CN117724603A - Interface display method and electronic equipment - Google Patents

Interface display method and electronic equipment Download PDF

Info

Publication number
CN117724603A
CN117724603A CN202310628628.5A CN202310628628A CN117724603A CN 117724603 A CN117724603 A CN 117724603A CN 202310628628 A CN202310628628 A CN 202310628628A CN 117724603 A CN117724603 A CN 117724603A
Authority
CN
China
Prior art keywords
interface
target
background
picture
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310628628.5A
Other languages
Chinese (zh)
Inventor
马生博
周向春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310628628.5A priority Critical patent/CN117724603A/en
Publication of CN117724603A publication Critical patent/CN117724603A/en
Pending legal-status Critical Current

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an interface display method and electronic equipment, relates to the technical field of terminals, and aims to reduce acquisition times and fuzzy times of background pictures and save system power consumption in the process of performing fuzzy processing on the background of a drawing node in a target interface to be displayed. The interface display method comprises the following steps: after the electronic equipment determines that the drawing node is a target drawing node with a background to be subjected to fuzzy processing according to fuzzy parameters of the drawing node in the target interface to be displayed, if the electronic equipment determines that the stored background picture is a picture subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment cuts a first background area corresponding to the position of the target drawing node from the background picture; and then, after the electronic equipment draws the first background area to the position of the target drawing node, drawing the content of the target drawing node on the previous layer of the first background area, and generating a target interface so as to display the target interface in the display screen.

Description

Interface display method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to an interface display method and electronic equipment.
Background
In order to achieve a good interface display effect, the electronic equipment performs fuzzy processing on the background of the interface in the scenes of a notification bar, a status bar, a popup window, a card and the like, so that the quality and the quality of the display interface are improved, the design aesthetic feeling of the interface is highlighted, and the use experience of a user is enhanced.
In the related art, a system generally adopts a real-time background blurring method to blur the background of each interface control in the interface to be displayed. However, in the process of blurring the background of each interface control in the interface to be displayed, the system respectively acquires the background picture corresponding to the background of each interface control, and respectively blurring the acquired background picture. Obviously, the background blurring process has the problem of larger system power consumption.
Disclosure of Invention
The embodiment of the application provides an interface display method and electronic equipment, wherein for a target drawing node to be subjected to fuzzy processing in a target interface to be displayed, the electronic equipment determines that a stored background picture is a picture subjected to fuzzy processing based on fuzzy parameters, the electronic equipment directly cuts a background area corresponding to the position of the target drawing node from the stored background picture, and takes the target picture as the background of the area corresponding to an interface control of the target drawing node. The electronic equipment can store the background pictures after the blurring process, so that the background area corresponding to the position of each target drawing node can be cut out directly from the background pictures after the blurring process, the background pictures do not need to be acquired once respectively when the target drawing nodes are drawn, the acquired background pictures are subjected to the blurring process respectively, and the purpose of saving the power consumption of the system is achieved by reducing the times of acquiring and blurring the background pictures.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, an interface display method is provided, which may include: after the electronic equipment determines that the drawing node is a target drawing node with a background to be subjected to fuzzy processing according to fuzzy parameters of the drawing node in the target interface to be displayed, if the electronic equipment determines that the stored background picture is a picture subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment cuts out a first background area corresponding to the position of the target drawing node from the background picture; and then, the electronic equipment draws the first background area to the position of the target drawing node, draws the content of the target drawing node on the previous layer of the first background area, generates a target interface, and displays the target interface.
The target interface comprises an interface control corresponding to the target drawing node, and the background area of the interface control is a first background area after fuzzy processing.
The blur parameters of the rendering node may include blur radii and/or blur filters. The stored background picture is processed based on fuzzy parameters of the drawing node.
It can be understood that the electronic device determines that the background picture of the target drawing node has been processed by the blurring parameter and stored, and the electronic device can directly cut the background area corresponding to the position of the target drawing node from the background, without acquiring the background picture when the target drawing node is drawn, and performing blurring processing on the acquired background picture, thereby achieving the purpose of saving the power consumption of the system by reducing the times of acquiring and blurring the background picture.
In a possible implementation manner of the first aspect, the interface display method may further include:
after the electronic equipment acquires the background picture based on the target application corresponding to the target interface, the electronic equipment stores the picture identification of the background picture as a parameter to the attribute information of each drawing node.
Wherein the picture identification of the background picture may indicate a storage location of the background picture.
The electronic device determining that the stored background picture is a picture processed based on blurring parameters comprises:
the electronic equipment determines that the stored picture mark indicates the background picture in the attribute information of the target drawing node, and the background picture is the picture subjected to fuzzy processing based on the fuzzy parameter.
It can be understood that, the electronic device stores the picture identifier of the background picture in the attribute information of each drawing node, after the electronic device determines the target drawing node according to the fuzzy parameter stored in the attribute information of each drawing node in the target interface to be displayed, the electronic device can directly obtain the background picture for performing the fuzzy processing on the target drawing node according to the picture identifier stored in the attribute information of the target drawing node, and the background picture is a picture after the fuzzy processing based on the fuzzy parameter. Therefore, the electronic equipment does not need to acquire the background pictures once respectively when drawing the target drawing node, and performs fuzzy processing on the acquired background pictures, and the purpose of saving the power consumption of the system is realized by reducing the times of acquiring and fuzzy processing on the background pictures.
In another possible implementation manner of the first aspect, the interface display method may further include:
if the electronic equipment determines that the stored background picture is not subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment adopts the fuzzy parameters to carry out fuzzy processing on the background picture to obtain the background fuzzy picture; then, the electronic device cuts out a first background area corresponding to the position of the target drawing node from the background blurred picture.
That is, in one possible case of the embodiment of the present application, the background picture obtained by the electronic device based on the target application corresponding to the target interface may be a picture that is processed by the blur parameters of the rendering node, or may be a picture that is not processed by the blur, or may be a picture that is processed by the blur, but is not processed by the blur parameters of the rendering node.
When the electronic equipment determines that the background picture indicated by the picture identification stored in the attribute information of the target drawing node is not subjected to fuzzy processing based on the fuzzy parameters of the target drawing node, the electronic equipment can carry out fuzzy processing on the background picture by adopting the fuzzy parameters, so that when the target drawing node is subjected to background fuzzy in the follow-up process, a background area can be cut from the picture subjected to the fuzzy processing.
In another possible implementation manner of the first aspect, the determining, by the electronic device, that the stored background picture is not blurred based on the blur parameter includes:
if the electronic equipment determines that the background picture is acquired from the target application for the first time, the electronic equipment determines that the background picture is not subjected to fuzzy processing based on fuzzy parameters; or if the electronic equipment determines that the fuzzy parameters for carrying out fuzzy processing on the background picture are different from the fuzzy parameters of the target drawing node, the electronic equipment determines that the background picture is not subjected to fuzzy processing based on the fuzzy parameters.
It can be understood that, when the electronic device determines that the target application obtains the background picture from the application layout file for the first time, the electronic device determines that the background picture is a picture that has not undergone any blurring process by the blurring parameters. That is, when the electronic device determines that the background picture acquired by the target application is a clear picture, the electronic device determines that the background picture is not blurred based on the blur parameters.
In addition, the electronic device determines that the background picture acquired based on the target application is subjected to blurring processing, but determines a blurring parameter for blurring processing the background picture, which is different from a blurring parameter of the target drawing node. In this case, the electronic device may also determine that the background picture is not blurred based on the blur parameters of the target rendering node.
As an example, assume that the electronic device determines that the background picture acquired by the target application is a picture processed by the blur parameter a, and the electronic device determines that the blur parameter in the attribute information of the target rendering node is the blur parameter B, where the blur parameter B is different from the blur parameter a. In this case, the electronic device may determine that the background picture is not blurred based on the blur parameters of the target rendering node.
In another possible implementation manner of the first aspect, the blurring parameter includes a blurring radius, and the electronic device uses the blurring parameter to blur the background picture to obtain a background blurred picture, which includes:
after the electronic equipment generates a fuzzy filter according to the fuzzy radius, the electronic equipment performs fuzzy processing on the background picture through the fuzzy filter to obtain the background fuzzy picture.
Wherein the blur radius is used to characterize the blur degree of the background picture. I.e. the larger the blur radius, the more blurred the background picture is; the smaller the blur radius, the clearer the background picture is explained.
In another possible implementation manner of the first aspect, the interface display method may further include:
the electronic equipment displays a first interface, wherein the first interface comprises first interface content; in response to the sliding operation on the first interface, the electronic device displays a sliding process interface or a sliding back interface, wherein the sliding process interface or the sliding back interface comprises first interface content, the first interface content comprises an interface control corresponding to a target drawing node, the sliding process interface or the sliding back interface control is in a compressed state, and a background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
In one possible case, after the electronic device detects the sliding operation of the user on the first interface in the process of displaying the first interface by the electronic device, the electronic device displays the sliding process interface in response to the sliding operation.
That is, in the process of receiving a sliding operation by the display screen of the electronic device, the electronic device displays a change process interface from the first display page after the blurring process to the target display page.
In one possible case, after the electronic device detects a sliding operation of the user on the first interface in the process of displaying the first interface by the electronic device, the electronic device displays the interface after sliding in response to the sliding operation. Therefore, the electronic equipment displays the interface subjected to fuzzy processing after sliding, and the purpose that the background of the target drawing node is still the background subjected to fuzzy processing when the interface is scaled is achieved.
The sliding operation may be a sliding operation performed to the left, right, up or down, and in the embodiment of the present application, the sliding direction of the first interface is not limited. For example, the electronic device displays a sliding procedure interface in response to a sliding operation of the first interface to the left by the user.
In another possible implementation manner of the first aspect, in response to a sliding operation on the first interface, the electronic device displays a sliding process interface or a sliding post interface, including:
In response to the sliding operation, the electronic device determines a transverse compression ratio and a longitudinal compression ratio of a control corresponding to the target drawing node relative to a canvas with a preset size according to a first coordinate value of the target drawing node in the canvas corresponding to the first interface and a second coordinate value of the target drawing node relative to the canvas with the preset size; and the electronic equipment compresses the first display page according to the transverse compression ratio and the longitudinal compression ratio to obtain a target display page, and then the electronic equipment displays the target display page.
The first display page comprises an interface control corresponding to the target drawing node, and the area corresponding to the interface control is subjected to fuzzy processing. The interface control in the target display page is in a compressed state, and the background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
In another possible implementation manner of the first aspect, after responding to the sliding operation, the interface display method may further include:
if the electronic equipment determines that the stored background picture is a picture subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment cuts a second background area corresponding to the position of the compressed target drawing node from the background picture; the electronic equipment stretches the second background area according to the transverse compression ratio and the longitudinal compression ratio to obtain a target background area; and the electronic equipment draws the target background area to the position of the target drawing node, and then draws the content of the target drawing node on the previous layer of the target background area to obtain a first display page after fuzzy processing.
It can be understood that, compared with the scheme that the electronic device directly displays the second background area at the compressed position of the target drawing node, after the electronic device draws the second background area to obtain the position of the target drawing node, the electronic device draws the content of the target drawing node at the previous layer of the target background area to obtain the first display page after the fuzzy processing, so that the problem that the position of the target drawing node displays the background area without the fuzzy processing in the sliding process of the display page is avoided.
In one possible case, the electronic device performs compression processing on the first display page according to the lateral compression ratio and the longitudinal compression ratio, including:
and the electronic equipment compresses the first display page after the fuzzy processing according to the transverse compression ratio and the longitudinal compression ratio.
In another possible implementation manner of the first aspect, before the electronic device displays the target display page, the electronic device displaying the sliding process interface may further include:
the electronic equipment displays a change process interface from the first display page after the blurring process to the target display page.
In another possible implementation manner of the first aspect, before the determining, by the electronic device, the lateral compression ratio and the longitudinal compression ratio of the control corresponding to the target drawing node with respect to the canvas of the preset size according to the first coordinate value of the target drawing node in the canvas corresponding to the first interface and the second coordinate value with respect to the canvas of the preset size, the interface display method may further include:
The electronic equipment determines first coordinate values of canvas corresponding to all drawing nodes in the first interface, the electronic equipment upwards traverses and determines first coordinate values of father nodes of the target drawing nodes based on the target drawing nodes, and determines intermediate coordinate values according to the first coordinate values of the father nodes and the first coordinate values of the target drawing nodes until the father nodes traversed upwards are root nodes, and determines second coordinate values of the target drawing nodes according to the first coordinate values and the intermediate coordinate values of the root nodes.
In another possible implementation manner of the first aspect, the interface display method may further include:
the electronic equipment displays a second interface, wherein the second interface comprises second interface content;
and responding to the zoom-in/zoom-out operation of the second interface, the electronic equipment displays a third interface, the third interface comprises second interface content, the second interface content comprises an interface control corresponding to the target drawing node, the interface control of the third interface is in a compressed state, and the background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
In one possible scenario, after the electronic device detects the zoom/out operation of the second interface by the user, an interface control in a third interface displayed by the electronic device is in a compressed state. In the process that the electronic equipment displays the third interface, after the electronic equipment detects the triggering operation of the user on any position in the display screen, the electronic equipment can display a fourth interface, the fourth interface comprises second interface content, an interface control of the fourth interface is in an uncompressed state, and a background area of the interface control corresponding to a target drawing node in the fourth interface is subjected to fuzzy processing.
In another possible implementation manner of the first aspect, the determining, by the electronic device, the rendering node as the target rendering node according to a blur parameter of the rendering node in the target interface to be displayed includes:
the electronic equipment acquires attribute information of each drawing node in a target interface to be displayed; if the electronic equipment determines that the attribute information of the drawing node comprises the fuzzy parameters, the electronic equipment determines that the drawing node is a target drawing node.
It is understood that, in a case where the electronic device determines that the attribute information of the drawing node does not include the blurring parameter, the electronic device determines that blurring processing is not required for the drawing node. And under the condition that the attribute information of the drawing node of the electronic equipment comprises the fuzzy parameter, the electronic equipment determines the drawing node as a target drawing node of which the background is to be subjected to fuzzy processing.
In another possible implementation manner of the first aspect, the electronic device includes an application layer, a framework layer, and a system library, the application layer includes a target application, and before the electronic device determines that the stored background picture is a picture that is blurred based on the blur parameters, the method further includes:
after the electronic equipment acquires the background picture based on the target application of the application layer, the target application sends the picture identification of the background picture to the drawing node corresponding to each interface control of the frame layer; the drawing nodes corresponding to each interface control of the frame layer send the picture identifications to the corresponding drawing nodes in the system library; after receiving the picture identification, the drawing node corresponding to each interface control in the system library stores the picture identification as a parameter into the corresponding attribute information.
It can be understood that, the electronic device sends the picture identifier obtained based on the target application of the application layer to the drawing node corresponding to each interface control, after storing the icon identifier as a parameter to the attribute information corresponding to the drawing node, the electronic device determines the target drawing node, obtains the background picture from the position indicated by the icon identifier stored in the attribute information of the target drawing node, directly cuts the background area corresponding to the position of the target drawing node from the background picture, and uses the target picture as the background of the area corresponding to the control of the target drawing node. The electronic equipment can store the background pictures after the blurring process, so that the background area corresponding to the position of each target drawing node can be cut out directly from the background pictures after the blurring process, the background pictures do not need to be acquired once respectively when the target drawing nodes are drawn, the acquired background pictures are subjected to the blurring process, and the purpose of saving the power consumption of the system is achieved by reducing the times of acquiring and blurring the background pictures.
In a second aspect, the present application provides an electronic device, comprising: a display screen; one or more processors; a memory; wherein the memory stores one or more computer programs, the one or more computer programs comprising instructions that, when executed by the electronic device, cause the electronic device to perform the interface display method of any of the first aspects above.
In a third aspect, the present application provides a computer readable storage medium having instructions stored therein that, when executed on an electronic device, cause the electronic device to perform the interface display method of any one of the first aspects.
In a fourth aspect, the present application provides a computer program product comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the interface display method according to any one of the first aspects.
It will be appreciated that the electronic device according to the second aspect, the computer storage medium according to the third aspect, and the computer program product according to the fourth aspect are all configured to perform the corresponding methods provided above, and therefore, the advantages achieved by the method are referred to as the advantages in the corresponding methods provided above, and are not repeated herein.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a software structure diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a rendering tree according to an embodiment of the present application;
FIG. 4 is a diagram illustrating an exemplary interface display provided in an embodiment of the present application;
fig. 5 is a coordinate example diagram of a drawing node according to an embodiment of the present application;
FIG. 6 is a diagram of a background blur example provided in an embodiment of the present application;
FIG. 7 is a diagram of an exemplary interface display provided in an embodiment of the present application;
FIG. 8 is a diagram of an exemplary interface display provided in an embodiment of the present application;
FIG. 9 is an exemplary diagram fourth of an interface display provided by embodiments of the present application;
FIG. 10 is an exemplary diagram five of an interface display provided by an embodiment of the present application;
FIG. 11 is a diagram sixth exemplary interface display provided in an embodiment of the present application;
FIG. 12 is an exemplary diagram seven of an interface display provided by embodiments of the present application;
FIG. 13 is an exemplary diagram eighth of an interface display provided by embodiments of the present application;
FIG. 14 is an exemplary diagram nine of an interface display provided by embodiments of the present application;
fig. 15 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the related art, in order to achieve a better interface display effect, in some scenarios, an electronic device may perform fuzzy processing on a background area of an interface control displayed in an interface. The interface control and the drawing node are mutually corresponding. In the process that the electronic device performs the blurring process on the background area of the drawing node, the electronic device generally performs the blurring process on the background area of the drawing node according to the background picture after performing the blurring process on the acquired background picture after acquiring the background picture corresponding to the drawing node.
The background area of the drawing node where the background change is not frequent is usually a fixed background picture, such as a notification bar or a card displayed in a desktop application, and the like, where the background change is not frequent. However, in the process of blurring a background area of a drawing node where background changes are not frequent, a rendering thread of the electronic device needs to acquire a background picture corresponding to each drawing node from an application layer, and blur the acquired background picture. Therefore, when the electronic device draws each drawing node to be subjected to background blurring, a background picture is acquired once, blurring processing is carried out on the acquired background picture, and the problem of high power consumption exists, so that the performance of the electronic device is influenced, and the use experience of a user is influenced.
Therefore, the embodiment of the application provides an interface display method, for each target drawing node to be subjected to fuzzy processing in a target interface to be displayed, an electronic device determines that a stored background picture is a picture subjected to fuzzy processing based on fuzzy parameters, and the electronic device directly cuts a first background area corresponding to the position of the target drawing node from the background picture and draws the first background area to the position of the target drawing node. Therefore, the electronic equipment does not need to acquire a background picture once when drawing each target drawing node and carries out fuzzy processing on the background picture, and the purpose of saving the power consumption of the system is realized by reducing the acquisition times and the fuzzy processing times of the background picture.
Exemplary, the interface display method provided in the embodiments of the present application may be applied to a mobile phone, a tablet computer, a personal computer (personal computer, PC), a personal digital assistant (personal digital assistant, PDA), a smart watch, a netbook, a wearable electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a vehicle-mounted device, a smart car, a smart stereo, and other electronic devices having a display screen, which are not limited in this embodiment of the present application.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 1.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. Wireless communication techniques may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In this embodiment of the present application, the display screen 194 is further configured to display a target interface, where the target interface includes an interface control corresponding to a target drawing node, and a background area of the interface control is a first background area after the blurring process.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen".
The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The embodiment of the invention takes a system with a layered architecture as an example, and illustrates the software structure of the electronic equipment.
Fig. 2 is a software structure diagram of an electronic device according to an embodiment of the present application.
It will be appreciated that the layered architecture divides the software into several layers, each with a clear role and division. The layers communicate with each other through a software interface. In some embodiments, the system may include an application layer (simply referred to as an application layer), an application Framework layer (Framework), a system library (Native Libs), and a hardware abstraction layer.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include a system application. The system application refers to an application which is set in the electronic equipment before the electronic equipment leaves a factory. By way of example, system applications may include desktop, drop down search, camera, gallery, calendar, music, short messages, weather, and like programs.
The application package may also include a third party application, which refers to an application installed after a user downloads the installation package from an application store (or application marketplace). For example, a map class application, a take-away class application, a reading class application (e.g., e-book), a social class application, a travel class application, and the like.
The application framework layer (abbreviated as framework layer) provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the framework layer may include a view system and a rendering node (renderNode).
The view system comprises visual controls, such as a control for displaying characters, a control for displaying pictures and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The system library may include a rendering node, a rendering pipeline (pipeline), an image rendering engine (Skia), and the like.
Wherein the rendering node is used to characterize each element in the interface.
In this embodiment of the present application, the interface displayed by the electronic device may be implemented by a rendering tree (RenderTree), where the tree structure includes a plurality of drawing nodes, and each drawing node includes a corresponding control image drawing logic. Each rendering node may be configured with various attributes, for example, the rendering node may be configured with a background effects attribute to add a background effect to the background content contained in the control image corresponding to the node.
Exemplary, fig. 3 is a schematic structural diagram of a rendering tree according to an embodiment of the present application. As shown in fig. 3, it is assumed that the interface a can be implemented by the drawing node 1 to the drawing node 5.
The rendering pipeline is a channel for transporting data.
The image rendering engine provides the ability to draw pictures and draw shapes.
In embodiments of the present application, the hardware abstraction layer may include a graphics processor and a display screen.
In the embodiment of the application, after the application layer acquires the background picture and sets the fuzzy parameters of each drawing node in the interface to be displayed by calling the background fuzzy capability, the fuzzy parameters of each drawing node and the picture identification of the corresponding background picture are sent to the frame layer. Wherein the blur parameters include, but are not limited to, blur radius and shape characteristics (e.g., rounded rectangle, circle, or diamond, etc.). The blur radius is used for representing the blur degree of the picture, and the value of the blur radius is the value of outwards expanding a certain pixel point. The background picture refers to a picture for blurring the background of the drawing node. The picture identifier of the background picture carries the storage address of the background picture.
In the embodiment of the present application, the background picture obtained by the application layer may be a clear picture or a picture after blurring processing, which is not limited herein. The shape features refer to the shape of the background picture after being processed.
After receiving the fuzzy parameters and the background pictures sent by the application layer, the framework layer sends the fuzzy parameters and the background pictures to a system library through a java native interface (Java native interface, JNI).
After receiving the fuzzy parameters of each drawing node and the picture identification of the background picture sent by the frame layer, the system library stores the fuzzy parameters and the picture identification as parameters into attribute information of the drawing nodes. Wherein, the picture identification is used for indicating the storage position of the background picture.
In a possible case of the embodiment of the present application, the system library determines that the background picture indicated by the picture identifier stored in the attribute information of the drawing node is a picture processed based on the blur parameter, and the system library may perform the blur processing on the background area of the drawing node according to the background picture.
For easy understanding, the following embodiments of the present application will take a mobile phone having the above hardware structure and software architecture as an example, and the interface display method provided in the embodiments of the present application will be described by way of example with reference to the accompanying drawings.
In the embodiment of the application, in the process of displaying the target interface to be displayed by the mobile phone, the mobile phone determines that the drawing node is the target drawing node to be subjected to background fuzzy processing according to fuzzy parameters of all drawing nodes in the target interface. Aiming at each target drawing node in the target interface, the mobile phone cuts out a first background area corresponding to the position of the target drawing node from the stored background pictures; and then, after the mobile phone draws the first background area to the position of the target drawing node, drawing the content of the target drawing node on the previous layer of the first background area, and generating a target interface. Then, a target interface is displayed on a display screen of the mobile phone, the target interface comprises an interface control corresponding to a target drawing node, and the background area of the interface control is a first background area after fuzzy processing.
In a possible case of the application, if the mobile phone determines that the stored background picture is a picture subjected to fuzzy processing based on the fuzzy parameters, the mobile phone cuts out a first background area corresponding to the position of the target drawing node from the background picture. And then, after the mobile phone draws the first background area to the position of the target drawing node, displaying a target interface in a display screen of the mobile phone after the content of the target drawing node is drawn on the previous layer of the first background area, wherein the target interface comprises an interface control corresponding to the target drawing node, and the background area of the interface control is the first background area after fuzzy processing.
In another possible case of the application, if the mobile phone determines that the stored background picture is not subjected to fuzzy processing based on the fuzzy parameters, the mobile phone uses the fuzzy parameters of the target drawing node to perform fuzzy processing on the background picture, so as to obtain the background fuzzy picture. Then, the mobile phone cuts out a first background area corresponding to the position of the target drawing node from the background blurred picture.
The method for displaying the target interface in the display screen of the mobile phone can be suitable for a scene of first starting the display target interface by an application after the mobile phone is started, can be suitable for a scene of displaying the target interface when the application is updated, can be suitable for a scene of displaying the target interface when the interface in the application is refreshed, and is not limited in the embodiment of the application. In addition, the application corresponding to the target interface displayed by the electronic device may be a desktop application, a negative one-screen application, or an application with other contexts not frequently switched, which is not limited in the embodiment of the present application.
The interface display method in the embodiment of the present application is described in an exemplary manner by taking the above application as a desktop application and taking a scenario of displaying a target interface in a process of starting the desktop application when the mobile phone is started as an example.
In the embodiment of the application, when the desktop application lays out a target interface to be displayed of the desktop application according to the application layout file in the process of starting the desktop application by the mobile phone, the desktop application can design display effects of each interface control in the target interface according to the application layout file, and generate fuzzy parameters corresponding to the target interface controls. The target interface control is a control with background blurring to be performed and infrequent background picture resource change.
Illustratively, as shown in fig. 4, the target interface control may be a part of the control in the notification bar, for example, the control 401 shown in (a) in fig. 4, may be a control displayed in the desktop, for example, the control 402 and the control 403 shown in (b) in fig. 4, and may be another control displayed in the desktop, for example, the control 404 shown in (c) in fig. 4.
The application layout file of the desktop application records the code logic of the drawing nodes corresponding to the controls, the association relation (such as father-son relation) between the drawing nodes corresponding to the controls, the target control to be subjected to background blurring, the timing of the target control to be subjected to background blurring, and the like.
The background blurring time of the target interface control can be the time when the target interface control is displayed on the desktop, or the time when the mobile phone detects that the target interface control slides on the desktop, and the background blurring time of the target control is not limited.
It should be explained that, in the embodiment of the present application, when the desktop application lays out the target interface of the desktop, the specific application layout file is not limited, and the setting may also be performed according to the actual application requirement.
In the embodiment of the application, the desktop application can set a horizontal blur radius and a vertical blur radius according to the design of the display effect of each interface control in the application layout file so as to generate the blur parameters corresponding to the target interface control. After generating the fuzzy parameters corresponding to the target interface controls, the desktop application can call the interface to send the fuzzy parameters to the drawing nodes corresponding to the target interface controls positioned on the framework layer. For example, the desktop application may send the fuzzy parameter as an entry of the setHnBlurParameters interface to a drawing node corresponding to the target interface control of the framework layer. After the fuzzy parameters are received by the drawing nodes corresponding to the target interface controls in the framework layer, the fuzzy parameters are sent to the drawing nodes corresponding to the same target controls in the system library through the JNI interface. And then, the drawing node of the system library stores the fuzzy parameter as attribute information of the drawing node corresponding to the target interface control in a structural body of the drawing node.
The drawing nodes of the system library and the drawing nodes of the frame layer corresponding to the target interface control are different performances of the drawing nodes at different levels of the system software architecture. That is, the drawing nodes of the system library and the drawing nodes of the framework layer point to the content of the same target control.
In the embodiment of the application, after the mobile phone obtains the background picture for blurring the background area of the target interface control based on the desktop application, the mobile phone can store the picture identification of the background picture as a parameter to the attribute information of each drawing node. The picture identifier is used to indicate a storage address of the background picture, for example, the picture identifier may carry the storage address of the background picture. Therefore, when the background area of the target drawing node in the target interface of the desktop application is blurred, the background picture can be directly obtained from the storage position indicated by the picture identification in the attribute information of the target drawing node, and the background picture does not need to be re-obtained from the application layer, so that a part of power consumption of the system can be saved.
For example, when the desktop application draws the target interface of the application for the first time, the desktop application may transmit the picture identifier of the background picture as a parameter to the system library after determining the background picture of each interface control in the target interface according to the application layout file. For example, the desktop application may send the picture identifier as an entry of the sethnblurframeters interface to the drawing node corresponding to the interface control. After the drawing node of the interface control receives the picture identification, the drawing node of the interface control can send the picture identification to the drawing node of the frame layer, and after the drawing node of the frame layer receives the picture identification, the drawing node of the frame layer sends the picture identification to the drawing node of the system library through the JNI. And the drawing nodes of the system library store the picture identifications as attribute information of the drawing nodes corresponding to the interface control in a structural body of the drawing nodes of the system library. Therefore, the system library can acquire the background picture according to the storage address of the background picture indicated by the picture identification stored in the attribute information of each drawing node, the background picture is not required to be acquired from the application layer, and a part of power consumption of the system is saved.
For convenience of subsequent description, a picture for blurring a background of the target interface control is named as a background picture in the embodiment of the present application, and of course, a picture serving as a background is named as a background picture only as an exemplary description, and is not limited herein. The background picture obtained by the desktop application may be a picture subjected to blurring processing, or may be a clear picture, which is not limited herein.
In the embodiment of the application, after the drawing node of the system library receives the fuzzy parameters, the image rendering engine of the system library can generate the fuzzy filter according to the fuzzy radius in the fuzzy parameters, and the generated fuzzy filter and the fuzzy parameters are used as parameters to be stored in the attribute information of the drawing node. The blurring filter is used for blurring the background picture in the storage address carried by the picture identification in the attribute information of the drawing node.
Alternatively, the image rendering engine may employ an image blur algorithm (e.g., a Gaussian blur algorithm, a Kawase blur algorithm, a radial blur algorithm, or the like) to generate the blur filter based on the lateral blur radius and the longitudinal blur radius. For example, an image rendering engine uses a gaussian blur algorithm to generate a gaussian blur filter from the blur radius. The working principle of the Gaussian blur filter is as follows: for each pixel point on a picture, taking the pixel point as the center, replacing the RGB component of the pixel point with the RGB component of the surrounding pixel point after the RGB component is weighted and averaged, and re-drawing the pixel point to obtain the Gaussian blur effect.
In the embodiment of the application, when the desktop application is started, the main Thread of the desktop application is started along with the desktop application, and the main Thread can call a rendering Thread (Render Thread) to Render and Render the background picture corresponding to the target interface control, so as to realize fuzzy processing on the background of the target interface control. The Render Thread is used for sharing rendering threads of the main Thread drawing task so as to relieve the burden of the main Thread. Here, the main Thread can call the Render Thread to draw and Render the background picture corresponding to the target interface control, so that when the main Thread has time-consuming operation, the situation of stuck rendering is avoided, and the rendering fluency is improved.
In the embodiment of the application, the main Thread and the Render Thread each maintain a piece of application window view information. The main Thread and the Render Thread respectively maintain the view information of one application program window, so that the view information can not interfere with each other, the parallelism is further realized to the greatest extent, and the starting time of the desktop application is saved. Wherein the view information of the application window maintained by the Render Thread is from the main Thread. Therefore, when the view information of the application window maintained by the main Thread changes, the changed view information of the application window needs to be synchronized into the Render Thread. The view information of the application window may include a Display List (Display List) of each drawing node, attribute information (Property), and a background picture (i.e., bitmap referenced by the Display List) referenced in the Display List.
For example, the main Thread may call the synFrameState function to synchronize application window information into the Render Thread. For example, the main thread may call a synFrameState function to synchronize the attribute information of the drawing nodes of the frame layer into the attribute information of the drawing nodes of the system library.
The main Thread synchronizes the maintained displayList to the displayList maintained by the Render Thread, so that the main Thread and the Render Thread can execute in parallel. This means that the Render Thread can Render the Display List of the current frame of the application window and at the same time, the main program can prepare the Display List of the next frame of the application window, so that the interface Display of the application window is smoother.
In embodiments of the present application, each rendering node may include two attributes. For example, the rendering node may include a stagingtroperty attribute and a property attribute. The parameters in the stagingperfty attribute are parameters synchronized from attribute information of a drawing node of the framework layer. For example, the main thread calls the synFrameState function to synchronize attribute information of each drawing node in the frame layer to attribute information of a stagingperfty attribute of a corresponding drawing node in the system library. Attribute information of the stagingtroperty attribute includes, but is not limited to, a blur radius of the drawing node, position information of the drawing node, a size of the drawing node, and a shape of the drawing node.
The location information of the drawing node may refer to location information of the drawing node in a screen coordinate system. In the embodiment of the present application, the position information of the drawing node may be a first coordinate value of the drawing node with respect to the screen. The screen coordinate system is a two-dimensional coordinate system built on the screen, and the origin of coordinates of the screen coordinate system may be in the upper left corner of the entire screen. The position information of the drawing node in the screen coordinate system may be expressed as (StartX, startY, endX, endY), wherein StartX is calculated from the leftmost side of the screen, 0 represents the leftmost edge, startY is calculated from the uppermost side of the screen, 0 represents the upper edge, endX is calculated from the leftmost side of the screen, 0 represents the leftmost edge, endY is calculated from the uppermost side of the screen, and 0 represents the upper edge. An exemplary diagram of drawing coordinate information of nodes in a screen coordinate system is shown in fig. 5. The drawing node in fig. 5 has a left side of 10, an upper side of 10, a right side of 60, and a lower side of 60, and the coordinate information of the drawing node is (10, 10, 60, 60).
In order to ensure the Thread safety, the same attribute of the drawing nodes is prevented from being operated by two threads at the same time, and before the Render Thread draws and renders each frame of background picture, the Render Thread can synchronize attribute information in the stagingperfty attribute of each drawing node to attribute information of the property attribute. When the subsequent Render Thread carries out fuzzy processing on the background picture according to the fuzzy parameters, the Render Thread directly calls the fuzzy parameters included in the attribute information in the property attribute.
In the process that the Render Thread synchronizes the attribute information in the stagingperfty attribute of each drawing node to the attribute information of the property attribute, the Render Thread synchronizes the fuzzy filter generated in the process to the corresponding drawing node at the same time, so that when the subsequent Render Thread performs fuzzy processing on the background picture, the Render Thread can call the fuzzy filter to perform fuzzy processing on the background picture, and a fuzzy effect is achieved.
In addition, the Render Thread can calculate the coordinate value of the rendering node relative to the canvas according to the position information of the rendering node contained in the property attribute of the rendering node, so as to be used for acquiring the correct region content on the transferred background picture as the background region of the rendering node in the subsequent rendering process.
It should be explained that the position information in the property attribute of the drawing node is the coordinate value of the drawing node with respect to the position of the root node. The coordinate values of the drawing node relative to the canvas refer to the coordinate values of the drawing node relative to the canvas when drawing the interface control in the target interface. Since the canvas may not be full-screen, i.e., the canvas and the display screen may not be exactly the same size, it is necessary to consider converting screen coordinate values of the drawing node into coordinate values of the canvas when drawing the drawing node.
As a possible implementation manner, if the Render Thread determines that the parent node of the current drawing node is not null, then adding the current offsetTop on the basis of the original offsetTop, then continuing to traverse the drawing node upwards to obtain the offsetTop of the parent node, and then adding the offsetTop to finally recursively obtain the ordinate of the drawing node relative to the whole canvas, thereby realizing the conversion of the ordinate value of the screen coordinate of the drawing node into the ordinate value relative to the canvas. Wherein offsetTop refers to the height of the acquisition rendering node from the top of the parent coordinate specified by the offsetParent attribute.
As another possible implementation, the Render Thread may call a canvas, getBuoundingcientRact () function to convert the coordinates of the drawing node from a first coordinate value in the screen coordinate system to a coordinate value in the canvas.
Here, the abscissa in the coordinate values of the drawing node relative to the canvas also adopts this method, and the description thereof will not be repeated here.
In the embodiment of the application, after the Display List of the desktop application window is constructed, the main Thread sends a rendering instruction to the Render Thread. After receiving a rendering instruction of a main Thread, when rendering each interface control in a target interface to be displayed of a desktop application on a canvas of a display screen, and executing to a rendering node corresponding to each interface control, the Render Thread firstly judges whether to perform background blurring processing on the rendering node. That is, the Render Thread first determines whether the rendering node is a target rendering node to determine whether to perform background blurring processing on the rendering node.
If the Render Thread determines that the rendering node is not the target rendering node, i.e., the Render Thread determines that the rendering node does not need to be subjected to background blurring, the Render Thread renders the content of the rendering node (e.g., an icon of an application, a name of an application, or other text content, etc.) directly at a layer immediately above the background picture after rendering the background picture on the canvas of the display screen. If the Render Thread determines that the rendering node is a target rendering node to be subjected to background blurring, that is, the Render Thread determines to perform background blurring on the rendering node, the Render Thread renders a background picture on a canvas of a display screen, renders content of a first background area for performing background blurring on the rendering node at a position corresponding to the rendering node, and renders the content corresponding to the rendering node onto the canvas of the display screen.
The above procedure will be described in detail below by taking the rendering node rendered by Render Thread as the first rendering node.
Optionally, the Render Thread may determine, according to attribute information of the first rendering node, whether the first rendering node is a target rendering node whose background is to be blurred. After the Render Thread obtains the attribute information of each first drawing node in the target interface to be displayed of the desktop application, the Render Thread may determine whether the attribute information of each first drawing node includes a blur filter and a blur parameter, so as to determine whether each first drawing node is a target drawing node.
In one possible case, if the Render Thread determines that the attribute information of the first rendering node does not include the blur filter and/or the blur parameter, the Render Thread determines that the first rendering node does not need to be subjected to background blur processing. In this case, the Render Thread normally renders the first rendering node on the canvas of the display screen, and then the Render Thread continues to determine whether the next rendering node is the target rendering node for background blurring processing.
It can be understood that when the desktop application does not set the fuzzy parameters of the first drawing node, the desktop application transmits the attribute information of the first drawing node to the system library through the frame layer, the attribute information of the first drawing node received by the system library does not include the fuzzy parameters, and the system library does not generate a fuzzy filter according to the fuzzy parameters. In this case, the Render Thread determines that the first rendering node is normally rendered on the canvas of the display screen.
In another possible case, if the Render Thread determines that the attribute information of the first rendering node includes a blur filter and/or a blur parameter, the Render Thread determines that the first rendering node is the target rendering node. That is, the Render Thread determines that the target rendering node is background blurred before the target rendering node is rendered on the canvas of the display screen.
It can be understood that when the desktop application sets the fuzzy parameters of the target drawing node, the desktop application transmits the attribute information of the target drawing node to the system library through the frame layer, the attribute information of the target drawing node received by the system library includes the fuzzy parameters, and the system library generates the fuzzy filter according to the fuzzy parameters of the target drawing node, and then the Render Thread determines that the attribute information of the target drawing node includes the fuzzy filter and/or the fuzzy parameters.
In the embodiment of the application, when the Render Thread uses the background picture to blur the background of the target drawing node, it is first determined whether to use the blur parameters included in the attribute information of the target drawing node to blur the background picture.
Optionally, the Render Thread may determine whether to blur the background picture according to whether the background picture received by the system library is a picture that is first transferred by the desktop application.
In the embodiment of the application, after the desktop application sends the picture identifier of the background picture to the frame layer, the frame layer sends the picture identifier of the background picture to the system library through the JNI interface, and if the Render Thread determines that the system library receives the picture identifier of the background picture through the JNI interface for the first time, the Render Thread sets the flag bit to a first preset value. And setting the zone bit to a second preset value after the Render Thread carries out fuzzy processing on the background picture.
For example, the Render Thread determines that the system library receives the picture identifier of the background picture through the JNI interface for the first time, and the Render Thread may set the flag bit to 0. After the Render Thread performs blurring processing on the background picture, the flag bit is set to 1. The value of the flag bit is used for representing whether the picture identifier of the background picture is received by the system library for the first time.
It should be noted that, setting the flag bit to 0 or 1 by the above-mentioned Render Thread is merely described as an example, and this is not limited in the embodiment of the present application. For example, when the Render Thread determines that the system library receives the picture identifier of the background picture through the JNI interface for the first time, the Render Thread may also set the flag bit to TRUE; when the Render Thread blurs the background picture, the Render Thread may set the flag bit to FALSE.
In one possible case, if the Render Thread determines that the flag bit value is the first preset value, the Render Thread determines that the JNI interface first receives the picture identifier of the background picture. That is, the Render Thread determines that the background picture has not been used to blur the background of the target rendering node. In this case, the Render Thread determines to blur the background picture using the blur parameters included in the attribute information of the target rendering node.
In another possible case, if the Render Thread determines that the flag bit is the second preset value, the Render Thread determines that the picture identifier of the background picture is not transmitted by the desktop application for the first time, that is, the Render Thread performs the background blurring process on the target rendering node using the background picture. That is, the background picture is a picture subjected to blurring processing.
In the embodiment of the present application, when the Render Thread determines that the background picture is a picture subjected to blurring processing, the Render Thread may determine whether the background picture is a picture subjected to blurring processing based on the blurring parameters of the target rendering node, so as to determine whether to use the blurring parameters of the target rendering node to perform blurring processing on the background picture according to a determination result.
As a possible implementation manner, the Render Thread may acquire a blur parameter from attribute information of the target rendering node, and determine whether the blur parameter for blurring the background picture is the same as the blur parameter of the target rendering node, so as to determine whether to blur the background picture again.
In the embodiment of the application, after the Render Thread determines that the background picture is a picture after the blurring process, the Render Thread obtains blurring parameters included in attribute information of the target drawing node. If the Render Thread determines that the blur parameters corresponding to the target rendering node are the same as the blur parameters for blurring the background picture, the Render Thread determines that blurring of the background picture is not required by using the blur parameters of the target rendering node. If the fuzzy parameters corresponding to the target drawing node determined by the Render Thread are different from the fuzzy parameters for carrying out fuzzy processing on the background picture, namely the fuzzy parameters of the target drawing node determined by the Render Thread are changed. In this case, the Render Thread determines to blur the background picture using the blur parameters corresponding to the target rendering node.
In the embodiment of the application, when the Render Thread adopts the fuzzy parameters corresponding to the target drawing node to perform fuzzy processing on the background picture, the Render Thread sends a rendering instruction to the graphics processor, and the graphics processor performs fuzzy processing on the background picture through the fuzzy filter in the attribute information of the target drawing node to obtain the background fuzzy picture and stores the background fuzzy picture.
Here, the graphics processor may send the background blurred image to the storage location of the background image for storage, or may replace the background image stored in the storage location with the background blurred image, so as to save storage memory, which is not limited herein.
In a possible case of the embodiment of the present application, in a case where the Render Thread determines that blurring processing is not required for the background picture, the Render Thread may crop a first background area corresponding to a position of the target rendering node from the background picture. That is, the Render Thread may obtain, according to the calculation, the coordinate value of the target rendering node with respect to the canvas, and crop, from the background picture, the first background area corresponding to the position of the coordinate value of the canvas of the rendering node. Then, the Render Thread draws the first background area as a background to a position of a coordinate value corresponding to the canvas, so that the first background area is used as a background of the target drawing node, and the effect of performing background blurring processing on the target drawing node is achieved.
Illustratively, as shown in FIG. 6, assume that (a) in FIG. 6 shows a canvas 601 of a display screen and (b) in FIG. 6 is a background picture 603. The Render Thread determines that the location of a rendering node on the canvas is the region 602,Render Thread in fig. 6 (a) and clips the same background region 604 from the background picture 603 in fig. 6 (b) as the location of the region 602 in fig. 6 (a). The Render Thread then uses the background region 604 to blur the rendering node, i.e., the Render Thread cuts the background region 604 from the background picture 603 and renders the background region 604 as the rendering node's background into the canvas 601 of the display screen. After the rendering node is background blurred by the Render Thread, the display effect in the canvas of the display screen is as shown in fig. 6 (c).
In another possible case of the embodiment of the present application, in a case where the Render Thread determines that the background image is blurred by using the blur parameter of the target drawing node, the Render Thread uses the blur parameter of the target drawing node to blur the background image, and after obtaining the background blurred image, the Render Thread may obtain, according to the above calculation, the coordinate value of the target drawing node relative to the canvas, and cut a first background area corresponding to the position of the coordinate value of the canvas from the background blurred image. Then, the Render Thread renders the first background region as a background to the position of the coordinate value of the canvas to use the first background region as a background of the target rendering node, thereby achieving the effect of performing background blurring processing on the target rendering node.
In this embodiment of the present application, the Render Thread draws the content (such as an icon or text applied) corresponding to the target drawing node on the layer above the first background area after drawing the first background area as the background of the target drawing node to the corresponding coordinate position on the canvas. The Render Thread then continues to draw other drawing nodes of the desktop application on the canvas until drawing of all drawing nodes is completed in the canvas.
Illustratively, as shown in fig. 7, after the Render Thread draws the first background area as the background of the drawing node to the area 701 corresponding to the drawing node on the canvas, the Render Thread draws the content corresponding to the drawing node on the previous layer of the first background area, for example, the Render Thread draws the icons of the application a, the application B, the application C and the application D in fig. 7 on the previous layer of the first background area.
When actually drawing a target interface of a desktop application, the Render Thread firstly draws the whole background picture of the target interface on a canvas, then the Render Thread uses the first background area as the background of a target drawing node, draws the first background area to the position corresponding to the target drawing node on the canvas, and then draws the content of the target drawing node to a layer above the first background area. The entire background picture of drawing the target interface on the canvas is not shown in fig. 7.
In the embodiment of the application, after the Render Thread determines that all drawing nodes in the target interface to be displayed of the desktop application are drawn in the canvas of the display screen, the Render Thread sends a Vsync signal to a Surface Flinger function, and after the Surface Flinger function receives the Vsync signal, the background picture of the target interface, each drawing node and the corresponding layer of the first background area are synthesized, and the synthesized image is sent to the display screen to be displayed, so that the display of the target interface of the desktop application is realized.
For example, the Surface Flinger function may call the message handling function handleMessageRefresh to implement the process of compositing an image and displaying the composited image on a display screen.
In a possible case of the embodiment of the present application, when areas of two controls in a target interface to be displayed of a desktop application overlap, for example, one control is a control corresponding to a target drawing node, and the other control is a control corresponding to a second drawing node. When the Render Thread determines that the control corresponding to the target drawing node overlaps with the control corresponding to the second drawing node in the region of the target interface, and the size of the control corresponding to the target drawing node is larger than the size of the control corresponding to the second drawing node, the Render Thread draws the background region cut from the background picture as the background of the target drawing node, and then draws the second drawing node on the canvas, if the Render Thread determines that the second drawing node is blurred in background, the Render Thread can cut the background region same as the position of the second drawing node from the background picture according to the position of the second drawing node relative to the coordinate value of the canvas. Then, the Render Thread renders the background region corresponding to the second rendering node to the layer above the background region corresponding to the target rendering node, and then renders the content corresponding to the second rendering node. Therefore, under the condition that the backgrounds corresponding to different drawing nodes are the same, the background pictures do not need to be subjected to fuzzy processing repeatedly, the background areas corresponding to the positions of the drawing nodes are cut from the cached background pictures directly, and the purpose of reducing the power consumption of the equipment is achieved by reducing the times of fuzzy on the background pictures.
Illustratively, as shown in (a) of fig. 8, the backgrounds corresponding to the interface control 801 and the interface control 802 are the same, and the Render Thread draws the background area as the background of the drawing node corresponding to the interface control 801 to the corresponding position on the canvas of the display screen, and then the background blurred area 803 is shown in (b) of fig. 8. When the Render Thread draws the drawing node corresponding to the interface control 802 on the canvas, if the Render Thread determines that the background of the drawing node corresponding to the interface control 802 is blurred, and the drawing node corresponding to the interface control 801 is the same background of the drawing node corresponding to the interface control 802, the Render Thread may cut a background area with the same position as the drawing node corresponding to the interface control 802 from the background picture according to the position of the drawing node corresponding to the interface control 802 relative to the coordinate value of the canvas. Then, the Render Thread renders the background area corresponding to the rendering node corresponding to the interface control 802 over the background area corresponding to the rendering node corresponding to the interface control 801.
In the embodiment of the application, after the desktop application is started, in the process that the mobile phone displays the display interface of the desktop application, after the mobile phone detects the sliding operation of the user on the display interface, the display interface of the mobile phone can display the sliding process interface or the sliding post interface in response to the sliding operation on the display interface.
In one possible scenario, after the electronic device detects a sliding operation of the user on the first interface in the process of displaying the first interface by the mobile phone after the desktop application is started, the mobile phone displays a sliding process interface in response to the sliding operation of the first interface.
The first interface can display first interface content, the sliding process interface also comprises the first interface content, the first interface content comprises an interface control corresponding to the target drawing node, the interface control of the sliding process interface is in a compressed state, and a background area of the interface control corresponding to the target drawing node in the sliding process interface is subjected to fuzzy processing.
It can be understood that the mobile phone responds to the sliding operation of the user on the first interface to display a change process interface between the first display page and the target display page after the background blurring process is performed on the target drawing node.
Fig. 9 is an exemplary diagram of an interface display according to an embodiment of the present application. The mobile phone in (a) in fig. 9 displays a first interface 901, wherein the first interface 901 includes an interface control 902 that is not subjected to blurring processing. After detecting the sliding operation of the user on the first interface 901 to the left, the mobile phone responds to the sliding operation to display an interface 903 after blurring the interface control 902 in the first interface 901, as shown in (b) in fig. 9. The interface 903 includes an interface control 904 subjected to blurring processing. And responding to the sliding operation, and continuously displaying a sliding process interface by the mobile phone. The interface 905 shown in fig. 9 (c) and the interface 907 shown in fig. 9 (d), wherein the interface 905 and the interface 907 are the sliding process interfaces, the interface controls in the interface 905 and the interface 907 are in a compressed state, and the interface 905 includes a compressed interface control 906. Interface control 906 is a compressed interface control 904. The interface controls in interface 907 are also in a compressed state, with interface 907 including compressed interface controls 908.
It should be noted that, in the above example, the interface control 902 displayed in the first interface 901 shown in (a) in fig. 9 is taken as an example of an interface control that is not subjected to the background blurring process, and the interface control 902 displayed in the first interface 901 in (a) in fig. 9 may also be an interface control that is subjected to the background blurring process. The sliding process shown in fig. 10 is described below as an example.
Fig. 10 is an exemplary diagram five of an interface display according to an embodiment of the present application. The mobile phone in fig. 10 (a) displays a first interface 1001, where the first interface 1001 includes an interface control 1002 that is subjected to blurring processing. After detecting the sliding operation of the user on the first interface 1001 to the left, the mobile phone displays a sliding process interface in response to the sliding operation. The interface 1003 shown in (b) in fig. 10 and the interface 1005 shown in (c) in fig. 10, wherein the interface 1003 and the interface 1005 are the sliding process interfaces, the interface control in the interface 1003 is in a compressed state, and the interface 1003 includes the compressed interface control 1004. Interface control 1004 is an interface control that compresses interface control 1002. The interface 1005 is also the sliding process interface, the interface controls in the interface 1005 are also in a compressed state, and the interface 1005 includes the compressed interface control 1006.
In another possible scenario, after the electronic device detects a sliding operation of the user on the first interface in the process of displaying the first interface by the mobile phone after the desktop application is started, the mobile phone displays the interface after sliding in response to the sliding operation on the first interface.
The sliding interface also comprises first interface content, the interface control of the sliding interface is in a compressed state, and the background area of the interface control corresponding to the target drawing node in the sliding interface is subjected to fuzzy processing.
It can be understood that, in response to the sliding operation of the user on the first interface, the mobile phone directly displays the interface after sliding, and does not display the change process of the interface in the sliding process.
Fig. 11 is an exemplary diagram sixth of an interface display according to an embodiment of the present application. As shown in fig. 11, the mobile phone in (a) in fig. 11 displays a first interface 1101, where the first interface 1101 includes an interface control 1102 subjected to blurring processing. After detecting the sliding operation of the user on the first interface 1101 to the left, the mobile phone displays a sliding interface, that is, an interface 1103 shown in (b) in fig. 11, in response to the sliding operation, and displays an interface control 1104 in which the interface control 1102 is compressed in the interface 1103.
It should be noted that, in the above example, the interface control 1102 displayed in the first interface 1101 shown in (a) in fig. 11 is taken as an example of the interface control subjected to the background blurring process, and the interface control 1102 displayed in the first interface 1101 in (a) in fig. 11 may also be the interface control subjected to the background blurring process. The sliding process shown in fig. 12 is described below as an example.
Fig. 12 is an exemplary diagram seven of an interface display according to an embodiment of the present application. As shown in fig. 12, the mobile phone in (a) in fig. 12 displays a first interface 1201, where the first interface 1201 includes an interface control 1202 that is not subjected to blurring processing. After detecting the sliding operation of the user on the first interface 1201 to the left, the mobile phone responds to the sliding operation, and directly displays a sliding interface, namely an interface 1203 shown in (b) in fig. 12, and displays an interface control 1204 after performing background blurring processing and compression on the interface control 1202 in the interface 1203.
In the embodiment of the application, after the desktop application is started, in the process that the mobile phone displays the first interface, the mobile phone detects the sliding operation of the user on the first interface, and in response to the sliding operation, the mobile phone can determine the transverse compression ratio and the longitudinal compression ratio of the control corresponding to the target drawing node relative to the canvas with the preset size according to the first coordinate value of the target drawing node in the canvas corresponding to the first interface in the first interface and the second coordinate value relative to the canvas with the preset size. And then, the mobile phone electronic equipment compresses the first display page according to the transverse compression ratio and the longitudinal compression ratio to obtain a target display page, and then the mobile phone displays the target display page. The first display page comprises an interface control corresponding to the target drawing node, and the area corresponding to the interface control is subjected to fuzzy processing. The interface control in the target display page is in a compressed state, and the background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
In one possible case, after responding to the sliding operation, if the mobile phone determines that the background picture indicated by the picture identification in the attribute information of the target drawing node is a picture subjected to fuzzy processing based on the fuzzy parameters, the mobile phone cuts a second background area corresponding to the position of the compressed target drawing node from the background picture. And then, stretching the second background area by the mobile phone according to the transverse compression ratio and the longitudinal compression ratio to obtain a target background area. And after the mobile phone draws the target background area to the position of the target drawing node, drawing the content of the target drawing node on the previous layer of the target background area to obtain a first display page after fuzzy processing. And the mobile phone compresses the first display page after the fuzzy processing according to the transverse compression ratio and the longitudinal compression ratio to obtain a target display page.
Alternatively, the Render Thread may determine the lateral scaling ratio based on a ratio of an abscissa value of the second coordinate value of the target rendering node relative to the preset-size canvas to an abscissa value of the first coordinate value of the target rendering node. And determining the longitudinal scaling ratio by the Render Thread according to the ratio of the ordinate value of the second coordinate value of the target drawing node relative to the canvas of the preset size to the ordinate value of the first coordinate value of the target drawing node.
In the embodiment of the present application, if the Render Thread determines that the background picture stored in the attribute information of the target rendering node is a picture that is blurred based on the blur parameter, the Render Thread cuts out a second background area corresponding to the position of the compressed target rendering node from the background picture. I.e., render Thread, determines a second background region corresponding to the location of the target rendering node in the preset-size canvas. Then, the Render Thread stretches the second background region according to the transverse compression ratio and the longitudinal compression ratio to obtain a target background region. And after the Render Thread draws the target background area to the position of the target drawing node, drawing the content of the target drawing node on the previous layer of the target background area to obtain a first display page. The first display page comprises an interface control corresponding to the target drawing node, and the background area of the interface control is subjected to fuzzy processing.
And the Render Thread compresses the first display page according to the transverse compression ratio and the longitudinal compression ratio to obtain a target display page. Then, the target display page is displayed in the display screen of the mobile phone. The target display page comprises an interface control corresponding to the compressed target drawing node, and the background area of the interface control is subjected to fuzzy processing. Therefore, after the Render Thread cuts the second background area corresponding to the compressed target drawing node from the background picture, the second background area is stretched according to the transverse scaling ratio and the longitudinal scaling ratio, and the stretched target background area is drawn on a canvas with a preset size, so that the blurring processing of the background of the target drawing node in the interface sliding process is realized, and the compression animation of the canvas can be adapted in real time.
Fig. 13 is an exemplary diagram eight of an interface display provided in the embodiment of the present application, where the display effect of the target drawing node in the interface after sliding is shown in fig. 13 (a), and the Render Thread obtains the landscape scaling ratio and the portrait scaling ratio according to the ratio of the abscissa in the second coordinate value of the target drawing node to the abscissa of the first coordinate value of the target drawing node, and the ratio of the ordinate of the second coordinate value of the target drawing node to the ordinate of the first coordinate value of the target drawing node, respectively. The Render Thread may calculate a coordinate value of the compressed target drawing node in the compressed canvas 1 of the display screen, and cut a first background region 1301 corresponding to a position of the target drawing node in the canvas 1 from a background picture or a background blurred picture obtained by blurring the background picture. Then, the Render Thread stretches the first background region according to the landscape scaling ratio and the portrait scaling ratio, and then draws the stretched target background region 1302 onto the canvas 2, as shown in (b) of fig. 13.
In one possible scenario of the embodiment of the present application, after the desktop application is started and in the process of displaying the second interface, the mobile phone displays the third interface in response to the zoom/out operation of the user on the second interface. The third interface comprises second interface content, the second interface content comprises interface controls corresponding to the target drawing nodes, the interface controls of the third interface are in a compressed state, and background areas of the interface controls corresponding to the target drawing nodes are subjected to fuzzy processing.
It can be understood that, in the process of displaying the second interface on the display screen of the mobile phone, after the mobile phone detects the compression operation of the user on the second interface, the mobile phone displays the compressed third interface.
Illustratively, an interface 1401 is shown in fig. 14 (a), and an interface control 1402 corresponding to the target rendering node is displayed in the interface 1401. The handset detects a compression operation (e.g., a slide-up operation) of the interface 1401 by the user, and in response to the compression operation, the handset displays the interface 1403 in (b) in fig. 14. Wherein the interface 1403 has a compressed interface control 1404 displayed therein. The mobile phone detects a trigger operation (e.g., a click operation) by the user on an arbitrary position in the interface 1403, and the mobile phone continues to display the interface 1401 in (a) in fig. 14.
In summary, in the embodiment of the present application, for an interface control with infrequent background switching, the Render Thread may directly cut a background area from a background picture stored in attribute information of a drawing node corresponding to the control as a background of the drawing node, without obtaining a background picture once when each drawing node is drawn, and performing fuzzy processing on the obtained background picture, thereby not only reducing the number of times of obtaining the background picture, but also reducing the number of times of performing fuzzy processing on the background picture, and thus, the power consumption of the device may be effectively reduced.
Or, the Render Thread determines that the background picture stored in the attribute information of the drawing node is not subjected to the blurring process by the blurring parameter of the drawing node, and the Render Thread may perform the blurring process on the background picture stored in the attribute information of the drawing node according to the blurring parameter in the attribute information of the drawing node, and cut a background area from the background blurring picture after the blurring process as the background of the drawing node. Therefore, the Render Thread does not need to acquire the background picture from the application layer again, so that the time for acquiring the background picture is saved, and the purpose of reducing the power consumption of the device is achieved.
As shown in fig. 15, an embodiment of the present application discloses an electronic device, which may be the mobile phone described above. The electronic device may specifically include: a touch screen 1501, the touch screen 1501 including a touch sensor 1506 and a display screen 1507; one or more processors 1502; a memory 1503; one or more applications (not shown); and one or more computer programs 1504, which may be connected via one or more communication buses 1505. Wherein the one or more computer programs 1504 are stored in the memory 1503 and configured to be executed by the one or more processors 1502, the one or more computer programs 1504 comprise instructions that can be used to perform the relevant steps of the above embodiments.
It will be appreciated that the electronic device or the like may include hardware structures and/or software modules that perform the functions described above. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present invention.
The embodiment of the present application may divide the functional modules of the electronic device or the like according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present invention, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing the respective functional modules with the respective functions, one possible composition diagram of the electronic device involved in the above-described embodiment may include: a display unit, a transmission unit, a processing unit, etc. It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Embodiments of the present application also provide an electronic device including one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the related method steps described above to implement the interface display method of the above embodiments.
Embodiments of the present application also provide a computer-readable storage medium having stored therein computer instructions that, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the interface display method in the above-described embodiments.
Embodiments of the present application also provide a computer program product comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the above-described related method steps to implement the interface display method of the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions, and when the apparatus is running, the processor may execute the computer-executable instructions stored in the memory, so that the apparatus executes the interface display method executed by the electronic device in the above method embodiments.
The electronic device, the computer readable storage medium, the computer program product or the apparatus provided in this embodiment are configured to execute the corresponding method provided above, and therefore, the advantages achieved by the electronic device, the computer readable storage medium, the computer program product or the apparatus can refer to the advantages in the corresponding method provided above, which are not described herein.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. An interface display method, characterized in that the method comprises:
the electronic equipment determines the drawing node as a target drawing node according to the fuzzy parameter of the drawing node in the target interface to be displayed, wherein the target drawing node refers to the drawing node of background to be subjected to fuzzy processing;
for each target drawing node, if the electronic equipment determines that the stored background picture is a picture subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment cuts out a first background area corresponding to the position of the target drawing node from the background picture;
after the electronic equipment draws the first background area to the position of the target drawing node, drawing the content of the target drawing node on the previous layer of the first background area, and generating the target interface;
The electronic equipment displays the target interface, wherein the target interface comprises an interface control corresponding to the target drawing node, and the background area of the interface control is the first background area after the blurring process.
2. The method according to claim 1, wherein the method further comprises:
the electronic equipment acquires the background picture based on a target application corresponding to the target interface;
the electronic equipment stores the picture identification of the background picture as a parameter to attribute information of each drawing node;
the electronic device determining that the stored background picture is a picture subjected to blurring processing based on the blurring parameters comprises:
and the electronic equipment determines that the stored picture identifier indicates the background picture in the attribute information of the target drawing node, and the background picture is the picture subjected to fuzzy processing based on the fuzzy parameter.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
if the electronic equipment determines that the stored background picture is not subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment adopts the fuzzy parameters to carry out fuzzy processing on the background picture to obtain a background fuzzy picture;
And the electronic equipment cuts the first background area corresponding to the position of the target drawing node from the background blurred picture.
4. A method according to claim 3, wherein the electronic device determining that the stored background picture has not been blurred based on the blur parameters comprises:
if the electronic equipment determines that the background picture is acquired from the target application for the first time, the electronic equipment determines that the background picture is not subjected to fuzzy processing based on the fuzzy parameters; or,
if the electronic equipment determines that the fuzzy parameters for carrying out fuzzy processing on the background picture are different from the fuzzy parameters of the target drawing node, the electronic equipment determines that the background picture is not subjected to fuzzy processing based on the fuzzy parameters.
5. The method according to claim 3 or 4, wherein the blur parameters include blur radii, and the electronic device uses the blur parameters to blur the background picture to obtain a background blurred picture, including:
the electronic equipment generates a fuzzy filter according to the fuzzy radius, wherein the fuzzy radius is used for representing the fuzzy degree of the background picture;
And the electronic equipment performs fuzzy processing on the background picture through the fuzzy filter to obtain the background fuzzy picture.
6. The method according to any one of claims 1-5, further comprising:
the electronic equipment displays a first interface, wherein the first interface comprises first interface content;
and responding to the sliding operation of the first interface, displaying a sliding process interface or a sliding back interface by the electronic equipment, wherein the sliding process interface or the sliding back interface comprises the first interface content, the first interface content comprises an interface control corresponding to the target drawing node, the interface control of the sliding process interface or the sliding back interface is in a compressed state, and the background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
7. The method of claim 6, wherein the electronic device displays a sliding process interface or a post-sliding interface in response to the sliding operation on the first interface, comprising:
responding to the sliding operation, and the electronic equipment determines a transverse compression ratio and a longitudinal compression ratio of a control corresponding to the target drawing node relative to a canvas with a preset size according to a first coordinate value of the target drawing node in the canvas corresponding to the first interface and a second coordinate value relative to the canvas with the preset size;
The electronic equipment compresses a first display page according to the transverse compression ratio and the longitudinal compression ratio to obtain a target display page, wherein the first display page comprises an interface control corresponding to the target drawing node, and the area corresponding to the interface control is subjected to fuzzy processing;
the electronic equipment displays the target display page, the interface control in the target display page is in a compressed state, and the background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
8. The method of claim 7, wherein after the responding to the sliding operation, the method further comprises:
if the electronic equipment determines that the stored background picture is the picture subjected to fuzzy processing based on the fuzzy parameters, the electronic equipment cuts out a second background area corresponding to the position of the compressed target drawing node from the background picture;
the electronic equipment stretches the second background area according to the transverse compression ratio and the longitudinal compression ratio to obtain a target background area;
after the electronic equipment draws the target background area to the position of the target drawing node, drawing the content of the target drawing node on the previous layer of the target background area to obtain the first display page after fuzzy processing;
The electronic equipment compresses a first display page according to the transverse compression ratio and the longitudinal compression ratio, and the method comprises the following steps:
and the electronic equipment compresses the first display page after the blurring process according to the transverse compression ratio and the longitudinal compression ratio.
9. The method of claim 7, wherein the electronic device displays a sliding process interface before the electronic device displays the target display page, further comprising:
and the electronic equipment displays a change process interface from the first display page after the blurring processing to the target display page.
10. The method of any of claims 7-9, wherein the electronic device determines, from a first coordinate value of the target drawing node in a canvas corresponding to the first interface and a second coordinate value relative to a preset-size canvas, a lateral compression ratio and a longitudinal compression ratio of a control corresponding to the target drawing node relative to the preset-size canvas, the method further comprising:
the electronic equipment determines a first coordinate value of canvas corresponding to all drawing nodes in the first interface at the first interface;
The electronic equipment traverses upwards to determine a first coordinate value of a father node of the target drawing node based on the target drawing node;
and determining an intermediate coordinate value according to the first coordinate value of the parent node and the first coordinate value of the target drawing node until the parent node traversed upwards is a root node, and determining the second coordinate value of the target drawing node according to the first coordinate value of the root node and the intermediate coordinate value.
11. The method according to any one of claims 1-10, further comprising:
the electronic equipment displays a second interface, wherein the second interface comprises second interface content;
and responding to the zoom-in/zoom-out operation of the second interface, displaying a third interface by the electronic equipment, wherein the third interface comprises the second interface content, the second interface content comprises an interface control corresponding to the target drawing node, the interface control of the third interface is in a compressed state, and the background area of the interface control corresponding to the target drawing node is subjected to fuzzy processing.
12. The method according to any one of claims 1-11, wherein the determining, by the electronic device, that the rendering node is a target rendering node according to a blur parameter of the rendering node in the target interface to be displayed, includes:
The electronic equipment acquires attribute information of each drawing node in a target interface to be displayed;
and if the electronic equipment determines that the attribute information of the drawing node comprises the fuzzy parameter, the electronic equipment determines that the drawing node is the target drawing node.
13. The method of any of claims 1-12, wherein the electronic device includes an application layer, a framework layer, and a system library, the application layer including a target application, the electronic device determining that the stored background picture is prior to blurring the processed picture based on the blurring parameters, the method further comprising:
after the electronic equipment acquires the background picture based on the target application of the application layer, the target application sends the picture identification of the background picture to the drawing node corresponding to each interface control of the frame layer;
the drawing nodes corresponding to each interface control of the frame layer send the picture identifications to the corresponding drawing nodes in the system library;
and after receiving the picture identification, the drawing node corresponding to each interface control in the system library stores the picture identification as a parameter into corresponding attribute information.
14. An electronic device, comprising:
a display screen;
one or more processors;
a memory;
wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the interface display method of any of claims 1-13.
15. A computer readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform the interface display method of any one of claims 1-13.
CN202310628628.5A 2023-05-30 2023-05-30 Interface display method and electronic equipment Pending CN117724603A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310628628.5A CN117724603A (en) 2023-05-30 2023-05-30 Interface display method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310628628.5A CN117724603A (en) 2023-05-30 2023-05-30 Interface display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117724603A true CN117724603A (en) 2024-03-19

Family

ID=90198456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310628628.5A Pending CN117724603A (en) 2023-05-30 2023-05-30 Interface display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117724603A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180350031A1 (en) * 2017-05-31 2018-12-06 Guangdong Oppo Mobile Telecommunications Corp., Lt D. Image blurring method, electronic device and computer device
CN112074865A (en) * 2018-05-10 2020-12-11 谷歌有限责任公司 Generating and displaying blur in an image
CN113791857A (en) * 2021-09-03 2021-12-14 北京鲸鲮信息系统技术有限公司 Application window background fuzzy processing method and device in Linux system
CN115830173A (en) * 2021-09-17 2023-03-21 腾讯科技(深圳)有限公司 Interface element drawing method and device, equipment, storage medium and product

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180350031A1 (en) * 2017-05-31 2018-12-06 Guangdong Oppo Mobile Telecommunications Corp., Lt D. Image blurring method, electronic device and computer device
CN112074865A (en) * 2018-05-10 2020-12-11 谷歌有限责任公司 Generating and displaying blur in an image
CN113791857A (en) * 2021-09-03 2021-12-14 北京鲸鲮信息系统技术有限公司 Application window background fuzzy processing method and device in Linux system
CN115830173A (en) * 2021-09-17 2023-03-21 腾讯科技(深圳)有限公司 Interface element drawing method and device, equipment, storage medium and product

Similar Documents

Publication Publication Date Title
CN113362783B (en) Refresh rate switching method and electronic equipment
US11669242B2 (en) Screenshot method and electronic device
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN115473957B (en) Image processing method and electronic equipment
CN112686981B (en) Picture rendering method and device, electronic equipment and storage medium
WO2023065873A1 (en) Frame rate adjustment method, terminal device, and frame rate adjustment system
CN113935898A (en) Image processing method, system, electronic device and computer readable storage medium
CN114089932B (en) Multi-screen display method, device, terminal equipment and storage medium
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
CN112541861B (en) Image processing method, device, equipment and computer storage medium
EP4280586A1 (en) Point light source image detection method and electronic device
WO2021077911A1 (en) Image flooding processing method and apparatus, and storage medium
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium
WO2020233593A1 (en) Method for displaying foreground element, and electronic device
CN115119048B (en) Video stream processing method and electronic equipment
CN116051351B (en) Special effect processing method and electronic equipment
CN116048831B (en) Target signal processing method and electronic equipment
CN117009005A (en) Display method, automobile and electronic equipment
CN117724603A (en) Interface display method and electronic equipment
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN116700578B (en) Layer synthesis method, electronic device and storage medium
CN116703741B (en) Image contrast generation method and device and electronic equipment
CN116095512B (en) Photographing method of terminal equipment and related device
CN116233599B (en) Video mode recommendation method and electronic equipment
CN117692693A (en) Multi-screen display method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination