CN114879896B - Frozen screen processing method, electronic equipment and storage medium - Google Patents

Frozen screen processing method, electronic equipment and storage medium Download PDF

Info

Publication number
CN114879896B
CN114879896B CN202210798657.1A CN202210798657A CN114879896B CN 114879896 B CN114879896 B CN 114879896B CN 202210798657 A CN202210798657 A CN 202210798657A CN 114879896 B CN114879896 B CN 114879896B
Authority
CN
China
Prior art keywords
display
layer
display layer
application
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210798657.1A
Other languages
Chinese (zh)
Other versions
CN114879896A (en
Inventor
祁长乐
高杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210798657.1A priority Critical patent/CN114879896B/en
Publication of CN114879896A publication Critical patent/CN114879896A/en
Application granted granted Critical
Publication of CN114879896B publication Critical patent/CN114879896B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44594Unloading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a frozen screen processing method, electronic equipment and a storage medium, relates to the technical field of terminals, and is used for solving the frozen screen problem caused by the fact that the electronic equipment cannot respond to a touch event of a user on a touch screen; the method comprises the following steps: when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; the display picture in the first display window corresponds to the first display picture layer; the electronic equipment acquires attribute information of each display layer in all display layers on the first interface; the electronic equipment judges whether a second display layer shields the first display layer or not in all display layers; the second display sub-layer shields the first display sub-layer and is used for enabling the electronic equipment to discard the touch event; if the second display sub-layer covers the first display sub-layer in all the display sub-layers, the electronic equipment closes the process of the first application corresponding to the second display sub-layer.

Description

Frozen screen processing method, electronic equipment and storage medium
Technical Field
The application relates to the technical field of terminals, in particular to a frozen screen processing method, electronic equipment and a storage medium.
Background
With the development of electronic devices, touch screens have been widely used. The electronic device may receive a touch event of a user to the touch screen, and execute a touch instruction corresponding to the touch event through recognition of the touch event. For example, the electronic device may receive a click of an application icon of an application on the touch screen by a user, and display a corresponding interface of the application.
However, some applications may cause the electronic device to fail to respond to a touch event of the user to the touch screen after being started, thereby causing a phenomenon of freezing the touch screen.
Disclosure of Invention
The embodiment of the application provides a frozen screen processing method, electronic equipment and a storage medium, which are used for solving the frozen screen problem caused by the fact that the electronic equipment cannot respond to a touch event of a user to a touch screen.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a method for processing a frozen screen is provided, which includes: when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; the display picture in the first display window corresponds to the first display picture layer; the electronic equipment acquires attribute information of each display layer in all display layers on the first interface; the attribute information includes at least a combination of one or more of the following: transparency, layer type, layer size, layer visibility, and layer name; the electronic equipment judges whether a second display layer shields the first display layer or not in all display layers; the second display sub-layer shields the first display sub-layer and is used for enabling the electronic equipment to discard the touch event; if the second display sub-layer covers the first display sub-layer in all the display sub-layers, the electronic equipment closes the process of the first application corresponding to the second display sub-layer.
Based on the first aspect, firstly, when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event, and a display picture in the first display window corresponds to a first display layer; the electronic equipment acquires attribute information of each display sub-layer in all display sub-layers on the first interface, and judges whether second display sub-layers in all display sub-layers cover the first display sub-layer or not; because the second display sub-layer obscures the first display sub-layer for enabling the electronic device to discard the touch event, the electronic device cannot respond to the touch event input by the user. Therefore, if the second display layer shields the first display layer in all the display layers, the electronic device can close the process of the first application corresponding to the second display layer, so that the problem of frozen screen caused by the fact that the electronic device cannot respond to a touch event input by a user can be solved.
In one implementation manner of the first aspect, the method further includes: the electronic device closes the floating window authority of the first application.
In the implementation manner, as the electronic equipment closes the floating window authority of the first application, the electronic equipment can not display the content of the second display layer through the floating window authority, and the problem of frozen screen caused by the fact that the electronic equipment cannot respond to the touch event input by the user is further solved.
In one implementation manner of the first aspect, the method further includes: the electronic device uninstalls the first application.
In the implementation manner, the first application is unloaded by the electronic equipment, so that the problem that the second display layer shields the first display layer does not exist, and the problem of screen freezing caused by the fact that the electronic equipment cannot respond to a touch event input by a user is fundamentally solved.
In an implementation manner of the first aspect, before the electronic device closes a process of the first application corresponding to the second display layer, the method further includes: the electronic equipment displays prompt information; the prompt information is used for prompting a user that the first application causes the frozen screen; the frozen screen is a touch event which is not input by the electronic equipment in response to a user; the prompt message comprises a first control; the electronic device closing a process of the first application corresponding to the second display layer, including: and the electronic equipment responds to the operation of the user on the first control, and closes the process of the first application corresponding to the second display layer.
In the implementation manner, as the electronic equipment displays the prompt information, the user can be prompted that the frozen screen is caused by the first application; after that, the electronic equipment can select the process of closing the first application according to the prompt information, so that the problem that the electronic equipment generates a frozen screen is solved, and meanwhile, the user experience is improved.
In an implementation manner of the first aspect, the prompt information further includes a second control and a third control; the electronic device closing the floating window authority of the first application, comprising: the electronic equipment responds to the operation of a user on the second control, and closes the floating window authority of the first application; the electronic device uninstalls a first application, comprising: and the electronic equipment responds to the operation of the user on the third control, and the first application is unloaded.
In the implementation mode, the prompt information also comprises the second control and the third control, so that a user can select to close the floating window authority of the first application according to the prompt information or unload the first application, and user experience is improved while the problem that the electronic equipment generates a frozen screen is solved.
In an implementation manner of the first aspect, the second display layer obscures the first display layer includes: the second display layer covers the first display layer, the second display layer is a display layer invisible to a user, and the second display layer is an untrusted layer; the untrusted layer is a display layer which is not penetrable by a touch event preset by the electronic equipment; or the second display layer covers the first display layer, the second display layer is a display layer invisible to a user, and the transparency of the second display layer is greater than or equal to the transparency threshold.
In an implementation manner of the first aspect, the electronic device determines whether the second display layer occludes the first display layer in all display layers, including: and the electronic equipment traverses each display layer with higher priority than the first display layer according to the priorities of all the display layers, and judges whether second display layers in all the display layers shield the first display layer.
In this implementation manner, when the electronic device determines whether the second display layer occludes the first display layer, the electronic device may traverse each display layer in the priority higher than the priority of the first display layer according to the priorities of all display layers, so that the electronic device does not need to traverse all display layers of the first interface, which is beneficial to reducing power consumption of the device.
In one implementation manner of the first aspect, the method further includes: if the second display sub-layer does not cover the first display sub-layer in all the display sub-layers, the electronic equipment distributes the touch event to the first display window and executes the touch instruction of the touch event.
In the implementation manner, when the second display sub-layer does not exist in all the display sub-layers to shield the first display sub-layer, the electronic device distributes the touch event to the first display window and executes the touch instruction of the touch event, so that the user experience is improved.
In a second aspect, an electronic device is provided, which has the functionality to implement the method described in the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided that includes a touch screen, a memory, and one or more processors; the touch screen, the memory and the processor are coupled; the memory is for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of: when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; the display picture in the first display window corresponds to the first display picture layer; the electronic equipment acquires attribute information of each display layer in all display layers on the first interface; the attribute information includes at least a combination of one or more of the following: transparency, layer type, layer size, layer visibility, and layer name; the electronic equipment judges whether a second display layer shields the first display layer or not in all display layers; the second display sub-layer shields the first display sub-layer and is used for enabling the electronic equipment to discard the touch event; if the second display sub-layer covers the first display sub-layer in all the display sub-layers, the electronic equipment closes the process of the first application corresponding to the second display sub-layer.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the electronic device closes the floating window authority of the first application.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the electronic device uninstalls the first application.
In an implementation manner of the third aspect, before the electronic device closes a process of the first application corresponding to the second display layer, when the computer instructions are executed by the processor, the electronic device is further caused to perform the following steps: the electronic equipment displays prompt information; the prompt information is used for prompting a user that the first application causes the frozen screen; the frozen screen is a touch event which is not input by the electronic equipment in response to a user; the prompt message comprises a first control; the electronic device closing a process of the first application corresponding to the second display layer, including: and the electronic equipment responds to the operation of the user on the first control, and closes the process of the first application corresponding to the second display layer.
In an implementation manner of the third aspect, the prompt information further includes a second control and a third control; when executed by a processor, the computer instructions cause the electronic device to specifically perform the steps of: the electronic equipment responds to the operation of a user on the second control, and closes the floating window authority of the first application; or, the electronic device uninstalls the first application in response to the operation of the third control by the user.
In one implementation manner of the third aspect, the second display layer obscures the first display layer includes: the second display layer covers the first display layer, the second display layer is a display layer invisible to a user, and the second display layer is an untrusted layer; the untrusted layer is a display layer which is not penetrable by a touch event preset by the electronic equipment; or the second display layer covers the first display layer, the second display layer is a display layer invisible to a user, and the transparency of the second display layer is greater than or equal to the transparency threshold.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to specifically perform the steps of: and the electronic equipment traverses each display layer with higher priority than the first display layer according to the priorities of all the display layers, and judges whether second display layers in all the display layers shield the first display layer.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: if the second display sub-layer does not cover the first display sub-layer in all the display sub-layers, the electronic equipment distributes the touch event to the first display window and executes the touch instruction of the touch event.
In a fourth aspect, there is provided a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
The technical effects of any one of the design manners of the second aspect to the fifth aspect may be referred to the technical effects of the different design manners of the first aspect, and will not be repeated here.
Drawings
Fig. 1 is a schematic diagram of a frozen screen phenomenon of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic software framework of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a freeze screen processing method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of all display layers according to an embodiment of the present disclosure;
fig. 6 is a second flow chart of a freeze screen processing method according to an embodiment of the present application;
Fig. 7 is a flowchart of a method for processing a frozen screen according to an embodiment of the present application;
fig. 8 is an interface schematic diagram of a freeze screen processing method according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the embodiments of the present application, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
For ease of understanding, the technical terms referred to in this application will first be described.
Transparency (alpha): refers to the transparency of the display layer. The value range of the transparency (alpha) can be 0-255 or 0.0f-1.0f. The smaller the number, the higher the transparency. Taking transparency (alpha) in the range of 0.0f to 1.0f as an example, 0.0f indicates that the display layer is completely transparent, and 1.0f indicates that the display layer is completely opaque.
Wherein the image layer is composed of a plurality of pixels, and one or more image layers are stacked to form the whole display image. By way of example, each layer may be resembling a "transparent glass"; if nothing is on the "transparent glass", the "transparent glass" is a completely transparent blank layer (or transparent layer); if the "transparent glass" has an image thereon, the "transparent glass" may be referred to as a non-transparent layer.
And (3) displaying a window: the display interface of the electronic device may be constituted by a plurality of display windows; wherein one display window manages one display layer.
The display layer: the display interface of the electronic device is formed by overlaying one or more display layers. The electronic device may sequentially superimpose the display layers from a low priority (low level) to a high priority (high level) according to the priority (or level) of the display layers, so as to form the whole display interface.
Wherein, the level of the display layer refers to the size on the vertical axis coordinate of the display layer; the larger the vertical axis coordinates, the higher the hierarchy of the display layer, the smaller the vertical axis coordinates, and the lower the hierarchy of the display layer.
The untrusted display layer: refers to a display layer that is not transparent to touch events. Wherein, the system defaults that all display layers except some special display layers are untrusted display layers. For example, some special display layers include: the display layer corresponding to the barrier-free display window, the display layer corresponding to the input editor (input method editor, IME) display window, the display layer corresponding to the intelligent assistant display window, the display layer corresponding to the display window with the root view of GONE or INVISBLE, the transparency of the display layer being 0, and the display layer corresponding to the display window being of TYPE TYPE_APPLICATION_OVERLAY, and the transparency being less than the transparency threshold (e.g. 0.8).
In the embodiment of the application, these special display icons shown above may also be referred to as trusted display layers. Wherein, the trusted display layer refers to a display layer through which a touch event can penetrate.
In order to improve the security of the system, to prevent the electronic device from penetrating the display layer and distributing the touch event to the display window corresponding to the touch event, the related art provides a security mechanism for the untrusted display layer, where the security mechanism includes: when an untrusted display layer exists above the display window corresponding to the touch event, the electronic device discards the touch event and does not distribute the touch event to the corresponding display window. However, this causes a phenomenon that the area of the touch screen where the display window is located cannot respond to a touch event input by a user, thereby freezing the screen. Moreover, since the frozen screen is not triggered by crash and watch dog, the scene of the frozen screen is difficult to monitor by the existing frozen screen monitoring, and the user experience is affected.
For example, in order to alleviate the problem of eye discomfort caused by users using electronic devices for a long time, third party manufacturers have developed an application having an eye protection function (e.g., an eye protection device application). After the eye protection device is installed and started, a light yellow mask layer is displayed in a display window of the electronic equipment, and the light yellow mask layer can effectively play a role in protecting eyes.
It should be noted that, although the eye protection device application will display a light yellow mask layer in the display window of the electronic device after being started, the light yellow mask layer is invisible to the user, that is, the light yellow mask layer displayed by the eye protection device application cannot be seen by the user's eyes, so that the light yellow mask layer displayed in the display window of the electronic device after the eye protection device application is started does not affect the user's experience of using the electronic device, but can play a role of protecting eyes.
Typically, the priority of the display window of the eye-protection application is higher than the priority of the desktop display window, and the transparency of the display layer within the display window of the eye-protection application is 1, and the layer type of the display layer is a no_touch type. From the foregoing, the display layer within the display window of the eyeshield application may be referred to as an untrusted display layer.
On this basis, after the eye protection device application is started, as shown in fig. 1, if the electronic device receives a touch event of a user to a certain position (such as a position a) on the touch screen, the electronic device discards the touch event and does not distribute the touch event to a display window corresponding to the touch event because the display layer in the display window of the eye protection device application is an untrusted display layer. In this way, the electronic device cannot respond to the touch event input by the user at the position a, so that the phenomenon of freezing the screen is caused.
In some embodiments, the electronic device cannot respond to a touch event entered by the user at any location on the touch screen with the yellowish mask layer displayed full screen after the eye-shield application is started.
The embodiment of the application provides a frozen screen processing method which is applied to electronic equipment and can solve the problem that the electronic equipment has frozen screens. When the electronic device detects that a display layer with shielding behavior exists above a display window corresponding to a touch event, the electronic device determines an application corresponding to the display layer and closes the process of the application.
Wherein, the display layer with shielding behavior comprises: the display layer is an untrusted display layer; alternatively, the transparency of the display layer is greater than or equal to a transparency threshold.
The method for processing the frozen screen provided in the embodiment of the application may be applied to an electronic device with a touch screen, where the electronic device may be, for example, a mobile phone, a tablet computer, a personal computer (personal computer, PC), a personal digital assistant (personal digital assistant, PDA), a smart watch, a netbook, a wearable electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a vehicle-mounted device, an intelligent automobile, an intelligent sound device, and other electronic devices, which is not limited in any way.
Fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 2, the electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The memory is used for storing instructions and data. In some embodiments, the memory may be a random access memory (random access memory, RAM), a read-only memory (ROM), a universal flash memory (universal flash storage, UFS), an embedded multimedia card (embedded multi media card, eMMC), a NAND flash memory, a Solid State Disk (SSD) or solid state drive, a mechanical hard disk, or the like.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of electronic devices can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, audio, video, etc. files are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in an embodiment of the present application, the processor 110 may include a storage program area and a storage data area by executing instructions stored in the internal memory 121, and the internal memory 121 may include a storage program area and a storage data area.
The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of hardware and software.
In order to make the technical solution of the present application clearer, it is easy to understand, and the method of the embodiment of the present application is illustrated below in conjunction with the software architecture of the electronic device 100.
Fig. 3 is a software architecture block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, android ™ is divided into five layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android run) and libraries, a hardware abstraction layer (hardware abstraction layer, HAL), a kernel layer, and a driver layer, respectively. It should be understood that: the Android ™ system is used for illustration, and in other operating systems (such as the iOS ™ system and the like), the scheme of the present application can be implemented as long as the functions implemented by the respective functional modules are similar to those implemented by the embodiments of the present application.
The application layer may include a series of application packages (Android application package, APK).
As shown in fig. 3, the application layer may install various applications. Such as conversations, memos, browsers, contacts, gallery, calendar, maps, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
For example, the application framework layer may include a window manager, an activity manager, a content provider, a view system, a resource manager, a notification manager, etc., to which embodiments of the present application are not limited in any way.
For example, the window manager described above is used to manage window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The activity manager is used for managing the life cycle of each application program and the navigation rollback function, and is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system described above may be used to build a display interface for an application. Each display interface may be composed of one or more controls. In general, controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, micro (Widget) items, and the like. The resource manager provides various resources, such as localization strings, icons, pictures, layout files, video files, and the like, to the application program. The notification manager can display notification information in a status bar by using an application program, can be used for conveying a notification type message, can automatically disappear after a short stay, and does not need user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is presented in a status bar, a prompt tone is emitted, vibration is generated, and an indicator light blinks.
In the embodiment of the application, the application framework layer further comprises an input module (input) and a management and control module. The input module is used for distributing touch events; and the management and control module is used for processing the application corresponding to the untrusted display layer. The application framework layer also includes an activity manager and an ANR management module. The ANR management module is used for popping up an ANR prompt box.
As shown in fig. 3, android run includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java voice, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer is an interface layer between the kernel layer and the hardware, and may be used to abstract the hardware.
The kernel layer is located below the hardware abstraction layer and is the layer between hardware and software. The kernel layer at least comprises display drive, camera drive, audio drive, sensor drive and the like, and the embodiment of the application does not limit the display drive, the audio drive, the sensor drive and the like.
Referring to fig. 3, in some embodiments, when the electronic device loads the display interface each time, the electronic device sends attribute information of a display layer corresponding to each display window to an image synthesis system (such as a surface flinger) through a window manager in all display windows of the current display interface; after the image composition system receives the attribute information of the display layers corresponding to each display window sent by the window manager, the image composition system sends the attribute information of the display layers corresponding to each display window to the input module. And then, the input module manages attribute information of the display layers corresponding to each display window.
For example, an IputWindowHandle object is included in the input module, and the input module may manage attribute information of all display layers through the IputWindowHandle object. On the basis, the input module can acquire attribute information of all display layers of the current display interface from the IputWindow handle object. Wherein, the attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer UID (user ID), etc. It should be noted that, when the electronic device sets the visibility of the layer to be visible, the layer is visible to the user; when the electronic device sets the layer to be invisible, the layer is invisible to the user.
Note that, since one display window manages one display layer, attribute information corresponding to the display layer may be attribute information corresponding to the display window.
Subsequently, the input module receives a touch event input by a user, and the input module determines a first display window according to position information (such as touch coordinates) corresponding to the touch event and attribute information of a display layer corresponding to each display window. The first display window corresponds to the first display layer.
After the input module determines the first display window, the input module acquires attribute information of all display layers corresponding to the current display interface, and the input module judges whether second display layers in all display layers cover the first display layers according to the attribute information of all display layers. If the second display sub-layer shields the first display sub-layer in all the display sub-layers, the input module determines a first application corresponding to the second display sub-layer according to the attribute information of the second display sub-layer. And then, the input module informs the window manager that the second display layer shields the first display layer and sends the application package name of the first application to the window manager. After the window manager receives the message that the second display layer shields the first display layer and the application package name of the first application, which are issued by the input module, the window manager sends the message that the second display layer shields the first display layer and the application package name of the first application to the management and control module. And after receiving the message that the second display layer shields the first display layer and sent by the window manager, the management and control module closes the process of the first application according to the application package name of the first application.
In some embodiments, after the administration module closes the process of the first application, the administration module queries from the packet administration service (package manage service, PMS) whether the first application has the floating window rights. If the management and control module inquires that the first application has the floating window authority from the package management and control service, the management and control module triggers the electronic equipment to display first prompt information, wherein the first prompt information is used for prompting the user that the first application has shielding behaviors.
Referring to fig. 3, in other embodiments, when the electronic device loads the display interface each time, the electronic device sends attribute information of a display layer corresponding to each display window to an image synthesis system (such as a surface flinger) through a window manager in all display windows of the current display interface; after the image composition system receives the attribute information of the display layers corresponding to each display window sent by the window manager, the image composition system sends the attribute information of the display layers corresponding to each display window to the input module. And then, the input module manages attribute information of the display layers corresponding to each display window.
For example, an IputWindowHandle object is included in the input module, and the input module may manage attribute information of all display layers through the IputWindowHandle object. On the basis, the input module can acquire attribute information of all display layers of the current display interface from the IputWindow handle object. Wherein, the attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer UID (user ID), etc. It should be noted that, when the electronic device sets the visibility of the layer to be visible, the layer is visible to the user; when the electronic device sets the layer to be invisible, the layer is invisible to the user.
Note that, since one display window manages one display layer, attribute information corresponding to the display layer may be attribute information corresponding to the display window.
Subsequently, the input module receives a touch event input by a user, and the input module determines a first display window according to position information (such as touch coordinates) corresponding to the touch event and attribute information of a display layer corresponding to each display window. The first display window corresponds to the first display layer.
After the input module determines the first display window, the input module acquires attribute information of all display layers corresponding to the current display interface, and the input module judges whether second display layers in all display layers cover the first display layers according to the attribute information of all display layers. If the second display sub-layer shields the first display sub-layer in all the display sub-layers, the input module determines a first application corresponding to the second display sub-layer according to the attribute information of the second display sub-layer. Afterwards, the input module informs the activity manager that the first application generates an application program unresponsiveness, i.e., the first application generates ANR; and, the input module also informs the campaign manager of the reason for the first application to generate ANR: if the second display layer obscures the first display layer, the touch event cannot be distributed to the first display layer. And the input module sends the application package name of the first application to the activity manager.
Subsequently, the activity manager sends the reason that the first application generates the ANR and the application package name of the first application to the ANR management module; the ANR management module triggers the electronic device to display second prompt information based on the ANR reasons. The second prompt message is used for prompting the user to close the process of the first application; or, the second prompt information prompts the user to uninstall the first application.
The embodiments of the present application are introduced above in connection with a software architecture and a hardware structure, and the technical solutions of the embodiments of the present application are described in detail below in connection with the accompanying drawings of the specification.
First, it should be noted that, after the electronic device detects the touch event, if a display layer with shielding behavior exists above the display window corresponding to the touch event, the electronic device discards the touch event and does not distribute the touch event to the corresponding display window. In this way, the area on the touch screen where the display window is located cannot respond to the touch event input by the user, so that the phenomenon of freezing the screen is caused. Based on this, when the electronic device determines that a display layer with shielding behavior exists above a display window corresponding to a touch event, the electronic device determines an application corresponding to the display layer, and closes a process of the application.
For easy understanding, the following describes a process of interaction between each module involved in the method provided in the embodiment of the present application in conjunction with a software architecture schematic diagram of an electronic device shown in fig. 3. As shown in fig. 3, the system may include: the system comprises an input module, a window manager, an activity manager, a management and control module and an ANR management module.
In some embodiments, referring to fig. 3, as shown in fig. 4, the method for processing frozen screen provided in the embodiments of the present application may include S1-S8.
S1, the window manager sends attribute information of a display layer corresponding to each display window to an input module in all display windows of a current display interface.
For example, when the electronic device loads the display interface each time, the electronic device sends attribute information of a display layer corresponding to each display window to the image synthesis system through the window manager in all display windows of the current display interface; after the image composition system receives the attribute information of the display layers corresponding to each display window sent by the window manager, the image composition system sends the attribute information of the display layers corresponding to each display window to the input module. And then, the input module manages attribute information of the display layers corresponding to each display window.
Wherein, the attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer UID (user ID), etc.
S2, the input module receives a touch event input by a user on the current display interface.
S3, the input module determines a first display window according to the position information corresponding to the touch event and the attribute information of the display layer corresponding to each display window.
The first display window corresponds to the first display layer.
S4, the input module acquires attribute information of all display layers corresponding to the current display interface.
Wherein, the attribute information of each display layer comprises transparency, layer type, layer size, layer name, layer UID, etc.
In some embodiments, the input module includes an IputWindowHandle object, where all attribute information of the display layers is encapsulated in the IputWindowHandle object. On the basis, the input module can acquire attribute information of all display layers corresponding to the current display interface (or called first interface) from the IputWindow handle object.
S5, the input module judges whether the second display sub-layer shields the first display sub-layer or not in all the display sub-layers.
Wherein the second display layer shielding the first display layer includes: the second display layer covers the first display layer, the second display layer is invisible to the user, and the second display layer is an untrusted display layer; or, the second display layer covers the first display layer, the second display layer is invisible to the user, and the transparency of the second display layer is greater than or equal to the transparency threshold. By way of example, the transparency threshold may be, for example, 0.8.
The input module traverses each display layer in all display layers in turn, and determines whether the second display layer obstructs the first display layer.
It should be appreciated that the display interface of the electronic device is formed by the superposition of multiple display layers. Therefore, the electronic device can traverse each display layer in all the display layers from top to bottom in turn through the input module, and judge whether the second display layer shields the first display layer.
In some embodiments, as shown in fig. 5, it is assumed that all display layers corresponding to the display interface include display layer 1, display layer 2, display layer 3, display layer 4, and display layer 5; and, the first display layer included in the first display window is display layer 1. On this basis, the electronic device may sequentially superimpose the display layers from low priority to high priority according to the priority of the display layers. For example, the electronic device displays the layers in a superimposed order from bottom to top. The display layer at the bottom (or stack bottom) (e.g., display layer 1) has the lowest priority, and the display layer at the top (or stack top) (e.g., display layer 5) has the highest priority.
The electronic device may traverse all display layers sequentially from the top to the bottom of the stack through the input module, and determine whether the second display layer blocks the first display layer. Referring to fig. 5, the input module first determines whether the display layer 5 obscures the display layer 1. If the display layer 5 does not cover the display layer 1, the input module judges whether the display layer 4 covers the display layer 1, and so on until the input module traverses to the display layer 1.
It should be noted that, if the display layer 2, the display layer 3, the display layer 4, and the display layer 5 do not cover the display layer 1, the input module distributes the touch event to the display window (e.g., the first display window) corresponding to the display layer 1.
In some embodiments, for each display layer, the process of the input module determining whether the display layer obscures the first display layer may refer to the steps shown in fig. 6. By way of example, the process may include: s1-1 to S1-4.
In fig. 6, an example of the input module determining whether the display layer 5 blocks the first display layer is illustrated.
S1-1, the input module judges whether the display layer 5 covers the first display layer.
For example, the input module may determine whether the display layer 5 covers the first display layer according to the priority and the layer size of the display layer 5. For example, when the priority of the display layer 5 is higher than that of the first display layer, and the size of the display layer 5 is greater than or equal to that of the first display layer, the input module determines that the display layer 5 covers the first display layer.
It should be noted that, if the display layer 5 covers the first display layer, the input module continues to execute S1-2; if the display layer 5 does not cover the first display layer, the input module distributes the touch event to a first display window corresponding to the first display layer.
S1-2, the input module judges whether the display layer 5 is visible to a user.
For example, the input module may determine whether the display layer 5 is visible to the user based on the visibility of the display layer 5.
It should be noted that, if the display layer 5 is invisible to the user, the input module continues to execute S1-3; if the display layer 5 is visible to the user, the input module distributes the touch event to a first display window corresponding to the first display layer.
S1-3, the input module judges whether the layer type of the display layer 5 is an untrusted display layer.
For example, if the layer type of the display layer 5 is an untrusted display layer, the input module determines that the display layer 5 obstructs the first display layer; if the layer type of the display layer 5 is the trusted display layer, the input module continues to execute S1-4.
S1-4, the input module judges whether the transparency of the display layer 5 is larger than or equal to a transparency threshold value.
For example, if the transparency of the display layer 5 is greater than or equal to the transparency threshold, the input module determines that the display layer 5 obstructs the first display layer; if the transparency of the display layer 5 is smaller than the transparency threshold, the input module distributes the touch event to a first display window corresponding to the first display layer.
In some embodiments, the input module determining whether the transparency of the display layer is greater than or equal to a transparency threshold comprises: the input module traverses all display layers in turn and judges whether all display layers comprise display layers with transparency greater than or equal to a transparency threshold. For example, the input module traverses the display layer 5 to obtain the transparency of the display layer 5, and stores the correspondence between the transparency of the display layer 5 and the layer name in the data object (or database); then, the input module traverses the display layer 4 to obtain the transparency of the display layer 4, and judges whether the transparency of the display layer 4 is larger than the transparency of the display layer 5; if the transparency of the display layer 4 is greater than the transparency of the display layer 5, the input module updates the transparency and the layer name stored in the data object to the transparency and the layer name of the display layer 4; if the transparency of display layer 4 is less than the transparency of display layer 5, the input module maintains the transparency and layer name in the data object unchanged.
Subsequently, the input module can sequentially traverse all display layers according to the steps, and finally the input module judges whether the transparency stored in the data object is greater than or equal to a transparency threshold value; if the transparency stored in the data object is greater than or equal to the transparency threshold, the input module determines the layer name of the display layer corresponding to the transparency stored in the data object, and determines that the display layer obstructs the first display layer. Correspondingly, if the transparency stored in the data object is smaller than the transparency threshold, the input module determines that all display layers do not include the display layers with transparency greater than or equal to the transparency threshold, i.e. the second display layer does not exist in all display layers to cover the first display layer.
Therefore, the input module does not need to judge whether the transparency of each display layer in all display layers is larger than or equal to the transparency threshold value, which is beneficial to reducing the power consumption of the equipment.
S6, if the second display sub-layer shields the first display sub-layer in all the display sub-layers, the input module determines a first application corresponding to the second display sub-layer.
It should be understood that if the second display layer does not cover the first display layer in all the display layers, the input module distributes the touch event to the first display window corresponding to the first display layer.
S7, the input module informs the window manager that the second display layer shields the first display layer, and sends the application package name of the first application.
S8, the window manager informs the management and control module to close the process of the first application.
In summary, in the embodiment of the present application, when the electronic device receives the touch event, the electronic device determines whether a second display layer covers the first display layer above the first display layer corresponding to the touch event; if a second display layer covers the first display layer above the first display layer corresponding to the touch event, the electronic device determines a first application corresponding to the second display layer; thereafter, the electronic device closes the process of the first application. In this way, the second display layer covers the first display layer above the first display layer corresponding to the determined touch event, so that the electronic device timely closes the process of the first application corresponding to the second display layer, and the problem of frozen screen of the interface electronic device can be solved.
In other embodiments, as shown in fig. 7 in conjunction with fig. 3, the freeze screen processing method provided in the embodiments of the present application may include A1-A8.
A1, the window manager sends attribute information of a display layer corresponding to each display window to the input module in all display windows of the current display interface.
For example, when the electronic device loads the display interface each time, the electronic device sends attribute information of a display layer corresponding to each display window to the image synthesis system through the window manager in all display windows of the current display interface; after the image composition system receives the attribute information of the display layers corresponding to each display window sent by the window manager, the image composition system sends the attribute information of the display layers corresponding to each display window to the input module. And then, the input module manages attribute information of the display layers corresponding to each display window.
Wherein, the attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer UID (user ID), etc.
And A2, the input module receives a touch event input by a user on the current display interface.
A3, the input module determines a first display window according to the position information corresponding to the touch event and the attribute information of the display layer corresponding to each display window.
The first display window corresponds to the first display layer.
And A4, the input module acquires attribute information of all display layers corresponding to the current display interface.
Wherein, the attribute information of each display layer comprises transparency, layer type, layer size, layer name, layer UID, etc.
In some embodiments, the input module includes an IputWindowHandle object, where all attribute information of the display layers is encapsulated in the IputWindowHandle object. On the basis, the input module can acquire attribute information of all display layers corresponding to the current display interface from the IputWindow handle object.
A5, the input module judges whether the second display sub-layer shields the first display sub-layer or not in all the display sub-layers.
Wherein the second display layer shielding the first display layer includes: the second display layer covers the first display layer, the second display layer is visible to a user, and the second display layer is an untrusted display layer; or, the second display layer covers the first display layer, the second display layer is visible to a user, and the transparency of the second display layer is greater than or equal to the transparency threshold. By way of example, the transparency threshold may be, for example, 0.8.
It should be noted that, for the illustration of the input module determining whether the second display layer obstructs the first display layer in all the display layers, reference may be made to the above embodiment, and details are not repeated here.
A6, if the second display sub-layer shields the first display sub-layer in all the display sub-layers, the input module determines a first application corresponding to the second display sub-layer.
It should be understood that if the second display layer does not cover the first display layer in all the display layers, the input module distributes the touch event to the first display window corresponding to the first display layer.
A7, the input module informs the activity manager that the first application generates ANR.
For example, the input module may send a cause of the first application generating the ANR to the activity manager to inform the activity manager that the first application generated the ANR. The reason for the ANR generated by the first application may be: the second display layer obscures the first display layer and touch events cannot be distributed to the first display window.
A8, the activity manager informs the ANR management module to pop up an ANR prompt box.
The ANR prompt box may be, for example, the second prompt information described in the foregoing embodiments.
In some embodiments, the activity manager may send to the ANR management module a cause for the first application to generate ANR; the ANR management module pops up an ANR prompt box based on the ANR reasons generated by the first application. For example, the ANR prompt box is configured to prompt a user to close a process of the first application; or prompt the user to uninstall the first application.
In summary, since the freeze screen is not triggered by crash and watch dog, it is difficult to monitor the scene of the freeze screen by the existing freeze screen monitoring, so in the embodiment of the application, when the electronic device receives the touch event, the electronic device determines whether the second display layer covers the first display layer above the first display layer corresponding to the touch event; if a second display layer covers the first display layer above the first display layer corresponding to the touch event, the electronic device determines a first application corresponding to the second display layer; the electronic device may then pop up the ANR prompt to prompt the user that the first application caused the problem of frozen screen. Thus, the electronic device can inform the user that the problem of the frozen screen caused by the first application is solved through the ANR prompt box.
As can be seen from the above embodiments, if the electronic device determines that the second display layer covers the first display layer above the first display layer corresponding to the touch event, the electronic device determines a first application corresponding to the second display layer; thereafter, the electronic device closes the process of the first application. However, the process of closing the first application by the electronic device can only solve the freeze-screen event. Based on this, the method of the embodiment of the application further includes: the electronic device displays prompt information for prompting the user that the frozen screen is caused by the first application. On the basis, the prompt message also comprises a first control and a second control; the first control is used for prompting a user to close a floating window function of the first application, and the second control is used for prompting the user to uninstall the first application.
For example, as shown in fig. 8, the content of the prompt information displayed by the electronic device may be, for example: the application of the eye protection device causes the problem of frozen screen. As can be seen from fig. 8, the hint information further includes: closing the floating window control and unloading the application control. On the basis, the electronic equipment responds to the operation of closing the floating window control by a user, and the electronic equipment closes the floating window function of the eye protector application; or the electronic equipment responds to the operation of the user on the uninstalling application control, and the electronic equipment uninstalls the eye protection device application.
In the embodiment, the electronic device can close the floating window function of the first application or uninstall the first application through the prompt information, so that the problem that the electronic device is frozen again can be avoided, and the user experience is improved.
The embodiment of the application provides electronic equipment, which can comprise: a display screen (e.g., a touch screen), a memory, and one or more processors. The display, memory, and processor are coupled. The memory is for storing computer program code, the computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the various functions or steps performed by the electronic device in the method embodiments described above. The structure of the electronic device may refer to the structure of the electronic device 100 shown in fig. 2.
Embodiments of the present application also provide a chip system, as shown in fig. 9, the chip system 1800 includes at least one processor 1801 and at least one interface circuit 1802. The processor 1801 may be the processor 110 shown in fig. 2 in the above embodiment. Interface circuit 1802 may be, for example, an interface circuit between processor 110 and an external memory; or as an interface circuit between the processor 110 and the internal memory 121.
The processor 1801 and interface circuit 1802 described above may be interconnected by wires. For example, interface circuit 1802 may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, interface circuit 1802 may be used to send signals to other devices (e.g., processor 1801). The interface circuit 1802 may, for example, read instructions stored in a memory and send the instructions to the processor 1801. The instructions, when executed by the processor 1801, may cause the electronic device to perform the steps performed by the electronic device in the embodiments described above. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiments of the present application also provide a computer storage medium, where the computer storage medium includes computer instructions that, when executed on an electronic device, cause the electronic device to perform the functions or steps performed by the electronic device in the foregoing method embodiments.
Embodiments of the present application also provide a computer program product, which when run on a computer, causes the computer to perform the functions or steps performed by the electronic device in the method embodiments described above.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. The freeze screen processing method is characterized by comprising the following steps of:
when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; the display picture in the first display window corresponds to a first display picture layer;
the electronic equipment acquires attribute information of each display layer in all display layers on the first interface; the attribute information includes at least a combination of one or more of the following information: transparency, layer type, layer size, layer visibility, and layer name;
the electronic equipment detects that a second display layer exists in the first display window based on the attribute information of each display layer, and the electronic equipment determines a first application corresponding to the second display layer and closes the process of the first application;
The second display layer is a display layer displayed after the first application is started, the second display layer covers the first display layer, the second display layer is a display layer invisible to a user, and the second display layer is an untrusted layer; the untrusted layer is a display layer which is preset by the electronic equipment and is not penetrable by the touch event, and the untrusted layer does not respond to the touch event input by the user.
2. The method according to claim 1, wherein the method further comprises:
and closing the floating window authority of the first application by the electronic equipment.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
the electronic device uninstalls the first application.
4. The method of claim 3, wherein before the electronic device closes the process of the first application corresponding to the second display layer, the method further comprises:
the electronic equipment displays prompt information; the prompt information is used for prompting a user that the first application causes a frozen screen; the frozen screen is a touch event which is not input by the electronic equipment in response to a user;
The prompt message comprises a first control; the electronic device closing the process of the first application corresponding to the second display layer, including:
and the electronic equipment responds to the operation of the user on the first control, and closes the process of the first application corresponding to the second display layer.
5. The method of claim 4, wherein the hint information further comprises a second control and a third control;
the electronic device closing the floating window authority of the first application, including: the electronic equipment responds to the operation of the user on the second control, and the floating window authority of the first application is closed;
the electronic device uninstalling the first application, comprising: and the electronic equipment responds to the operation of the user on the third control, and the first application is unloaded.
6. The method of claim 1, wherein the electronic device determining whether a second display layer obscures the first display layer among all display layers comprises:
and traversing each display layer with higher priority than the first display layer by the electronic equipment according to the priorities of all display layers, and judging whether the second display layer shields the first display layer or not in all display layers.
7. The method according to claim 1, wherein the method further comprises:
and if the second display sub-layer does not exist in all the display sub-layers to shield the first display sub-layer, the electronic equipment distributes the touch event to the first display window and executes the touch instruction of the touch event.
8. An electronic device, comprising: a touch screen, a memory, and one or more processors; the touch screen, the memory, and the processor are coupled; the memory is used for storing computer program codes, and the computer program codes comprise computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-7.
9. A computer-readable storage medium comprising computer instructions; the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-7.
CN202210798657.1A 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium Active CN114879896B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210798657.1A CN114879896B (en) 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210798657.1A CN114879896B (en) 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114879896A CN114879896A (en) 2022-08-09
CN114879896B true CN114879896B (en) 2023-05-12

Family

ID=82683131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210798657.1A Active CN114879896B (en) 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114879896B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110703961A (en) * 2019-08-26 2020-01-17 北京达佳互联信息技术有限公司 Cover layer display method and device, electronic equipment and storage medium
CN111061410A (en) * 2018-10-16 2020-04-24 华为技术有限公司 Screen freezing processing method and terminal
CN111309429A (en) * 2020-02-26 2020-06-19 维沃移动通信有限公司 Display method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9727226B2 (en) * 2010-04-02 2017-08-08 Nokia Technologies Oy Methods and apparatuses for providing an enhanced user interface
KR102105460B1 (en) * 2013-06-14 2020-06-01 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
CN110489043B (en) * 2019-07-31 2023-03-24 华为技术有限公司 Management method and related device for floating window
CN111273841B (en) * 2020-02-11 2021-08-17 天津车之家数据信息技术有限公司 Page processing method and mobile terminal
CN112835472B (en) * 2021-01-22 2023-04-07 青岛海信移动通信技术股份有限公司 Communication terminal and display method
CN113282361B (en) * 2021-04-21 2022-09-23 荣耀终端有限公司 Window processing method and electronic equipment
CN113448442A (en) * 2021-07-12 2021-09-28 交互未来(北京)科技有限公司 Large screen false touch prevention method and device, storage medium and equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061410A (en) * 2018-10-16 2020-04-24 华为技术有限公司 Screen freezing processing method and terminal
CN110703961A (en) * 2019-08-26 2020-01-17 北京达佳互联信息技术有限公司 Cover layer display method and device, electronic equipment and storage medium
CN111309429A (en) * 2020-02-26 2020-06-19 维沃移动通信有限公司 Display method and electronic equipment

Also Published As

Publication number Publication date
CN114879896A (en) 2022-08-09

Similar Documents

Publication Publication Date Title
CN112269527B (en) Application interface generation method and related device
US20230418696A1 (en) Method for performing drawing operation by application and electronic device
WO2020037611A1 (en) Method and electronic device for processing notification message
CN112767231A (en) Layer composition method and device
WO2021185352A1 (en) Version upgrade method and related apparatus
CN115292199B (en) Video memory leakage processing method and related device
CN112732434A (en) Application management method and device
CN115017534B (en) File processing authority control method, device and storage medium
CN111381996B (en) Memory exception handling method and device
CN115640083A (en) Screen refreshing method and equipment capable of improving dynamic performance
CN117234398A (en) Screen brightness adjusting method and electronic equipment
CN114879896B (en) Frozen screen processing method, electronic equipment and storage medium
WO2023005751A1 (en) Rendering method and electronic device
WO2020062192A1 (en) Operation control method and electronic device
CN116672707B (en) Method and electronic device for generating game prediction frame
CN116664734B (en) Method for displaying ring chart, electronic device and readable storage medium
CN116688494B (en) Method and electronic device for generating game prediction frame
CN116055715B (en) Scheduling method of coder and decoder and electronic equipment
CN114706633B (en) Preloading method, electronic device and storage medium
CN115016921B (en) Resource scheduling method, device and storage medium
CN116662150B (en) Application starting time-consuming detection method and related device
CN117707242A (en) Temperature control method and related device
CN116709339A (en) Detection method of application notification message and electronic equipment
CN117668846A (en) Hot patching method and related device
CN117130698A (en) Menu display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant