CN114879896A - Screen freezing processing method, electronic equipment and storage medium - Google Patents

Screen freezing processing method, electronic equipment and storage medium Download PDF

Info

Publication number
CN114879896A
CN114879896A CN202210798657.1A CN202210798657A CN114879896A CN 114879896 A CN114879896 A CN 114879896A CN 202210798657 A CN202210798657 A CN 202210798657A CN 114879896 A CN114879896 A CN 114879896A
Authority
CN
China
Prior art keywords
display
display layer
layer
electronic equipment
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210798657.1A
Other languages
Chinese (zh)
Other versions
CN114879896B (en
Inventor
祁长乐
高杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210798657.1A priority Critical patent/CN114879896B/en
Publication of CN114879896A publication Critical patent/CN114879896A/en
Application granted granted Critical
Publication of CN114879896B publication Critical patent/CN114879896B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44594Unloading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a screen freezing processing method, electronic equipment and a storage medium, relates to the technical field of terminals, and is used for solving the problem of screen freezing caused by the fact that the electronic equipment cannot respond to a touch event of a user on a touch screen; the method comprises the following steps: when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; a display picture in the first display window corresponds to the first display layer; the electronic equipment acquires attribute information of each display layer in all display layers on a first interface; the electronic equipment judges whether a second display layer exists in all the display layers to shield the first display layer; the second display layer covers the first display layer and is used for enabling the electronic equipment to discard the touch event; and if the second display layer exists in all the display layers and blocks the first display layer, the electronic equipment closes the process of the first application corresponding to the second display layer.

Description

Screen freezing processing method, electronic equipment and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a frozen screen processing method, an electronic device, and a storage medium.
Background
With the development of electronic devices, touch screens are widely used. The electronic device can receive a touch event of a user on the touch screen, and execute a touch instruction corresponding to the touch event through recognition of the touch event. For example, the electronic device may receive a user click of an application icon for an application on the touch screen, and display a corresponding interface for the application.
However, some applications may display an interface after being started, which may cause the electronic device to fail to respond to a touch event of the touch screen by a user, thereby causing a phenomenon of screen freezing.
Disclosure of Invention
The embodiment of the application provides a screen freezing processing method, electronic equipment and a storage medium, which are used for solving the problem of screen freezing caused by the fact that the electronic equipment cannot respond to a touch event of a user on a touch screen.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a freeze-shielding treatment method is provided, which includes: when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; a display picture in the first display window corresponds to the first display layer; the electronic equipment acquires attribute information of each display layer in all display layers on a first interface; the attribute information includes at least one or a combination of the following information: transparency, layer type, layer size, layer visibility and layer name; the electronic equipment judges whether a second display layer exists in all the display layers to shield the first display layer; the second display layer covers the first display layer and is used for enabling the electronic equipment to discard the touch event; and if the second display layer exists in all the display layers to shield the first display layer, the electronic equipment closes the process of the first application corresponding to the second display layer.
Based on the first aspect, firstly, when the electronic device receives a touch event input by a user on a first interface, the electronic device determines a first display window corresponding to the touch event, and a display picture in the first display window corresponds to a first display layer; then, the electronic equipment acquires attribute information of each display layer in all display layers on the first interface, and judges whether a second display layer exists in all the display layers to shield the first display layer; because the second display layer obscures the first display layer for the electronic device to discard the touch event, the electronic device may not respond to the touch event input by the user. Therefore, if the second display layer covers the first display layer in all the display layers, the electronic device can close the progress of the first application corresponding to the second display layer, and therefore the problem of screen freezing caused by the fact that the electronic device cannot respond to the touch event input by the user can be solved.
In one implementation form of the first aspect, the method further comprises: the electronic device closes the floating window permission of the first application.
In the implementation mode, because the electronic device closes the floating window permission of the first application, the electronic device cannot display the content of the second display layer through the floating window permission, and the problem of screen freezing caused by the fact that the electronic device cannot respond to a touch event input by a user is further solved.
In one implementation form of the first aspect, the method further comprises: the electronic device uninstalls the first application.
In the implementation mode, the electronic equipment unloads the first application, so that the problem that the second display layer shields the first display layer is avoided, and the problem of screen freezing caused by the fact that the electronic equipment cannot respond to a touch event input by a user is fundamentally solved.
In an implementation manner of the first aspect, before the electronic device closes a process of the first application corresponding to the second display layer, the method further includes: the electronic equipment displays prompt information; the prompt message is used for prompting the user that the first application causes screen freezing; the frozen screen is a touch event which does not respond to the input of the user by the electronic equipment; the prompt message comprises a first control; the electronic equipment closes the process of the first application corresponding to the second display layer, and the process comprises the following steps: and the electronic equipment responds to the operation of the user on the first control, and closes the process of the first application corresponding to the second display layer.
In this implementation, since the electronic device displays the prompt message, the user may be prompted that the screen freeze is caused by the first application; and then, the electronic equipment can select to close the process of the first application according to the prompt message, so that the problem of screen freezing of the electronic equipment is solved, and the user experience is improved.
In one implementation manner of the first aspect, the prompt further includes a second control and a third control; the electronic equipment closes the floating window authority of the first application, and the method comprises the following steps: the electronic equipment responds to the operation of the user on the second control, and closes the floating window permission of the first application; the electronic device uninstalls a first application, comprising: the electronic device uninstalls the first application in response to operation of the third control by the user.
In the implementation mode, the prompt information further comprises the second control and the third control, so that the user can select to close the floating window permission of the first application or uninstall the first application according to the prompt information, the problem that the electronic equipment is frozen is solved, and meanwhile, the user experience is improved.
In an implementation manner of the first aspect, the blocking, by the second display layer, the first display layer includes: the second display layer covers the first display layer, the second display layer is a display layer which is invisible to a user, and the second display layer is an untrusted layer; the untrusted layer is a display layer which is preset by the electronic equipment and is impenetrable to touch events; or the second display layer covers the first display layer, the second display layer is a display layer invisible to the user, and the transparency of the second display layer is greater than or equal to the transparency threshold.
In an implementation manner of the first aspect, the determining, by the electronic device, whether there is a second display layer in all display layers that blocks the first display layer includes: the electronic equipment traverses each display layer with the priority higher than that of the first display layer according to the priorities of all the display layers, and judges whether a second display layer exists in all the display layers to shield the first display layer.
In this implementation manner, when the electronic device determines whether the second display layer blocks the first display layer, the electronic device may traverse each display layer with a higher priority than that of the first display layer according to the priorities of all the display layers, so that the electronic device does not need to traverse all the display layers of the first interface, which is beneficial to reducing the power consumption of the device.
In one implementation form of the first aspect, the method further comprises: and if the second display layer does not exist in all the display layers to shield the first display layer, the electronic equipment distributes the touch event to the first display window and executes the touch instruction of the touch event.
In the implementation manner, when the second display layer does not exist in all the display layers to shield the first display layer, the electronic device distributes the touch event to the first display window and executes the touch instruction of the touch event, which is beneficial to improving user experience.
In a second aspect, an electronic device is provided, which has the function of implementing the method of the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided that includes a touch screen, a memory, and one or more processors; the touch screen, the memory and the processor are coupled; the memory is for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of: when the electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; a display picture in the first display window corresponds to the first display layer; the electronic equipment acquires attribute information of each display layer in all display layers on a first interface; the attribute information includes at least one or a combination of the following information: transparency, layer type, layer size, layer visibility and layer name; the electronic equipment judges whether a second display layer exists in all the display layers to shield the first display layer; the second display layer covers the first display layer and is used for enabling the electronic equipment to discard the touch event; and if the second display layer exists in all the display layers and blocks the first display layer, the electronic equipment closes the process of the first application corresponding to the second display layer.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the electronic device closes the floating window permission of the first application.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the electronic device uninstalls the first application.
In an implementation manner of the third aspect, before the electronic device closes the process of the first application corresponding to the second display layer, when the computer instructions are executed by the processor, the electronic device further performs the following steps: the electronic equipment displays prompt information; the prompt message is used for prompting the user that the first application causes screen freezing; the frozen screen is a touch event which does not respond to user input by the electronic equipment; the prompt message comprises a first control; the electronic equipment closes the process of the first application corresponding to the second display layer, and the process comprises the following steps: and the electronic equipment responds to the operation of the user on the first control, and closes the process of the first application corresponding to the second display layer.
In one implementation manner of the third aspect, the prompt information further includes a second control and a third control; when executed by a processor, the computer instructions cause the electronic device to perform in particular the steps of: the electronic equipment responds to the operation of the user on the second control, and closes the floating window permission of the first application; alternatively, the electronic device uninstalls the first application in response to a user operation of the third control.
In an implementation manner of the third aspect, the blocking, by the second display layer, the first display layer includes: the second display layer covers the first display layer, the second display layer is a display layer which is invisible to a user, and the second display layer is an untrusted layer; the untrusted layer is a display layer which is preset by the electronic equipment and is impenetrable to touch events; or the second display layer covers the first display layer, the second display layer is a display layer invisible to the user, and the transparency of the second display layer is greater than or equal to the transparency threshold.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to perform the following steps: the electronic equipment traverses each display layer with the priority higher than that of the first display layer according to the priorities of all the display layers, and judges whether a second display layer exists in all the display layers to shield the first display layer.
In one implementation of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: and if the second display layer does not exist in all the display layers to shield the first display layer, the electronic equipment distributes the touch event to the first display window and executes the touch instruction of the touch event.
In a fourth aspect, there is provided a computer-readable storage medium having stored therein instructions, which when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
For technical effects brought by any one of the design manners in the second aspect to the fifth aspect, reference may be made to technical effects brought by different design manners in the first aspect, and details are not described herein.
Drawings
Fig. 1 is a schematic diagram illustrating a screen freeze phenomenon of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a software framework diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a first flowchart illustrating a screen freezing processing method according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of all display layers according to an embodiment of the present application;
fig. 6 is a second schematic flowchart of a freeze-screen processing method according to an embodiment of the present disclosure;
fig. 7 is a third schematic flowchart of a freeze-screen processing method according to an embodiment of the present disclosure;
FIG. 8 is an interface schematic diagram of a freeze screen processing method according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
In order to make those skilled in the art better understand the solution of the embodiments of the present application, the technical solution in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present application without making any creative efforts shall fall within the protection scope of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
For ease of understanding, the technical terms referred to in the present application will be described first.
Transparency (alpha): refers to the transparency of the display layer. Wherein, the value range of the transparency (alpha) can be 0-255, and can also be 0.0f-1.0 f. The smaller the value, the higher the transparency. Taking the value range of the transparency (alpha) as 0.0f-1.0f as an example, 0.0f represents that the display layer is completely transparent, and 1.0f represents that the display layer is completely opaque.
The image layers are composed of a plurality of pixels, and one or more image layers are overlapped to form the whole display image. For example, each layer may be similar to a "transparent glass"; if nothing is on the transparent glass, the transparent glass is a completely transparent empty layer (or transparent layer); if the "transparent glass" has an image thereon, the "transparent glass" may be referred to as a non-transparent layer.
Displaying a window: the display interface of the electronic equipment can be composed of a plurality of display windows; wherein one display window manages one display layer.
Displaying the image layer: the display interface of the electronic equipment is formed by superposing one or more display layers. The electronic device may sequentially superimpose the display layers from a low priority (low hierarchy) to a high priority (high hierarchy) according to the priorities (or called hierarchies) of the display layers, so as to form the entire display interface.
The hierarchy of the display layer refers to the size of the display layer on a vertical axis coordinate; the larger the vertical axis coordinate is, the higher the level of the display layer is, and the smaller the vertical axis coordinate is, the lower the level of the display layer is.
Untrusted display layers: refer to a display layer that is impenetrable to touch events. The system defaults that all the display layers except some special display layers are untrusted display layers. For example, some special display layers include: the display layer corresponding to the barrier-free display window, the display layer corresponding to the Input Method Editor (IME) display window, the display layer corresponding to the smart assistant display window, the display layer corresponding to the display window whose root view is GONE or INVISIBLE, the transparency of the display layer is 0, and the TYPE of the display window is TYPE _ APPLICATION _ overlap, and the transparency is less than the display layer corresponding to the transparency threshold (e.g. 0.8).
In the embodiment of the present application, the special display icons shown above may also be referred to as trusted display layers. Wherein, the trusted display layer refers to a display layer which can be penetrated by the touch event.
In order to improve the security of the system and prevent the electronic device from distributing the touch event through the display layer to the display window corresponding to the touch event, the related art provides a security mechanism for an untrusted display layer, where the security mechanism includes: when the untrusted display layer exists above the display window corresponding to the touch event, the electronic device discards the touch event and does not distribute the touch event to the corresponding display window. However, this may cause the area of the touch screen where the display window is located to fail to respond to the touch event input by the user, thereby causing a phenomenon of screen freezing. Moreover, because the screen freezing does not trigger the coast and the watchdog, the conventional screen freezing monitoring is difficult to monitor the scene of the screen freezing, and the user experience is influenced.
For example, in order to alleviate the problem of eye discomfort caused by long-term use of electronic devices by users, third-party manufacturers have developed an application with an eye protection function (e.g., an eye protection application). Wherein, the precious use of eyeshield can show a faint yellow mask layer in electronic equipment's display window after the installation starts, and this kind of faint yellow mask layer can effectively play the effect of eyeshield.
It should be noted that, although the eye protection product is applied to the display window of the electronic device after being started, a light yellow mask layer is displayed in the display window of the electronic device, the mask layer is invisible to the user, that is, the light yellow mask layer displayed by the eye protection product is invisible to the eyes of the user.
Generally, the priority of a display window of the eye protection product application is higher than that of a desktop display window, the transparency of a display layer in the display window of the eye protection product application is 1, and the layer type of the display layer is a non-TOUCHABLE type (NO _ TOUCHABLE). As can be seen from the above, the display layer within the display window for the eye protection device application may be referred to as an untrusted display layer.
On this basis, after the eye protection application is started, as shown in fig. 1, if the electronic device receives a touch event of a user at a certain position (e.g., position a) on the touch screen, since the display layer in the display window of the eye protection application is an untrusted display layer, the electronic device discards the touch event and does not distribute the touch event to the display window corresponding to the touch event. This may cause the electronic device to fail to respond to the touch event input by the user at the position a, thereby causing a phenomenon of screen freezing.
In some embodiments, in the case where the yellowish mask layer is displayed full-screen after the eye protection application is started, the electronic device cannot respond to a touch event input by the user at any position on the touch screen.
The embodiment of the application provides a screen freezing processing method, which is applied to electronic equipment and can solve the problem that the electronic equipment is frozen. For example, when the electronic device detects that a display layer with an occlusion behavior exists above a display window corresponding to a touch event, the electronic device determines an application corresponding to the display layer, and closes a process of the application.
Wherein, the display map layer with sheltering from action includes: the display layer is an untrusted display layer; or the transparency of the display layer is greater than or equal to the transparency threshold.
For example, the method for processing a frozen screen provided in the embodiment of the present application may be applied to an electronic device with a touch screen, where the electronic device may be, for example, a mobile phone, a tablet computer, a Personal Computer (PC), a Personal Digital Assistant (PDA), a smart watch, a netbook, a wearable electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, an in-vehicle device, a smart car, a smart audio device, and the like, and the embodiment of the present application does not limit the electronic device.
Fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 2, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
The memory is used for storing instructions and data. In some embodiments, the memory may be a Random Access Memory (RAM), a read-only memory (ROM), a universal flash memory (UFS), an embedded multimedia card (eMMC), a NAND flash memory, a Solid State Drive (SSD), a mechanical hard disk, or the like.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules illustrated in this embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device. In other embodiments, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (QLED), and the like.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as audio, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in the embodiment of the present application, the processor 110 may execute instructions stored in the internal memory 121, and the internal memory 121 may include a program storage area and a data storage area.
The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of hardware and software.
In order to make the technical solution of the present application clearer and easier to understand, the method of the embodiment of the present application is illustrated below with reference to a software architecture of the electronic device 100.
Fig. 3 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android repertoire is divided into five layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system libraries, a Hardware Abstraction Layer (HAL), a kernel layer, and a driver layer, from top to bottom. It should be understood that: it is noted that the Android system can also be implemented in other operating systems (e.g., iOS system), as long as the functions implemented by the functional modules are similar to the embodiments of the present application.
The application layer may include a series of Application Packages (APKs).
As shown in fig. 3, the application layer may install various applications. Such as applications like talk, memo, browser, contacts, gallery, calendar, map, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
For example, the application framework layer may include a window manager, an activity manager, a content provider, a view system, a resource manager, a notification manager, and the like, which is not limited in any way by the embodiments of the present application.
For example, the window manager described above is used to manage window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The activity manager is used for managing the life cycle and the navigation backspacing function of each application program, and is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The view system described above can be used to build a display interface for an application. Each display interface may be comprised of one or more controls. Generally, controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, micro (Widget) pieces, and the like. The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, and the like, to the application. The notification manager can display notification information in the status bar by using an application program, can be used for conveying notification type messages, can automatically disappear after a short time of stopping, and does not need user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, to prompt text messages in the status bar, to emit a prompt tone, to vibrate, to flash an indicator light, etc.
In this embodiment of the application, the application framework layer further includes an input module (input) and a management and control module. The input module is used for distributing touch events; the management and control module is used for processing the application corresponding to the untrusted display layer. The application framework layer also includes an activity manager and an ANR management module. And the ANR management module is used for popping up an ANR prompt box.
As shown in fig. 3, the Android runtime includes a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java voice, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
Wherein the surface manager is configured to manage the display subsystem and provide a fusion of the 2D and 3D layers for the plurality of applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer is an interface layer between the kernel layer and the hardware, and may be used to abstract the hardware.
The kernel layer is located below the hardware abstraction layer and is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver and the like.
Referring to fig. 3, in some embodiments, each time the electronic device loads a display interface, the electronic device sends, through the window manager, attribute information of a display layer corresponding to each display window in all display windows of the current display interface to an image synthesis system (e.g., a surface flicker); and after receiving the attribute information of the display layer corresponding to each display window sent by the window manager, the image synthesis system sends the attribute information of the display layer corresponding to each display window to the input module. And then, the input module manages the attribute information of the display layer corresponding to each display window.
Illustratively, the input module includes an IputWindowHandle object, and the input module may manage attribute information of all display layers through the IputWindowHandle object. On this basis, the input module can acquire attribute information of all display layers of the current display interface from the IputWindowHandle object. The attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer uid (user id), and the like. It should be noted that, when the electronic device sets the layer visibility to be visible, the layer is visible to the user; when the electronic device sets the layer visibility to be invisible, the layer is invisible to the user.
It should be noted that, because one display window manages one display layer, the attribute information corresponding to the display layer may also be attribute information corresponding to the display window.
Subsequently, the input module receives a touch event input by a user, and the input module determines the first display window according to the position information (such as touch coordinates) corresponding to the touch event and the attribute information of the display layer corresponding to each display window. The first display window corresponds to the first display layer.
After the input module determines the first display window, the input module acquires attribute information of all display layers corresponding to the current display interface, and the input module judges whether a second display layer exists in all the display layers and covers the first display layer according to the attribute information of all the display layers. If the second display layer exists in all the display layers and covers the first display layer, the input module determines a first application corresponding to the second display layer according to the attribute information of the second display layer. And then, the input module informs the window manager that the second display layer shields the first display layer, and sends the application package name of the first application to the window manager. After the window manager receives the message that the second display layer covers the first display layer and the application package name of the first application sent by the input module, the window manager sends the message that the second display layer covers the first display layer and the application package name of the first application to the control module. And after receiving the message, sent by the window manager, that the second display layer blocks the first display layer, the control module closes the process of the first application according to the application package name of the first application.
In some embodiments, after the management and control module closes the process of the first application, the management and control module queries whether the first application has the floating window authority from a Package Management Service (PMS). If the control module inquires that the first application has the floating window permission from the package management service, the control module triggers the electronic equipment to display first prompt information, and the first prompt information is used for prompting a user that the first application has a shielding behavior.
In another embodiment, as shown in fig. 3, when the electronic device loads the display interface each time, the electronic device sends, through the window manager, attribute information of a display layer corresponding to each display window in all display windows of the current display interface to an image synthesis system (e.g., a surface flicker); and after receiving the attribute information of the display layer corresponding to each display window sent by the window manager, the image synthesis system sends the attribute information of the display layer corresponding to each display window to the input module. And then, the input module manages the attribute information of the display layer corresponding to each display window.
Illustratively, the input module includes an IputWindowHandle object, and the input module may manage attribute information of all display layers through the IputWindowHandle object. On this basis, the input module can acquire attribute information of all display layers of the current display interface from the IputWindowHandle object. The attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer uid (user id), and the like. It should be noted that, when the electronic device sets the layer visibility to be visible, the layer is visible to the user; when the electronic device sets the layer visibility to be invisible, the layer is invisible to the user.
It should be noted that, because one display window manages one display layer, the attribute information corresponding to the display layer may also be attribute information corresponding to the display window.
Subsequently, the input module receives a touch event input by a user, and the input module determines the first display window according to the position information (such as touch coordinates) corresponding to the touch event and the attribute information of the display layer corresponding to each display window. The first display window corresponds to the first display layer.
After the input module determines the first display window, the input module acquires attribute information of all display layers corresponding to the current display interface, and the input module judges whether a second display layer exists in all the display layers and covers the first display layer according to the attribute information of all the display layers. If the second display layer exists in all the display layers and covers the first display layer, the input module determines a first application corresponding to the second display layer according to the attribute information of the second display layer. Thereafter, the input module notifies the campaign manager that the first application generated the application unresponsiveness, i.e., that the first application generated the ANR; and, the input module further notifies the activity manager of a reason for the first application to generate the ANR: if the second display layer blocks the first display layer, the touch event cannot be distributed to the first display layer. And the input module sends the application package name of the first application to the activity manager.
Subsequently, the activity manager sends the ANR generation reason of the first application and the application package name of the first application to the ANR management module; and the ANR management module triggers the electronic equipment to display the second prompt message based on the ANR reason. The second prompt message is used for prompting the user to close the process of the first application; alternatively, the second prompt user prompts the user to uninstall the first application.
The embodiments of the present application are introduced above with reference to software architecture and hardware structure, and the following describes technical solutions of the embodiments of the present application in detail with reference to the drawings of the specification.
First, after the electronic device detects a touch event, if a display layer with a blocking behavior exists above a display window corresponding to the touch event, the electronic device discards the touch event and does not distribute the touch event to the corresponding display window. Thus, the area of the touch screen where the display window is located cannot respond to the touch event input by the user, and the phenomenon of screen freezing is caused. Based on this, according to the embodiment of the application, when the electronic device determines that the display layer with the shielding behavior exists above the display window corresponding to the touch event, the electronic device determines the application corresponding to the display layer, and closes the process of the application.
For convenience of understanding, the following describes a process of interaction between the modules involved in the method provided by the embodiment of the present application with reference to a software architecture diagram of the electronic device shown in fig. 3. As shown in fig. 3, the system may include: the system comprises an input module, a window manager, an activity manager, a management and control module and an ANR management module.
In some embodiments, in conjunction with fig. 3, as shown in fig. 4, the freeze-screen processing method provided by the embodiments of the present application may include S1-S8.
And S1, the window manager sends the attribute information of the display layer corresponding to each display window in all the display windows of the current display interface to the input module.
Illustratively, when the electronic device loads a display interface each time, the electronic device sends attribute information of a display layer corresponding to each display window in all display windows of a current display interface to an image synthesis system through a window manager; and after receiving the attribute information of the display layer corresponding to each display window sent by the window manager, the image synthesis system sends the attribute information of the display layer corresponding to each display window to the input module. And then, the input module manages the attribute information of the display layer corresponding to each display window.
The attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer uid (user id), and the like.
And S2, the input module receives the touch event input by the user on the current display interface.
And S3, the input module determines the first display window according to the position information corresponding to the touch event and the attribute information of the display layer corresponding to each display window.
The first display window corresponds to the first display layer.
And S4, the input module acquires attribute information of all display layers corresponding to the current display interface.
The attribute information of each display layer comprises transparency, layer type, layer size, layer name, layer UID and the like.
In some embodiments, the input module includes an IputWindowHandle object, and the IputWindowHandle object encapsulates attribute information of all display layers therein. On this basis, the input module may obtain attribute information of all display layers corresponding to the current display interface (or referred to as a first interface) from the IputWindowHandle object.
S5, the input module judges whether the second display layer exists in all the display layers and blocks the first display layer.
Wherein, the second shows the picture layer and shelters from first demonstration picture layer and includes: the second display layer covers the first display layer, the second display layer is invisible to a user, and the second display layer is an untrusted display layer; or the second display layer covers the first display layer, the second display layer is invisible to a user, and the transparency of the second display layer is larger than or equal to the transparency threshold. Illustratively, the transparency threshold may be, for example, 0.8.
Illustratively, the input module sequentially traverses each display layer in all the display layers, and determines whether the second display layer covers the first display layer.
It should be understood that the display interface of the electronic device is formed by overlapping a plurality of display layers. Therefore, the electronic device can sequentially traverse each display layer in all the display layers from the top to the bottom through the input module, and judge whether the second display layer covers the first display layer.
In some embodiments, as shown in fig. 5, it is assumed that all display layers corresponding to the display interface include a display layer 1, a display layer 2, a display layer 3, a display layer 4, and a display layer 5; and the first display layer included in the first display window is display layer 1. On the basis, the electronic equipment can sequentially superpose the display layers from low priority to high priority according to the priorities of the display layers. For example, the electronic device sequentially superimposes the display layers from bottom to top. The display layer (e.g., display layer 1) at the bottom (or called the bottom of the stack) has the lowest priority, and the display layer (e.g., display layer 5) at the top (or called the top of the stack) has the highest priority.
For example, the electronic device may sequentially traverse all display layers from the stack top to the stack bottom through the input module, and determine whether there is a second display layer that blocks the first display layer. Referring to fig. 5, it can be seen that the input module first determines whether the display layer 5 blocks the display layer 1. If the display layer 5 does not shield the display layer 1, the input module judges whether the display layer 4 shields the display layer 1, and so on until the input module traverses to the display layer 1.
It should be noted that, if the display layer 2, the display layer 3, the display layer 4, and the display layer 5 do not block the display layer 1, the input module distributes the touch event to a display window (for example, a first display window) corresponding to the display layer 1.
In some embodiments, for each display layer, the process of determining, by the input module, whether the display layer blocks the first display layer may refer to the steps shown in fig. 6. Illustratively, the process may include: s1-1 to S1-4.
It should be noted that fig. 6 illustrates an example in which the input module determines whether the display layer 5 blocks the first display layer.
S1-1, the input module judges whether the display layer 5 covers the first display layer.
For example, the input module may determine whether the display layer 5 covers the first display layer according to the priority of the display layer 5 and the layer size. For example, when the priority of the display layer 5 is higher than that of the first display layer and the size of the display layer 5 is greater than or equal to that of the first display layer, the input module determines that the display layer 5 covers the first display layer.
It should be noted that, if the display layer 5 covers the first display layer, the input module continues to execute S1-2; if the display layer 5 does not cover the first display layer, the input module distributes the touch event to a first display window corresponding to the first display layer.
S1-2, the input module judges whether the display layer 5 is visible for the user.
For example, the input module may determine whether the display layer 5 is visible to the user according to the visibility of the display layer 5.
It should be noted that, if the display layer 5 is invisible to the user, the input module continues to execute S1-3; if the display layer 5 is visible to the user, the input module distributes the touch event to a first display window corresponding to the first display layer.
S1-3, the input module judges whether the layer type of the display layer 5 is an untrusted display layer.
For example, if the layer type of the display layer 5 is an untrusted display layer, the input module determines that the display layer 5 blocks the first display layer; and if the layer type of the display layer 5 is the trusted display layer, the input module continues to execute step S1-4.
S1-4, the input module judges whether the transparency of the display layer 5 is larger than or equal to the transparency threshold.
For example, if the transparency of the display layer 5 is greater than or equal to the transparency threshold, the input module determines that the display layer 5 blocks the first display layer; if the transparency of the display layer 5 is smaller than the transparency threshold, the input module distributes the touch event to a first display window corresponding to the first display layer.
In some embodiments, the determining, by the input module, whether the transparency of the display layer is greater than or equal to the transparency threshold includes: and the input module sequentially traverses all the display layers and judges whether the display layers comprise display layers with the transparency larger than or equal to the transparency threshold value. For example, the input module traverses the display layer 5, obtains the transparency of the display layer 5, and stores the corresponding relationship between the transparency of the display layer 5 and the layer name in a data object (or called database); then, the input module traverses the display layer 4, obtains the transparency of the display layer 4, and judges whether the transparency of the display layer 4 is larger than that of the display layer 5; if the transparency of the display layer 4 is larger than that of the display layer 5, the input module updates the transparency and the layer name stored in the data object into the transparency and the layer name of the display layer 4; and if the transparency of the display layer 4 is smaller than that of the display layer 5, the input module keeps the transparency and the layer name in the data object unchanged.
Subsequently, the input module can sequentially traverse all the display layers according to the steps, and finally the input module judges whether the transparency stored in the data object is greater than or equal to the transparency threshold value; if the transparency stored in the data object is larger than or equal to the transparency threshold, the input module determines the layer name of the display layer corresponding to the transparency stored in the data object, and determines that the first display layer is shielded by the display layer. Correspondingly, if the transparency stored in the data object is smaller than the transparency threshold, the input module determines that no display layer with the transparency larger than or equal to the transparency threshold is included in all the display layers, that is, no second display layer exists in all the display layers to shield the first display layer.
Therefore, the input module does not need to judge whether the transparency of each display layer in all the display layers is larger than or equal to the transparency threshold value, and the power consumption of the equipment is reduced.
And S6, if the second display layer exists in all the display layers and blocks the first display layer, the input module determines a first application corresponding to the second display layer.
It should be understood that if the second display layer does not exist in all the display layers and blocks the first display layer, the input module distributes the touch event to the first display window corresponding to the first display layer.
S7, the input module informs the window manager that the second display layer blocks the first display layer, and sends the application package name of the first application.
S8, the window manager informs the management and control module to close the process of the first application.
In summary, in the embodiment of the present application, when the electronic device receives a touch event, the electronic device determines whether a second display layer exists above a first display layer corresponding to the touch event and blocks the first display layer; if a second display layer exists above a first display layer corresponding to the touch event and covers the first display layer, the electronic equipment determines a first application corresponding to the second display layer; and then, the electronic equipment closes the process of the first application. Therefore, the second display layer exists above the first display layer corresponding to the determined touch event and covers the first display layer, and the electronic equipment closes the process of the first application corresponding to the second display layer in time, so that the problem of screen freezing of the electronic equipment can be solved.
In other embodiments, as shown in FIG. 7 in conjunction with FIG. 3, the freeze-screen treatment method provided by the embodiments of the present application may include A1-A8.
A1, the window manager sends the attribute information of the display layer corresponding to each display window in all the display windows of the current display interface to the input module.
Illustratively, when the electronic device loads the display interface each time, the electronic device sends attribute information of a display layer corresponding to each display window in all display windows of the current display interface to the image synthesis system through the window manager; and after receiving the attribute information of the display layer corresponding to each display window sent by the window manager, the image synthesis system sends the attribute information of the display layer corresponding to each display window to the input module. And then, the input module manages the attribute information of the display layer corresponding to each display window.
The attribute information of each display layer includes transparency, layer type, layer size, layer visibility, layer name, layer uid (user id), and the like.
A2, the input module receives the touch event input by the user on the current display interface.
A3, the input module determines a first display window according to the position information corresponding to the touch event and the attribute information of the display layer corresponding to each display window.
The first display window corresponds to the first display layer.
A4, the input module obtains attribute information of all display layers corresponding to the current display interface.
The attribute information of each display layer comprises transparency, layer type, layer size, layer name, layer UID and the like.
In some embodiments, the input module includes an IputWindowHandle object, and the IputWindowHandle object encapsulates attribute information of all display layers therein. On this basis, the input module can acquire attribute information of all display layers corresponding to the current display interface from the IputWindowHandle object.
A5, the input module judges whether the second display layer exists in all the display layers and blocks the first display layer.
Wherein, the second display layer sheltering from the first display layer comprises: the second display layer covers the first display layer, the second display layer is visible to a user, and the second display layer is an untrusted display layer; or the second display layer covers the first display layer, the second display layer is visible to a user, and the transparency of the second display layer is larger than or equal to the transparency threshold. Illustratively, the transparency threshold may be, for example, 0.8.
It should be noted that, for an example that the input module determines whether the second display layer exists in all the display layers and blocks the first display layer, reference may be made to the above embodiment, and details are not repeated here.
A6, if a second display layer exists in all the display layers and blocks the first display layer, the input module determines a first application corresponding to the second display layer.
It should be understood that if the second display layer does not exist in all the display layers and blocks the first display layer, the input module distributes the touch event to the first display window corresponding to the first display layer.
A7, the input module notifies the activity manager that the first application created ANR.
For example, the input module may send a reason for the first application to generate ANR to the activity manager to notify the activity manager that the first application generated ANR. The reason why the first application generates the ANR may be: the second display layer covers the first display layer, and the touch event cannot be distributed to the first display window.
A8, the activity manager notifies the ANR management module to pop up the ANR hint box.
For example, the ANR prompting block may be the second prompting message described in the above embodiment.
In some embodiments, the activity manager may send to the ANR management module a reason why the first application generated ANR; and the ANR management module pops up an ANR prompt box based on the ANR reason generated by the first application. Illustratively, the ANR prompt box is to prompt the user to close the process of the first application; or prompt the user to uninstall the first application.
In summary, since the freeze screen does not trigger the blast and watchdog, the conventional freeze screen monitoring is difficult to monitor the freeze screen scene, and therefore in the embodiment of the present application, when the electronic device receives a touch event, the electronic device determines whether a second display layer exists above a first display layer corresponding to the touch event and blocks the first display layer; if a second display layer exists above a first display layer corresponding to the touch event and covers the first display layer, the electronic equipment determines a first application corresponding to the second display layer; the electronic device may then pop up an ANR prompt box to prompt the user of the problem of the frozen screen caused by the first application. In this way, the electronic device may notify the user of the problem of the frozen screen caused by the first application through the ANR prompt box.
According to the embodiment, if the electronic device determines that the second display layer exists above the first display layer corresponding to the touch event and blocks the first display layer, the electronic device determines the first application corresponding to the second display layer; and then, the electronic equipment closes the process of the first application. However, the process of the electronic device closing the first application can only solve the screen freezing event. Based on this, the method of the embodiment of the present application further includes: the electronic device displays a prompt for prompting the user that the screen freeze is caused by the first application. On the basis, the prompt message also comprises a first control and a second control; the first control is used for prompting a user to close the floating window function of the first application, and the second control is used for prompting the user to uninstall the first application.
For example, as shown in fig. 8, the content of the prompt message displayed by the electronic device may be, for example: the problem of screen freezing is caused by the application of the eye protection product. As can be seen from fig. 8, the prompt message further includes: and closing the floating window control and the unloading application control. On the basis, the electronic equipment responds to the operation of the user on the control for closing the floating window, and the electronic equipment closes the floating window function applied by the eye protection device; or the electronic equipment responds to the operation of the user on the uninstalling application control, and the electronic equipment uninstalls the eye protection application.
In this embodiment, the electronic device may close the floating window function of the first application or uninstall the first application through the prompt message, so that the problem that the electronic device freezes the screen again may be avoided, and user experience is improved.
An embodiment of the present application provides an electronic device, which may include: a display screen (e.g., a touch screen), memory, and one or more processors. The display screen, memory and processor are coupled. The memory is for storing computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform the various functions or steps performed by the electronic device in the above-described method embodiments. The structure of the electronic device may refer to the structure of the electronic device 100 shown in fig. 2.
An embodiment of the present application further provides a chip system, as shown in fig. 9, the chip system 1800 includes at least one processor 1801 and at least one interface circuit 1802. The processor 1801 may be the processor 110 shown in fig. 2 in the foregoing embodiment. The interface circuit 1802 may be, for example, an interface circuit between the processor 110 and an external memory; or an interface circuit between the processor 110 and the internal memory 121.
The processor 1801 and the interface circuit 1802 may be interconnected by wires. For example, the interface circuit 1802 may be used to receive signals from other devices (e.g., a memory of an electronic device). Also for example, the interface circuit 1802 may be used to send signals to other devices, such as the processor 1801. Illustratively, the interface circuit 1802 may read instructions stored in the memory and send the instructions to the processor 1801. The instructions, when executed by the processor 1801, may cause the electronic device to perform the steps performed by the electronic device in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
Embodiments of the present application further provide a computer storage medium, where the computer storage medium includes computer instructions, and when the computer instructions are run on an electronic device, the electronic device is caused to perform various functions or steps performed by the electronic device in the foregoing method embodiments.
Embodiments of the present application further provide a computer program product, which, when running on a computer, causes the computer to execute each function or step performed by the electronic device in the above method embodiments.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A freeze-shielding treatment method is characterized by comprising the following steps:
when electronic equipment receives a touch event input by a user on a first interface, the electronic equipment determines a first display window corresponding to the touch event; a display picture in the first display window corresponds to a first display layer;
the electronic equipment acquires attribute information of each display layer in all display layers on the first interface; the attribute information includes at least one or more of the following information in combination: transparency, layer type, layer size, layer visibility and layer name;
the electronic equipment judges whether a second display layer exists in all the display layers to shield the first display layer; the second display layer covers the first display layer, so that the electronic equipment discards the touch event;
and if the second display layer exists in all the display layers to shield the first display layer, the electronic equipment closes the process of the first application corresponding to the second display layer.
2. The method of claim 1, further comprising:
the electronic device closes the floating window permission of the first application.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
the electronic device uninstalls the first application.
4. The method according to claim 3, wherein before the electronic device closes the process of the first application corresponding to the second display layer, the method further comprises:
the electronic equipment displays prompt information; the prompt message is used for prompting the user that the first application causes screen freezing; the frozen screen is a touch event which does not respond to user input by the electronic equipment;
wherein the prompt message comprises a first control; the electronic device closing the process of the first application corresponding to the second display layer includes:
and the electronic equipment responds to the operation of the user on the first control, and closes the process of the first application corresponding to the second display layer.
5. The method of claim 4, wherein the hint information further comprises a second control and a third control;
the electronic device closing the floating window permission of the first application comprises: the electronic equipment responds to the operation of a user on the second control, and closes the floating window permission of the first application;
the electronic device uninstalls the first application, including: and the electronic equipment responds to the operation of the user on the third control and unloads the first application.
6. The method of claim 1, wherein the second display layer obscuring the first display layer comprises:
the second display layer covers the first display layer, the second display layer is a display layer invisible to a user, and the second display layer is an untrusted layer; the untrusted layer is a display layer which is preset by the electronic equipment and is impenetrable to the touch event; alternatively, the first and second electrodes may be,
the second display layer covers the first display layer, the second display layer is a display layer which is invisible to a user, and the transparency of the second display layer is larger than or equal to the transparency threshold.
7. The method according to claim 1, wherein the determining, by the electronic device, whether a second display layer exists in all display layers to block the first display layer comprises:
and the electronic equipment traverses each display layer with higher priority than the priority of the first display layer according to the priorities of all the display layers, and judges whether the second display layer exists in all the display layers to shield the first display layer.
8. The method of claim 1, further comprising:
and if the second display layer does not exist in all the display layers to shield the first display layer, the electronic equipment distributes the touch event to the first display window and executes a touch instruction of the touch event.
9. An electronic device, comprising: a touch screen, memory, and one or more processors; the touch screen, the memory and the processor are coupled; the memory for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-8.
10. A computer-readable storage medium comprising computer instructions; the computer instructions, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-8.
CN202210798657.1A 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium Active CN114879896B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210798657.1A CN114879896B (en) 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210798657.1A CN114879896B (en) 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114879896A true CN114879896A (en) 2022-08-09
CN114879896B CN114879896B (en) 2023-05-12

Family

ID=82683131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210798657.1A Active CN114879896B (en) 2022-07-08 2022-07-08 Frozen screen processing method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114879896B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110246916A1 (en) * 2010-04-02 2011-10-06 Nokia Corporation Methods and apparatuses for providing an enhanced user interface
US20140372938A1 (en) * 2013-06-14 2014-12-18 Lg Electronics Inc. Mobile terminal and method of controlling the mobile terminal
CN110489043A (en) * 2019-07-31 2019-11-22 华为技术有限公司 A kind of management method and relevant apparatus of suspension windows
CN110703961A (en) * 2019-08-26 2020-01-17 北京达佳互联信息技术有限公司 Cover layer display method and device, electronic equipment and storage medium
CN111061410A (en) * 2018-10-16 2020-04-24 华为技术有限公司 Screen freezing processing method and terminal
CN111273841A (en) * 2020-02-11 2020-06-12 天津车之家数据信息技术有限公司 Page processing method and mobile terminal
CN111309429A (en) * 2020-02-26 2020-06-19 维沃移动通信有限公司 Display method and electronic equipment
CN112835472A (en) * 2021-01-22 2021-05-25 青岛海信移动通信技术股份有限公司 Communication terminal and display method
CN113282361A (en) * 2021-04-21 2021-08-20 荣耀终端有限公司 Window processing method and electronic equipment
CN113448442A (en) * 2021-07-12 2021-09-28 交互未来(北京)科技有限公司 Large screen false touch prevention method and device, storage medium and equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110246916A1 (en) * 2010-04-02 2011-10-06 Nokia Corporation Methods and apparatuses for providing an enhanced user interface
US20140372938A1 (en) * 2013-06-14 2014-12-18 Lg Electronics Inc. Mobile terminal and method of controlling the mobile terminal
CN111061410A (en) * 2018-10-16 2020-04-24 华为技术有限公司 Screen freezing processing method and terminal
CN110489043A (en) * 2019-07-31 2019-11-22 华为技术有限公司 A kind of management method and relevant apparatus of suspension windows
CN110703961A (en) * 2019-08-26 2020-01-17 北京达佳互联信息技术有限公司 Cover layer display method and device, electronic equipment and storage medium
CN111273841A (en) * 2020-02-11 2020-06-12 天津车之家数据信息技术有限公司 Page processing method and mobile terminal
CN111309429A (en) * 2020-02-26 2020-06-19 维沃移动通信有限公司 Display method and electronic equipment
CN112835472A (en) * 2021-01-22 2021-05-25 青岛海信移动通信技术股份有限公司 Communication terminal and display method
CN113282361A (en) * 2021-04-21 2021-08-20 荣耀终端有限公司 Window processing method and electronic equipment
CN113448442A (en) * 2021-07-12 2021-09-28 交互未来(北京)科技有限公司 Large screen false touch prevention method and device, storage medium and equipment

Also Published As

Publication number Publication date
CN114879896B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
CN112269527B (en) Application interface generation method and related device
CN111543042B (en) Notification message processing method and electronic equipment
US11930130B2 (en) Screenshot generating method, control method, and electronic device
US20230418696A1 (en) Method for performing drawing operation by application and electronic device
CN112767231B (en) Layer composition method and device
CN116501210B (en) Display method, electronic equipment and storage medium
CN113132526B (en) Page drawing method and related device
CN111263002B (en) Display method and electronic equipment
CN112732434A (en) Application management method and device
CN115801943B (en) Display method, electronic device and storage medium
CN115017534A (en) File processing authority control method and device and storage medium
CN115640083A (en) Screen refreshing method and equipment capable of improving dynamic performance
CN115904297A (en) Screen display detection method, electronic device and storage medium
CN115145447A (en) Window is displayed on method and related apparatus
CN111381996A (en) Memory exception handling method and device
CN114879896B (en) Frozen screen processing method, electronic equipment and storage medium
WO2023005751A1 (en) Rendering method and electronic device
CN113805772B (en) Dynamic response method, electronic equipment and storage medium
CN113805771B (en) Notification reminding method, terminal equipment and computer readable storage medium
CN116672707B (en) Method and electronic device for generating game prediction frame
CN116055715B (en) Scheduling method of coder and decoder and electronic equipment
CN116055738B (en) Video compression method and electronic equipment
WO2024109481A1 (en) Window control method and electronic device
CN117009023B (en) Method for displaying notification information and related device
CN114816311B (en) Screen movement method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant