CN116028148B - Interface processing method and device and electronic equipment - Google Patents

Interface processing method and device and electronic equipment Download PDF

Info

Publication number
CN116028148B
CN116028148B CN202211017564.7A CN202211017564A CN116028148B CN 116028148 B CN116028148 B CN 116028148B CN 202211017564 A CN202211017564 A CN 202211017564A CN 116028148 B CN116028148 B CN 116028148B
Authority
CN
China
Prior art keywords
application
interface
displayed
party
pulling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211017564.7A
Other languages
Chinese (zh)
Other versions
CN116028148A (en
Inventor
刘雅坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211017564.7A priority Critical patent/CN116028148B/en
Publication of CN116028148A publication Critical patent/CN116028148A/en
Application granted granted Critical
Publication of CN116028148B publication Critical patent/CN116028148B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an interface processing method, an interface processing device and electronic equipment, which are applicable to the technical field of data processing, and the method comprises the following steps: starting destroying a current display interface of the target application in response to an application switching operation of switching from the target application to the first application; in the process of destroying the current display interface, identifying a pulling-up party of the application to be displayed, to which the interface to be displayed belongs; the method comprises the steps that a pulling-up party of an application to be displayed is an application program for pulling up the application to be displayed; and when the pulling-up party of the application to be displayed contains the target application, canceling the display of the interface to be displayed. The embodiment of the application can improve the interface flash condition.

Description

Interface processing method and device and electronic equipment
Technical Field
The present disclosure relates to the field of data processing, and in particular, to an interface processing method, an interface processing device, and an electronic device.
Background
In practical application, it is found that when the electronic device returns to the desktop from the application program, the interface flash condition sometimes occurs, that is, the electronic device suddenly and automatically displays the application program interface. For example, the user operates the electronic device to return to the desktop from application A. The electronic device suddenly displays the relevant interface of the application program a in the process of returning to the desktop from the application program a. Interface flash severely affects the user experience.
Disclosure of Invention
In view of this, the embodiments of the present application provide an interface processing method, an apparatus, and an electronic device, which can improve the problem of interface flash.
A first aspect of an embodiment of the present application provides an interface processing method, including:
and starting to destroy the current display interface of the target application in response to an application switching operation of switching from the target application to the first application. And in the process of destroying the current display interface, identifying the pulling-up party of the application to be displayed, to which the interface to be displayed belongs. And when the pulling-up party of the application to be displayed contains the target application, canceling the display of the interface to be displayed. The pulling-up party of the application to be displayed is an application program for pulling up the application to be displayed.
And actively detecting a pulling-up party of the interface to be displayed, and intercepting the display of the interface to be displayed in time when the pulling-up party comprises an application program of the interface being destroyed. The interface flash condition caused by pulling up the application program interface by the background application in the process of switching the application program of the electronic equipment can be avoided, so that the occurrence probability of the interface flash condition is reduced, and the interface flash condition is improved.
As one embodiment of the present application, the following 3 optional application switching operation modes may be provided:
Operation mode 1: when detecting an operation of a specific operation gesture performed by a user, the electronic device determines that an application switching operation of the user is detected.
Operation mode 2: the electronic device determines that the application switching operation of the user is detected after detecting that the user presses a physical or virtual key of the application switching.
Operation mode 3: when the electronic device receives an externally transmitted application switching instruction, the electronic device determines that the application switching operation of the user is detected.
In a first possible implementation manner of the first aspect, the first application is a desktop, and the application switching operation is a return desktop operation.
In the embodiment of the application, the scene returned from the application program to the desktop is optimized. The interface flash condition caused by pulling up the application program interface by the background application when the electronic equipment returns to the desktop from the application program can be avoided, so that the occurrence probability of the interface flash condition is reduced, and the interface flash condition is improved.
In a second possible implementation manner of the first aspect, identifying a pull-up party of the application to be displayed includes:
a direct pull-up party and/or an initial pull-up party of an application to be displayed is identified. And when the identified direct pulling-up party and/or the initial pulling-up party are/is the target application, judging that the pulling-up party of the application to be displayed contains the target application. The direct pull-up party is an application program for directly pulling up an application to be displayed, and the initial pull-up party is: a first application in a relationship that includes a plurality of applications of the application to be displayed that are successively pulled up.
In the embodiment of the present application, the pulling-up party includes both cases of a direct pulling-up party and an initial pulling-up party. The method comprises the steps of identifying the condition that a target application is directly pulled up and is used as an initial pulling-up party to continuously pull up an interface to be displayed for a plurality of times, and judging the pulling-up party when the target application is determined to be the direct pulling-up party and/or the initial pulling-up party of the interface to be displayed. Therefore, the embodiment of the application can more comprehensively and accurately identify the interface to be displayed pulled up by the target application and accurately intercept the interface. Therefore, the optimization effect of the electronic equipment on the interface flash condition in the process of switching the application program is improved.
On the basis of the second possible implementation manner of the first aspect, as a third possible implementation manner of the first aspect, the identifying operation of the initial pull-up party includes:
a pull-up relationship is obtained that contains the application to be displayed. The pulling relation is the relation data which is recorded in the process of destroying the current display interface and is pulled up among all application programs.
And determining a first application program in the pull-up relation, and obtaining an initial pull-up party of the application to be displayed.
In the embodiment of the application, the electronic device may record the relationship data that occurs when the application program is switched. By inquiring the pull-up relation, the electronic device can quickly and accurately determine the real pull-up party (namely the initial pull-up party) of the interface to be displayed. The accuracy and the efficiency of the identification of the pulling-up party are improved.
In a fourth possible implementation manner of the first aspect, identifying a pull-up party of the application to be displayed includes:
a direct pull-up party and/or an indirect pull-up party of the application to be displayed is identified. The direct pull-up party is an application program for directly pulling up an application to be displayed, and the indirect pull-up party is: in a relationship in which a plurality of application programs including an application to be displayed are successively pulled up, the order of being pulled up precedes the application program of the application to be displayed.
When the identified direct pull-up party and/or indirect pull-up party contains the target application, it is determined that the pull-up party of the application to be displayed contains the target application.
In the embodiment of the present application, the pulling-up party includes both cases of a direct pulling-up party and an indirect pulling-up party. And in the relation of direct pulling and continuous multiple pulling of the target application, the situation of indirectly pulling the interface to be displayed is identified, and when the target application is determined to be a direct pulling party and/or an indirect pulling party of the interface to be displayed, the pulling party is judged. Therefore, the embodiment of the application can more comprehensively and accurately identify the interface to be displayed pulled up by the target application and accurately intercept the interface. Therefore, the optimization effect of the electronic equipment on the interface flash condition in the process of switching the application program is improved.
In a fifth possible implementation manner of the first aspect, cancelling the display of the interface to be displayed includes:
and deleting the display task of the interface to be displayed. Or intercepting the display task when the execution of the display task of the interface to be displayed is detected.
In the embodiment of the present application, in order to cancel the to-be-displayed interface, the corresponding display task may be selected for dynamic deletion. Optionally, task interception may be performed while the display task is being executed. The display task is actively deleted, so that the execution of the display task can be avoided, and the display of the interface to be displayed is better avoided.
In a sixth possible implementation manner of the first aspect, deleting a display task of the interface to be displayed includes:
and reading an interface list to be displayed, and deleting the display task of the interface to be displayed from the interface list.
In the embodiment of the application, the electronic device stores the interface which is drawn and is waiting for display in an interface list mode. Therefore, the display tasks of the interface to be displayed in the interface list are deleted, and the deletion of the display tasks can be well realized.
In a seventh possible implementation manner of the first aspect, a status flag bit is set in the electronic device, and the status flag bit is set when the destruction of the current display interface of the target application is started. And after the destruction of the current display interface is completed, recovering the status flag bit. Correspondingly, before the state zone bit is not recovered, the state zone bit can be regarded as incomplete destruction of the target application interface, and at the moment, the operations of identifying the pulling-up party and canceling the display can be executed. When the status flag is restored, the operations of identifying the pull-up party and canceling the display may be stopped.
As an embodiment of the present application, when the gui of the target application is to be displayed, the application to be displayed and the pulling-up party of the application to be displayed are both target applications. In this case, the operation of canceling the display of the interface to be displayed may not be performed in the embodiment of the present application, but the interface to be displayed may be normally displayed.
A second aspect of an embodiment of the present application provides an interface processing apparatus, including:
the destroying module is used for responding to the application switching operation of switching from the target application to the first application and starting to destroy the current display interface of the target application.
And the pulling-up party identification module is used for identifying the pulling-up party of the application to be displayed, to which the interface to be displayed belongs, in the process of destroying the current display interface. The pulling-up party of the application to be displayed is an application program for pulling up the application to be displayed.
And the cancellation display module is used for canceling the display of the interface to be displayed when the pulling-up party of the application to be displayed contains the target application.
In a third aspect, embodiments of the present application provide an electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing a method according to any one of the first aspects described above when the computer program is executed by the processor.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements a method as in any of the first aspects described above.
In a fifth aspect, embodiments of the present application provide a chip system, the chip system including a processor, the processor being coupled to a memory, the processor executing a computer program stored in the memory to implement a method as described in any one of the first aspects. The chip system can be a single chip or a chip module composed of a plurality of chips.
In a sixth aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform the method of any one of the first aspects.
It will be appreciated that the advantages of the second to sixth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
FIG. 1a is a schematic diagram of a scene without interface flash according to an embodiment of the present application;
FIG. 1b is a schematic view of a scenario in which interface flash occurs according to an embodiment of the present application;
FIG. 1c is a schematic diagram of another scenario in which interface flash occurs according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of an interface process according to an embodiment of the present disclosure;
fig. 3 is a timing chart of returning an electronic device to a desktop according to an embodiment of the present application;
fig. 4 is a schematic view of a scenario of a pull-up situation provided in an embodiment of the present application;
FIG. 5 is a schematic flow chart of an interface process according to an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart of an interface process according to an embodiment of the present disclosure;
FIG. 7 is a timing diagram of an interface process according to an embodiment of the present disclosure;
FIG. 8 is a timing diagram of another interface process according to an embodiment of the present disclosure;
FIG. 9 is a comparison timing chart of several application scenarios according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an interface processing apparatus according to an embodiment of the present application;
fig. 11a is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
FIG. 11b is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The interface processing method provided by the embodiment of the application can be applied to electronic equipment such as mobile phones, tablet computers and wearable equipment, and the electronic equipment is the execution main body of the interface processing method provided by the embodiment of the application, and the embodiment of the application does not limit the specific type of the electronic equipment.
In practical applications, it is found that when the electronic device switches from an application program to other application programs (such as returning to a desktop), interface flash sometimes occurs, that is, the electronic device suddenly and automatically displays an application program interface.
By way of example, assume that a user, while using a video player, operates an electronic device to exit the video player to a desktop. At this time, the electronic device plays the desktop and returns the animation until the animation is successfully returned to the desktop after playing. Reference may be made to (1), (2), (3) and (4) in fig. 1a, which are illustrative of the desktop return animation when returning from the video player to the desktop. And also assume that the user operates the electronic device to open the file manager after returning to the desktop. At this time, the electronic device may start the file manager and display an interface of the file manager. Reference may be made to (5) in fig. 1a, which is an interface to the file manager. The whole content of fig. 1a is that an application program which has normal interface flash returns to the desktop.
Referring to FIG. 1b, a schematic view of a scenario in which interface flash occurs on the basis of FIG. 1 a. In the scene, after the electronic equipment returns to the desktop normally, a user operates the electronic equipment to open the file manager. Reference may be made to (4), (6) and (5) in fig. 1b at this time. The difference from fig. 1a is that: the electronic device suddenly displays the interface of the video player (fig. 1b (6)), and then opens the interface of the file manager. I.e. the interface of the video player that is not opened by the user is displayed by itself before the interface of the file manager.
Referring to FIG. 1c, there is shown another scene diagram of the interface flash on the basis of FIG. 1 a. In the scene, after the electronic equipment returns to the desktop normally, a user operates the electronic equipment to open the file manager. Reference may be made to (4), (7) and (5) in fig. 1c at this time. The difference from fig. 1a is that: the electronic device suddenly first displays the interface of the camera (fig. 1c (7)), and then opens the interface of the file manager. I.e. the interface of the camera that is not opened by the user is displayed by itself before the interface of the file manager.
For another example, in other scenarios where interface flash occurs, the electronic device is in the process of playing the desktop return animation. It may also happen that the interface of the video player is suddenly displayed by itself, or that other application interfaces are displayed by itself. And then, continuing playing the desktop return animation, or directly returning to the desktop.
In view of the interface flash situation, the inventor researches and discovers that when the electronic device switches from the application program to other application programs, the interface of the application program can be synchronously destroyed. Since the interface destruction process may continue for a period of time, the applications destroyed during this period of time remain in the foreground. At this time, if the application program pulls up the interface of itself or other application programs, the electronic device will display the pulled-up interface, so that the interface flash situation occurs. The pulled interfaces flash in the process of switching the application program to other application programs, are opened by a non-user, and do not accord with the interface which is expected to be displayed by the switching operation of the user.
The interface flash can influence the normal use of the electronic equipment by the user, and the use experience of the user is seriously reduced. For example, it may cause a user to malfunction as to the electronic device by mistake, or cause the user to normally return to the desktop to be affected, or to be affected when using other applications after returning to the desktop, or the like. There is a need for a solution that improves the appearance of interface flash when switching applications in an electronic device.
It should be appreciated from the above analysis that interface flash is caused by the backstage application pulling itself or other application interfaces during the time that the application interfaces are not destroyed. In practical application, for a scene returned to the desktop, the action of destroying the interface is synchronous with the action of starting playing of the animation returned to the desktop. The time point when the interface destruction is completed may be the same as or different from the time point when the desktop returns to the animation playing end and returns to the desktop successfully. Thus, in the embodiment of the present application, when the electronic device returns to the desktop from the application program, the electronic device displays the application program interface opened by the non-user, including: the electronic device automatically displays the non-user-opened application program interface in the process of returning from the application program to the desktop, and automatically displays the non-user-opened application program interface after returning to the desktop.
In order to avoid that when the electronic equipment returns to the desktop from the application program, the application program interface is pulled up by the background application, so that the electronic equipment automatically displays the application program interface, and the interface flash problem condition is improved. In the process of displaying the interface of the application program, the electronic device responds when detecting the application switching operation (such as returning to the desktop operation), namely, the electronic device starts to switch the application program and synchronously starts to destroy the interface of the application program. In the process of destroying the application program interface, if it is detected that the application program interface waiting for display is drawn (the application program interface waiting for display is simply referred to as an interface waiting for display herein), a pull-up party (callinguid) of the interface waiting for display is checked. And when the pulling-up party of the interface to be displayed is the application program which is backing up, canceling the display of the interface to be displayed. Because of the application program interface that is pulled up, it is not the application program interface that the user opens by himself. Therefore, by actively detecting the pulling-up party of the interface to be displayed and intercepting the display of the interface to be displayed in time when the pulling-up party is an application program of a background required by a user, the interface flash condition caused by pulling up the application program interface by the background application when the electronic equipment switches the application program can be avoided, the interface flash condition occurrence probability is reduced, and the interface flash condition is improved.
Some concepts that may be referred to in embodiments of the present application are described herein:
interface: in this embodiment of the present application, the interface of the application is activity (activity) of the application. Sometimes also referred to as pages or windows, etc.
Destroying interface: the method comprises the steps of deleting activity related information, and performing operations such as memory resource recovery and the like.
The desktop returns animation: refer to animation or dynamic effects played by the electronic device in returning from the application to the desktop. Also referred to as desktop return effects or desktop return animation effects, etc.
Target application, first application, pulled application and application to be displayed: in the embodiment of the present application, in a scenario where the electronic device is switched from a certain application program to other application programs, an application program before switching (i.e., an application program in the background) is referred to as a target application. The application to which switching is required (i.e. the switched application) is called the first application. In a scenario where the application returns to the desktop, the first application is the desktop. An application that is directly or indirectly pulled by a target application is referred to as a pulled application. The application program to which the interface to be displayed belongs is called an application to be displayed. Wherein, when the target application pulls up itself, the pulled up application may also be the target application itself.
The application programs in the embodiments of the present application may be conventional application programs installed through an installation package, or may be other types of application programs such as applet programs or web page application programs (e.g. HTML5 application programs).
Pull-up, pull-up side and pulled-up side: in the embodiment of the present application, for two different application programs, pull-up refers to that one application program starts an interface of another application program by means of associated starting or calling. For an application to pull itself, pull refers to the application starting its own interface. Since a certain interface needs to be started first, in this embodiment of the present application, the pulling-up side of the application B may be referred to as application a, or the pulling-up side of the interface of the application B may be referred to as application a, which have the same meaning. The specific manner of pulling, the specific type and content of the interface to be started, etc. are not limited herein.
The pulling-up party refers to an application program that actively performs a pulling-up operation in an application program pulling-up scene, and the pulling-up party refers to a pulled-up application program. For the scenario where an application pulls itself, the application is both a pulling party and a pulled party.
For example, when the application a pulls up another application B, the application a initiates the interface of the application B by means of an associated initiation or call. At this time, the pulling-up party is application a, and the pulling-up party is application B.
Direct pull-up side, indirect pull-up side and initial pull-up side: the direct pull-up side of an application refers to an application that directly pulls up the application interface. The indirect pulling-up side and the initial pulling-up side are mainly used for the case of continuously pulling up a plurality of application programs (continuous pulling-up needs to include actions of at least two pulling-up). The initial pulling-up party refers to an application program of which the first is a pulling-up party in a relation of continuous pulling-up. That is, among the applications that are continuously pulled up, all the applications that are pulled up are the first application in the continuously pulled up relationship. For a single application, an indirect pull-up party refers to an application that is pulled up earlier than itself in a continuous pull-up relationship. Thus, in a scenario where an application continues to pull, an indirect pull includes an initial pull and a direct pull.
Describing an example, assume that the continuous pull relationship between the target application, application a, application b, and application c is: the target application acts as the first pull-up party, pulling up application a. Application a is pulled up and then serves as a new pull-up party, pulling up application b. Application b is pulled up and then serves as a new pull-up party, pulling up application c. At this time, the application a, the application b, and the application c are the target application, the application a, and the application b, respectively, as the direct pulling-up party, and the initial pulling-up party is the target application. For application b, its indirect pull includes the target application and application a. For application c, its indirect pull includes the target application, application a, and application b.
The application scenario of the embodiment of the present application is described herein:
the embodiment of the application can be suitable for switching the electronic device from any application program with a display interface to other application programs, such as a scene of returning to a desktop from an application program display interface. On this basis, the embodiment of the application does not limit the application type of the target application and the like.
In some application scenarios, the target application may be some application program built into the electronic device. Such as phone applications, text messaging applications, browser applications, audio and video player applications, etc. built into the electronic device. The target application may also be a third party application installed by the user in the electronic device, etc. Such as some instant messaging applications, browser applications, video applications, and the like.
In order to illustrate the technical solution described in the present application, the following description will take an example of returning from an application program to a desktop, by way of specific embodiments.
Fig. 2 shows a schematic implementation flow chart of an interface processing method according to an embodiment of the present application, which is described in detail below:
s201, when the electronic equipment detects that the user returns to the desktop operation in the process of displaying the interface of the target application, playing the desktop return animation and starting to destroy the interface of the target application.
A user often has a scene of returning to the desktop during the use of an application. At this time, the application may be switched to the background operation or the application may be ended. For example, suppose a user wants to chat using an instant messaging application while listening to songs using a music player. At this time, the user can return to the desktop from the music player, namely, switch the music player to background operation, and then open the instant messaging application from the desktop. For another example, assume that a user wants to turn off a video player after watching a video using the video player. At which point the user may choose to exit the video player directly. After the user detects the exit operation of the user, the electronic device may end the application program and return to the desktop. The electronic equipment destroys the interface of the application program in the process of returning to the desktop no matter the application program is switched to the background operation or the application program is ended. The interface displayed before the destruction of the target application can be called a current display interface.
The embodiment of the application does not limit the operation mode of returning from the application program to the desktop in the electronic equipment too much, and can be specifically determined according to the actual setting of the electronic equipment. As an alternative embodiment of the present application, the following several alternative operation modes may be provided to implement the operation of returning from an application to a desktop in an electronic device:
Operation mode 1: preset by a technician, or by the user selecting or setting a specific operation gesture to return to the desktop. When detecting that the user performs the operation of the specific operation gesture, the electronic device determines that the user returns to the desktop operation is detected.
The types and numbers of gestures included in a particular operation gesture are not limited herein, including, but not limited to, for example: a swipe up gesture starting from the bottom of the screen, a slide-down gesture starting from the side of the screen, etc.
Operation mode 2: the electronic device contains physical or virtual keys with a function of returning to the desktop, and the user returns to the desktop by operating the keys. For example, when a physical or virtual triple-key navigation is provided in the electronic device, the user may press a home key therein, and may implement a one-key return to the desktop. Or the return key in the double-click triple-key navigation realizes one-key return to the desktop. Or when the user is in the main interface of the application program, the user presses the return key, and the operation of returning to the desktop can be realized. Therefore, in operation mode 2, the electronic device can determine that the user's return desktop operation is detected after detecting that the user presses a physical or virtual key of the return desktop.
Operation mode 3: the user can control the electronic device in a wired or wireless mode and send a desktop return instruction to the electronic device. At this time, when receiving the externally transmitted return desktop command, the electronic device may determine that the return desktop operation of the user is detected. For example, the user may control the electronic device using a bluetooth remote control and press a key back to the desktop in the bluetooth remote control. And the Bluetooth remote controller generates a return desktop instruction and sends the return desktop instruction to the electronic equipment. And when the electronic equipment receives the desktop returning instruction, the electronic equipment determines that the desktop returning operation of the user is detected, and starts to return to the desktop.
Similarly, the above operation modes 1 to 3 may also be applicable to the case that the target application is switched to the application program other than the desktop, which is not described herein.
In practical application, the electronic device may support any one or more of the above operation modes 1 to 3, so as to implement an operation of returning from the application program to the desktop. Other operation modes can be added on the basis, and the operation modes are not limited in the specification.
As an alternative embodiment of the present application, in an embodiment of the present application, the electronic device may support the operation manner in 3 above at the same time. At this time, when the electronic device detects the operation of the specific operation gesture, detects that the user presses the key of returning to the desktop, or receives the instruction of returning to the desktop, it may be determined that the operation of returning to the desktop by the user is detected, so that the operation of S201 is triggered.
In the process of displaying the interface of the target application, if the operation of returning to the desktop by the user is detected, the electronic equipment starts to return to the desktop and synchronously starts to destroy the interface of the target application. The electronic device returns to the desktop, and two alternative modes exist in the embodiment of the application. One way to play the desktop return animation is to return to the desktop, and the other way is not to play the desktop return animation. At this time S201 can be summarized as: and when the electronic equipment detects the operation of returning to the desktop by the user in the process of displaying the interface of the target application, starting to return to the desktop and starting to destroy the interface of the target application. The method can be specifically determined according to the actual situation of the electronic equipment.
As an alternative embodiment of the present application, the return to desktop operation may also be non-user actively triggered. For example, the electronic device may automatically trigger a return to desktop operation by setting a timed task or the like. At this time, the embodiment of the present application may also perform the operation of S201, except that the detected return desktop operation is not necessarily triggered by the user.
Aiming at the situation that the electronic equipment returns to the desktop in a way of playing the desktop return animation, the embodiment of the application does not limit the content of the desktop return animation too much. For example, in an alternative embodiment, a fixed one or more animations or animated content may be preset by the technician as desktop return animated content. And when returning to the desktop, playing the fixed one or more animations or dynamic content. In other alternative embodiments, the generation rule of the desktop return animation may also be preset by a technician. When the operation of returning the desktop by the user is detected, corresponding desktop return animation is generated according to the generation rule and played. For example, in some embodiments, the generation rules may be set as: and gradually shrinking the interface of the target application until the interface of the target application disappears, and displaying the desktop.
As a specific embodiment of returning to the desktop in the present application, in the embodiment of the present application, the above operation mode 1 is adopted to implement returning to the desktop from the application program. Wherein the specific operation gesture is: a swipe gesture starting from the bottom of the screen. Referring to fig. 3, a timing diagram of a return desktop without an interface destruction process according to an embodiment of the present application is described in detail below:
s301, when the electronic device detects an operation of pressing down the bottom position of the screen by the user, playing a latest task (license) animation is started.
S302, when the electronic equipment detects the gesture operation of the user starting from the bottom of the screen, playing of a desktop return (Home) animation is started.
S303, returning the desktop to the end of the animation playing.
S304, calling a finish method of the latest task animation to finish the latest task animation.
S305, a task front (MoveTaskTofront) pulls a desktop launcher (launcher) to the foreground.
S306, the latest task animation playing is finished.
In the embodiment of the application, the operation of destroying the application program interface is not excessively limited, and the operation can be specifically determined according to the actual situation of the electronic equipment. Wherein the destruction may also be referred to as closing or ending.
S202, detecting whether an interface to be displayed exists in the process of destroying the interface of the target application.
Whether or not an application program interface to be displayed (i.e., an interface to be displayed) is detected, in the process of destroying the interface of the target application, the embodiment of the application may execute the detection of S202 multiple times until the interface of the target application is destroyed. If there is an interface to be displayed, the operation of S203 is performed at the same time.
In practical applications, the destruction of the application program interface requires a certain period of time, and during the destruction, the electronic device may take the following actions:
possible action 1: the electronic device plays the desktop return animation.
Possible action 2: in response to a user operation to open a new application, the electronic device opens the new application.
Possible action 3: the target application pulls up itself or other applications (i.e., the pulled up application).
In possible action 3, two pull-up situations are involved:
pull-up case 1: the target application directly pulls itself or other application programs, which belong to single pull at this time, and only one of the pulled party and the pulled party.
Pull-up case 2: the target application pulls up one application as an initial pull-up party and the application continues to pull up the next application as a new pull-up party, thus creating a continuous pull-up situation. Wherein the number of continuous pulling-up is determined according to the actual situation. In this case, the pulling-up party and the pulled-up party each include a plurality of pulling-up parties, and the target application is the first pulling-up party. For the application program that is eventually pulled up, the target application, although not its direct pull-up party, belongs to its initial pull-up party.
Pull-up cases 1 and 2 are illustrated by way of example assuming that there are target applications, application a, application b, application c, and application e in the electronic device. Referring to fig. 4, a schematic diagram of a pull-up scenario is provided in an embodiment of the present application. For pull-up case 1, the target application acts as a pull-up party, pulling up application e during the destruction of the target application interface by the electronic device. For pull-up case 2, the target application pulls up application program a as the initial pull-up party during the destruction of the target application interface by the electronic device. Application a is pulled up and then serves as a new pull-up party, pulling up application b. Application b is pulled up and then serves as a new pull-up party, pulling up application c. During this period, the target application, application a and application b all belong to the pulling-up party, and application a, application b and application c all belong to the pulled-up application, and their initial pulling-up party is the target application. It will also be appreciated that for a pulled application: the application a, the application b, and the application c include the target application in either the direct pull-up side or the indirect pull-up side.
In practice it is found that any of the above possible actions may occur at any point in time during the destruction of the target application interface. Wherein when possible action 3 occurs: when the target application pulls up itself or other application programs, the electronic device may draw and display an interface of the pulled-up application. At this time, if the electronic device is executing the possible actions 1 or 2 at the same time, the situation that the interface of the application is pulled up is suddenly displayed in the process of executing the possible actions 1 or 2 may occur, that is, the interface flash situation occurs. For example, when possible action 3 is performed while possible action 1 is performed, an interface flash situation may occur during the process of playing the desktop return animation (i.e., during the process of returning to the desktop). When the possible action 3 is executed and the possible action 2 is executed, the situation that the interface flash occurs first and then the new application program is normally opened when the user opens the new application program may occur after the electronic device returns to the desktop.
In order to avoid interface flash, in the process of destroying the interface of the target application, the electronic device continuously detects whether the application program interface to be displayed exists. If the interface to be displayed exists, continuing to identify whether the application program to which the interface to be displayed belongs is a pulled application.
S203, in the process of destroying the interface of the target application, if the interface to be displayed is detected, detecting whether the target application is contained or not by a pulling-up party of the application program to which the interface to be displayed belongs.
In practical application, for an interface to be displayed in an electronic device, interface drawing is first performed. In some embodiments, after the interface is drawn, the electronic device may place the interface to be displayed in the list of interfaces to be displayed (also referred to as the list of interfaces to be displayed), and then sequentially display the interfaces. In the embodiment of the application, the detection of whether the interface to be displayed exists can be realized by inquiring whether the interface to be displayed exists in the list or the list of the interface to be displayed.
As is apparent from the above description of the pull-up case 1 and the pull-up case 2, an application may be pulled up in the electronic device at a single time or multiple times in succession. When an application is pulled up multiple times in succession, the direct pull of the final pulled up application may not be the target application. Thus, in the embodiment of the present application, the pulling-up party of the application includes at least two situations of a direct pulling-up party and an indirect pulling-up party of the application. In which case the initial pull-up party is included for the indirect pull-up party, i.e. the first pull-up party in the continuous pull-up of pull-up situation 2. Based on this, in the embodiment of the present application, the detection of whether the pull-up side of the application program includes the target application may be implemented using any one or more of the following detection schemes:
Detection scheme 1: it is detected whether the direct pull of the application is the target application.
Detection scheme 2: it is detected whether the initial pull-up side of the application is the target application.
Detection scheme 3: it is detected whether the target application is included in the indirect pull-up side of the application.
For the single pull-up scene in the pull-up case 1, the direct pull-up party and the initial pull-up party of the application to be displayed are the same as the target application. The detection schemes 1 and 2 can be realized by reading the application to be displayed in a direct pull-up mode, and the results are the same. For example, whether the information of the package name of the direct pulling-up party of the application to be displayed is consistent with the information of the package name of the target application or not can be identified, so that whether the direct pulling-up party of the application to be displayed is the target application or not can be judged.
Whereas in a scenario in which the application is pulled up a plurality of times in succession in view of the pull-up situation 2, it is highly likely that the pull-up relationship of the target application and the application to be displayed is not the direct pull-up party and the pulled-up party. Therefore, in practical application, it is difficult to realize detection schemes 2 and 3 by reading the mode of directly pulling up the application to be displayed. Based on the actual situation, in the process of destroying the interface of the target application, the electronic device in the embodiment of the application records the pull-up relationship between the application programs. Taking the embodiment shown in fig. 4 as an example, for the pull-up case 2 in fig. 4, the embodiment of the present application may record the pull-up relationship among the target application, the application a, the application b, and the application c. The embodiment of the application does not limit the record mode of the pull-up relation too much. For example, in some embodiments, the pull-up relationship may be recorded in a relationship sequence, where the resulting relationship sequence may also be referred to as a pull-up sequence, a start sequence, a jump sequence, or the like.
On the basis of recording the pull-up relationship between the application programs, S203 may query the pull-up relationship corresponding to the application to be displayed according to the pull-up relationship when detecting the interface to be displayed, and determine the initial pull-up party (i.e., determine the first pull-up party in the pull-up relationship). If the initial pull-up party is the target application, it may be determined that the pull-up party of the application to be displayed includes the target application. Thereby realizing detection scheme 2. For the detection scheme 3, all the pull-up relations corresponding to the application to be displayed can be queried according to the pull-up relations, and when the pull-up party is the target application, the pull-up party of the application to be displayed can be judged to contain the target application.
In practical application, any one or more of the above detection schemes may be selected to implement the pull-up party detection in S203. For example, in some alternative embodiments, detection schemes 1 and 2 may be selected simultaneously for pull-up detection. At this point the electronic device may detect the direct pull-up and the initial pull-up of the application to be displayed. If any one or both of the direct pull-up party and the initial pull-up party are target applications, it may be determined that the pull-up party of the application to be displayed includes the target application. Accordingly, referring to fig. 5, S203 may be replaced with: S501-S502.
S501, in the process of destroying the interface of the target application, if the interface to be displayed is detected, identifying a direct pulling-up party and an initial pulling-up party of the application program to which the interface to be displayed belongs.
S502, when the direct pulling-up party and/or the initial pulling-up party is the target application, determining that the pulling-up party of the application program to which the interface to be displayed belongs contains the target application.
In yet other alternative embodiments, detection schemes 1 or 2 may be selected for pull-up detection, where S501 may be replaced with: and in the process of destroying the interface of the target application, if the interface to be displayed is detected, identifying a direct pull-up party or an initial pull-up party of the application program to which the interface to be displayed belongs. S502 may be replaced with: and when the direct pulling-up party or the initial pulling-up party is the target application, judging the pulling-up party of the application program of the interface to be displayed, including the target application.
As an alternative embodiment of the present application, corresponding to the pull-up side detection performed using detection scheme 1, S203 may be replaced by:
and in the process of destroying the interface of the target application, if the interface to be displayed is detected, identifying the direct pulling-up party of the application program to which the interface to be displayed belongs.
And when the direct pulling-up party is the target application, determining the pulling-up party of the application program to which the interface to be displayed belongs, wherein the pulling-up party comprises the target application.
As another alternative embodiment of the present application, corresponding to the case where the pull-up side detection is performed using the detection scheme 2 and the pull-up relationship is recorded, referring to fig. 6, at this time S203 may be replaced with: S601-S603.
S601, in the process of destroying the interface of the target application, if the interface to be displayed is detected, the pull-up relation of the application program to which the interface to be displayed belongs is obtained.
S602, determining an initial pulling-up party of the application program to which the interface to be displayed belongs according to the pulling-up relation.
S603, when the initial pulling-up party is the target application, determining the pulling-up party of the application program of the interface to be displayed, including the target application.
As an optional embodiment of the present application, in a case where the pulling-up party of the application program to which the interface to be displayed belongs does not include the target application, no processing may be performed.
It should be understood that in practical applications, a pull-up relationship between applications is sometimes described as an interface where an application pulls up another application. For example, assume that application a pulls up application b and takes the home interface of application b as the interface to be displayed. At this time, it may be described that the pull-up side of the application b to which the interface to be displayed belongs is the application a. The pull-up side, which may also be described as the interface to be displayed, is application a. The two are merely different in description, but are substantially the same. Based on this, in S203 and other related steps, "detect the pull-up party of the application program to which the interface to be displayed belongs", it may be alternatively described as "detect the pull-up party of the interface to be displayed". And will not be described or replaced here.
And S204, when detecting that the pulling-up party of the application program of the interface to be displayed contains the target application, the electronic equipment cancels the display of the interface to be displayed.
When the fact that the target application is contained in the pulling-up party of the application to be displayed is detected, the application to be displayed is automatically pulled up by the target application. If the interface to be displayed of the application to be displayed is displayed normally, the interface flash condition can be caused. At this point, the embodiment of the present application may cancel the display of the interface to be displayed. The method for canceling the interface display is not limited too much.
As an optional specific embodiment of canceling the display of the interface to be displayed in the present application, the display task of the interface to be displayed may be deleted, so that the electronic device does not display the interface to be displayed any more. For example, in some application scenarios, after the electronic device draws the interface, all the interface display tasks are placed in one interface list to be displayed or an interface list. At this time, the embodiment of the application may remove the display task of the interface to be displayed from the interface list or list to be displayed. The display task of the interface to be displayed is not contained in the interface list or the list to be displayed after the rejection operation, so that the electronic equipment does not need to display the interface to be displayed any more, and the display of the interface to be displayed is canceled. At this time S204 may be replaced with: and when detecting that the pulling-up party of the application program to which the interface to be displayed belongs contains the target application, deleting the display task of the interface to be displayed by the electronic equipment.
As another optional embodiment of canceling the display of the interface to be displayed in the present application, active interception may also be selected when the electronic device performs the display task of the interface to be displayed, so as to avoid displaying the interface to be displayed. At this time S204 may be replaced with: when detecting that the pulling-up party of the application program of the interface to be displayed contains the target application, the electronic equipment detects a display task of the interface to be displayed. And intercepting the display task of the interface to be displayed when the execution of the display task of the interface to be displayed is detected.
As an alternative embodiment of the application, S202-S204 are implemented during the destruction of the target application interface (during the destruction of the interface of the target application), for the detection of the interface to be displayed, the identification of the pulling-up party, and the cancellation of the display. In order to facilitate the electronic device to determine whether the destruction of the target application interface is completed, the embodiment of the application may preset a status flag bit for marking whether the destruction of the target application interface is completed. When S201 starts destroying the target application interface, the status flag bit is set. And when the target application interface is destroyed, recovering the state flag bit. Therefore, whether the destruction of the target application interface is completed can be determined according to whether the state flag bit is restored. Before the state zone bit is detected to be not recovered, the destruction of the target application interface is considered to be incomplete. The name of the status flag may be set by the technician, which is not limited herein. For example, the setting may be return home.
As an optional embodiment of the present application, in order to prevent the state flag bit abnormality from causing the electronic device to continuously consider that the destruction of the target application interface is not completed, a problem occurs. The embodiment of the application can set a state flag bit overtime detection mechanism. And starting timing when the state flag bit is set, and automatically recovering the state flag bit if the timing time exceeds a preset time threshold value and the state flag bit is still in a state of unrecoverable setting. The specific value of the duration threshold is not limited herein.
As an optional embodiment of the present application, in an embodiment of the present application, the case where the interface to be displayed is an interface of a desktop or an interface of a call application is targeted. When it is detected in S203 that the object application is included in the to-be-displayed interface pulling-up side, it may be selected not to execute S204 the display operation of canceling the to-be-displayed interface, but to continue the normal display.
As an alternative embodiment of the present application, in the present application embodiment, the case where the interface to be displayed is a picture-in-picture interface (Picture In Picture, PIP) of the target application is aimed at. When it is detected in S203 that the object application is included in the to-be-displayed interface pulling-up side, it may be selected not to execute S204 the display operation of canceling the to-be-displayed interface, but to continue the normal display. Therefore, the embodiment of the application can realize normal display of the picture-in-picture content of the target application.
In the embodiment of the application, when detecting the operation of returning to the desktop in the process of displaying the interface of the target application, the electronic device starts to return to the desktop and synchronously starts to destroy the interface of the target application. In the process of destroying the target application interface, if the interface to be displayed is detected, a pulling-up party (also referred to as a pulling-up party of the application to be displayed) of the interface to be displayed is checked. And actively deleting or intercepting the display task of the interface to be displayed when the direct pulling-up party and/or the initial pulling-up party are/is the target application. Therefore, in the process of destroying the interface of the target application, the embodiment of the application can avoid the electronic equipment from automatically displaying some application program interfaces which are pulled up by the target application (not opened by a user).
The root cause of interface flash when the electronic device is switched to other application programs is as follows: when the electronic equipment is switched to other application programs, the interface of the background application program is synchronously destroyed, and the destroyed application program automatically pulls up the interface of the application program or other application programs. Thus, by canceling the display of some application program interfaces pulled up by the target application during the process of destroying the interfaces of the target application. The problem of interface flickering caused by pulling up an application program interface by a background application program when the electronic equipment is switched to other application programs can be solved, so that the situation of the problem of interface flickering is improved.
As a specific embodiment of implementing interface processing in the present application, in the embodiment of the present application, an up gesture from the bottom of the screen is selected as an operation mode of returning to the desktop from the target application. And setting a return home state flag bit for marking whether the target application interface is destroyed. In an embodiment of the application, an electronic device includes an input management (input) module, a desktop management module, and a window management module (WindowManagerService, WMS). Referring to fig. 7, an interface processing timing diagram provided in an embodiment of the present application is described in detail below:
s701, when the input management module detects the pressing operation of the user at the bottom position of the screen in the process of displaying the interface of the target application, the touch event is reported to the desktop management module.
The target application may also be referred to as a background application at this time.
S702, the desktop management module identifies the received touch event and reports the touch event to the window management module.
S703, after the window management module receives the touch event, playing the latest task (license) animation.
S704, when the input management module detects that the finger of the user is lifted, the gesture track characteristics are reported to the desktop management module.
S705, the desktop management module recognizes the gesture of the user starting from the bottom of the screen according to the gesture track characteristics, and reports the information of the gesture to the window management module.
S706, after receiving the information of the gesture of the up stroke, the window management module starts to process the interface processing operation. Specifically, a return home status flag bit is set at this time, and the interface of the target application is started to be destroyed, the target application is recorded, and the desktop return animation is started to be played.
S707, during the control period that the return home state flag bit is not reset, the window management module detects whether the direct pulling-up party of the drawn interface to be displayed is a target application.
During the control, the target application pulls up the interface to be displayed in theory, which causes interface flickering, so that a direct pulling-up operation is required.
S708, if the direct pulling-up party of the interface to be displayed is the target application, the window management module deletes the display task of the interface to be displayed so as to intercept the interface to be displayed.
The direct pulling party can also be called a last pulling party, and directly pulls up the application program of the interface to be displayed.
S709, after the desktop returns to the animation playing end, the window management module pulls the desktop to the foreground (i.e. sets the desktop as a foreground application).
S710, when the interface destruction of the target application is completed, the window management module restores the return home state flag bit.
At this time, the operation of interface processing is marked to be completed, and the process of starting the non-user opening interface by the background of the interception target application is ended.
In this embodiment, details and beneficial effects of operations such as detection of an interface to be displayed, identification of a pulling-up party, cancellation of display, and the like may refer to related descriptions in the embodiments shown in fig. 2 to 6, and are not repeated here.
In the embodiment of the application, the interface to be displayed is detected by detecting the direct pulling-up party of the interface to be displayed, and the interface to be displayed is intercepted when the direct pulling-up party is the target application. The embodiment of the application can avoid the situation that the electronic equipment is opened by a non-user and an application program interface is directly pulled up by the target application when returning to the desktop.
As another specific embodiment for implementing interface processing in the present application, in the embodiment of the present application, an up gesture from the bottom of the screen is selected as an operation mode of returning to the desktop from the target application. And setting a return home state flag bit for marking whether the target application interface is destroyed. In an embodiment of the application, an electronic device includes an input management (input) module, a desktop management module, and a window management module (WindowManagerService, WMS). Referring to fig. 8, another interface processing timing diagram provided in an embodiment of the present application is described in detail below:
S801, when the input management module detects the operation of pressing down the bottom position of the screen by the user in the process of displaying the interface of the target application, the touch event is reported to the desktop management module.
The target application may also be referred to as a background application at this time.
S802, the desktop management module recognizes the received touch event and reports the touch event to the window management module.
S803, after the window management module receives the touch event, playing the latest task (license) animation.
S804, when the input management module detects the lifting operation of the fingers of the user, the gesture track characteristics are reported to the desktop management module.
S805, the desktop management module recognizes the up gesture of the user from the bottom of the screen according to the gesture track characteristics, and reports the information of the up gesture to the window management module.
S806, after receiving the information of the gesture of the up stroke, the window management module starts to process the interface processing operation. Specifically, a return home state flag bit is set at this time, an interface of the target application is started to be destroyed, an interface starting sequence is recorded, and a desktop return animation is started to be played.
The interface starting sequence records the pull-up relation among the application programs. Through the interface starting sequence, the mutual pulling relation between the application programs and the jump relation between the application programs in the continuous pulling process can be queried, for example, the application program A pulls up the application program B, the application program B pulls up the application program C and the like.
S807, during the control period that the return home state zone bit is not reset, the window management module detects whether the initial pulling-up party of the interface to be displayed is a target application according to the interface starting sequence for the drawn interface to be displayed.
During the control, in theory, the interface to be displayed which is finally pulled up can cause interface flash condition to occur due to continuous pulling up between the target application and other application programs, so that the operation of initial pulling-up party detection is needed.
In this embodiment of the present application, the application program that is the first pull-up party (i.e., the initial pull-up party) in the continuous pull-up relationship may be determined by querying the continuous pull-up relationship corresponding to the interface to be displayed through the interface startup sequence.
S808, if the initial pulling-up party of the interface to be displayed is the target application, the window management module deletes the display task of the interface to be displayed so as to intercept the interface to be displayed.
The direct pulling party can also be called a last pulling party, and directly pulls up the application program of the interface to be displayed.
S809, after the desktop returns to the animation playing end, the window management module straightens the desktop to the foreground (namely, the desktop is set as a foreground application).
S810, when the interface destruction of the target application is completed, the window management module restores the return home state flag bit.
At this time, the operation of interface processing is marked to be completed, and the process of starting the non-user opening interface by the background of the interception target application is ended.
In this embodiment, details and beneficial effects of operations such as detection of an interface to be displayed, identification of a pulling-up party, cancellation of display, and the like may refer to related descriptions in the embodiments shown in fig. 2 to 6, and are not repeated here.
In the embodiment of the application, an initial pulling-up party of an interface to be displayed is detected through an interface starting sequence, and the interface to be displayed is intercepted when the initial pulling-up party is a target application. The application program interface which is opened by a non-user and is continuously pulled up by the target application and other application programs can be avoided when the electronic equipment returns to the desktop.
For easy understanding of the application scenario of the present application, as an alternative embodiment of the present application, reference is made to fig. 9, which is a comparison timing diagram of application scenarios in several cases provided in the embodiments of the present application. In the embodiment of the present application, it is assumed that the target application is application a, and the application directly or indirectly pulled by the target application is not application B.
The embodiment of the application comprises three scenes: normal scene, abnormal scene, and scene after the application of the scheme. The difference is that: during the destruction of the interface of the application program A, the application program A in the normal scene does not pull up the application program A or other application programs, and the application program A in the abnormal scene and the scene after the application of the scheme pulls up the application program B. The details are as follows:
And in the process of displaying the interface of the application program A by the electronic equipment, detecting the return desktop operation of the user. At this time, the electronic device starts destroying the interface of the application program A and plays the desktop return animation. Wherein:
for normal scenarios: since application a does not pull itself or other applications, the electronic device may display the desktop after the desktop return animation has been played.
For an abnormal scene: since the application program a pulls up the application program B, the electronic device will draw the interface of the application program B at this time and display the interface after drawing is completed, thereby causing the interface to flash. When the interface flash appears before the end of the desktop return animation play, the desktop return animation play may be terminated. When the interface flash appears, the desktop returns to the animation playing end, namely, returns to the desktop. This will cause the electronic device to go from the desktop to the interface of application B. After destroying the interface of application a, the electronic device may choose to display the desktop.
For the scene after the scheme is applied: since application a pulls up application B, the electronic device will draw the interface of application B. In order to prevent the interface from flashing, the electronic device may operate using the schemes of the embodiments shown in fig. 2 to 8, so as to cancel the display of the interface of the application program B after drawing. At the moment, the desktop return animation can be normally played, and the electronic equipment can display the desktop after the desktop return animation is played.
In each of the above scenarios, the desktop launcher may perform an on Resume operation to interact with the user while the electronic device is displaying the desktop.
It should be noted that fig. 2 to fig. 9 are all descriptions of embodiments taking the example of returning the target application to the desktop. In practical applications, when the object of application program switching is other application programs except the desktop (i.e., the first application is a non-desktop application program), the application program switching method and device can achieve the effect of improving the interface flash situation. Details, advantages, and the like of the embodiment shown in fig. 2 to 9 can be referred to, and description of other related embodiments is omitted here.
For an alternative embodiment of the present application, based on a large scenario that the target application switches to other application programs, the response operation of the present application to the switching operation can be described as: and starting to destroy the current display interface of the target application in response to an application switching operation of switching from the target application to the first application. Depending on the situation of the first application, the type of application switching operation may also vary. For example, when the first application is a desktop, the application switching operation is the return desktop operation in the embodiment shown in fig. 2.
Accordingly, S201 may be replaced with: and when the electronic equipment detects the application switching operation of the user in the process of displaying the interface of the target application, starting to switch the application program and starting to destroy the interface of the target application.
From the above description, in the scene after the application of the scheme, the interface flash situation can be avoided due to background non-sensitivity. So that finally, the scene after the scheme is applied is basically the same as the normal scene in terms of perception of a user. Therefore, the embodiment of the application can reduce or avoid the influence on the normal use of the electronic equipment by the user while improving the interface flash problem.
Fig. 10 shows a schematic structural diagram of an interface processing apparatus according to an embodiment of the present application, corresponding to the interface processing method described in the above embodiment, and for convenience of explanation, only a portion related to the embodiment of the present application is shown.
Referring to fig. 10, the interface processing apparatus includes:
the destroying module 1001 is configured to start destroying a current display interface of the target application in response to an application switching operation for switching from the target application to the first application.
The pulling-up party identifying module 1002 is configured to identify a pulling-up party of an application to be displayed to which the interface to be displayed belongs in a process of destroying the current display interface; the pulling-up party of the application to be displayed is an application program for pulling up the application to be displayed.
And the cancellation display module 1003 is used for canceling the display of the interface to be displayed when the pulling-up party of the application to be displayed contains the target application.
As one embodiment of the present application, the pull-up party identification module 1002 includes:
and the identification sub-module is used for identifying a direct pulling-up party and/or an initial pulling-up party of the application to be displayed. The direct pull-up party is an application program for directly pulling up an application to be displayed, and the initial pull-up party is: a first application in a relationship that includes a plurality of applications of the application to be displayed that are successively pulled up.
And the judging module is used for judging that the pulling-up party of the application to be displayed contains the target application when the identified direct pulling-up party and/or the initial pulling-up party are the target application.
As one embodiment of the present application, the identification sub-module includes:
and the pulling-up relation acquisition module is used for acquiring the pulling-up relation containing the application to be displayed. The pulling relation is the relation data which is recorded in the process of destroying the current display interface and is pulled up among all application programs.
And the initial determining module is used for determining a first application program in the pull-up relation to obtain an initial pull-up party of the application to be displayed.
As one embodiment of the present application, the pull-up party identification module 1002 includes:
And the identification sub-module is used for identifying a direct pulling-up party and/or an indirect pulling-up party of the application to be displayed. The direct pull-up party is an application program for directly pulling up an application to be displayed, and the indirect pull-up party is: in a relationship in which a plurality of application programs including an application to be displayed are successively pulled up, the order of being pulled up precedes the application program of the application to be displayed.
And the judging module is used for judging that the pulling-up party of the application to be displayed contains the target application when the identified indirect pulling-up party contains the target application.
As one embodiment of the present application, the cancel display module 1003 includes:
and the deleting module is used for deleting the display task of the interface to be displayed. Or the interception module is used for intercepting the display task when the execution of the display task of the interface to be displayed is detected.
As an embodiment of the present application, the deletion module specifically includes:
and reading an interface list to be displayed, and deleting the display task of the interface to be displayed from the interface list.
As one embodiment of the application, a status flag bit is set in the electronic device, and the status flag bit is set when the current display interface of the target application starts to be destroyed. And after the destruction of the current display interface is completed, recovering the status flag bit. Correspondingly, before the state flag bit is not recovered, the object application interface destruction is considered to be incomplete, and at this time, the pulling-up party identification module 1002 and the cancel display module 1003 can execute operations of identifying the pulling-up party and canceling display. When the status flag is restored, the operations of identifying the pull-up party and canceling the display may be stopped.
As an embodiment of the present application, when the gui of the target application is to be displayed, the application to be displayed and the pulling-up party of the application to be displayed are both target applications. For this case, the operation of canceling the display of the interface to be displayed by the cancel display module 1003 may not be used in the embodiment of the present application, but the interface to be displayed may be normally displayed.
The process of implementing respective functions by each module in the interface processing apparatus provided in this embodiment of the present application may refer to the foregoing description of the embodiments shown in fig. 2 to 9 and other related method embodiments, which are not repeated herein.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance. It will also be understood that, although the terms "first," "second," etc. may be used in this document to describe various elements in some embodiments of the present application, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first table may be named a second table, and similarly, a second table may be named a first table without departing from the scope of the various described embodiments. The first table and the second table are both tables, but they are not the same table.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The interface processing method provided by the embodiment of the application can be applied to electronic devices with display capability, such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/Virtual Reality (VR) devices, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the specific types of the electronic devices are not limited.
For example, the electronic device may be a cellular telephone, a cordless telephone, a personal digital processing (Personal Digital Assistant, PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, an in-vehicle device, a car networking terminal, a computer, a laptop computer, a handheld communication device, a handheld computing device, a satellite radio device, a wireless modem card, a customer premise equipment (customer premise equipment, CPE) and/or other devices for communicating over a wireless system, as well as electronic devices in a next generation communication system, e.g., an electronic device in a 5G network or in a future evolving public land mobile network (Public Land Mobile Network, PLMN) network, etc.
By way of example, but not limitation, when the electronic device is a wearable device, the wearable device may also be a generic name for applying wearable technology to intelligently design daily wear, developing wearable devices, such as glasses, gloves, watches, apparel, shoes, and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device comprises full functions, large size, and complete or partial functions which can be realized independent of a smart phone, such as a smart watch or a smart glasses, and is only focused on certain application functions, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets, smart jewelry and the like for physical sign monitoring.
Taking the example that the electronic device is a mobile phone, fig. 11a shows a schematic structural diagram of the mobile phone 100.
The handset 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a SIM card interface 195, etc. The sensor module 180 may include a gyroscope sensor 180A, an acceleration sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an ambient light sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, and a touch sensor 180K (of course, the mobile phone 100 may also include other sensors such as a temperature sensor, a pressure sensor, an air pressure sensor, a bone conduction sensor, etc., which are not shown).
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a Neural network processor (Neural-network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural center or a command center of the mobile phone 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The processor 110 may run the interface processing method provided in the embodiments of the present application, so as to enrich and avoid interface flash situations, and improve user experience. The processor 110 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the interface processing method provided in the embodiments of the present application, for example, a part of algorithms in the interface processing method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain a faster processing efficiency.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, N being a positive integer greater than 1. The display 194 may be used to display information entered by or provided to a user as well as various graphical user interfaces (graphical user interface, GUI). For example, the display 194 may display photographs, videos, web pages, or files, etc. For another example, the display 194 may display a graphical user interface. Including status bars, hidden navigation bars, time and weather gadgets (widgets), and icons for applications, such as browser icons, etc. The status bar includes the name of the operator (e.g., chinese mobile), the mobile network (e.g., 4G), time, and the remaining power. The navigation bar includes a back (back) key icon, a home screen (home) key icon, and a forward key icon. Further, it is to be appreciated that in some embodiments, bluetooth icons, wi-Fi icons, external device icons, etc. may also be included in the status bar. It will also be appreciated that in other embodiments, a Dock may be included in the graphical user interface, a commonly used application icon may be included in the Dock, and the like. When the processor detects a touch event of a finger (or a stylus or the like) of a user with respect to a certain application icon, a user interface of the application corresponding to the application icon is opened in response to the touch event, and the user interface of the application is displayed on the display screen 194.
In the embodiment of the present application, the display 194 may be an integral flexible display, or a tiled display formed of two rigid screens and a flexible screen located between the two rigid screens may be used. After the processor 110 runs the interface processing method provided in the embodiments of the present application, the processor 110 may control the external audio output device to switch the output audio signal.
The camera 193 (front camera or rear camera, or one camera may be used as both front camera and rear camera) is used to capture still images or video. In general, the camera 193 may include a photosensitive element such as a lens group including a plurality of lenses (convex lenses or concave lenses) for collecting optical signals reflected by an object to be photographed and transmitting the collected optical signals to an image sensor. The image sensor generates an original image of the object to be photographed according to the optical signal.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store, among other things, code for an operating system, an application program (e.g., a camera application, a WeChat application, etc.), and so on. The storage data area may store data created during use of the handset 100 (e.g., images, video, etc. acquired by the camera application), etc.
The internal memory 121 may also store one or more computer programs corresponding to the interface processing methods provided in the embodiments of the present application. The one or more computer programs stored in the memory 121 and configured to be executed by the one or more processors 110 include instructions that may be used to perform the various steps as in the corresponding embodiments of fig. 2-9, which may include an account verification module, a priority comparison module. The account verification module is used for authenticating system authentication accounts of other electronic devices in the local area network; the priority comparison module can be used for comparing the priority of the audio output request service with the priority of the current output service of the audio output equipment. And the state synchronization module can be used for synchronizing the equipment state of the audio output equipment currently accessed by the electronic equipment to other electronic equipment or synchronizing the equipment state of the audio output equipment currently accessed by other equipment to the local. When the code of the interface processing method stored in the internal memory 121 is executed by the processor 110, the processor 110 may control the electronic device to perform interface data processing.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
Of course, the code of the interface processing method provided in the embodiment of the present application may also be stored in an external memory. In this case, the processor 110 may run the codes of the interface processing method stored in the external memory through the external memory interface 120, and the processor 110 may control the electronic device to perform the interface data processing.
The function of the sensor module 180 is described below.
The gyro sensor 180A may be used to determine the motion gesture of the handset 100. In some embodiments, the angular velocity of the handset 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180A. I.e., the gyro sensor 180A may be used to detect the current motion state of the handset 100, such as shaking or being stationary.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 180A may be used to detect a folding or unfolding operation acting on the display screen 194. The gyro sensor 180A may report the detected folding operation or unfolding operation to the processor 110 as an event to determine the folding state or unfolding state of the display screen 194.
The acceleration sensor 180B can detect the magnitude of acceleration of the mobile phone 100 in various directions (typically three axes). I.e., the gyro sensor 180A may be used to detect the current motion state of the handset 100, such as shaking or being stationary. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 180B may be used to detect a folding or unfolding operation acting on the display screen 194. The acceleration sensor 180B may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folding state or unfolding state of the display screen 194.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The cell phone uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the handset. When insufficient reflected light is detected, the handset may determine that there is no object in the vicinity of the handset. When the display screen in the embodiment of the present application is a foldable screen, the proximity light sensor 180G may be disposed on a first screen of the foldable display screen 194, and the proximity light sensor 180G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 180A (or the acceleration sensor 180B) may transmit detected motion state information (such as angular velocity) to the processor 110. The processor 110 determines whether it is currently in a handheld state or a foot rest state based on the motion state information (e.g., when the angular velocity is not 0, it is indicated that the mobile phone 100 is in a handheld state).
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100 at a different location than the display 194.
Illustratively, the display 194 of the handset 100 displays a main interface that includes icons of a plurality of applications (e.g., camera applications, weChat applications, etc.). The user clicks on an icon of the camera application in the main interface by touching the sensor 180K, triggering the processor 110 to launch the camera application, opening the camera 193. The display 194 displays an interface for the camera application, such as a viewfinder interface.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110. In the embodiment of the present application, the mobile communication module 150 may also be used for information interaction with other electronic devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the handset 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2. In the embodiment of the present application, the wireless communication module 160 may be used for accessing an access point device, and sending and receiving messages to other electronic devices.
In addition, the mobile phone 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor, etc. Such as music playing, recording, etc. The handset 100 may receive key 190 inputs, generating key signal inputs related to user settings and function control of the handset 100. The cell phone 100 may generate a vibration alert (such as an incoming call vibration alert) using the motor 191. The indicator 192 in the mobile phone 100 may be an indicator light, which may be used to indicate a state of charge, a change in power, an indication message, a missed call, a notification, etc. The SIM card interface 195 in the handset 100 is used to connect to a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation with the handset 100.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the camera function of cell phone 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display function of the handset 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present invention is only illustrative, and is not limited to the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
It should be understood that in practical applications, the mobile phone 100 may include more or fewer components than shown in fig. 11a, and embodiments of the present application are not limited. The illustrated handset 100 is only one example, and the handset 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, an Android system with a layered architecture is taken as an example, and the software structure of the electronic equipment is illustrated. Fig. 11b is a software architecture block diagram of an electronic device according to an embodiment of the invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 11b, the application package may include applications such as phone, camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 11b, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.164, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 12, the electronic device 12 of this embodiment includes: at least one processor 120 (only one is shown in fig. 12), a memory 121, said memory 121 having stored therein a computer program 122 executable on said processor 120. The processor 120, when executing the computer program 122, implements the steps of the various interface processing method embodiments described above, such as steps 201 through 204 shown in fig. 2. Alternatively, the processor 120, when executing the computer program 122, performs the functions of the modules/units in the above-described apparatus embodiments, such as the functions of the modules 101 to 104 shown in fig. 10.
The electronic device 12 may be a desktop computer, a notebook computer, a palm top computer, a cloud server, or the like. The electronic device may include, but is not limited to, a processor 120, a memory 121. It will be appreciated by those skilled in the art that fig. 12 is merely an example of an electronic device 12 and is not intended to be limiting of the electronic device 12, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the electronic device may further include an input transmitting device, a network access device, a bus, etc.
The processor 120 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 121 may in some embodiments be an internal storage unit of the electronic device 12, such as a hard disk or a memory of the electronic device 12. The memory 121 may also be an external storage device of the electronic device 12, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device 12. Further, the memory 121 may also include both internal storage units and external storage devices of the electronic device 12. The memory 121 is used to store an operating system, application programs, boot loader (BootLoader), data, and other programs, such as program codes of the computer programs. The memory 121 may also be used to temporarily store data that has been transmitted or is to be transmitted.
In addition, it will be clearly understood by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The embodiment of the application also provides an electronic device, which comprises at least one memory, at least one processor and a computer program stored in the at least one memory and capable of running on the at least one processor, wherein the processor executes the computer program to enable the electronic device to realize the steps in any of the method embodiments.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps that may implement the various method embodiments described above.
Embodiments of the present application provide a computer program product which, when run on an electronic device, causes the electronic device to perform the steps of the method embodiments described above.
Embodiments of the present application also provide a chip system, where the chip system includes a processor, where the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the steps in the foregoing method embodiments.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. An interface processing method, applied to an electronic device, comprising:
responding to an application switching operation of switching from a target application to a first application, and starting to destroy a current display interface of the target application; the destroying interface is to delete the activity related information and recover the memory resource;
in the process of destroying the current display interface, identifying a pulling-up party of the application to be displayed, to which the interface to be displayed belongs; the method comprises the steps that a pulling-up party of an application to be displayed is an application program for pulling up the application to be displayed;
and when the pulling-up party of the application to be displayed contains the target application, canceling the display of the interface to be displayed.
2. The interface processing method according to claim 1, wherein the first application is a desktop, and the application switching operation is a return desktop operation.
3. The interface processing method according to claim 1, characterized in that identifying a pull-up party of the application to be displayed includes:
identifying a direct pull-up party and/or an initial pull-up party of the application to be displayed; the direct pulling-up party is an application program for directly pulling up the application to be displayed, and the initial pulling-up party is: a first application in a relationship in which a plurality of applications including the application to be displayed are successively pulled up;
And when the identified direct pulling-up party and/or the initial pulling-up party are the target application, judging that the pulling-up party of the application to be displayed contains the target application.
4. The interface processing method according to claim 3, wherein the operation of identifying the initial pull-up party includes:
acquiring a pull-up relation containing the application to be displayed; the pulling-up relation is relation data which is recorded in the process of destroying the current display interface and is mutually pulled up among all application programs;
and determining a first application program in the pull-up relation, and obtaining the initial pull-up party of the application to be displayed.
5. The interface processing method according to claim 1, characterized in that identifying a pull-up party of the application to be displayed includes:
identifying a direct pull-up party and/or an indirect pull-up party of the application to be displayed; the direct pulling-up party is an application program for directly pulling up the application to be displayed, and the indirect pulling-up party is: in a relationship in which a plurality of application programs including an application to be displayed are successively pulled up, the sequence of the pulled up application programs is prior to the application programs of the application to be displayed;
and when the identified direct pulling-up party and/or the indirect pulling-up party contain the target application, judging that the pulling-up party of the application to be displayed contains the target application.
6. The interface processing method according to any one of claims 1 to 5, wherein the canceling the display of the interface to be displayed includes:
deleting the display task of the interface to be displayed; or alternatively
And intercepting the display task when the display task of the interface to be displayed is detected to be executed.
7. The interface processing method according to claim 6, wherein deleting the display task for the interface to be displayed includes:
and reading an interface list to be displayed, and deleting the display task of the interface to be displayed from the interface list.
8. An interface processing apparatus, comprising:
the destroying module is used for responding to an application switching operation of switching from a target application to a first application and starting to destroy a current display interface of the target application; the destroying interface is to delete the activity related information and recover the memory resource;
the pulling-up party identification module is used for identifying a pulling-up party of the application to be displayed, to which the interface to be displayed belongs, in the process of destroying the current display interface; the method comprises the steps that a pulling-up party of an application to be displayed is an application program for pulling up the application to be displayed;
And the cancellation display module is used for canceling the display of the interface to be displayed when the pulling-up party of the application to be displayed contains the target application.
9. An electronic device comprising a memory, a processor, the memory having stored thereon a computer program executable on the processor, when executing the computer program, performing the steps of the method according to any of claims 1 to 7.
10. A chip system comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the interface processing method of any of claims 1 to 7.
CN202211017564.7A 2022-08-23 2022-08-23 Interface processing method and device and electronic equipment Active CN116028148B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211017564.7A CN116028148B (en) 2022-08-23 2022-08-23 Interface processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211017564.7A CN116028148B (en) 2022-08-23 2022-08-23 Interface processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN116028148A CN116028148A (en) 2023-04-28
CN116028148B true CN116028148B (en) 2024-04-12

Family

ID=86076524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211017564.7A Active CN116028148B (en) 2022-08-23 2022-08-23 Interface processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116028148B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117149340B (en) * 2023-10-30 2024-02-06 北京小米移动软件有限公司 USIM card application interface display method and device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110825301A (en) * 2019-09-25 2020-02-21 华为技术有限公司 Interface switching method and electronic equipment
CN111736931A (en) * 2019-03-25 2020-10-02 青岛海信移动通信技术股份有限公司 Application display interface interception method and terminal
WO2022100239A1 (en) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 Device cooperation method, apparatus and system, electronic device and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736931A (en) * 2019-03-25 2020-10-02 青岛海信移动通信技术股份有限公司 Application display interface interception method and terminal
CN110825301A (en) * 2019-09-25 2020-02-21 华为技术有限公司 Interface switching method and electronic equipment
WO2022100239A1 (en) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 Device cooperation method, apparatus and system, electronic device and storage medium

Also Published As

Publication number Publication date
CN116028148A (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN110825301A (en) Interface switching method and electronic equipment
CN115297199A (en) Touch method of equipment with folding screen and folding screen equipment
WO2023284415A1 (en) Power key mistouch detection method and electronic device
CN113805797B (en) Processing method of network resource, electronic equipment and computer readable storage medium
US20220358089A1 (en) Learning-Based Keyword Search Method and Electronic Device
WO2021185352A1 (en) Version upgrade method and related apparatus
CN112835495B (en) Method and device for opening application program and terminal equipment
CN116028148B (en) Interface processing method and device and electronic equipment
CN114666433B (en) Howling processing method and device in terminal equipment and terminal
CN115801943A (en) Display method, electronic device, and storage medium
WO2024093103A1 (en) Handwriting processing method, and terminal device and chip system
CN116048833B (en) Thread processing method, terminal equipment and chip system
CN116954409A (en) Application display method and device and storage medium
CN114911400A (en) Method for sharing pictures and electronic equipment
CN116662150B (en) Application starting time-consuming detection method and related device
CN116048829B (en) Interface calling method, device and storage medium
CN115828227B (en) Method for identifying advertisement popup, electronic equipment and storage medium
CN116991274B (en) Upper sliding effect exception handling method and electronic equipment
CN111475363B (en) Card death recognition method and electronic equipment
CN114244951B (en) Method for opening page by application program, medium and electronic equipment thereof
CN116027933B (en) Method and device for processing service information
CN116916093B (en) Method for identifying clamping, electronic equipment and storage medium
CN115794272B (en) Display method and electronic equipment
CN116662150A (en) Application starting time-consuming detection method and related device
CN117077703A (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant