CN112835472B - Communication terminal and display method - Google Patents

Communication terminal and display method Download PDF

Info

Publication number
CN112835472B
CN112835472B CN202110152969.0A CN202110152969A CN112835472B CN 112835472 B CN112835472 B CN 112835472B CN 202110152969 A CN202110152969 A CN 202110152969A CN 112835472 B CN112835472 B CN 112835472B
Authority
CN
China
Prior art keywords
window
user
application program
application
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110152969.0A
Other languages
Chinese (zh)
Other versions
CN112835472A (en
Inventor
荆楠楠
王旭光
孙哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Publication of CN112835472A publication Critical patent/CN112835472A/en
Application granted granted Critical
Publication of CN112835472B publication Critical patent/CN112835472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application discloses a communication terminal and a display method. In the application, a second window is displayed on the upper layer of a first window, a first user interface of a first application is displayed in the first window, a second user interface of a second application is displayed in the second window, and the first user interface is partially shielded by the second user interface; and in response to a user touch operation acting on a first area, the second window is hidden or closed, and the first area is an area where the first window is not shielded by the second window. By the method and the device, the influence of the second window on the use of the first application program by the user can be avoided.

Description

Communication terminal and display method
Cross Reference to Related Applications
The present application claims priority of chinese patent application filed on 22/1/22/2021 with chinese patent office under the application number 202110090800.7 entitled "communication terminal and display method", the entire contents of which are incorporated by reference into this disclosure.
Technical Field
The present application relates to the field of communications, and in particular, to a communication terminal and a display method.
Background
With the popularization of intelligent communication terminals, more and more functions are supported by the communication terminals, and in order to enable users to obtain better use experience, the current communication terminals support multi-window display, different windows display different application interfaces, but in some scenes, user operation is affected, and user experience is reduced.
For example, as shown in FIG. 1, a cell phone is currently running a game and the user interface 10 for the game is displayed on the screen of the cell phone in a full screen manner. When a call is incoming, a user interface 11 of a call application program is displayed on a screen of the mobile phone, wherein the user interface 11 comprises function keys such as an answering key, a hang-up key and the like, and the number of the incoming call can also be displayed. The user interface 11 is displayed in a small window manner, so that the user interface 11 is displayed on the upper layer of the user interface 10 and is smaller than the user interface 10 in size, as shown in fig. 1, and the user interface 10 of the game is partially blocked by the user interface 11 of the conversation application.
The presence of the user interface 11 of the telephony application interferes with the user's gaming operations. For example, when the user needs to operate the game interface of the blocked part because the user interface 11 of the call application blocks part of the game interface, the user needs to first touch the hang-up key in the user interface 10, so that the small window bearing the user interface 10 is closed, which is cumbersome to operate and affects the game operation of the user; or, if the user mistakenly touches the answer key in the user interface 10, the user enters the call application, and the game operation cannot be performed continuously.
For another example, as shown in FIG. 2, a cell phone is currently running a game, and the user interface 20 for the game is displayed on the screen of the cell phone in a full screen manner. When a user touches the edge of the mobile phone screen, a screen recording shortcut icon 21 is displayed on the mobile phone screen (a screen recording application can be opened by triggering the screen recording shortcut icon to record a game interface). The screen recording shortcut icon 21 is displayed in a small window manner, so that the screen recording shortcut icon is displayed on the upper layer of the user interface 20, and the size of the screen recording shortcut icon is smaller than that of the user interface 20, as shown in fig. 2, the user interface 20 of the game is partially shielded by the screen recording shortcut icon 21, and if the user does not need to perform screen recording operation, the screen recording shortcut icon interferes with the operation of the user on the game interface.
For another example, as shown in fig. 3, a mobile phone is currently running a reading application, and a page 30 of the electronic reading is displayed on the screen of the mobile phone in a full-screen manner. At a certain moment, the advertisement popup window 31 is displayed on the screen of the mobile phone, and the advertisement popup window 31 is displayed in a small window manner, so that the advertisement popup window 31 is displayed on the upper layer of the page 30 of the electronic reading material and is smaller than the page 30 of the electronic reading material in size, as shown in fig. 3, the page 30 of the electronic reading material is partially shielded by the advertisement popup window 31, and a user needs to touch a close function key in the advertisement popup window to close the advertisement popup window, so that normal reading can be continued. In some cases, the user interface in the widget may not provide a close function key, and the user needs to click to enter the widget and then exit the corresponding application program, so that the widget is closed, and the user continues to read normally.
Disclosure of Invention
The exemplary embodiments of the present application provide a communication terminal and a display method to avoid interference of a second window of a second application program with a user using a first application program.
In a first aspect, a communication terminal is provided, which includes:
a touch screen configured to receive a touch operation from a user;
a display screen configured to display a user interface;
a processor coupled to the touch screen and the display screen, respectively;
the display screen displays a first window and a second window, a first user interface of a first application is displayed in the first window, a second user interface of a second application is displayed in the second window, and the first user interface is partially shielded by the second user interface;
the processor is configured to:
in response to a user touch operation acting on a first area, closing the second window, wherein the first area is an area where the first window is not shielded by the second window;
the processor is configured with an input controller configured to:
obtaining information of a user input event corresponding to the user touch operation;
and transmitting the information of the user input event to the second application program registered with the user input event, so that the second application program closes the second window when judging that the user touch operation does not act in the second window according to the information of the user input event, and the second application program registers the user input event.
Further, the input controller is further configured to: and directly acquiring the information of the user input event from a driving node of a hardware driving layer, wherein the driving node of the hardware driving layer is used for storing the information of the user input event corresponding to the user touch operation.
In a possible implementation manner, the second window is a caller identification window of the call application program; the telephony application is further configured to: and if the touch operation of the user is not acted in the second window according to the information of the user input event, hanging up the incoming call.
In one possible implementation, the processor is configured with a window manager configured to:
obtaining information of a user input event corresponding to the user touch operation;
and determining an area acted by the user touch operation according to the user input event information, and if the area acted by the user touch operation is a visible area of the first window, placing the first window on an upper layer of the second window, so that the second user interface in the second window is shielded by the first user interface in the first window.
Further, the window manager is further configured to: and inquiring an application program visible window list according to the position information of the user touch operation coordinate in the user input event information to obtain a visible area of an application program visible window where the user touch operation coordinate is located, wherein the application program visible window list stores visible window identification information, identification information of an application program to which the visible window belongs, and the position and the size of the visible area of the visible window.
Further, the window manager is further configured to: before the first window is placed on the upper layer of the second window, the visible area of the first window in the application program visible window list is updated according to the position and the size of the second window, and the visible area of the first window does not include the area of the first window, which is shielded by the second window.
Further, the window manager is further configured to: transmitting the user touch operation coordinate position information, the identification information of the application program where the user touch operation coordinate position is located and application program window display level information to the second application program, so that the second application program judges that the user touch operation is not acted in a second window of the second application program according to the identification information of the application program where the user touch operation coordinate position is located, and closes the second window when the second window is determined to be located at the upper layer of the first window of the first application program according to the application program window display level information; the identification information of the application program where the user touch operation coordinate position is located is the identification information of the first application program, and the application program window display level information indicates that the second window of the second application program is on the upper layer of the first window of the first application program.
In a second aspect, a communication terminal is provided, including:
the input controller is positioned on a system layer and is configured to respond to a user touch operation, obtain information of a user input event corresponding to the user touch operation and transmit the information of the user input event to a second application program, the second application program registers the user input event, a user interface of the second application program is displayed on a second window, the second window is positioned on an upper layer of a first window, the first window displays the user interface of the first application program, and the first user interface is partially shielded by the second user interface;
the second application program is configured to close the second window if it is determined that the user touch operation is not applied to the second window according to the information of the user input event.
In one possible implementation, the input controller is specifically configured to:
and directly acquiring the information of the user input event from a driving node of a hardware driving layer, wherein the driving node of the hardware driving layer is used for storing the information of the user input event corresponding to the user touch operation.
In a third aspect, a communication terminal is provided, including:
the hardware input capturer is positioned in a system layer and is configured to respond to a user touch operation, obtain user input event information corresponding to the user touch operation and transmit the user input event information to the window manager;
the window manager, located at an application framework layer, is configured to: and determining an area acted by the user touch operation according to the user input event information, and if the area acted by the user touch operation is a visible area of the first window, placing the first window on an upper layer of the second window, so that the second user interface in the second window is shielded by the first user interface in the first window.
In a fourth aspect, a display method is provided, including:
displaying a second window on an upper layer of a first window, wherein a first user interface of a first application is displayed in the first window, a second user interface of a second application is displayed in the second window, and the first user interface is partially shielded by the second user interface;
and in response to a user touch operation acting on a first area, the second window is hidden or closed, and the first area is an area where the first window is not shielded by the second window.
In one possible implementation manner, the closing the second window in response to a user touch operation applied to the first area includes:
an input controller of a system layer obtains information of a user input event corresponding to the user touch operation;
the input controller passing information of the user input event to the second application program, the second application program registering the user input event;
and if the second application program judges that the user touch operation does not act in the second window according to the information of the user input event, closing the second window.
Further, the obtaining, by the input controller of the system layer, information of a user input event corresponding to the user touch operation includes: the input controller directly acquires the information of the user input event from a driving node of a hardware driving layer, and the driving node of the hardware driving layer is used for storing the information of the user input event corresponding to the user touch operation.
In a possible implementation manner, the second window is a call display window of a call application program, and if the call application program determines that the user touch operation is not applied to the second window according to the information of the user input event, the call application program hangs up the incoming call.
In one possible implementation, the hiding the second window in response to a user touch operation applied to the first area includes:
the window manager obtains information of a user input event corresponding to the user touch operation;
and the window manager determines the area acted by the user touch operation according to the user input event information, and if the area acted by the user touch operation is the visible area of the first window, the first window is placed on the upper layer of the second window, so that the second user interface in the second window is shielded by the first user interface in the first window.
Further, the determining, by the window manager, the area where the touch operation is performed by the user according to the user input event information includes: and the window manager queries an application program visible window list according to the position information of the user touch operation coordinate in the user input event information to obtain a visible area of an application program visible window where the user touch operation coordinate is located, wherein the visible area of the application program visible window where the visible window belongs is stored in the application program visible window list, and the visible area of the visible window, the identification information of the application program to which the visible window belongs, and the position and the size of the visible area of the visible window are stored in the application program visible window list.
Further, before the placing the first window on the upper layer of the second window, the method further comprises: and the window manager updates the visible area of the first window in the application program visible window list according to the position and the size of the second window, wherein the visible area of the first window does not comprise the area of the first window, which is shielded by the second window.
In a fifth aspect, a computer storage medium is provided, in which computer program instructions are stored, which instructions, when run on a computer, cause the computer to perform any of the above-described display methods.
A sixth aspect provides a computer program product which, when invoked by a computer, causes the computer to perform any of the display methods described above.
On the basis of the common knowledge in the field, the above preferred conditions can be combined randomly to obtain the preferred embodiments of the application.
According to the embodiment of the application, when the second window is displayed on the upper layer of the first window, and the first user interface is partially shielded by the second user interface, the second window can be hidden or closed through the user touch operation acting on the first area (the first area is an area where the first window is not shielded by the second window), so that the influence of the second window on the use of the first application program by a user is avoided.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, a brief description of the drawings used in the description of the embodiments or the prior art will be given below, and it is apparent that, the drawings in the following description are examples of the present application, and those skilled in the art will appreciate that other drawings may be derived from the drawings without inventive exercise.
FIGS. 1, 2 and 3 are schematic diagrams of a user interface being partially obscured, respectively;
fig. 4 is a schematic structural diagram illustrating a communication terminal provided in an embodiment of the present application;
fig. 5 is a schematic diagram illustrating a software architecture of a communication terminal according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a software architecture of a communication terminal provided by an embodiment of the present application;
fig. 7 is a schematic flowchart illustrating a display method provided by an embodiment of the present application;
fig. 8 is a schematic diagram illustrating a software architecture of a communication terminal according to another embodiment of the present application;
fig. 9 is a schematic view schematically showing a window visible region in the embodiment of the present application;
fig. 10 is a schematic flowchart illustrating a display method provided by an embodiment of the present application;
fig. 11a, 11b, and 11c respectively illustrate schematic diagrams of a user interface in an embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present application will be described in detail and removed with reference to the accompanying drawings. Wherein in the description of the embodiments of the present application, "/" means or, unless otherwise stated, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first", "second", may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application "a plurality" means two or more unless stated otherwise.
Fig. 4 shows a schematic configuration of the communication terminal 100.
The communication terminal 100 in the embodiment of the present application supports a multi-window mode, for example, the communication terminal may be a smartphone, a wearable device, a tablet computer, and the like, which support the multi-window mode.
The following describes an embodiment specifically taking the communication terminal 100 as an example. It should be understood that the communication terminal 100 shown in fig. 4 is only an example, and the communication terminal 100 may have more or less components than those shown in fig. 4, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
A block diagram of a hardware configuration of the communication terminal 100 according to the exemplary embodiment is exemplarily shown in fig. 4. As shown in fig. 4, the communication terminal 100 includes: a Radio Frequency (RF) circuit 110, a memory 120, a display unit 130, a camera 140, a sensor 150, an audio circuit 160, a Wireless Fidelity (Wi-Fi) module 170, a processor 180, a bluetooth module 181, and a power supply 190.
The RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 180 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 120 may be used to store software programs and data. The processor 180 executes various functions of the communication terminal 100 and data processing by executing software programs or data stored in the memory 120. The memory 120 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 120 stores an operating system that enables the communication terminal 100 to operate. The memory 120 in the embodiment of the present application may store an operating system and various application programs, and may also store codes for performing the methods described in the embodiment of the present application.
The display unit 130 may be used to receive input numeric or character information and generate signal input related to user settings and function control of the communication terminal 100, and particularly, the display unit 130 may include a touch screen 131 disposed on the front surface of the communication terminal 100 and may collect touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like.
The display unit 130 may also be used to display a Graphical User Interface (GUI) of information input by or provided to the user and various menus of the communication terminal 100. Specifically, the display unit 130 may include a display screen 132 disposed on the front surface of the communication terminal 100. The display 132 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 130 may be used to display various graphical user interfaces described in the embodiments of the present application.
The touch screen 131 may cover the display screen 132, or the touch screen 131 and the display screen 132 may be integrated to implement the input and output functions of the communication terminal 100, and after the integration, the touch screen may be referred to as a touch display screen for short. In the embodiment of the present application, the display unit 130 may display the application program and the corresponding operation steps.
The camera 140 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 180 for conversion into digital image signals.
The communication terminal 100 may further comprise at least one sensor 150, such as an acceleration sensor 151, a distance sensor 152, a fingerprint sensor 153, a temperature sensor 154. The communication terminal 100 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, optical sensor, motion sensor, and the like.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between a user and the communication terminal 100. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The communication terminal 100 may also be configured with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to, for example, another terminal or outputs the audio data to the memory 120 for further processing. In the embodiment of the present application, the microphone 162 may acquire the voice of the user.
Wi-Fi belongs to a short-distance wireless transmission technology, and the communication terminal 100 can help a user to receive and send e-mails, browse web pages, access streaming media and the like through the Wi-Fi module 170, and provides wireless broadband internet access for the user.
The processor 180 is a control center of the communication terminal 100, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the communication terminal 100 and processes data by running or executing software programs stored in the memory 120 and calling data stored in the memory 120. In some embodiments, processor 180 may include one or more processing units; the processor 180 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 180. The processor 180 in the embodiment of the present application may run an operating system, an application program, a user interface display, and a touch response, and the processing method in the embodiment of the present application. In addition, the processor 180 is coupled with the display unit 130 and the camera 140.
And the bluetooth module 181 is configured to perform information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the communication terminal 100 may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) having a bluetooth module via the bluetooth module 181, so as to perform data interaction.
The communication terminal 100 also includes a power supply 190 (such as a battery) to power the various components. The power supply may be logically connected to the processor 180 through a power management system to manage charging, discharging, power consumption, etc. through the power management system. The communication terminal 100 may also be configured with power buttons for powering the terminal on and off, and for locking the screen.
Fig. 5 is a block diagram of a software configuration of the communication terminal 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, which are an application layer, an application framework layer, an Android runtime (Android runtime) and system library (also called system layer), and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 5, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 5, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide a communication function of the communication terminal 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the communication terminal vibrates, and an indicator light flashes.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example the method comprises the following steps: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
When a first application program is currently running on the communication terminal, a user interface of the first application program is displayed in a first window, if a window (hereinafter referred to as a second window) of a second application program appears on an upper layer of the first window at the moment, the user interface of the second application program is displayed in the second window, and if the user interface in the second window partially shields the user interface in the first window, the use of the first application program by a user is influenced.
At present, the second window can be closed only by touch operation of a user on a related function key in the second window, or the user enters a corresponding application program or a corresponding page through touch operation in the second window and then exits, so that the second window can be closed, the user operation is complicated, and the user uses the first application program to cause interference.
In the embodiment of the application, a user can perform touch operation in an area (hereinafter referred to as a first area) in the first window that is not covered by the second window, and in response, the second window is hidden or closed, so that the user interface of the first application program is not covered by the user interface of the second application program, interference of the second window on the user using the first application program is avoided, the user operation is simple, and the user experience can be improved.
The "application program" described in this embodiment of the present application may be a system application program, a third party application program, or other types of application programs, which is not limited in this embodiment of the present application.
Referring to fig. 6, a schematic diagram of a system architecture of a communication terminal provided in the embodiment of the present application is shown, where components related to the embodiment of the present application are shown, and in implementation, the system architecture of the communication terminal may include more components.
As shown in fig. 6, the application layer includes one or more applications, and only a first application 611 and a second application 612 are shown in this embodiment. The first application 611 and the second application 612 may be system applications or third-party applications. For example, the first application 611 may be a game, or an electronic reading browser, or a video player, etc.; the second application 612 may be a telephony application, or a popup application or the like.
In some embodiments, the window used to display the user interface of the first application 611 is typically a full screen window, and the window used to display the user interface of the second application 612 is typically a "small window," i.e., the window is smaller in size than the full screen window, which would cause the user interface of the second application 612 to partially occlude the user interface of the first application if the window of the second application were located on top of the window of the first application 611.
The window of the second application 612 may be located on a layer above the window of the first application 611, or a plurality of layers may be spaced between the window of the second application 612 and the window of the first application 611, which is not limited in this embodiment of the present application.
In this embodiment of the present application, the size of the window for displaying the user interface of the first application 611 and the size of the window for displaying the user interface of the second application 612 are not limited, and as long as the user interface of the second application partially covers the user interface of the first application, the method provided in this embodiment of the present application may be applied, so that the user interface of the first application is no longer covered by the user interface of the second application.
In this embodiment of the application, for the second application 612, the user interface of which may block the user interfaces of other applications and may cause interference to the user, a user input event (input event) is registered for the second application. User input events (input events) may include one or more types of events, such as the following exemplary listed events:
an onClick event, which is triggered when the user touches the item (in touch mode), or focuses on the item using the navigation key and presses the appropriate "enter" button;
an onLongCl ick event, which is triggered when the user continues to touch item (in touch mode), or focuses focus on the item using the navigation key and continues to press the appropriate "enter" button;
onTouch event, which is triggered when the user performs a qualified touch operation, including a press, release, or any on-screen gesture (within the boundaries of an item).
The embodiment of the application does not limit the specific type of the user input event.
In this embodiment, the second application 612, which registers the user input event, may receive information of the user input event transmitted from the input controller 631 in the system layer. The information (or called event parameter) of the user input event may include a coordinate position on the screen of the communication terminal acted by a user touch operation corresponding to the user input event. The second application 612 may determine a position on the screen of the communication terminal where a corresponding user touch operation acts according to the received information of the user input event, and perform a set operation, such as closing the window of the second application 612, when it is determined that the position is not within the window of the second application 612.
The application framework layer includes a window manager 621, and the window manager 621 is implemented by software and represents a system service, which may be referred to as wmmsrvice. The window manager 621 has a function of a conventional window manager, for example, to pass information of a user input event received from a lower layer to a visible window of an application program in a broadcast manner.
The input controller 631 is included in the system layer, and the input controller 631 is implemented by software and represents a system service, which may be referred to as SysInputCtrl. The input controller 631 may receive information of a user input event from the lower layer (kernel layer) and transfer the information of the user input event to the window manager 621 in the application framework layer and to the second application 612 in the application layer registered with the user input event.
Optionally, in some embodiments, a hardware input capturer 641 is included in the kernel layer, and the hardware input capturer 641 is implemented in a software manner and represents a system service, which may be referred to as hardwareinputcapturedevice. The hardware input capturer 641 may obtain information for a user input event from the underlying layer and pass the information for the user input event to the input controller 631 in the system layer.
Alternatively, the hardware input capturer 641 may obtain information of the user input event directly from the driver node of the hardware driver layer. The driving node of the hardware driving layer is used for storing information of user input events corresponding to user touch operations.
Specifically, in the embodiment of the present application, a Hardware input capturer 641 (HardwareInputCapture service) may be added to a Hardware Abstraction Layer (HAL), so as to directly read information of an input event from/dev/input/event 0 (event 1/event 2.) bottom driver node, and call back the information to an input controller 631 (SysInputCtrl) in a system Layer. Because the hardware input capturer 641 may directly acquire information of the user operation event from the bottom-layer driving node, compared with the conventional method in which the user input event information is acquired based on an eventwab mechanism, the acquisition of the user input event information is more convenient.
The HAL is a hardware Interface layer abstracted from a hardware platform, and the Interface layer is responsible for realizing functions and control of a specific hardware platform and providing a uniform Application Programming Interface (API) for an upper layer to realize isolation between the upper layer and bottom layer hardware.
Based on the system architecture of the communication terminal shown in fig. 6, fig. 7 exemplarily shows a flowchart of the display method provided in the embodiment of the present application, and as shown in the drawing, the flowchart may include:
step 701: and displaying a second window on the upper layer of the first window.
The first window displays a first user interface of a first application, the second window displays a second user interface of a second application, and the first user interface is partially shielded by the second user interface.
In some scenarios, a current communication terminal is running a first application, and a first user interface of the first application is displayed on a screen of the communication terminal in a full-screen manner. In the case that a certain event is triggered, the second window of the second application program is displayed on the upper layer of the first window of the first application program, and the user interface of the second application program partially shades the user interface of the first application program.
Step 702: and closing the second window in response to the user touch operation acting on the first area. The first area is an area where the first window is not shielded by the second window, and the first area does not include the second window area.
In this step, the user may perform a touch operation in the first area, so that the second window is closed.
Specifically, after the user performs the touch operation in the first area, and the touch screen of the communication terminal receives the touch operation, the corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into a user input event (including touch coordinates, a time stamp of the touch operation, and the like). Information of the user input event is stored in a driver node of the hardware driver layer. And the hardware input capturer of the kernel layer reads the information of the user input event from the driving node of the hardware driving layer and then transmits the information to the input controller of the system layer. And the input controller of the system layer transmits the information of the user input event to a window manager of an application program framework layer on one hand, and transmits the information of the user input event to a second application program which registers the user input event on the other hand.
After receiving the information of the user input event from the input controller, the second application program judges whether the corresponding user touch operation acts on the window (namely, the second window) of the second application program according to the information of the user input event, and if the corresponding user touch operation does not act on the second window, the second window is closed.
And after receiving the information of the user input event from the input controller, the window manager of the application program framework layer processes the information according to a conventional mode. Specifically, after receiving the information of the user input event, the window manager determines a window corresponding to the user input event according to a position of a corresponding user touch operation action, determines an application program to which the window belongs, and broadcasts the information of the user input event to a visible window of each application program, where the information carries identification information (such as an application program package name) of the determined application program. And the application program receiving the information of the user input event broadcasted by the window manager judges whether the identification information of the application program is the identification information of the application program, if so, the application program responds to the user input event, and otherwise, the application program discards the information of the user input event. For example, if the user touch operation is performed in the first window, the information of the user input event broadcast by the window manager includes a first application package name, the first application responds to the user input event after receiving the information of the user input event, and the second application discards the information of the user input event after receiving the information of the user input event; for another example, if the user touch operation is performed in the second window, the information of the user input event broadcast by the window manager includes a second application package name, the second application responds to the user input event after receiving the information of the user input event, and the first application discards the information of the user input event after receiving the information of the user input event.
It should be noted that there may be a plurality of second applications, and the number of corresponding second windows is also a plurality, in this case, each second application may be executed according to the flow shown in fig. 7 after receiving the user input event information from the input controller of the system layer.
It can be seen from the foregoing embodiments that, in the embodiments of the present application, an input controller is added in a system layer, and the input controller can transfer information of a user input event acquired from a lower layer to an application program in which the user input event is registered, so that the application program can process according to the received user input event. Because the input controller of the system layer directly transfers the user input event to the second application program which registers the event and bypasses the window manager of the application program framework layer, the limitation that an event broadcasting mechanism based on the window manager can only enable the application program (such as the first application program) with the corresponding user touch operation function to respond to the user input event is removed, and the application program (such as the second application program) without the user touch operation function can also be allowed to process (close the window), so that the interference of the second window on the use of the first application program by the user is avoided, the user operation is simple, and the user experience can be improved.
Referring to fig. 8, a system architecture provided for the embodiment of the present application shows components related to the embodiment of the present application, and in implementation, the system architecture of the communication terminal may include more components.
As shown in fig. 8, one or more applications are included in the application layer, and only a first application 811 and a second application 812 are shown in this embodiment. The first application 811 and the second application 812 may be system applications or third-party applications. For example, the first application 811 may be a game, or an electronic reading browser, or a video player, etc.; the second application 812 may be a telephony application, or a popup application, or the like.
In some embodiments, the window used to display the user interface of the first application 811 is typically a full screen window, and the window used to display the user interface of the second application 812 is typically a "small window," i.e., the window is smaller in size than the full screen window, which would cause the user interface of the second application 812 to partially occlude the user interface of the first application if the window of the second application was above the window of the first application 811.
In this embodiment, the size of the window for displaying the user interface of the first application 811 and the size of the window for displaying the user interface of the second application 812 are not limited, and as long as the user interface of the second application partially covers the user interface of the first application, the method provided in this embodiment may be applied, so that the user interface of the first application is no longer covered by the user interface of the second application.
In the embodiment of the present application, the first application 811 and the second application 812 may receive information of a user input event broadcast from the window manager 821 of the application framework layer and process it in a conventional manner. For the description related to the user input event and the description related to the information related to the user input event, reference may be made to the foregoing embodiments.
Included in the application framework layer is a window manager 821, which is implemented in software, representing a system service that may be referred to as a wmmsrvice. The window manager 821 has, in addition to the functions of the conventional window manager, the following functions: and according to the event information input by the user, judging that the window (first window) of the first application program and the window (second window) of the second application program are adjusted when a certain condition is met. Specifically, the window manager 821 may determine, according to the event information input by the user, a corresponding area where the user touches and operates, and if the area is a visible area of the first window, place the first window on an upper layer of the second window, so that the second user interface in the second window is blocked by the first user interface in the first window, that is, the second window is hidden.
Optionally, the window manager 821 may query the application visible window list according to the information of the user touch operation coordinate position in the user input event information, so as to determine the visible area of the application visible window where the user touch operation coordinate position is located. The application program visible window list stores visible window identification information, identification information of an application program to which the visible window belongs, and the position and size of a visible area of the visible window. Further, the application visible window list may further include a hierarchical relationship of the visible window of each application, and of course, the hierarchical relationship information of the visible window of each application may also be stored in an independent application window hierarchical list. Wherein the identification information of the application program can be the name of the application program package, and the position and the size of the visible window area can be characterized by using the coordinates of the visible area.
Further, the window manager 821 may also transmit the information of the position of the user touch operation coordinate, the identification information of the application program where the user touch operation coordinate is located, and the display level information of the application program window to the second application program. For example, the window manager 821 may pass the above information to an application having a visible window in the application layer by broadcasting.
If the current user touch operation is applied to the visible area of the visible window of the first application 811, the window manager 821 transfers information in which the identification information of the application where the user touch operation coordinate position is located is the identification information of the first application 811, and the application window display level information indicates that the second window of the second application 812 is on the upper layer of the first window of the first application 811.
Further, after receiving the information transmitted by the window manager 821, the second application 812 may close the second window if it is determined that the user touch operation is not performed on the second window of the second application according to the identification information (for example, the application package name of the first application 811) of the application in which the user touch operation coordinate position is located, and it is determined that the second window is located on the upper layer of the first window of the first application 811 according to the application window display level information.
Further, after receiving the information transmitted by the window manager 821, the first application 811 determines a position of the user touch operation according to the information of the user touch operation coordinate position if it is determined that the user touch operation is applied to the second window of the first application based on the identification information (for example, the application package name of the first application 811) of the application in which the user touch operation coordinate position is located, and responds to the application touch operation.
Alternatively, the window manager 821 may update the application visible window list each time a window is opened and the display position is changed. Specifically, before the window hierarchy is not adjusted, the window manager 821 may update the visible area of the first window in the application visible window list according to the position and size of the second window, and since the second window is displayed on the upper layer of the first window, the first window is partially blocked by the second window, and thus the visible area of the first window does not include an area where the first window is blocked by the second window. After adjusting the window hierarchy (i.e., after placing the first window on the upper layer of the second window), the window manager 821 updates the visible areas of the first window and the second window in the application program visible window list again according to the adjusted window hierarchy relationship, where if the first window is a full-screen window, the visible area of the first window is a full-screen window area, and since the second window is completely blocked by the first window, the second window does not have a visible area.
For example, as shown in fig. 9, if the current communication terminal is running a game, the game window 910 for displaying the game interface is a full-screen window, so that the game interface is displayed on the screen of the communication terminal in a full-screen manner. At a certain moment, the communication terminal receives an incoming call, and as a response to the event, the incoming call display window 911 of the call application program is displayed on the upper layer of the game window to partially shield the game interface. The window manager 821 updates the application visible window list, wherein the visible area of the game window 910 is shown as the first area 912 (the area is filled by oblique lines), and the visible area of the caller ID window 911 is the whole area of the window.
The kernel layer includes a hardware input capturer 841, and the hardware input capturer 841 is implemented by software and represents a system service, which may be called as HardwarneInputCapture service. The hardware input capturer 841 may obtain information of the user input event from the underlying layer and pass the information of the user input event to the window manager 821 of the application framework layer.
Alternatively, the hardware input capturer 941 may directly obtain information of the user input event from a driving node of the hardware driving layer. The driving node of the hardware driving layer is used for storing information of user input events corresponding to user touch operations.
Based on the system architecture of the communication terminal shown in fig. 8, fig. 10 exemplarily shows a flowchart of the display method provided in the embodiment of the present application, and as shown in the drawing, the flowchart may include:
step 1001: and displaying a second window on the upper layer of the first window.
The first window displays a first user interface of a first application, the second window displays a second user interface of a second application, and the first user interface is partially shielded by the second user interface.
In some scenarios, a current communication terminal is running a first application, and a first user interface of the first application is displayed on a screen of the communication terminal in a full-screen manner. In the case that a certain event is triggered, the second window of the second application program is displayed on the upper layer of the first window of the first application program, and the user interface of the second application program partially shades the user interface of the first application program.
Step 1002: and hiding the second window in response to the user touch operation acting on the first area. The first area is an area where the first window is not covered by the second window, that is, a visible area of the first window.
In this step, the user may perform a touch operation in the first area, so that the second window is hidden.
Specifically, after the user performs the touch operation in the first area, and the touch screen of the communication terminal receives the touch operation, the corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into a user input event (including touch coordinates, a time stamp of the touch operation, and the like). Information of the user input event is stored in a driver node of the hardware driver layer. And the hardware input capturer of the kernel layer directly reads the information of the user input event from the driving node of the hardware driving layer and then transmits the information to the window manager of the application program framework layer.
After the window manager of the application framework layer receives the information of the user input event, on one hand, the area corresponding to the touch operation action of the user is determined according to the information of the user input event, and if the area corresponding to the touch operation action of the user is the visible area of the first window, the first window is placed on the upper layer of the second window, so that the second window is hidden, namely, the user interface in the second window is shielded by the user interface in the first window.
Optionally, in some embodiments, the window manager of the application framework layer determines, according to the position of the corresponding user touch operation action, a window corresponding to the user input event, determines an application program to which the window belongs, and broadcasts information of the user input event to the visible window of each application program, where the information carries identification information (such as an application package name) of the determined application program. And the application program receiving the information of the user input event broadcasted by the window manager judges whether the identification information of the application program is the identification information of the application program, if so, the application program responds to the user input event, and otherwise, the application program discards the information of the user input event. For example, if the user touch operation is performed in the visible area of the first window, the information of the user input event broadcast by the window manager includes a first application package name, the first application responds to the user input event after receiving the information of the user input event, and the second application discards the information of the user input event after receiving the information of the user input event; for another example, if the user touch operation is performed in the visible area of the second window, the information of the user input event broadcast by the window manager includes a second application package name, the second application responds to the user input event after receiving the information of the user input event, and the first application discards the information of the user input event after receiving the information of the user input event.
Optionally, in another embodiment, the window manager may further transmit the information of the position of the user touch operation coordinate, the identification information of the application program where the user touch operation coordinate is located, and the display hierarchy information of the application program window to the second application program, so that the second application program determines, according to the identification information of the application program where the user touch operation coordinate is located, that the user touch operation is not applied to the second window of the second application program, and closes the second window when it is determined, according to the display hierarchy information of the application program, that the second window is located on the upper layer of the first window of the first application program.
In this case, after receiving the user input event information from the input controller of the system layer, the window manager of the application framework layer determines that the user touch operation is applied to the visible area of the window of the first application according to the flow shown in fig. 10, and then places the first window on top of all the second windows.
It can be seen from the foregoing embodiment that, in the embodiment of the present application, a window manager in an application framework layer may intercept a user input event, perform window level adjustment when it is determined that a specific condition is satisfied according to the user input event, and then distribute information of the user input event. Because the window manager of the application program framework layer is used for adjusting the window level, the problem of interference of the second window on the use of the first application program by a user is solved at the system level, the application program does not need to be improved, the applicability is wide, the user operation is simple, and the user experience can be improved.
It should be noted that in another embodiment of the present application, the system architectures shown in fig. 5 and fig. 8 may be fused, for example, on the basis of adding an input controller in the system layer, a window manager in the application framework layer is further improved to have the function of the window manager in fig. 8, so that the flows shown in fig. 7 and fig. 10 may be implemented.
According to the above embodiment, as shown in fig. 11a, if the current communication terminal is running a game, the game window 1110 for displaying the game interface is a full screen window, so that the game interface is displayed on the screen of the communication terminal in a full screen manner. At a certain moment, when the communication terminal receives an incoming call, as a response to the event, the incoming call display window 1111 of the call application program is displayed on the upper layer of the game window, and the game interface is partially blocked. After the user clicks in the first area 1112, according to the flow shown in fig. 7 or fig. 10, the caller id display window 1111 is closed by the call application program or hidden to the lower layer of the game window 1110, and the game can respond to the user click to perform corresponding processing, so that the game operation of the user is not affected.
Further, if the call application determines that the touch operation of the user is not applied to the incoming call display window 1111, the incoming call may be hung up before the incoming call display window is closed.
According to the above flow, as shown in fig. 11b, if the current communication terminal is running a game, the game window 1120 for displaying the game interface is a full screen window, so that the game interface is displayed on the screen of the communication terminal in a full screen manner. Due to the fact that the user touches the edge of the screen of the communication terminal by mistake, a small window 1121 for displaying the screen recording shortcut icon is displayed on the upper layer of the game window, and the game interface is partially shielded. After the user clicks in the first region 1122, according to the flow shown in fig. 7 or fig. 10, the small window 1121 is closed by the call application program or hidden to the lower layer of the game window 1110, and the game can respond to the user clicking to perform corresponding processing, so that the game operation of the user is not affected.
According to the above flow, as shown in fig. 11c, if the current communication terminal is running the electronic reading application, the window 1130 for displaying the e-book page is a full screen window, so that the e-book page is displayed on the screen of the communication terminal in a full screen manner. At a certain moment, the advertisement popup 1131 is displayed on the upper layer of the e-book page, and partially shields the e-book page. After the user performs the leftward sliding operation in the first region 1132, according to the flow shown in fig. 7 or 10, the advertisement popup 1131 is closed or hidden to a lower layer of the window 1130 of the e-book page, and at the same time, the next page of the e-book is displayed on the screen of the communication terminal in response to the leftward sliding operation.
According to a further aspect of the exemplary embodiments, there is provided a computer storage medium having stored therein computer program instructions which, when run on a computer, cause the computer to perform a processing method as described above.
According to a further aspect of the exemplary embodiments, there is provided a computer program product which, when invoked by a computer, causes the computer to perform any of the display methods described above.
Since the communication terminal and the computer storage medium in the embodiments of the present application may be applied to the processing method, reference may also be made to the method embodiments for obtaining technical effects, and details of the embodiments of the present application are not repeated herein.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
While specific embodiments of the present application have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and that the scope of the present application is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and principles of this application, and these changes and modifications are intended to be included within the scope of this application.

Claims (8)

1. A communication terminal, comprising:
a touch screen configured to receive a touch operation from a user;
a display screen configured to display a user interface;
a processor coupled to the touch screen and the display screen, respectively;
the display screen displays a first window and a second window, a first user interface of a first application is displayed in the first window, a second user interface of a second application is displayed in the second window, and the first user interface is partially shielded by the second user interface;
the processor is configured to:
in response to a user touch operation acting on a first area, closing the second window, wherein the first area is an area where the first window is not shielded by the second window;
the processor is configured with an input controller configured to:
obtaining information of a user input event corresponding to the user touch operation;
and transmitting the information of the user input event to a second application program registered with the user input event, so that the second application program closes the second window when judging that the user touch operation does not act in the second window according to the information of the user input event, and the second application program registers the user input event.
2. The communication terminal of claim 1, wherein the input controller is further configured to:
and directly acquiring the information of the user input event from a driving node of a hardware driving layer, wherein the driving node of the hardware driving layer is used for storing the information of the user input event corresponding to the user touch operation.
3. The communication terminal of claim 1, wherein the second window is a caller id window of a call application;
the telephony application is further configured to: and if the touch operation of the user is not acted in the second window according to the information of the user input event, hanging up the incoming call.
4. The communication terminal of claim 1, wherein the processor is configured with a window manager configured to:
obtaining information of a user input event corresponding to the user touch operation;
and determining an area acted by the user touch operation according to the user input event information, and if the area acted by the user touch operation is a visible area of the first window, placing the first window on the upper layer of the second window, so that the second user interface in the second window is shielded by the first user interface in the first window.
5. The communication terminal of claim 4, wherein the window manager is further configured to:
and inquiring an application program visible window list according to the position information of the user touch operation coordinate in the user input event information to obtain a visible area of an application program visible window where the user touch operation coordinate is located, wherein the application program visible window list stores visible window identification information, identification information of an application program to which the visible window belongs, and the position and the size of the visible area of the visible window.
6. The communication terminal of claim 5, wherein the window manager is further configured to:
before the first window is placed on the upper layer of the second window, the visible area of the first window in the application program visible window list is updated according to the position and the size of the second window, and the visible area of the first window does not include the area of the first window, which is shielded by the second window.
7. The communication terminal of claim 5, wherein the window manager is further configured to:
transmitting the user touch operation coordinate position information, the identification information of the application program where the user touch operation coordinate position is located and application program window display level information to the second application program, so that the second application program judges that the user touch operation is not acted in a second window of the second application program according to the identification information of the application program where the user touch operation coordinate position is located, and closes the second window when the second window is determined to be located at the upper layer of the first window of the first application program according to the application program window display level information; the identification information of the application program where the user touch operation coordinate position is located is the identification information of the first application program, and the application program window display level information indicates that the second window of the second application program is on the upper layer of the first window of the first application program.
8. A display method, comprising:
displaying a second window on an upper layer of a first window, wherein a first user interface of a first application is displayed in the first window, a second user interface of a second application is displayed in the second window, and the first user interface is partially shielded by the second user interface;
in response to a user touch operation acting on a first area, closing the second window, wherein the first area is an area where the first window is not shielded by the second window;
the method further comprises the following steps:
acquiring information of a user input event corresponding to user touch operation;
and transmitting the information of the user input event to a second application program registered with the user input event, so that the second application program closes the second window when judging that the user touch operation does not act in the second window according to the information of the user input event, and the second application program registers the user input event.
CN202110152969.0A 2021-01-22 2021-02-03 Communication terminal and display method Active CN112835472B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021100908007 2021-01-22
CN202110090800 2021-01-22

Publications (2)

Publication Number Publication Date
CN112835472A CN112835472A (en) 2021-05-25
CN112835472B true CN112835472B (en) 2023-04-07

Family

ID=75932022

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110152969.0A Active CN112835472B (en) 2021-01-22 2021-02-03 Communication terminal and display method

Country Status (1)

Country Link
CN (1) CN112835472B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879896B (en) * 2022-07-08 2023-05-12 荣耀终端有限公司 Frozen screen processing method, electronic equipment and storage medium
CN115237315A (en) * 2022-07-08 2022-10-25 北京字跳网络技术有限公司 Information display method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049900A (en) * 2014-06-30 2014-09-17 北京安兔兔科技有限公司 Floating window closing method and device
CN104571790A (en) * 2013-10-28 2015-04-29 联想(北京)有限公司 Information processing method and electronic equipment
CN109445572A (en) * 2018-09-10 2019-03-08 华为技术有限公司 The method, graphical user interface and terminal of wicket are quickly recalled in full screen display video
WO2019114828A1 (en) * 2017-12-14 2019-06-20 Oppo广东移动通信有限公司 User interface display method and apparatus, device, and storage medium
CN110489043A (en) * 2019-07-31 2019-11-22 华为技术有限公司 A kind of management method and relevant apparatus of suspension windows

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571790A (en) * 2013-10-28 2015-04-29 联想(北京)有限公司 Information processing method and electronic equipment
CN104049900A (en) * 2014-06-30 2014-09-17 北京安兔兔科技有限公司 Floating window closing method and device
WO2019114828A1 (en) * 2017-12-14 2019-06-20 Oppo广东移动通信有限公司 User interface display method and apparatus, device, and storage medium
CN109445572A (en) * 2018-09-10 2019-03-08 华为技术有限公司 The method, graphical user interface and terminal of wicket are quickly recalled in full screen display video
CN110489043A (en) * 2019-07-31 2019-11-22 华为技术有限公司 A kind of management method and relevant apparatus of suspension windows

Also Published As

Publication number Publication date
CN112835472A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
CN111367456A (en) Communication terminal and display method in multi-window mode
CN111597000B (en) Small window management method and terminal
CN111240546B (en) Split screen processing method and communication terminal
CN112835472B (en) Communication terminal and display method
CN111274564A (en) Communication terminal and application unlocking method in split screen mode
CN113709026B (en) Method, device, storage medium and program product for processing instant communication message
CN111176766A (en) Communication terminal and component display method
CN112000408B (en) Mobile terminal and display method thereof
CN113835569A (en) Terminal device, quick start method for internal function of application and storage medium
CN113055585B (en) Thumbnail display method of shooting interface and mobile terminal
CN112992082B (en) Electronic equipment and refreshing method of electronic ink screen thereof
CN111324255B (en) Application processing method based on double-screen terminal and communication terminal
CN114035870A (en) Terminal device, application resource control method and storage medium
CN114546219A (en) Picture list processing method and related device
CN114020379A (en) Terminal device, information feedback method and storage medium
CN111787157A (en) Mobile terminal and operation response method thereof
CN113641431A (en) Method and terminal equipment for enhancing display of two-dimensional code
CN113760164A (en) Display device and response method of control operation thereof
CN112363653A (en) Ink screen display method and terminal
CN111159734A (en) Communication terminal and multi-application data inter-access processing method
CN111258699B (en) Page display method and communication terminal
CN112000409B (en) Mobile terminal and display method thereof
CN113938550B (en) Terminal equipment, information feedback method and storage medium
CN112000411B (en) Mobile terminal and display method of recording channel occupation information thereof
CN111381801B (en) Audio playing method based on double-screen terminal and communication terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant