Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Before introducing the scheme provided by the embodiment of the present application, an implementation of a multi-window system is first introduced. Specifically, the method includes steps S101 and S102:
s101: and the terminal stores each display unit of the application in a container corresponding to the application aiming at each started application, wherein the containers corresponding to different applications are different.
In the embodiment of the present application, the terminal includes but is not limited to: mobile phones, tablet computers, and the like.
On the terminal, after each application is started, each display unit for displaying the interface of the application is respectively started in the running process of the application. The terminal can instantiate a container for each started application, and then store each display unit started by the application in the corresponding container, and subsequently display the corresponding application interface.
The container may be a stack, the display unit located at the top of the stack is in an Active state, the display unit in the Active state is called by a process or a thread of the application to display an application interface corresponding to the display unit to a user, content on the application interface corresponding to the display unit may be refreshed in real time and may interact with the user through preset various input instructions, and at this time, other display units in the stack are all in a Paused (used) state or a Stopped (Stopped) state, and the display unit in the used state or the Stopped state is not currently displayed to the user. When the application jumps from the current application interface to another application interface, the display unit corresponding to the another application interface pops out of the stack and is saved at the top of the stack again, that is, is activated and switched to the Active state, and correspondingly, the saving position of the original display unit at the top of the original stack is moved downwards and is switched to the used state or the Stopped state, so that the another application interface becomes the application interface which is currently displayed to the user.
S102: and the terminal displays the window of the application according to the window attribute of the application and each display unit stored in the container corresponding to the application.
In the embodiment of the application, each display unit of the application includes various predefined display elements on the application interface, such as characters, pictures, videos, controls, and the like. Generally, when an application process calls a display unit to display an application interface, resolution information of a current terminal screen is acquired, and then the application interface is directly displayed on the terminal screen in a full screen in a self-adaptive manner according to the resolution information. If the application interface corresponding to each display unit is to be displayed in a window manner, the size and the position of the window to be displayed need to be determined, so that the window attributes can be predefined to store the relevant information, and the relevant information can be tracked and updated during the lifetime of the window (the time from the creation of the window to the closing of the window).
The window attribute may include a window length value, a window width value, a window coordinate value, and the like, where the coordinate referred to by the window coordinate value is a two-dimensional coordinate on the current terminal screen, the window coordinate value is used to locate a position of the window on the current terminal screen, and may be a coordinate of an intersection of two diagonal lines of the window, or may be a coordinate of another specific point (e.g., a vertex of the window) in the window region, and for a rectangular window, when the window length value, the window width value, and the window coordinate value are determined, the size and the position of the window are determined.
In this embodiment of the application, the terminal may determine the size and the position of the window on the terminal screen according to the window attribute of the application, and then call, through a corresponding process or thread, a corresponding display unit stored in a container corresponding to the application to display the window of the application.
In an embodiment of the present application, the display unit specifically includes an activity, and the container specifically includes an activity stack.
The activity stack is a First-In/Last-Out (First-In/Last-Out) data structure, and the valid data stored In the activity stack is each activity of the application corresponding to the activity stack. The following illustrates an activity saving manner provided in the embodiment of the present application, and assuming that the terminal has started an application a and an application B, where the application a has started 2 activities (activity a1 and activity a2) and the application B has started 3 activities (activity B1 to activity B3), the terminal instantiates two activity stacks for respectively saving the activity of the application a and the activity of the application B.
FIG. 2 shows the activity stacks for application A and application B, with application A on the left and application B on the right. Only the activity a2 and the activity b3 at the top of the two stacks are in active state, and the other activities are in either used state or Stopped state. Obviously, at this time, application a is displaying an application interface corresponding to activityA2, and application B is displaying an application interface corresponding to activityB 3. Assuming that the application a jumps from the current application interface to the application interface corresponding to the activity a1, the terminal may sequentially pop the activity a2 and the activity a1 in the activity stack of the application a, and then sequentially pop the activity a2 and the activity a1 so that the activity a1 is located at the top of the stack of the application a, the activity a1 is switched to the active state, the activity a2 is switched to the used state or the Stopped state, and further, the related process or thread of the application a may call the activity a1 to display the corresponding application interface.
In the embodiment of the present application, another implementation manner of a multi-window system may specifically include the following first step and second step:
the method comprises the following steps: and for each started application, storing the display unit of each application in at least one container.
In an embodiment of the present application, the display unit may be an activity, and the container may be an activity stack.
The terminal may instantiate an activity stack for each application to store the activity of the application, or may directly instantiate one or more activity stacks for each application to share the activity stacks, so that the activity of each application may be stored in the activity stacks and subsequently used for display.
Step two: and displaying a plurality of windows according to the set window attributes and each display unit stored in the container, wherein at least one window displays more than two visual units.
In this embodiment of the application, the terminal may display a window for each started application according to the set window attribute and each display unit stored in the activity stack, and in this way, only the visual unit of the application itself may be displayed in the window of each application.
The terminal may also instantiate one or more windows according to the set window attributes, and then display two or more visual units in each window according to each display unit stored in the activity stack, so that the use efficiency of the windows may be improved. Two or more visual units displayed in the same window can belong to the same application or belong to different applications.
Fig. 3 shows a schematic diagram of a multi-window display in this scenario, and it can be seen that there are two windows displayed on the terminal screen.
The window "applications a & B" is a window common to the applications a and B, and on the left side of the display area of the window (i.e., on the left side of the dotted line) is a visual unit of the application a (also referred to as real-time content) displayed according to the activity of the application a, and on the right side of the display area of the window (i.e., on the right side of the dotted line) is a visual unit of the application B displayed according to the activity of the application B.
The window "app C" is a window of the app C, a first visual unit of the app C displayed according to one activity of the app C is on the left side of the display area of the window (i.e., on the left side of the dotted line), and a second visual unit of the app C displayed according to another activity of the app C is on the right side of the display area of the window (i.e., on the right side of the dotted line).
In practical applications, when the terminal needs to display more than two visual units in the same window at the same time, the size and position (size and position, which may be referred to as sub-area herein) of each visual unit in the window display area may be predefined in the window attribute of the window, so that each visual unit can be correctly displayed in the corresponding sub-area. Currently, according to different settings of the sub-regions, the visual elements may be displayed in a completely overlapping manner or in a partially overlapping manner, or may not be displayed in a non-overlapping manner in the same window.
By the method, the terminal can instantiate a plurality of activity stacks to save the activity of each started application, so that the terminal can simultaneously display a plurality of windows according to the display unit at the top of each activity stack, and can pre-divide a window display area into a plurality of sub-areas according to the set window attribute, so that one visual unit can be displayed in each sub-area of the same window, and the visual units of a plurality of applications can be simultaneously displayed on the screen of the terminal.
For convenience of description, hereinafter, "window" and "visual unit" will be collectively referred to as "display object".
Based on the above description, to solve the problem that the window switching cannot be realized based on the input event in the android native system, an embodiment of the present application first provides a method for switching a display object in a multi-window system, where the method includes the following steps as shown in fig. 4:
step 41, obtaining coordinate values of click points contained in the input event;
step 42, judging whether a display object corresponding to the touchbleregion containing the coordinate value of the click point is a top end display object; if the determination result indicates that the display object is not the top display object, go to step 43;
it should be noted that, when the display object is a window, the touchbleregion may be a window attribute in an input list maintained by the input module described in the background art. The window attribute corresponding to a single window represents at least the coordinate range of the clickable area of the single window, i.e., the coordinate values of the clickable points located within that area. In addition, the window attribute may also represent other parameters of the window, such as size, location on the display screen, and/or width, etc. In the embodiment of the present application, the coordinate range of the clickable area of the window and the values of other parameters may be changed.
When the display object is a visual unit, the touchbleregion may be a new window attribute obtained by modifying the window attribute described in the background art. Specifically, when the display object is a visual cell, the new window attribute may specifically be an attribute of a visual cell in the window. The attribute corresponding to a single visual element represents at least the coordinate range of the clickable area of the single visual element, i.e., the coordinate values of the clickable points located within that area. Furthermore, the window properties may also represent other parameters of the visual element, such as size, position on the display screen, and/or width, etc. In the embodiment of the present application, the coordinate ranges of the clickable areas of the visual elements and the values of other parameters may be changed.
And 43, setting the display object as a top display object, and refreshing the display interface.
When the display object is a visual cell, setting the visual cell as a top visual cell (the top visual cell may also be referred to as a focus visual cell) means that a window in which the visual cell is located is also set as a top window (the top window may also be referred to as a focus window).
By adopting the method provided by the embodiment of the application, when the display object corresponding to the clickable area containing the coordinate value of the click point is judged not to be the top end display object, the display object is set as the top end display object to be displayed, so that the problem that window switching cannot be realized based on an input event in an android native system is solved.
Some alternative embodiments of the above steps are described in detail below.
In step 42, considering that the touchbleregion of the display object changes after the display object is dragged, zoomed in, or zoomed out, in order to ensure the accuracy of the determination result obtained by performing step 42, in one embodiment, the touchbleregion in the input list may be updated before determining whether the display object corresponding to the touchbleregion including the coordinate value of the click point is the top display object.
The manner of determining whether the display object corresponding to the clickable area containing the coordinate value of the click point is the top display object based on the updated input list may include:
determining the TouchableRegion containing the coordinate value of the click point by traversing the TouchableRegion in the updated input list; and judging whether the display object corresponding to the TouchableRegion containing the coordinate value is the top display object or not.
In the embodiment of the present application, the update to the input list may be periodic or aperiodic. For an aperiodic update, for example, in an embodiment, the update may include: and after the TouchableRegion of the display object is determined to be changed, updating the TouchableRegion in the input list according to the changed TouchableRegion.
For step 43:
in one embodiment, the implementation of setting the display object as the top display object may be implemented by revising the list of Z-orders.
Wherein the Z-order list stores the Z-order value of the display object. Generally, the larger the value of the Z-order of a display object, the closer the display object is to the top in the Z-order; the smaller the value of Z-order of the display object, the closer the display object is in the Z-order to the bottom. When displaying multiple display objects, the multi-window system takes the value of the Z order stored in the list of the Z-order as the basis for determining the arrangement order of the display objects. Based on such characteristics, in the embodiment of the present application, the non-top display object may be set as the top display object by updating the value of the Z-order of the non-top display object to be greater than the value of the Z-order of other display objects in the multi-window system.
In this embodiment of the present application, in order to implement processing on an input event, the method may further include the steps of: and sending the input event to a display unit of a display object corresponding to the touchbleregion containing the coordinate value of the click point.
Another embodiment of the present application, which takes a display object as an example of a window, discloses an apparatus 100 for switching windows in a multi-window system, which can be used in an apparatus such as a kiosk. Referring to fig. 5, an apparatus 100 for switching windows in a multi-Window system according to an embodiment of the present disclosure includes an update Module 1, an Input Module 2, a determination Module 3, and a Window management Module 4.
An input list (InputList)21 is maintained in the input module 2, and the input list 21 stores window attributes corresponding to each display unit. Referring to FIG. 6, the input list 21 corresponds to list 5 of Z-orders. For example, the window attributes included in the input list 21 are specifically O1, O2 to On, where O1 is the window attribute of the window numbered 1, and the meanings of O2 to On are similar; the Z-order values contained in Table 5 of Z-order are in particular Z1, Z2 up to Zn, where Z1 is the value of the Z-order of the window numbered 1, and Z2-Zn have the same meaning; then O1 corresponds to Z1, O2 corresponds to Z2, and On corresponds to Zn.
Among the window attributes maintained by the input module 2, there is a window attribute named "clickable area (touchregion)", where the information at least indicates a coordinate range of the clickable area of the window, that is, coordinate values of clickable points located in the area. In addition, the information may also indicate other parameters of the window, such as size, location on the display screen, and/or width, etc. In embodiments of the present application, the coordinate ranges of the clickable areas of the window and the values of other parameters may be varied.
Based on the above introduction, please refer to fig. 7, the following description is made on a method for switching between multiple windows of the present application, where the method includes the following steps:
s1: updating an input list 21 maintained by the input module 2 according to the applied parameter of the windowPanel;
the function of the windowpane is to record the state of the window.
In this embodiment of the present application, the implementation manner of step S1 mainly includes: the TouchableRegion in the input list 21 is updated according to the current TouchableRegion of the application, so that the TouchableRegion in the input list 21 is synchronized with the current TouchableRegion of the application.
S2: when the clicking action is detected, a java layer of the system generates input events and sends the input events to the input module 2, and each input event contains the coordinate value of a clicking point; then, the input module 2 traverses the input list 21 in order; if the touch tableregion of a window is found to contain the coordinate value contained in the input event, distributing the event to the display unit of the window even if the window is a non-top window;
s3: the display unit receiving the input event judges whether the corresponding window is a top window; if yes, the display unit processes the event, such as judging the position of a button and the like; if not, the display unit sends a request to the window management module 4;
s4: after receiving the request, the window management module 4 modifies the sequencing of the window list (windowList), further triggers the change of the Z-order, and finally modifies the list 5 of the Z-order; and then refreshing a User Interface (UI), thereby realizing the switching of multiple windows.
Some alternative embodiments of the above steps are detailed below:
in one embodiment, the execution timing of step S1 may include: dragging the window, changing the size of the window, etc., to ensure that subsequent click actions can be based on the most recent window state.
In one embodiment, in step S2, please refer to fig. 1B, for example, the started applications are application a and application B, where application a is the application that is currently and normally displayed, and the window of application B is partially blocked by the window of application a, and when the mouse clicks the window of application B, the multi-window system 100 of the present application may also enable the input event to be correctly transmitted to the display unit of the partially blocked window of application B.
Still taking application a and application B above as an example, in one embodiment, in step S4, after modifying list 5 of Z-order, the value of Z-order of application B may be made greater than the value of Z-order of application a; further, in the process of executing the UI refresh, the window management module 4 may determine, according to the modified Z-order list 5, that the value of the Z-order of the application B is currently the largest value of the Z-order, so as to display the window of the application B as the top window, that is, swap all display units of the application B with the display unit of the application a, thereby completing the switching of the top window to the window of the application B based on the input event (see fig. 8).
Compared with an android native system, the applied windowpane parameters in the embodiment of the application can be updated; according to the method and the device, a distribution mechanism of the input event is changed, the input event can be transmitted to the non-top end display unit, the list 5 of the Z-order is modified, and finally the function of window switching through a mouse is achieved.
Referring to fig. 5, the functions of the modules of the apparatus 100 for switching windows in a multi-window system are as follows:
the updating module 1 is used for updating the parameters of the applied WindowPanel to the clickable area of the corresponding window; wherein, the WindowPanel is used for recording the state of the window;
the input module 2 is configured to generate input events when there is a click action, where each input event includes a coordinate value (e.g., an x-y coordinate value) of the click action, and then sequentially traverse the input list 21; if the clickable area of a window is found to contain the coordinates of the input, then the event is distributed to the window, even if the window is a non-top display unit;
the judging module 3 is used for judging whether the window corresponding to the event is the current window; if yes, processing is carried out by the window, such as judging the position of a button and the like; if not, sending a request to the window management module 4;
the window management module 4 is used for modifying the sorting of a window list (windowList) after receiving the request, further triggering the change of the Z-order and finally modifying a list 5 of the Z-order; the UI is then notified of the refresh, thus enabling multi-window switching.
Wherein:
the parameter updates of the applied windowpane include dragging the window, changing the size of the window, etc. to ensure that subsequent click actions can be based on the latest window state.
Referring to fig. 2, for example, the started applications are application a and application B, where application a is the application that is currently and normally displayed, and the window of application B is partially blocked by the window of application a, and at this time, the multi-window system 100 of the present application also enables the input event to be correctly transmitted to the partially blocked window of application B.
Referring to fig. 8, after modifying the list 5 of Z-orders, all display units of application B are swapped with those of application a, thus completing the mouse click switch application B.
Compared with an android native system, the applied windowpane parameters in the embodiment of the application can be updated; according to the method and the device, a distribution mechanism of the input event is changed, the click event can be transmitted to the non-top end display unit, the list 5 of the Z-order is modified, and finally the function of window switching through a mouse is achieved.
In order to solve the problem that window switching cannot be realized based on an input event in an android native system, an embodiment of the present application further provides a device for switching a display object in a multi-window system, where a specific structural schematic diagram of the device is shown in fig. 9 and includes a coordinate value obtaining unit 91, a determining unit 92, and an interface refreshing unit 93. The functions of the units are described below:
a coordinate value obtaining unit 91 for obtaining coordinate values of the click points included in the input event;
a judging unit 92 configured to judge whether a display object (hereinafter, referred to as the display object for short) corresponding to the clickable area containing the coordinate value obtained by the coordinate value obtaining unit 91 is a top display object;
and an interface refreshing unit 93, configured to set the display object as a top display object and refresh the display interface when the determination result obtained by the determining unit 92 is negative.
In one embodiment, the interface refreshing unit 93 may be configured to: and updating the Z-order value of the display object to be larger than the Z-order values of other display objects in the multi-window system.
In one embodiment, the apparatus may further include an attribute information updating unit. The unit is used for updating the clickable area touchregion in the input list before the judging unit judges whether the display object corresponding to the clickable area containing the coordinate value of the click point is the top end display object.
Based on the function of the attribute information updating unit, the judging unit 92 may be configured to determine a toucheregi including a coordinate value by traversing toucheregions in the input list; and judging whether the display object corresponding to the touchbleregion containing the coordinate value of the click point is the top display object.
In one embodiment, the attribute information updating unit may be configured to: after the TouchableRegion of the display object in the multi-window system is determined to be changed, updating the TouchableRegion in the input list according to the changed TouchableRegion.
In one embodiment, the apparatus may further include: and the sending unit is used for sending the input event to the display unit of the display object.
In one embodiment, when the display object is a window, the apparatus shown in fig. 9 may further include an execution saving unit and a display unit. Wherein:
an execution saving unit configured to save, for an application that has been started, each display unit of the application in a container corresponding to the application before the coordinate value obtaining unit 91 obtains the coordinate value of the click point included in the input event; wherein, at least two started applications are provided, and the containers corresponding to different applications are different;
and the display unit is used for displaying the window of the application according to the set window attribute of the application and each display unit stored in the container corresponding to the application.
In one embodiment, when the display object is a visual unit in a window, the apparatus shown in fig. 9 may further include an execution saving unit and a display unit. Wherein:
an execution saving unit configured to save a display unit of each application in at least one container for each application that has been started before the coordinate value obtaining unit 91 obtains the coordinate values of the click points included in the input event;
a display unit for displaying a plurality of windows according to the set window attributes and each display unit stored in the container; wherein at least one window displays more than two visual units.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.