CN115885245A - Application window control method and device, interactive panel and storage medium - Google Patents

Application window control method and device, interactive panel and storage medium Download PDF

Info

Publication number
CN115885245A
CN115885245A CN202180005735.1A CN202180005735A CN115885245A CN 115885245 A CN115885245 A CN 115885245A CN 202180005735 A CN202180005735 A CN 202180005735A CN 115885245 A CN115885245 A CN 115885245A
Authority
CN
China
Prior art keywords
touch
screen
interface
application window
touch position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180005735.1A
Other languages
Chinese (zh)
Inventor
丁静静
黄业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Innovation Technology Co ltd
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Original Assignee
Guangzhou Shiyuan Innovation Technology Co ltd
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Innovation Technology Co ltd, Guangzhou Shiyuan Electronics Thecnology Co Ltd filed Critical Guangzhou Shiyuan Innovation Technology Co ltd
Publication of CN115885245A publication Critical patent/CN115885245A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Abstract

The application discloses an application window management method and device, an interactive flat panel and a storage medium. The method comprises the following steps: displaying a target application window in a full screen state; receiving a first touch operation, wherein the first touch operation is that a touch object slides from the edge of a screen to the middle of the screen and stays at a first touch position; displaying a first preview interface of a target application window; receiving a second touch operation, wherein the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at the second touch position; displaying a second preview interface of the target application window; receiving a third touch operation, wherein the third touch operation is that the touch object leaves the screen from the second touch position; a first interface of a target application window is displayed. By the method, the problem that the user cannot effectively control the application window to exit from the full screen due to the fact that the interactive tablet is too large in size is solved, convenience in user control is achieved, and system usability of the interactive tablet is improved to a great extent.

Description

Application window control method and device, interactive panel and storage medium Technical Field
The present application relates to the field of application window management technologies, and in particular, to an application window control method and apparatus, an interactive tablet, and a storage medium.
Background
The interactive flat plate is representative large-size integrated equipment, is suitable for group interaction occasions such as conferences, teaching, commercial exhibition and the like, integrates multiple functions such as screen projection, video conferences and the like, and is mainly based on information interaction realized by a touch technology.
In practical applications of the interactive tablet, such as a classroom teaching scene, in order to enrich and display teaching contents and better utilize a screen space of the interactive tablet, an application window is often required to enter a full screen mode or exit the full screen mode through user operations, and switching operations are very frequent. Currently, the user can exit the application window out of the full-screen mode by the following operation modes: 1) Clicking an icon of 'quitting full screen' in an application window operation bar to enable the application window to quit full screen; 2) Clicking a window minimization icon in an application window operation bar to pack the application window to process management; 3) And clicking a 'home' icon in the system operation bar to enable the application window to be folded to process management and display a desktop.
However, in the implementation that the user performs the application window on the interactive tablet to exit from the full screen through the above operations, the following problems mainly exist, so that the interactive experience of the interactive tablet is affected:
1. the large size of the interactive tablet causes inconvenience to the user. For example, the application window operation bar is often arranged at the top of the window, and the operation bar is located at the top of the screen of the interactive tablet in the full-screen mode, which results in difficulty for the user to operate; for another example, to improve the immersive display effect, some application windows may collapse the operation bar during the application running, if the operation bar needs to be used, the operation bar needs to be slid downwards from the top edge of the interactive panel, the operation action is difficult to operate, and the interactive panel is difficult to perceive.
2. The operation steps are complicated. For example, when the "home" icon in the system operation bar is used to perform the operation of exiting the full screen of the application window, the system operation bar needs to be called first, then the "home" icon is clicked, and the system operation bar needs to be called again when the application window exits from the operation.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for managing an application window, an interactive tablet, and a storage medium, so that convenient and fast control of an application window is achieved, and usability of a system is improved.
In a first aspect, an embodiment of the present application provides an application window management method, including:
displaying a target application window in a full screen state;
receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position;
displaying a first preview interface of the target application window, wherein the interface position of the first preview interface is related to the first touch position;
receiving a second touch operation, wherein the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at a second touch position;
displaying a second preview interface of the target application window, wherein the interface position of the second preview interface is related to the second touch position;
receiving a third touch operation, wherein the third touch operation is that the touch object leaves the screen from the second touch position;
and displaying a first interface of the target application window, wherein the first interface has the same interface position as the second preview interface.
Further, the third touch operation is that the touching object leaves the screen from the second touch position, and the sliding speed of the touching object sliding to the second touch position is less than the set first speed threshold.
Further, the method also includes:
receiving a fourth touch operation, wherein the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object sliding to the second touch position is greater than or equal to the first speed threshold;
and displaying a second interface of the target application window, wherein the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
Further, the method also includes:
receiving a fifth touch operation, wherein the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at a third touch position;
displaying a third preview interface of the target application window, wherein the interface position of the third preview interface is related to the third touch position;
receiving a sixth touch operation, wherein the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at a fourth touch position;
and displaying a fourth preview interface of the target application window, wherein the interface position of the fourth preview interface is related to the fourth touch position.
Further, the method further comprises:
receiving a seventh touch operation, wherein the seventh touch operation is that the touch object leaves the screen from the fourth touch position;
and displaying a third interface of the target application window, wherein the display position of the third interface is the same as that of the fourth preview interface.
Further, the seventh touch operation is that the touching object leaves the screen from the fourth touch position, and the sliding speed of the touching object sliding to the fourth touch position is less than a set second speed threshold.
Further, the method also includes:
receiving an eighth touch operation, wherein the eighth touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object sliding to the fourth touch position is greater than or equal to the second speed threshold;
displaying the target application window in a full screen state.
Further, the method also includes:
receiving a ninth touch operation, wherein the ninth touch operation is that the touch object slides to the edge of the screen from the fourth touch position;
displaying the target application window in a full screen state.
Further, for each preview interface displayed by the target application window, the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface is characterized by vertex coordinates of an interface vertex;
and determining each vertex coordinate by combining a target touch coordinate of a target touch position with a width and height value of a screen, wherein the target touch position is a touch position where the touch object stays when the preview interface is displayed.
Further, the step of determining each vertex coordinate according to the target touch coordinate of the target touch position and the width and height value of the screen includes:
when the edge of the screen is a left edge/a right edge, determining a vertex abscissa of each interface vertex through an abscissa in the target touch coordinate;
and determining the vertex vertical coordinate corresponding to each interface vertex by combining the vertex horizontal coordinate with the width and height value of the screen.
Further, the step of determining each vertex coordinate by the target touch coordinate of the target touch position and the width and height value of the screen includes:
when the edge of the screen is a bottom edge/a top edge, determining a vertex ordinate of each interface vertex through the ordinate in the target touch coordinate;
and determining the vertex abscissa corresponding to each interface vertex by combining the vertex ordinate with the width and height values of the screen.
Further, the operation of the touch object moving away from a touch position is identified by the called touch listening function.
Further, the sliding speed of the touching object when sliding from one touching position to another touching position is determined by the touch coordinates and the touch time point of each touching position.
In a second aspect, an embodiment of the present application provides an apparatus for controlling an application window, where the apparatus includes:
the full screen display module is used for displaying the target application window in a full screen state;
the first receiving module is used for receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position;
the first display module is used for displaying a first preview interface of the target application window, and the interface position of the first preview interface is related to the first touch position;
the second receiving module is used for receiving a second touch operation, wherein the second touch operation is that the touch object slides to the middle of the screen from the first touch position and stays at the second touch position;
the second display module is used for displaying a second preview interface of the target application window, and the interface position of the second preview interface is related to the second touch position;
a third receiving module, configured to receive a third touch operation, where the third touch operation is that the touching object leaves the screen from the second touch position;
and the third display module is used for displaying the first interface of the target application window, and the first interface has the same interface position as the second preview interface.
In a third aspect, an embodiment of the present application further provides an interactive tablet, including:
the touch component is used for responding to the touch operation of a touch object through included hardware circuits;
the display screen is covered with the touch component to form a touch screen and is used for displaying an application window;
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method as provided in the first aspect of the application.
In a fourth aspect, embodiments of the present application further provide a storage medium containing computer-executable instructions for performing the method according to the first aspect when executed by a computer processor.
The application window control method and device, the interactive panel and the storage medium are provided. The method can be executed by an interactive tablet, a target application window in a full-screen state is firstly displayed on the interactive tablet, a first preview interface can be displayed after a first touch operation is received, and the interface position of the first preview interface is controlled to be related to the first touch position where the first touch operation stays; and then, after receiving a third touch operation, displaying the first interface, wherein the interface position of the first interface is the same as the interface position of the second preview interface associated with the second touch position corresponding to the departure screen during the third touch operation. According to the technical scheme, the touch operation generated by the interaction of the user and the interactive panel is only related to the sliding of the touch object from the edge of the screen to the middle of the screen, and the interactive panel can also directly adjust the interface position of the target application window by responding to each touch operation to control the target application window to exit from a full screen state. In the whole interaction implementation, the problem that a user cannot effectively control the application window to exit from the full screen due to the fact that the size of the interaction panel is too large is solved, and the problem that the application window can be controlled to exit from the full screen only by frequent interaction with the user is also solved. Therefore, convenience in user operation is realized, and the system usability of the interactive tablet is improved to a great extent.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a schematic flowchart of an application window control method according to an embodiment of the present invention;
fig. 1a is an interface schematic diagram of a first preview interface displayed on an interactive tablet in the first embodiment;
fig. 1b is an interface schematic diagram of a second preview interface displayed on the interactive tablet in the first embodiment;
fig. 1c is a schematic interface diagram of a first interface displayed on the interactive tablet in the first embodiment;
FIG. 1d is a schematic interface diagram illustrating a second interface displayed on the interactive tablet according to the first embodiment;
fig. 2 is a schematic flowchart of an application window control method according to a second embodiment of the present invention;
fig. 2a is an interface schematic diagram of a third preview interface displayed on the interactive tablet in the second embodiment;
fig. 2b is an interface schematic diagram of a fourth preview interface displayed on the interactive tablet in the second embodiment;
fig. 2c is a schematic interface diagram of a third interface displayed on the interactive tablet according to the second embodiment;
FIG. 2d is a schematic interface diagram of a full screen state target application window displayed on the interactive tablet in the second embodiment;
FIG. 2e is a schematic diagram of another interface of a full-screen state target application window displayed on the interactive tablet in the second embodiment;
fig. 3 is a schematic structural diagram of an application window control apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an interactive tablet according to a fourth embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings. It should be understood that the embodiments described are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," "third," and the like are used solely for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order, nor should be construed to indicate or imply relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Example one
Fig. 1 is a schematic flowchart of an application window control method according to an embodiment of the present invention, and this embodiment is applicable to a situation where a display position and a display size of an application window displayed on a screen are controlled. The method may be performed by an application window control device, which may be implemented by software and/or hardware, and may be configured in the interactive tablet, in particular in a processor of the interactive tablet, which may be a host processor in the intelligent processing system, while the interactive tablet is equipped with a touch screen, which may be regarded as an electrically connected combination of a touch frame and a display screen.
In practical application, at least one display screen is displayed in the interactive flat panel. Specific display contents of the display screen are not limited in the embodiments. For example, the display screen may be a window interface of an application program in the smart interactive tablet, a screen projection interface sent by the external device, or video data sent by the external device. Wherein the external device may be: the external device is in data connection with the intelligent interactive panel, and the window interface with the display interface as the application program is used as a main processing object to realize convenient control of application window interface display.
Meanwhile, in the embodiment, intelligent teaching based on an interactive tablet is taken as an actual application scene, when a teacher learns online, the situation that a plurality of functional application windows need to be opened and displayed on a screen of the interactive tablet is usually encountered, so that the content of the plurality of application windows can be displayed on the screen without being blocked, and each application window needs to be quitted from a full-screen state to be displayed on a small-window interface. The existing operation modes (such as clicking a minimize/exit full screen button at the top of the screen) provided for the user to perform application window display state interactive control are not suitable for the large-size terminal device of the interactive tablet.
The control method for the application window provided by the embodiment can respond to the touch operation generated by the user relative to the application window by adopting a mode different from the existing interactive operation mode, so that the application window is controlled to realize the display from full screen display to other display sizes.
As shown in fig. 1, a method for controlling an application window provided in an embodiment of the present application specifically includes the following operations:
and S101, displaying the target application window in a full screen state.
In this embodiment, after a functional application is triggered, its application window may be presented on the display screen of the interactive tablet in a full screen state by default. In theory, there may be multiple application windows in a full screen state on the interactive tablet, but only the content of the application window at the uppermost layer of the screen can be shown to the user, and the content of other application windows is covered by the uppermost layer of the application window.
Correspondingly, when the user performs interactive operation with the interactive tablet, only the element (application window) at the uppermost layer of the screen can be controlled. In this step, the application window displayed in the uppermost layer in the full screen state may be recorded as a target application window that can be interactively controlled by the user, and it may be considered that the subsequent steps in this embodiment are all execution steps for the target application window.
S102, receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position.
In this embodiment, the touch object may specifically be a finger of a user, an active stylus, a passive stylus, or the like, and the user may control the touch object to slide on the surface of the display screen of the interactive tablet. It can be understood that the sliding behavior of the user sliding through the touch object can be determined by the mutual matching and analysis of the related hardware or software of the execution main body, and what operation the user specifically performs on the display screen can be determined by analyzing the data information related to the sliding behavior, so that the touch operation triggered by the user can be received through the step.
In this embodiment, the specific implementation of the executing entity analyzing and determining that the user performs the first touch operation on the screen may be described as follows: the touch frame configured on the execution main body can respond to a touch signal generated by the touch of the touch object on the screen, so that touch point information generated when the touch object touches the screen is obtained and fed back to the upper-layer processing system, and the processing system can determine whether a user performs sliding operation on the display screen and what sliding direction and sliding track corresponding to the sliding operation are through analysis of the touch point information.
Finally, through the sliding direction and the sliding track, the processing system can determine that the user operates the touch object to move on the screen from the edge of the screen to the middle of the screen, and the user currently stays at the first touch position. At this time, the executing agent may consider that the user performs the interaction of the first touch operation with respect to the target application window, so that the first touch operation generated by the interaction with the user may be received through this step. In this embodiment, the touch object may stop at the first touch position very briefly (for example, only the touch object passes through the first touch position during the sliding process, and then the touch object continues to slide based on the user's operation), or may stop at certain intervals (for example, the touch object slides and stops at a certain position on the screen for a certain time interval).
It should be noted that, the interactive tablet serving as the main execution body is composed of a display screen, an intelligent processing system and other parts, and is combined together by an integral structural member and supported by a dedicated software system. The Display screen may specifically include a Light Emitting Diode (LED) Display screen, an Organic Light-Emitting Diode (OLED) Display screen, a Liquid Crystal Display (LCD) Display screen, and the like. Specifically, the interactive flat panel display screen is a touch screen, a touch panel, and is an inductive liquid crystal display device, and when a graphic button on the screen is touched, the tactile feedback system on the screen can drive various connecting devices according to a pre-programmed program, so that the interactive flat panel display screen can be used for replacing a mechanical button panel, and a vivid video and audio effect can be produced by means of a liquid crystal display picture. Touch screens are distinguished from technical principles and can be divided into five basic categories; a vector pressure sensing technology touch screen, a resistance technology touch screen, a capacitance technology touch screen, an infrared technology touch screen, and a surface acoustic wave technology touch screen. According to the working principle of the touch screen and the medium for transmitting information, the touch screen can be divided into four categories: resistive, capacitive, infrared, and surface acoustic wave. When a user touches the screen with a finger or a pen, the point coordinates are positioned, so that the control of the intelligent processing system is realized, and then different functional applications are realized along with the built-in software of the intelligent processing system. When a user touches the screen with a finger or a pen, the point coordinates are positioned, so that the control of the intelligent processing system is realized, and then different functional applications are realized along with the built-in software of the intelligent processing system.
For example, optical touch sensors are disposed on two sides of the surface of the display screen to form a touch frame, so as to form a touch display screen. The touch information identification process can be described as follows: the optical touch sensor constituting the touch frame may scan a touch object, such as a finger of a user, a stylus, etc., on a surface of the display screen using an optical signal. When a touch object touches the display screen and triggers a certain interface on the display screen, or performs operations such as positioning and the like, the touch frame can respond to the touch operation and transmit corresponding touch operation information to the intelligent processing system of the application layer, so that various interactive applications are realized through the intelligent processing system.
In this embodiment, the screen edge may be considered as a frame position of the display screen on the interactive flat panel, and considering that the display screen has four sides, the screen edge may be a left side edge, a right side edge, a top edge or a bottom edge of the display screen relative to the display screen. The sliding tracks sliding from different screen edges to the middle of the screen are different, for example, the sliding track corresponding to the bottom edge can be described as sliding from the bottom to the top; the sliding track corresponding to the top edge can be described as sliding downwards from the top; the left edge corresponding sliding trajectory may be described as sliding from left to right and the right edge corresponding sliding trajectory may be described as sliding from right to left. However, the sliding track from the edge of the screen to the middle of the screen is not limited to horizontal sliding or vertical sliding in this embodiment.
S103, displaying a first preview interface of the target application window, wherein the interface position of the first preview interface is related to the first touch position.
In this embodiment, in response to the first touch operation received in the above step, the touch object of the first touch operation may be considered as a target application window, and the touch purpose is: and adjusting the window display position and the window display size of the target application window, and displaying a result after responding to the first touch operation, wherein a first preview interface of the target application window is specifically displayed.
It can be known that, when the user stands at the user angle, the first touch operation corresponds to the user interaction operation specifically: the touch object slides from the edge of the screen to the middle of the screen and stays at the first touch position. At this time, after the execution main body receives and responds to the first touch operation, the execution main body can control the target application window to be temporarily presented on the display screen in the form of the first preview interface. The display form (interface position, interface size) of the presented first preview interface is specifically related to the first touch position where the touch object stays at the moment, and through the first touch position, the interface position of the first preview interface on the screen can be determined, which is also equivalent to determining the interface size of the first preview interface.
In this embodiment, for the association relationship between the interface position of the first preview interface and the first touch position, the touch coordinate of the first touch position may be just located on one of the four edges of the displayed first preview interface; the touch coordinate of the first touch position can be used as the interface center point coordinate of the displayed first preview interface; or the touch coordinate of the first touch position may be used as the vertex coordinate of one of the displayed first preview interfaces. In short, the interface state of the target application window changes along with the sliding of the touch object from the edge of the screen to the middle of the screen, and the changed interface position of the first preview interface is related to the touch position passed (stopped) by the touch object when sliding.
And S104, receiving a second touch operation, wherein the second touch operation is that the touch object slides to the middle of the screen from the first touch position and stays at the second touch position.
As described in the foregoing S102 of this embodiment, this step is equivalent to receiving a touch operation performed by a user operating a touch object, and this embodiment marks the received touch operation as a second touch operation, where the second touch operation is specifically characterized in that the touch object continues to slide toward the middle of the screen from the first touch position, and stays at the second touch position during the sliding process. The touch object can stay at the second touch position for a short time or a certain interval.
For the identification of the second touch operation, the executing main body may also respond to the touch signal when the touch object slides through the equipped touch frame, feed back the corresponding touch point information to the upper layer, then analyze the sliding direction and sliding track corresponding to the touch point information, and finally determine that the operation performed by the user is the second touch operation when both the sliding direction and the sliding track represent that the touch object continues to slide from the first touch position to the middle of the screen and stays at the second touch position, and then receive the second touch operation through this step.
And S105, displaying a second preview interface of the target application window, wherein the interface position of the second preview interface is related to the second touch position.
In this embodiment, in response to the second touch operation in S104, the display form of the target application window may also be adjusted along with the obtained information related to the second touch position, and the adjustment may specifically be: and adjusting and changing the target application window into a display form of a second preview interface from the displayed first preview interface along with the sliding change of the touch object from the first touch position to the middle of the screen to a second touch position, wherein the interface position of the second preview interface is related to the second touch position where the touch object currently stays.
Similarly, after the main body performs the second touch operation corresponding to the second touch position, one edge of the second preview interface may be controlled to include the touch coordinate of the second touch position, or one interface vertex of the second preview interface is controlled to be the touch coordinate of the second touch position, or the center point of the second preview interface is controlled to be the touch coordinate of the second touch position, and the second preview display interface formed by adjusting based on the second touch position is displayed through this step.
It can be analyzed that when the touching object stays at the second touch position during the sliding process from the first touch position to the middle of the screen, it can be determined that the second touch position is closer to the middle of the screen than the first touch position. It can be seen that, no matter what the above association relationship exists between the second preview interface and the second touch position, the interface size of the second preview interface is reduced relative to that of the first preview interface. Thus, in the present embodiment, it is considered that the interface position of the target window preview interface gradually gets closer to the center of the screen as the touch object slides to the middle of the screen, and the interface size of the preview interface gradually decreases as the touch object slides to the middle of the screen.
It should be noted that, regarding the interface positions of the first preview interface and the second preview interface, since the interface positions are respectively related to the first touch position and the second touch position, the embodiment may specifically determine, by the touch coordinates of the first touch position and the second touch position, the specific position information of the interface positions corresponding to the first preview interface and the second preview interface respectively by combining the width and height values of the screen.
And S106, receiving a third touch operation, wherein the third touch operation is that the touch object leaves the screen from the second touch position.
In this embodiment, the executing entity may determine various touch operations (e.g., a first touch operation and a second touch operation) performed by the user in the process of controlling the touch object to slide from the edge of the screen to the middle of the screen, so that the touch operation that the user controls the touch object to leave the screen of the display screen may also be monitored, and the touch operation may be denoted as a third touch operation, and the third touch operation correspondingly generated when the touch object leaves the screen from the second touch position may be received in this step.
In this embodiment, the executing main body may monitor the leaving event of the touching object leaving the screen through a preset touch monitoring function. Specifically, in the process that the user operates the touch object to perform various interactive operations on the display screen, the execution main body can also monitor whether a leaving event for operating the touch object to leave the screen occurs through the touch monitoring function. If the exit event is monitored to exist, the executive body can be considered to recognize that the user operates the touch object to exit from the screen from one touch position, and accordingly, the touch operation that the touch object leaves the screen can be generated.
S107, displaying a first interface of the target application window, wherein the first interface has the same interface position as the second preview interface.
In this embodiment, in response to the third touch operation in S106, when the touch object finishes sliding on the display screen and leaves the screen, the final display form of the target application interface, that is, the first interface, may be presented on the screen through this step. The display form of the first interface is specifically related to a corresponding touch position when the touch object leaves the screen, and specifically can be the same as the interface position and the interface size of the preview interface presented by the target application window when the touch object leaves the screen.
For example, when the touch position of the touch object when the touch object leaves the screen is a second touch position, a second preview interface associated with the target application window at the second touch position can be directly displayed as the first interface. That is, the interface size and the interface position of the first interface presented in this step are the same as those of the second preview interface.
Through the above description, it can be known that the first preview interface and the second preview interface presented in the present embodiment may or may not be interface forms that the user desires to finally present the target application window. When the touch object touches the screen and slides from the edge of the screen to the middle of the screen, the display form of the target application window changes along with the change of the touch position where the touch object stays, for example, when the touch object slides from the edge of the screen to the first touch position, the display form of the target application window is adjusted from a full screen state to the interface position of the first preview interface; and when the touch object slides to the second touch position from the first touch position to the middle of the screen, the display form of the target application window is adjusted to the interface position of the second preview interface from the first preview interface.
When the touch object leaves the screen from one touch position, the target application window presents the display form of the preview interface associated with the touch position on the screen as a final form, and if the touch object leaves from a second touch position and the display form associated with the second touch position is a second preview interface, the second preview interface as the final form can be displayed through the step and recorded as the first interface.
In order to better show the application window control method provided by the embodiment, the embodiment describes specific implementation of controlling the adjustment of the display form of the application window based on the interactive control of the user by standing at a visualization angle through the following example.
Specifically, standing at the visualization angle, first, an application window in a full screen state is displayed on the interactive tablet, and screen edges on the interactive tablet include a left edge, a right edge, a bottom edge, and a top edge, where the bottom edge is taken as an example in this example. Then, the user uses fingers (the number of fingers is not limited in this embodiment, and may be one finger, two fingers, or even multiple fingers) as touch objects to slide upwards from the bottom edge of the interactive tablet. In this embodiment, the operation of the user controlling the finger to slide upward from the bottom edge is processed by the application window control method provided by the embodiment, and different responses can be given based on different operations, and the response result can be embodied by the display form of the presented interface of the target application window.
In this example, one of the response results is presented in the form of FIG. 1 a. Specifically, fig. 1a is an interface schematic diagram of a first preview interface displayed on an interactive tablet in the first embodiment; as shown in fig. 1a, the first preview interface 14 is an interface presented by the interactive tablet 1 relative to the target application window after receiving the first touch operation of the user. In fig. 1a, the first touch operation is represented as: the finger 11 slides upwards from the bottom edge 12 of the interactive tablet 1 and stays at the first touch position 13 on the screen, and at this time, the preview interface of the target application window is presented in the display form of the first preview interface 14. It can be seen that the first touch position 13 determines an interface position of the first preview interface 14, which is specifically shown in that the first touch position 13 is located on the interface bottom side of the first preview interface 14; in addition, in order to ensure the display effect of the first preview interface 14, the first preview interface 14 may preferably have the same aspect ratio as the screen.
Following the above example, after responding to the touch operation performed again by the user, the response result shown in fig. 1b may also be presented. Specifically, fig. 1b is an interface schematic diagram of a second preview interface displayed on the interactive tablet in the first embodiment; as shown in fig. 1b, the second preview interface 16 is an interface presented by the interactive tablet 1 relative to the target application window after receiving the second touch operation of the user. In fig. 1b, the second touch operation is represented as: the finger 11 continues to slide upwards from the first touch position 13 of the interactive tablet 1 and stays at the second touch position 15 on the screen, and at this time, the preview interface of the target application window is presented in the display form of the second preview interface 16. It can be seen that the second touch position 15 determines an interface position of the second preview interface 16, which is specifically shown in that the second touch position 15 is located on the interface bottom side of the second preview interface 16; in order to ensure the display effect of the second preview interface 16, the second preview interface 16 may preferably have the same aspect ratio as the screen.
In the above example, after responding to the touch operation performed by the user again, the response result shown in fig. 1c may also be presented. Fig. 1c is a schematic interface diagram of a first interface displayed on the interactive tablet in the first embodiment. As shown in fig. 1c, the first interface 17 is an interface presented by the interactive tablet 1 relative to the target application window after receiving the third touch operation of the user. In fig. 1c, the third touch operation is represented as: the finger 11 leaves the screen from the second touch position 15 of the interaction pad 1. At this time, the target application window is no longer presented in the preview interface form, but is presented in a fixed form of the first interface 17.
It can be seen that the departure of the finger 11 from the screen at a touch position (e.g. the second touch position 15) determines that the target application window is no longer adjusted in the preview mode with the movement of the finger 11, but will be presented in a fixed interface, and the interface position of the presented fixed interface (e.g. the first interface 17) is the same as the interface position of the presented preview interface at the touch position when the finger leaves the screen. Referring to fig. 1b and 1c, it can be seen that the first interface 17 in fig. 1c is at the same interface position as the second preview interface in fig. 1b, and the interface size and the like are the same.
In addition, comparing fig. 1a and fig. 1b longitudinally, it can be seen that, during the process of sliding the finger upwards from the bottom edge, the interface position of the preview interface presented by the target application window is then closer to the center of the screen, and the interface size is also reduced.
In the application window control method provided by the embodiment of the invention, the touch operation generated by the interaction between the user and the interactive flat panel is only related to the sliding of the touch object from the edge of the screen to the middle of the screen, and the interactive flat panel can also directly adjust the interface position of the target application window by responding to each touch operation to control the target application window to exit from the full screen state. In the whole interaction implementation, the problem that a user cannot effectively control the application window to exit from the full screen due to the fact that the size of the interaction panel is too large is solved, and the problem that the application window can be controlled to exit from the full screen only by frequent interaction with the user is also solved. Therefore, convenience in user operation is realized, and the system usability of the interactive tablet is improved to a great extent.
As an optional embodiment of the first embodiment of the present invention, on the basis of the foregoing embodiment, the third touch operation is further optimized to enable the touching object to leave the screen from the second touch position, and a sliding speed of the touching object sliding to the second touch position is smaller than the set first speed threshold.
The first interface displayed in the foregoing embodiment is performed after receiving the third touch operation, and this optional embodiment further defines the third touch operation. Specifically, the third touch operation includes, in addition to the operation of moving the touching object away from the second touch position, a defined condition that the sliding speed of the touching object to the second touch position during the sliding process to the middle of the screen needs to be less than the first speed threshold.
In this optional embodiment, the executing body may perform the determination of the above-mentioned limitation condition by monitoring a sliding speed during the sliding of the touching object, and when it is determined that the sliding speed is less than the first speed threshold, determine that the generation condition of the third touch operation is met, thereby generating the third touch operation, and execute the related display operation of S107 described above.
The execution main body can determine the sliding speed of the touch object when sliding from one touch position to another touch position through the touch coordinates and the touch time points of the two touch positions.
Specifically, the determination process of the sliding speed when the touching object slides from one touch position to another touch position can be described as follows: recording the touch position where the touch object starts to slide as an initial touch position, and recording the touch position to be reached by sliding as a target touch position; acquiring an initial touch time point when a touch object is at an initial touch position and a target touch time point when the touch object is at a target touch position; simultaneously acquiring an initial touch coordinate of the initial touch position and a target touch coordinate of the target touch position; and finally, determining the sliding time according to the target touch time point and the initial touch time point, and taking the ratio of the distance value to the sliding time as the sliding speed of the touch object reaching the target touch position.
The embodiment may preferably select the first speed threshold as a set multiple of the height of the interactive tablet screen, where the set multiple may be 0.3.
On the basis of the above optimization, the optional embodiment further includes:
receiving a fourth touch operation, wherein the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object sliding to the second touch position is greater than or equal to the first speed threshold;
and displaying a second interface of the target application window, wherein the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
In this optional embodiment, after determining that the touch object leaves the second touch position, the execution main body further determines a sliding speed of the touch object sliding to the second touch position and a first speed threshold, so that when the sliding speed is less than the first speed threshold, the interactive operation performed by the user is considered to satisfy a generation condition of the third touch operation; the interactive operation performed by the user may be considered to satisfy the fourth touch operation when the sliding speed is greater than or equal to the first speed threshold.
Based on this, the optional embodiment may further receive a fourth touch operation recognized by the execution subject, and in response to the received fourth touch operation, it is determined that the touching object leaves the screen at the second touch position, the leaving operation is determined to be equivalent to triggering execution of displaying the target application window in the fixed form, and then, when it is determined that the sliding speed of the touching object sliding to the second touch position is greater than or equal to the first speed threshold, it may be determined in what fixed form the target application window should be displayed in what display size to display the interface, and the interface displayed in the determined display size may be regarded as the second interface.
For example, in this optional embodiment, the display size of the second interface may preferably be a set multiple of the corresponding display size of the target application window in the full-screen state. Wherein, the set multiple may be preferably 1/4.
For convenience of understanding, fig. 1d is an interface schematic diagram of a second interface displayed on the interactive tablet in the first embodiment, and as shown in fig. 1d, the second interface 18 is an interface presented by the interactive tablet 1 relative to the target application window after receiving the fourth touch operation of the user. In fig. 1d, the fourth touch operation appears as: the finger 11 leaves the screen from the second touch position 15 of the interactive tablet 1, and the sliding speed of the control finger 11 sliding to the second touch position 15 before leaving the screen is greater than or equal to 0.3 times of the height of the screen. At this time, the target application window is also no longer presented in the preview interface form, but rather a fixed form of the second interface 18. And the second interface 18 is presented with a display size of 1/4 of the display size of the target application window in the full-screen state.
It should be noted that the dashed line graphs appearing in the above figures can be understood as the historical state corresponding to the historical event, such as the first preview interface 14 shown in dashed line in fig. 1b, which represents the historical preview state existing in the target application window before the second preview interface 16 is displayed. Also as in fig. 1c the virtual finger pointing to the second touch position 15 can be considered as a historical state of the touch screen before the finger leaves the screen.
Through the schematic diagrams of the target application window interface change on the interactive tablet illustrated in fig. 1a to fig. 1d, a specific application interaction scene example is provided in this embodiment to further illustrate the application window interface display size change realized through the interactive operation between the user and the interactive tablet.
Standing at the user angle, and after the target application window in a full-screen state is presented on the interactive tablet, the user controls the finger to slide upwards from the bottom edge of the interactive tablet.
At the interactive tablet level, firstly, as the user's finger continuously slides up, the target application window changes in real time along with the change of the finger position in the form of the preview interface, and the window display size of the preview interface continuously shrinks along with the sliding up of the finger (refer to the change from fig. 1a to fig. 1 b).
Secondly, when the user's finger slides upwards and leaves the screen at a certain touch position, if the sliding speed of the finger at the touch position is less than 0.3 times of the height of the screen, the target application window can be triggered to exit from the full screen, and the touch position where the finger is located when the finger leaves the screen is used as the presentation position of the bottom interface of the target application window (refer to the interface effect given in fig. 1 c).
Correspondingly, if the sliding speed of the finger at the touch position is greater than or equal to 0.3 time of the screen height, the target application window may be triggered to exit from the full screen, and the target application window is controlled to perform interface display by 1/4 time of the full screen interface (refer to the interface effect given in fig. 1 d).
The optional embodiment further optimizes the situation after the touch object leaves the screen from one touch position on the basis of the first embodiment, introduces the sliding speed when the touch object slides to the touch position, and performs different display result responses according to different sliding speeds. Therefore, the gesture operation mode of application window control is increased, the flexible control of the display state of the application window by a user is realized, and the usability of the interactive tablet is effectively improved.
Example two
Fig. 2 is a schematic flow chart of an application window control method according to a second embodiment of the present invention, where the second embodiment is optimized based on the above-mentioned embodiments, and on the basis of the above-mentioned embodiments, the following steps are added to the optimization: receiving a fifth touch operation, wherein the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at a third touch position; displaying a third preview interface of the target application window, wherein the interface position of the third preview interface is related to the third touch position; receiving a sixth touch operation, wherein the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at a fourth touch position; and displaying a fourth preview interface of the target application window, wherein the interface position of the fourth preview interface is related to the fourth touch position.
Simultaneously, also optimize and increased: receiving a seventh touch operation, wherein the seventh touch operation is that the touch object leaves the screen from the fourth touch position; and displaying a third interface of the target application window, wherein the display position of the third interface is the same as that of the fourth preview interface.
Furthermore, optimization increases: receiving a ninth touch operation, wherein the ninth touch operation is that the touch object slides to the edge of the screen from the fourth touch position; displaying the target application window in a full screen state.
As shown in fig. 2, the application window control method provided in the second embodiment specifically includes the following steps:
s201, displaying the target application window in a full screen state.
For example, the state displayed in this step may be an initial presentation state of the application window control, and specifically, the application window in a full screen state at the uppermost layer on the screen may be determined as a target application window that the user can interactively operate.
S202, receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position.
It can be understood that, in the present embodiment, the touch object may be slid from the edge of the screen to the middle of the screen as a trigger operation performed by the application window control, that is, when the user wants to control the adjustment of the application window through interaction with the interaction tablet, the touch object needs to be first manipulated to slide from the edge of the screen to the middle of the screen.
In addition, the execution main body can also identify different touch operations corresponding to the touch object passing through or staying at different touch positions in the sliding process. The first touch operation received in this step may preferably be considered that the touch object stays at the first touch position during the sliding process.
S203, displaying a first preview interface of the target application window, wherein the interface position of the first preview interface is related to the first touch position.
For example, the executing entity may execute a response step of the touch operation after receiving the touch operation, and may show a corresponding response result through the step. For example, when the received first touch operation is responded, it is determined that the interface display form of the target application window needs to be adjusted currently.
The specific adjustment of the interface display form is related to a first touch position where the touch object stays in the first touch operation, so that the interface position of the interface to be presented of the target application window can be determined through the first touch position, and the interface display is performed at the determined interface position in the form of the first preview interface.
And S204, receiving a second touch operation, wherein the second touch operation is that the touch object slides to the middle of the screen from the first touch position and stays at the second touch position.
Similarly to the execution of S202, the second touch operation recognized by the sliding and stopping of the touch object may also be received in this step.
S205, displaying a second preview interface of the target application window, wherein the interface position of the second preview interface is related to the second touch position.
As with the execution of S203, this step may also present the result after the received second touch operation is responded, and specifically present a second preview interface whose interface position is related to the second touch position.
In the execution of S202 to S205, as the touching object slides from the edge of the screen to the middle of the screen, each touch position where the touching object stays gradually approaches the center of the screen; considering that the interface position of the displayed target application window preview interface is related to the corresponding stopped touch position, when the touch position gradually approaches the screen center position, the displayed preview interface also gradually approaches the screen center position. Meanwhile, in consideration of the display effect of the preview interface, each preview interface needs to have the same ratio as the aspect ratio of the screen, and therefore, when the interface position of the preview interface is continuously close to the center of the screen, the interface size of the preview interface is gradually reduced.
S206, receiving a fifth touch operation, wherein the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at a third touch position.
In this embodiment, the step may be regarded as a continuation of the above steps, and the received fifth touch operation is related to the manipulation of the touch object by the user. In the fifth touch operation, the sliding direction of the touch object is changed under the generation condition of the fifth touch operation, different from the received first touch operation and the second touch operation.
Specifically, the touching object no longer follows the sliding direction from the edge of the screen to the middle of the screen, but changes to the sliding direction from the currently located touch position (e.g., the second touch position) to the edge of the screen. Finally, when the touch object slides from the second touch position to the edge of the screen and stays at the third touch position, the execution main body is equivalent to recognize the generated fifth trigger operation, and the fifth trigger operation can be received through the step.
For the identification of the fifth touch operation, the executing entity may also respond to the touch signal when the touch object slides through the equipped touch frame, feed back the corresponding touch point information to the upper layer, analyze the sliding direction and sliding track corresponding to the touch point information, and finally determine that the operation performed by the user is the fifth touch operation when both the sliding direction and the sliding track represent that the touch object slides from the current touch position (e.g., the second touch position) to the edge of the screen and stays at the third touch position, and then receive the fifth touch operation in this step.
And S207, displaying a third preview interface of the target application window, wherein the interface position of the third preview interface is related to the third touch position.
In this embodiment, in response to the fifth touch operation of S205, it can be analyzed that the fifth touch operation meets the condition of presenting the target application window preview interface on the screen, and thus, the display form of the target application window preview interface can be adjusted again along with the obtained related information of the third touch position. The adjustment may specifically be: and adjusting and changing the target application window into a display form of a third preview interface from the displayed second preview interface along with the sliding change of the touch object from the second touch position to the edge of the screen to a third touch position, wherein the interface position of the third preview interface is related to the third touch position where the touch object currently stays.
Similarly, for the association between the third preview interface and the third touch position, one edge of the third preview interface may include the touch coordinate of the third touch position, or one vertex coordinate of the third preview interface is the touch coordinate of the third touch position, or the central point of the third preview interface is the touch coordinate of the third touch position. Therefore, before the third preview interface is displayed in this step, the interface position of the preview interface of the target application window may be adjusted based on the third touch position until the interface position corresponding to the third preview interface is reached.
And S208, receiving a sixth touch operation, wherein the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at a fourth touch position.
As described in S206 of the present embodiment, the sixth touch operation received in this step is also equivalent to receiving a touch operation generated by a user operating a touch object. As in the fifth touch operation, the sliding direction of the touch object corresponding to the sixth touch operation is also the sliding of the currently stopped touch position (third touch position) to the edge of the screen.
The sixth touch operation may be specifically characterized as: and the touch object continues to slide to the edge of the screen from the third touch position and stays at the fourth touch position in the sliding process. It can be known that, in the present embodiment, the touch object may stay at the third touch position and the fourth touch position for a short time or a certain time interval. And the executive body identifies the sixth touch operation based on the combination of the equipped touch frame and the upper-layer processing system.
S209, displaying a fourth preview interface of the target application window, wherein the interface position of the fourth preview interface is related to the fourth touch position.
In this embodiment, in response to the sixth touch operation in S208, it can be analyzed that the sixth touch operation also meets the condition of presenting the target application window preview interface on the screen, and therefore, the display form of the target application window preview interface can be adjusted again along with the obtained related information of the fourth touch position.
Wherein, the adjustment specifically may be: and adjusting and changing the target application window into a display form of a fourth preview interface from the displayed third preview interface along with the sliding change of the touch object from the third touch position to the edge of the screen to the fourth touch position, wherein the interface position of the fourth preview interface is related to the fourth touch position where the touch object currently stays.
In addition, the association relationship between the fourth preview interface and the fourth touch position may also be referred to the above description of the association relationship between the third preview interface and the third touch position, and is not described herein again.
S210, receiving a seventh touch operation, wherein the seventh touch operation is that the touch object leaves the screen from the fourth touch position.
It can be understood that, in the process of sliding the touching object from the current touch position to the edge of the screen, there is also a possibility that the touching object leaves the screen at a certain touch position. When the execution main body identifies various touch operations performed in the sliding process of the user-controlled touch object, the execution main body can also identify the touch operation of the user-controlled touch object leaving the display screen.
Specifically, when the touch operation that the subject recognizes that the user slides to the edge of the screen while manipulating the touch object and leaves the screen at the fourth touch position is performed, the touch operation may be recorded as a seventh touch operation. In this step, a seventh touch operation generated correspondingly by the touch object leaving the screen from the fourth touch position may be received.
In this embodiment, the executing main body may also monitor the leaving event of the touching object leaving the screen through a preset touch monitoring function. Specifically, the executing main body can monitor whether a leaving event that the touch object leaves the screen occurs through the touch monitoring function in the process that the user controls the touch object to slide to the edge of the screen. If the exit event is monitored to exist, the executive body can recognize that the user operates the touch object to exit from the screen from one touch position, and correspondingly, a seventh touch operation that the touch object exits from the screen can still be generated and received through the step.
S211, displaying a third interface of the target application window, wherein the display positions of the third interface and the fourth preview interface are the same.
In this embodiment, in response to the seventh touch operation of S210, when the touch object finishes sliding on the display screen and leaves the screen, the final display form of the target application interface, that is, the third interface, may be presented on the screen through this step. The display form of the third interface is specifically related to the corresponding touch position when the touch object leaves the screen, and specifically may be the same as the interface position and the interface size of the preview interface presented by the target application window when the touch object leaves the screen.
For example, when the touch position of the touch object when the touch object leaves the screen is a fourth touch position, a fourth preview interface associated with the target application window at the fourth touch position may be directly displayed as the third interface. That is, the interface size and the interface position of the third interface presented in this step are the same as those of the fourth preview interface.
Through the above description, it can also be known that, in the process that the user operates the touch object to touch the screen and slides from the middle of the screen to the edge of the screen, the display form of the target application window also changes with the change of the touch position where the touch object stays, for example, when the touch object slides from the second touch position to the edge of the screen to the third touch position, the display form of the target application window has the second preview interface adjusted to the interface position of the third preview interface; for another example, when the touch object slides from the third touch position to the edge of the screen to the fourth touch position, the display form of the target application window is adjusted from the third preview interface to the interface position of the fourth preview interface.
Through the adjustment of the preview interface, the preview interface of the target application window gradually approaches the edge of the screen along with the touch position in the process, and the interface position of the preview interface gradually approaches the edge of the screen, so that the size of the interface presented by the preview interface relative to the target application window is gradually increased on the basis of ensuring the same aspect ratio of the interface. When the touch object leaves the screen from one touch position, the target application window takes the display form of the preview interface associated with the touch position as the final form to be presented on the screen, and if the touch object leaves from the fourth touch position, the fourth preview interface associated with the fourth touch position can be taken as the current final display form of the target application window, and can be marked as the third interface when displayed by the step.
Further, in the present embodiment, on the basis of the above step S210, the seventh touch operation is optimally defined. Specifically, the seventh touch operation is defined as an operation including, in addition to the operation of moving the touching object away from the fourth touch position, a defined condition that the sliding speed of the touching object sliding to the fourth touch position in the process of sliding to the edge of the screen needs to be less than the second speed threshold. Based on this, the embodiment is equivalent to optimizing the seventh touch operation specifically as follows: the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object sliding to the fourth touch position is less than a set second speed threshold.
It can be understood that, in addition to recognizing that the touching object leaves the screen at the fourth touch position, the executing body needs to determine the slide speed related limit condition by monitoring the slide speed during the sliding process of the touching object. Thus, when it is determined that the sliding speed of the slide to the fourth touch position is less than the second speed threshold, it is determined that the generation condition of the seventh touch operation is met, so that the seventh touch operation is generated, and the display operation related to the above S211 is performed.
In this embodiment, the second speed threshold may also be preferably a set multiple of the height of the interactive tablet screen, and a value of the set multiple may also be 0.3.
In addition, as another execution branch different from the seventh touch operation, when the execution main body recognizes that the touching object is away from the fourth touch position and determines that the sliding speed of the object sliding to the fourth touch position is greater than or equal to the second speed threshold, the interactive operation performed by the user may be considered to satisfy the eighth touch operation. Therefore, the execution main body can also receive the identified eighth touch operation and respond to the eighth touch operation.
Specifically, regarding the response of the execution main body to the eighth touch operation, first, when it is determined that the eighth touch operation includes an operation in which the touch object leaves the screen at the fourth touch position, it may be determined that the target application window needs to be displayed in a fixed form on the interface; subsequently, when it is determined that the sliding speed of the touch object sliding to the fourth touch position is greater than or equal to the second speed threshold, it is determined that the target application window should be displayed in a full screen state. That is, in this embodiment, when the eighth touch operation is received, the display of the full-screen state display target application window may be realized.
It should be noted that the sliding speed of the touch object sliding to the fourth touch position may be determined by taking the determination method of the sliding speed mentioned in the first embodiment as a reference, and the determination process of the sliding speed is not repeated here.
S212, receiving a ninth touch operation, wherein the ninth touch operation is that the touch object slides to the edge of the screen from the fourth touch position.
In this embodiment, the present step can be regarded as two execution branches with the above S210. Under the execution analysis, the execution main body can determine that a ninth touch operation is correspondingly generated when recognizing that the user operates the touch object to slide from the fourth touch position to the edge of the screen and slide to the edge of the screen. In this embodiment, when the ninth touch operation is identified, the receiving operation of the ninth touch operation may be executed through this step.
And S213, displaying the target application window in a full-screen state.
In this embodiment, in response to the ninth touch operation in S212, the target application window in the full screen state is directly displayed on the screen as the presented response result.
It is understood that the execution of S212 and S213 in this embodiment is equivalent to an implementation of controlling the application window to return to the full screen state again. It should be noted that, in this implementation manner, mainly for a case that a user operates a touch object to slide from an edge of a screen to a middle of the screen first, and slide reversely to the edge of the screen after reaching a certain touch position, the touch object slides from the edge of the screen to the middle of the screen, and the touch object does not leave the screen in a process of sliding reversely to the edge of the screen, that is, a target application window is not presented on the screen in a final form of a first interface, a second interface, or a third interface in the whole process.
In order to better show the application window control method provided by the embodiment, the embodiment still describes a specific implementation of controlling and adjusting the display form of the application window based on the user interaction manipulation from the visualization angle through the following example station.
For example, standing at a visualization angle, an application window in a full screen state is displayed on the interactive flat panel, and a user performs an interactive operation on the interactive flat panel by using a finger as a touch object. Referring to the exemplary description of the embodiment, the display states of the target application window when the executing entity receives the first touch operation, the second touch operation, the third touch operation and the fourth touch operation are respectively as shown in fig. 1a, 1b, 1c and 1 d. The above touch operation performed by the user with respect to the interactive tablet is not described again here.
As another execution branch, the execution main body may further perform, before performing the operation of presenting the first interface or presenting the second interface, after receiving the second touch operation and displaying the second preview interface accordingly, when it is recognized that the user operates the finger to perform the fifth touch operation, a display response of the third preview interface.
In this example, the displayed third preview interface may be specifically characterized with fig. 2 a. Specifically, fig. 2a is an interface schematic diagram of a third preview interface displayed on the interactive tablet in the second embodiment; as shown in fig. 2a, the third preview interface 25 is an interface presented by the interactive tablet 2 relative to the target application window after receiving the fifth touch operation of the user. In fig. 2a, the finger 21 stays at the third touch position 24 while sliding from the second touch position 22 to the bottom edge 23 of the interactive tablet 2, and at this time, the preview interface of the target application window is presented in the display form of the third preview interface 25. It can be seen that the third touch position 24 determines the interface position of the third preview interface 25, which is specifically shown in that the third touch position 24 is located on the bottom edge of the interface of the third preview interface 25; in addition, in order to ensure the display effect of the third preview interface 25, the third preview interface 25 may preferably have the same aspect ratio as the screen.
In the process that the user operates the finger to slide downwards from the second touch position to the bottom edge in the above example, after the executing main body responds to the user performing the touch operation again, the response effect shown in fig. 2b can be further presented. Specifically, fig. 2b is an interface schematic diagram of a fourth preview interface displayed on the interactive tablet in the second embodiment; as shown in fig. 2b, the fourth preview interface 27 is an interface presented by the interactive tablet 2 relative to the target application window after receiving the sixth touch operation by the user. In fig. 2b, the sixth touch operation is represented as: the finger 21 continues to slide downwards from the third touch position 24 of the interactive tablet 2 and stays at the fourth touch position 26 on the screen, and at this time, the preview interface of the target application window is presented in the display form of the fourth preview interface 27. It can be seen that the fourth touch position 26 determines the interface position of the fourth preview screen 27, which is specifically shown in that the fourth touch position 26 is located on the bottom edge of the interface of the fourth preview screen 27; in order to ensure the display effect of the fourth preview interface 27, the fourth preview interface 27 may preferably have the same aspect ratio as the screen.
Comparing fig. 2a and fig. 2b in the vertical direction, it can be seen that, in the process of sliding the finger from the second touch position to the bottom edge in the opposite direction, the interface position of the preview interface presented in the target application window is closer to the edge of the screen, and the interface size is also increased.
In the above example, after responding to the touch operation performed by the user again, the response result shown in fig. 2c or fig. 2d may also be presented. Among the response results shown in fig. 2c or fig. 2d, the difference is that only one limiting condition determination result is different, so that different touch operations are responded. Specifically, fig. 2c is an interface schematic diagram of a third interface displayed on the interactive tablet in the second embodiment. Fig. 2d is a schematic interface diagram of one of the full-screen-state target application windows displayed on the interactive tablet in the second embodiment.
As shown in fig. 2c, the third interface 28 is an interface presented by the interactive tablet 2 relative to the target application window after receiving the seventh touch operation of the user. In fig. 2c, the seventh touch operation is represented as: the finger 21 leaves the screen from the fourth touch position 26 of the interactive tablet 2, and the sliding speed of the finger 21 to the fourth touch position 26 is less than the second speed threshold (e.g. 0.3 times the height of the screen). At this point, the target application window is no longer presented in the preview interface form, but rather is presented in a fixed form of the third interface 28.
It can be seen that the departure of the finger 21 from the screen at one touch position (e.g. the fourth touch position 26) determines that the target application window is no longer adjusted in the preview mode with the movement of the finger 21, but will be presented in a fixed interface, which is further limited by another limitation condition, i.e. the sliding speed of the finger 21 to the fourth touch position 26 is less than the second speed threshold. After the above condition is satisfied, the interface position of the presented fixed interface (e.g., the third interface 28) is the same as the interface position of the presented preview interface at the touch position when the screen is away. Referring to fig. 2b and 2c, it can be seen that the third interface 28 in fig. 2c is at the same interface position as the fourth preview interface in fig. 2b, and the interface size and the like are the same.
As shown in fig. 2d, the target application window is displayed in a full-screen state in fig. 2d, and the full-screen state can be regarded as a state presented when the eighth touch operation is received. In fig. 2d, the eighth touch operation is represented as: the finger 21 leaves the screen from the fourth touch position 26 of the interactive tablet 2, and the sliding speed of the finger 21 to the fourth touch position 26 is greater than or equal to the second speed threshold (e.g. 0.3 times the height of the screen). At this time, the target application window is not presented in the preview interface form any more, but is directly presented in a full screen state.
In addition, the present example also gives the interface schematic of fig. 2e for the display of the target application window in a full screen state. Specifically, fig. 2e is another interface schematic diagram of a full screen state target application window displayed on the interactive tablet in the second embodiment. The full screen state shown in fig. 2e may also be regarded as a state shown after receiving the ninth touch operation.
As shown in fig. 2e, the ninth touch operation is embodied as: the finger 21 continues to slide downwards from the fourth touch position 26 of the interaction pad 2 until the bottom edge 23 of the screen is reached. At this time, the target application window is also triggered to return to the full screen state for displaying, so that the target application window is directly presented in the full screen state after the ninth touch operation is received in the embodiment.
It will also be appreciated that the dashed line graphs appearing in the above figures are all understood to be the historical state corresponding to the historical event, such as the third preview interface 25 shown in dashed line in fig. 2b, which represents the historical preview state of the target application window prior to the fourth preview interface 27 being displayed. Also as in fig. 2c the virtual finger pointing to the fourth touch position 26 can be considered as a historical state of the touch screen before the finger leaves the screen at the fourth touch position 26. Also as shown in fig. 2d, the third preview interface 25 presented in dashed lines and the fourth preview interface 27 presented in dashed lines may both represent the historical preview state that the respective preview interface existed before the target application window was displayed in a full screen state. And with the virtual finger extending continuously downwards as in fig. 2e, the historical sliding state of the finger sliding downwards and up to the bottom edge is also illustrated.
Through the schematic diagrams of the target application window interface change on the interaction tablet illustrated in fig. 2a to fig. 2e, this embodiment also provides a specific application interaction scenario example to further illustrate the application window interface display size change realized through the interaction operation between the user and the interaction tablet.
Standing at the user's perspective, and first still after the target application window in a full screen state is presented on the interaction tablet, the user manipulates the finger to slide upward from the bottom edge of the interaction tablet, during which the change of the target application window on the interaction tablet is described in the above embodiments and will not be further explained here.
Secondly, after the user operates the finger to slide upwards to a certain touch position of the screen, the sliding direction of the finger is changed, namely, the user operates the finger to slide downwards from the touch position to the bottom edge of the screen.
At the interactive tablet level, as the finger of the user changes direction and slides towards the bottom edge, the target application window changes in real time along with the change of the finger position in the form of a preview interface, and the window display size of the preview interface continuously increases along with the sliding of the finger (refer to the change from fig. 2a to fig. 2 b).
Then, when the user's finger slides downward and leaves the screen at a certain touch position, if the sliding speed of the finger at the touch position is less than 0.3 times of the height of the screen, the target application window may also be triggered to exit from the full screen, and the touch position where the finger is located when the finger leaves the screen is used as the presentation position of the bottom interface of the target application window (refer to the interface effect given in fig. 2 c).
Accordingly, if the sliding speed of the finger at the touch position is greater than or equal to 0.3 times of the height of the screen, it may be considered that the operation of returning the target application window to the full screen is triggered, so that the interface of the target application window may be returned to the full screen state and displayed in the display size of the full screen state (refer to the interface effect given in fig. 2 d).
In addition, the user may also control the finger to slide downward until the finger slides to the bottom edge of the screen (without limiting the sliding speed of the finger), and this kind of interaction operation may also be considered as triggering the operation of restoring the target application window to the full screen, and at this time, the interface of the target application window may also be restored to the full screen state and displayed in the display size of the full screen state (refer to the interface effect given in fig. 2 e)
Based on the first embodiment, the application window control method provided by the second embodiment of the present invention specifically provides application window control implemented during other touch operations. In the application window control, the touch operation generated by the interaction of the user and the interactive screen panel is related to the sliding of the touch object from the edge of the screen to the middle of the screen and also related to the sliding of the touch object from the middle of the screen to the edge of the screen. In various sliding operations of the touch object, the interactive tablet can flexibly adjust the interface position of the target application window directly through corresponding touch operations, can control the target application window to exit from a full-screen state, and can also control the target application window to recover to the full-screen state. In the whole interaction implementation, the problem that the user cannot effectively control the application window to exit from the full screen due to the fact that the size of the interaction panel is too large is solved, and the problem that the user needs to frequently interact with the user to control the application window to exit from the full screen is also solved. Therefore, the interaction control gestures of the user and the interaction panel are enriched, the convenience of user control is realized, and the system availability of the interaction panel is greatly improved.
On the basis of the second embodiment, the present optional embodiment further provides a specific implementation that different touch positions affect the interface position of the target application window preview interface from the bottom layer perspective.
Specifically, for each preview interface displayed by the target application window, the embodiment preferably selects that the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface may be represented by the vertex coordinates of the interface vertex.
For each vertex coordinate, the vertex coordinate may be determined by combining a target touch coordinate of a target touch position with a width and height value of a screen, where the target touch position is a touch position where the touch object stays when the preview interface is displayed.
It can be known that the preview interfaces in this optional embodiment include the first preview interface, the second preview interface, the third preview interface, the fourth preview interface, and the like appearing in the first embodiment and the second embodiment, and the target touch positions in this optional embodiment include the first touch position, the second touch position, the third touch position, the fourth touch position, and the like passed by the touch object in the first embodiment and the second embodiment.
For any target touch position related to the interface position of the preview interface, the embodiment may extract a target touch coordinate of the target touch position, and on the premise that the interface aspect ratio of the target application window is the same as the screen aspect ratio, vertex coordinates of vertices of each interface in the preview interface may be determined through the target touch coordinate and the aspect value of the screen.
For example, when the user operates the touch object to slide from the edge of the screen, the selected edge of the screen has an influence on the determination of the vertex coordinates of each interface of the preview interface. The present embodiment illustrates the specific implementation of each vertex coordinate determination case by case based on the difference of the screen edge selected by the user.
In this embodiment, when the edge of the screen is a left edge or a right edge, determining a vertex abscissa of each interface vertex through an abscissa in the target touch coordinate; and determining the vertex vertical coordinate corresponding to each interface vertex by combining the vertex horizontal coordinate with the width and height value of the screen.
It can be known that, in the present embodiment, the touch coordinate of the target touch position is first set to be on the side parallel to the edge of the screen and closest to the edge of the screen, that is, the side corresponding to the preview interface moves along with the movement of the target touch position. Meanwhile, as can be seen from the above description of the embodiment, the executing entity may obtain a touch coordinate fed back by the touch frame relative to the target touch position, where the touch coordinate is recorded as a target touch coordinate, and the target touch coordinate includes an abscissa and an ordinate.
Based on this, when the screen edge is the left side edge or the right side edge, the sliding direction sliding from the screen edge to the screen middle takes the lateral sliding as the main direction. When the lateral sliding is taken as the main sliding direction, firstly, the edge on the preview interface, which is parallel to the left edge or the right edge, is laterally translated, and the specific value of the lateral translation can be determined by the abscissa of the target touch position.
Illustratively, assuming that the preview interface of the target application window is represented by ABCD, where the edge AB characterizes an edge that is parallel to the left/right edge and closest to the left/right edge, the edge CD may characterize an edge that is parallel to the left/right edge but farther from the left/right edge. In addition, a point E is used for representing the target touch position, at this time, the point E can be considered to be on the side AB, therefore, the abscissa of the vertex A and the vertex B can be considered to be the abscissa of the point E, and therefore the sliding distance of the touch object from the left side edge/the right side edge to the point E can be determined; meanwhile, it is considered that the side CD is also moved in the horizontal direction toward the middle of the screen by the distance by which the touch object slides from the left edge/right edge to the point E, and thereby the abscissa at this time of the vertex C and the vertex D can also be calculated.
On the basis of obtaining the abscissa of each vertex a, B, C, and D and the known screen width and height values, the specific values of the side AC and the side BD can be calculated, and meanwhile, because the boundary aspect ratio equivalent to the screen width and height ratio is previewed, after the specific values of the side AC and the side BD are determined, the specific values of the side AB and the side CD can be determined, so that the ordinate of each vertex a, B, C, and D can be obtained, and finally, the vertex coordinates of each vertex a, B, C, and D can be obtained.
In this embodiment, when the edge of the screen is a top edge/a bottom edge, determining a vertex ordinate of each interface vertex through an ordinate in the target touch coordinate; and determining vertex horizontal coordinates corresponding to the vertexes of the interfaces by combining the vertex vertical coordinates and the width and height values of the screen.
Illustratively, when the screen edge is the top edge/bottom edge, the sliding direction from the screen edge to the screen middle is the longitudinal sliding direction. When the longitudinal sliding is taken as the main sliding direction, firstly, the edge on the preview interface, which is parallel to the top edge or the bottom edge, undergoes longitudinal translation, and the specific value of the longitudinal translation can be determined by the ordinate of the target touch position.
In particular, assuming that the preview interface of the target application window is still represented in ABCD, where edge AC represents the edge parallel to the top/bottom edge and closest to the top/bottom edge, edge BD may represent the edge parallel to the top/bottom edge but further from the top/bottom edge. In addition, the point E is still used for representing the target touch position, at this time, the point E can be considered to be on the side AC, and therefore, the ordinate of the vertex a and the vertex C can be considered to be the ordinate of the point E, so that the sliding distance of the touch object from the top edge/the bottom edge to the point E can be determined; meanwhile, it is considered that the edge BD also moves in the vertical direction toward the center of the screen by the distance by which the touching object slides from the top edge/bottom edge to the point E, and thereby the vertical coordinates of the vertices C and D at that time can also be calculated.
On the basis of obtaining the ordinate of each vertex a, B, C, and D and the known screen width and height value, the specific values of the side AB and the side CD may be calculated, and meanwhile, because the boundary aspect ratio equivalent to the screen width and height ratio is previewed, after the specific values of the side AB and the side CD are determined, the specific values of the side AC and the side BD may also be determined, so that the abscissa of each vertex a, B, C, and D may be obtained, and finally the vertex coordinates of each vertex a, B, C, and D may be obtained.
The second optional embodiment provides a bottom layer specific implementation of determining the interface position of the preview interface according to the touch coordinate of the touch position where the touch object stays, and provides a bottom layer technical support for implementing the application window control method provided by the second embodiment.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an application window control apparatus according to a third embodiment of the present invention, as shown in fig. 3, the apparatus includes: a full screen display module 31, a first receiving module 32, a first display module 33, a second receiving module 34, a second display module 35, a third receiving module 36, and a third display module 37;
the full-screen display module 31 is configured to display a target application window in a full-screen state;
a first receiving module 32, configured to receive a first touch operation, where the first touch operation is that the touch object slides from an edge of the screen to a middle of the screen and stays at a first touch position;
the first display module 33 is configured to display a first preview interface of the target application window, where an interface position of the first preview interface is related to the first touch position;
a second receiving module 34, configured to receive a second touch operation, where the second touch operation is that the touching object slides from the first touch position to the middle of the screen and stays at the second touch position;
a second display module 35, configured to display a second preview interface of the target application window, where an interface position of the second preview interface is related to the second touch position;
a third receiving module 36, configured to receive a third touch operation, where the third touch operation is that the touching object leaves the screen from the second touch position;
and a third display module 37, configured to display a first interface of the target application window, where the first interface has the same interface position as the second preview interface.
In the application window control device provided by the third embodiment of the present invention, the touch operation generated by the interaction between the user and the interactive tablet is only related to the sliding of the touch object from the edge of the screen to the middle of the screen, and the interactive tablet can also directly adjust the interface position of the target application window by responding to each touch operation to control the target application window to exit from the full screen state. In the whole interaction implementation, the problem that a user cannot effectively control the application window to exit from the full screen due to the fact that the size of the interaction panel is too large is solved, and the problem that the application window can be controlled to exit from the full screen only by frequent interaction with the user is also solved. Therefore, convenience in user operation is realized, and the system availability of the interactive tablet is greatly improved.
Further, the third touch operation is that the touching object leaves the screen from the second touch position, and the sliding speed of the touching object sliding to the second touch position is less than the set first speed threshold.
Further, the apparatus further comprises:
a fourth receiving module, configured to receive a fourth touch operation, where the fourth touch operation is that the touching object leaves the screen from the second touch position, and a sliding speed of the touching object sliding to the second touch position is greater than or equal to the first speed threshold;
and the fourth display module is used for displaying a second interface of the target application window, and the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
Further, the apparatus further comprises:
a fifth receiving module, configured to receive a fifth touch operation, where the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at a third touch position;
the fifth display module is used for displaying a third preview interface of the target application window, and the interface position of the third preview interface is related to the third touch position;
a sixth receiving module, configured to receive a sixth touch operation, where the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at a fourth touch position;
and the sixth display module is used for displaying a fourth preview interface of the target application window, wherein the interface position of the fourth preview interface is related to the fourth touch position.
Further, the apparatus further comprises:
a seventh receiving module, configured to receive a seventh touch operation, where the seventh touch operation is that the touching object leaves the screen from the fourth touch position;
and the seventh display module is used for displaying a third interface of the target application window, and the display positions of the third interface and the fourth preview interface are the same.
Further, a seventh touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object sliding to the fourth touch position is less than a set second speed threshold.
Further, the apparatus further comprises:
an eighth receiving module, configured to receive an eighth touch operation, where the eighth touch operation is that the touching object leaves the screen from the fourth touch position, and a sliding speed of the touching object sliding to the fourth touch position is greater than or equal to the second speed threshold;
and the eighth display module is used for displaying the target application window in a full screen state.
Further, the apparatus further comprises:
a ninth receiving module, configured to receive a ninth touch operation, where the ninth touch operation is a sliding of the touch object from the fourth touch position to the edge of the screen;
and the ninth display module is used for displaying the target application window in a full screen state.
Further, for each preview interface displayed by the target application window, the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface is characterized by vertex coordinates of the interface vertices;
each of the vertex coordinates is determined by an information determination module included in the apparatus,
the information determining module is used for determining a target touch coordinate of a target touch position by combining a width and height value of a screen, wherein the target touch position is a touch position where the touch object stays when the preview interface is displayed.
On the basis of the optimization, the information determination module is specifically configured to:
when the edge of the screen is a left edge/a right edge, determining a vertex abscissa of each interface vertex through an abscissa in the target touch coordinate;
and determining the vertex vertical coordinate corresponding to each interface vertex by combining the vertex horizontal coordinate with the width and height value of the screen.
On the basis of the foregoing optimization, the information determining module is specifically further configured to:
when the edge of the screen is a bottom edge/a top edge, determining a vertex ordinate of each interface vertex through the ordinate in the target touch coordinate;
and determining vertex horizontal coordinates corresponding to the vertexes of the interfaces by combining the vertex vertical coordinates and the width and height values of the screen.
Further, the operation of the touch object leaving from one touch position is identified by the called touch monitoring function.
Further, the sliding speed of the touch object when sliding from one touch position to another touch position is determined by the touch coordinates and the touch time point of each touch position.
Example four
Fig. 4 is a schematic structural diagram of an interactive tablet according to a fourth embodiment of the present application. The interactive tablet includes: a processor 40, a memory 41, a display 42, an input device 43, an output device 44, a touch assembly 45. The number of processors 40 in the interactive tablet may be one or more, and one processor 40 is taken as an example in fig. 4. The number of the memories 41 in the interactive tablet may be one or more, and one memory 41 is taken as an example in fig. 4. The processor 40, memory 41, display 42, input device 43, output device 44, and touch assembly 45 of the interactive tablet may be connected via a bus or other means, such as via a bus connection in fig. 4.
The memory 41 serves as a computer readable storage medium for storing software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the interactive tablet according to any embodiment of the present invention (for example, the full screen display module 31, the first receiving module 32, the first display module 33, the second receiving module 34, the second display module 35, the third receiving module 36, and the third display module 37 in the application window control device). The memory 41 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory 41 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 41 may further include memory located remotely from processor 40, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The display screen 42 and the touch component 45 are overlaid (the overlaying relationship is not shown in fig. 4), and may form a touch screen for displaying interactive contents, and generally speaking, the display screen 42 is used for displaying data according to the indication of the processor 40, and is also used for receiving touch operation applied to the display screen 42 and sending corresponding signals to the processor 40 or other devices.
The input means 43 may be used for receiving input numeric or character information and generating key signal inputs related to user settings and function controls of the presentation apparatus, and may be a camera for acquiring graphics and a sound pickup apparatus for acquiring audio data. The output device 44 may include an audio device such as a speaker. It should be noted that the specific composition of the input device 43 and the output device 44 can be set according to actual conditions.
And a touch assembly 45 for responding to a touch operation of a touch object through a hardware circuit included therein.
The processor 40 executes various functional applications and data processing of the device by running software programs, instructions and modules stored in the memory 41, that is, implements the application window control method provided by any of the above-described embodiments.
The interactive tablet provided by the above can be used for executing the application window control method provided by any of the above embodiments, and has corresponding functions and beneficial effects.
EXAMPLE five
A fifth embodiment of the present application further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a method for controlling an application window, including:
displaying a target application window in a full screen state;
receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position;
displaying a first preview interface of the target application window, wherein the interface position of the first preview interface is related to the first touch position;
receiving a second touch operation, wherein the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at a second touch position;
displaying a second preview interface of the target application window, wherein the interface position of the second preview interface is related to the second touch position;
receiving a third touch operation, wherein the third touch operation is that the touch object leaves the screen from the second touch position;
and displaying a first interface of the target application window, wherein the first interface has the same interface position as the second preview interface.
Of course, the storage medium including the computer-executable instructions provided in the embodiments of the present application is not limited to the above-described operations of the application window control method, and may also perform related operations in the application window control method provided in any embodiment of the present invention, and has corresponding functions and advantages.
EXAMPLE six
An embodiment of the present application further provides a computer program, which when executed by a computer processor is configured to implement an application window control method, the method including:
displaying a target application window in a full screen state;
receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position;
displaying a first preview interface of the target application window, wherein the interface position of the first preview interface is related to the first touch position;
receiving a second touch operation, wherein the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at a second touch position;
displaying a second preview interface of the target application window, wherein the interface position of the second preview interface is related to the second touch position;
receiving a third touch operation, wherein the third touch operation is that the touch object leaves the screen from the second touch position;
and displaying a first interface of the target application window, wherein the first interface has the same interface position as the second preview interface.
Further, the third touch operation is that the touching object leaves the screen from the second touch position, and the sliding speed of the touching object sliding to the second touch position is less than the set first speed threshold.
Further, the method further comprises:
receiving a fourth touch operation, wherein the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object sliding to the second touch position is greater than or equal to the first speed threshold;
and displaying a second interface of the target application window, wherein the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
Further, the method also includes:
receiving a fifth touch operation, wherein the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at a third touch position;
displaying a third preview interface of the target application window, wherein the interface position of the third preview interface is related to the third touch position;
receiving a sixth touch operation, wherein the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at a fourth touch position;
and displaying a fourth preview interface of the target application window, wherein the interface position of the fourth preview interface is related to the fourth touch position.
Further, the method further comprises:
receiving a seventh touch operation, wherein the seventh touch operation is that the touch object leaves the screen from the fourth touch position;
and displaying a third interface of the target application window, wherein the display position of the third interface is the same as that of the fourth preview interface.
Further, the seventh touch operation is that the touching object leaves the screen from the fourth touch position, and the sliding speed of the touching object sliding to the fourth touch position is less than a set second speed threshold.
Further, the method further comprises:
receiving an eighth touch operation, wherein the eighth touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object sliding to the fourth touch position is greater than or equal to the second speed threshold;
displaying the target application window in a full screen state.
Further, the method further comprises:
receiving a ninth touch operation, wherein the ninth touch operation is that the touch object slides to the edge of the screen from the fourth touch position;
displaying the target application window in a full-screen state.
Further, for each preview interface displayed by the target application window, the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface is characterized by vertex coordinates of the interface vertices;
and determining each vertex coordinate by combining a target touch coordinate of a target touch position with a width and height value of a screen, wherein the target touch position is a touch position where the touch object stays when the preview interface is displayed.
Further, the step of determining each vertex coordinate by the target touch coordinate of the target touch position and the width and height value of the screen includes:
when the edge of the screen is a left edge/a right edge, determining a vertex abscissa of each interface vertex through an abscissa in the target touch coordinate;
and determining the vertex vertical coordinate corresponding to each interface vertex by combining the vertex horizontal coordinate with the width and height value of the screen.
Further, the step of determining each vertex coordinate by the target touch coordinate of the target touch position and the width and height value of the screen includes:
when the edge of the screen is a bottom edge/a top edge, determining a vertex ordinate of each interface vertex through the ordinate in the target touch coordinate;
and determining vertex horizontal coordinates corresponding to the vertexes of the interfaces by combining the vertex vertical coordinates and the width and height values of the screen.
Further, the operation of the touch object moving away from a touch position is identified by the called touch listening function.
Further, the sliding speed of the touching object when sliding from one touching position to another touching position is determined by the touch coordinates and the touch time point of each touching position.
From the above description of the embodiments, it is obvious for those skilled in the art that the present application can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling an interactive tablet (which may be a robot, a personal computer, a server, or a network device) to execute the application window control method according to any embodiment of the present application.
It should be noted that, in the course recommendation apparatus, the units and modules included in the course recommendation apparatus are only divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (16)

  1. An application window control method, comprising:
    displaying a target application window in a full screen state;
    receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position;
    displaying a first preview interface of the target application window, wherein the interface position of the first preview interface is related to the first touch position;
    receiving a second touch operation, wherein the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at a second touch position;
    displaying a second preview interface of the target application window, wherein the interface position of the second preview interface is related to the second touch position;
    receiving a third touch operation, wherein the third touch operation is that the touch object leaves the screen from the second touch position;
    and displaying a first interface of the target application window, wherein the first interface has the same interface position as the second preview interface.
  2. The method of claim 1, wherein the third touch operation is that the touching object leaves the screen from the second touch position, and a sliding speed of the touching object sliding to the second touch position is less than a set first speed threshold.
  3. The method of claim 2, further comprising:
    receiving a fourth touch operation, wherein the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object sliding to the second touch position is greater than or equal to the first speed threshold;
    and displaying a second interface of the target application window, wherein the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
  4. The method according to any one of claims 1-3, further comprising:
    receiving a fifth touch operation, wherein the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at a third touch position;
    displaying a third preview interface of the target application window, wherein the interface position of the third preview interface is related to the third touch position;
    receiving a sixth touch operation, wherein the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at a fourth touch position;
    and displaying a fourth preview interface of the target application window, wherein the interface position of the fourth preview interface is related to the fourth touch position.
  5. The method of claim 4, further comprising:
    receiving a seventh touch operation, wherein the seventh touch operation is that the touch object leaves the screen from the fourth touch position;
    and displaying a third interface of the target application window, wherein the display position of the third interface is the same as that of the fourth preview interface.
  6. The method of claim 5, wherein the seventh touch operation is that the touching object leaves the screen from the fourth touch position, and the sliding speed of the touching object sliding to the fourth touch position is less than a set second speed threshold.
  7. The method of claim 6, further comprising:
    receiving an eighth touch operation, wherein the eighth touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object sliding to the fourth touch position is greater than or equal to the second speed threshold;
    displaying the target application window in a full-screen state.
  8. The method of any one of claims 4-7, further comprising:
    receiving a ninth touch operation, wherein the ninth touch operation is that the touch object slides from the fourth touch position to the edge of the screen;
    displaying the target application window in a full-screen state.
  9. The method of any of claims 4-8, wherein for each preview interface displayed by the target application window, the preview interface has an interface aspect ratio that is the same as the aspect ratio of the screen, and the interface position of the preview interface is characterized by vertex coordinates of interface vertices;
    and determining each vertex coordinate by combining a target touch coordinate of a target touch position with a width and height value of a screen, wherein the target touch position is a touch position where the touch object stays when the preview interface is displayed.
  10. The method of claim 9, wherein the step of determining each vertex coordinate from the target touch coordinate of the target touch position in combination with the width-height value of the screen comprises:
    when the edge of the screen is a left edge/a right edge, determining a vertex abscissa of each interface vertex through an abscissa in the target touch coordinate;
    and determining the vertex vertical coordinate corresponding to each interface vertex by combining the vertex horizontal coordinate with the width and height value of the screen.
  11. The method according to claim 9 or 10, wherein the step of determining each vertex coordinate from the target touch coordinate of the target touch position in combination with the width and height values of the screen comprises:
    when the edge of the screen is a bottom edge/a top edge, determining a vertex ordinate of each interface vertex through the ordinate in the target touch coordinate;
    and determining the vertex abscissa corresponding to each interface vertex by combining the vertex ordinate with the width and height values of the screen.
  12. The method of any of claims 1-11, wherein the act of moving the touching object away from a touch location is identified by a called touch listening function.
  13. The method according to any one of claims 1-11, wherein the sliding speed of the touching object when sliding from one touch position to another touch position is determined by the touch coordinates and touch time of each touch position.
  14. An application window control apparatus, comprising:
    the full screen display module is used for displaying the target application window in a full screen state;
    the first receiving module is used for receiving a first touch operation, wherein the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at a first touch position;
    the first display module is used for displaying a first preview interface of the target application window, and the interface position of the first preview interface is related to the first touch position;
    the second receiving module is used for receiving a second touch operation, wherein the second touch operation is that the touch object slides to the middle of the screen from the first touch position and stays at the second touch position;
    the second display module is used for displaying a second preview interface of the target application window, and the interface position of the second preview interface is related to the second touch position;
    a third receiving module, configured to receive a third touch operation, where the third touch operation is that the touching object leaves the screen from the second touch position;
    and the third display module is used for displaying the first interface of the target application window, and the first interface has the same interface position as the second preview interface.
  15. An interactive tablet, comprising:
    the touch component is used for responding to the touch operation of a touch object through included hardware circuits;
    the display screen is covered with the touch component to form a touch screen and is used for displaying an application window;
    one or more processors;
    storage means for storing one or more programs;
    when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-13.
  16. A storage medium containing computer-executable instructions for performing the method of any one of claims 1-13 when executed by a computer processor.
CN202180005735.1A 2021-07-27 2021-07-27 Application window control method and device, interactive panel and storage medium Pending CN115885245A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/108753 WO2023004600A1 (en) 2021-07-27 2021-07-27 Application window control method and apparatus, and interactive flat panel and storage medium

Publications (1)

Publication Number Publication Date
CN115885245A true CN115885245A (en) 2023-03-31

Family

ID=85086107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180005735.1A Pending CN115885245A (en) 2021-07-27 2021-07-27 Application window control method and device, interactive panel and storage medium

Country Status (2)

Country Link
CN (1) CN115885245A (en)
WO (1) WO2023004600A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101702111B (en) * 2009-11-13 2013-07-03 宇龙计算机通信科技(深圳)有限公司 Method for realizing content scaling of touch screen and terminal
CN102981596A (en) * 2012-12-21 2013-03-20 东莞宇龙通信科技有限公司 Terminal and screen interface display method
CN104503682A (en) * 2014-11-07 2015-04-08 联发科技(新加坡)私人有限公司 Method for processing screen display window and mobile terminal
CN105549824A (en) * 2015-12-26 2016-05-04 魅族科技(中国)有限公司 Display control method and mobile terminal
US20180203596A1 (en) * 2017-01-19 2018-07-19 Microsoft Technology Licensing, Llc Computing device with window repositioning preview interface
CN111124338B (en) * 2019-12-18 2024-03-08 青岛海信商用显示股份有限公司 Screen control method and touch display device
CN111966252A (en) * 2020-05-14 2020-11-20 华为技术有限公司 Application window display method and electronic equipment

Also Published As

Publication number Publication date
WO2023004600A1 (en) 2023-02-02

Similar Documents

Publication Publication Date Title
US11301200B2 (en) Method of providing annotation track on the content displayed on an interactive whiteboard, computing device and non-transitory readable storage medium
CN106775313B (en) Split screen operation control method and mobile terminal
KR102611858B1 (en) Method for operating intelligent interactive tablet, and storage medium and related device
US9880727B2 (en) Gesture manipulations for configuring system settings
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
US20130307765A1 (en) Contactless Gesture-Based Control Method and Apparatus
CN101667058B (en) Interactive method for switching focuses among multiple systems
CN108829314B (en) Screenshot selecting interface selection method, device, equipment and storage medium
CN104331246A (en) Device and method for split screen display in terminal
CN109885222B (en) Icon processing method and device, electronic equipment and computer readable medium
CN110750197A (en) File sharing method, device and system, corresponding equipment and storage medium
CN110928614B (en) Interface display method, device, equipment and storage medium
CN112099707A (en) Display method and device and electronic equipment
WO2023207145A1 (en) Screenshot capturing method and apparatus, electronic device and computer readable medium
CN109976614B (en) Method, device, equipment and medium for marking three-dimensional graph
CN105589636A (en) Method and mobile terminal used for realizing virtual pointer control on touch screen
US11455071B2 (en) Layout method, device and equipment for window control bars
CN105824401A (en) Mobile terminal control method and mobile terminal thereof
CN110688190A (en) Control method and device of intelligent interactive panel
CN113721808A (en) Control method and device
CN107817927B (en) Application icon management method and device
WO2020143387A1 (en) Table processing method, device and system, and storage medium and intelligent interactive tablet
WO2017075958A1 (en) Terminal operation method and device
CN104216624A (en) Display method and electronic device
CN115885245A (en) Application window control method and device, interactive panel and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination