CN111338550A - Window control method, device and equipment - Google Patents

Window control method, device and equipment Download PDF

Info

Publication number
CN111338550A
CN111338550A CN202010091478.5A CN202010091478A CN111338550A CN 111338550 A CN111338550 A CN 111338550A CN 202010091478 A CN202010091478 A CN 202010091478A CN 111338550 A CN111338550 A CN 111338550A
Authority
CN
China
Prior art keywords
window
touch points
screen display
large screen
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010091478.5A
Other languages
Chinese (zh)
Inventor
魏巍
罗永浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010091478.5A priority Critical patent/CN111338550A/en
Publication of CN111338550A publication Critical patent/CN111338550A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a window control method, a device and equipment, which are used for controlling a window displayed by a large screen display through multi-point touch control, the method is applied to an intelligent terminal, the intelligent terminal is connected with the large screen display, and at least one window is displayed on the large screen display, and the method comprises the following steps: acquiring the position of a touch point detected by the large screen display in real time; determining a controlled window in a window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points, wherein the number of the touch points is at least three; determining a corresponding control instruction according to the change condition of the position of the touch point; and executing the operation corresponding to the control instruction aiming at the controlled window.

Description

Window control method, device and equipment
The application is named as: a window control method, device and equipment, the application number is: 201810457284.5, the filing date of the parent application is 2018, 05 and 14.
Technical Field
The application relates to the field of internet, in particular to a window control method, device and equipment.
Background
In recent years, technologies in the field of Human-Computer Interaction have been rapidly developed, wherein Human-Computer Interaction technologies (Human-Computer Interaction technologies) refer to technologies for realizing Human-Computer Interaction in an efficient manner through Computer input and output devices. With the explosion of Artificial Intelligence (AI), as the most focused future technology, Artificial Intelligence has been increasingly applied to the field of human-computer interaction.
At present, the interaction between an AI large screen and people is taken as one of man-machine interaction, and the utilization rate in the aspects of office work, study and the like is higher and higher. However, with the development of AI large screens, people often want to display multiple windows on a large screen display at the same time, for example, simultaneously display a video player window, a word document editing window, etc., so that a user can record a look-after feeling, etc. by using the word document editing window while playing a video by a player. However, in the prior art, when a user needs to perform a control operation on a window, such as closing the window, maximizing the window, minimizing the window, etc., the user can only operate the window by operating a mouse, for example, by clicking a corresponding control button on the window to control the window. When the control buttons on the currently controlled window are blocked due to the overlapping of the windows, other windows need to be moved first, and then the currently controlled window is controlled. This approach is cumbersome and inflexible, resulting in poor user experience.
Disclosure of Invention
In order to solve the above problems, the present application provides a window control method, device and apparatus, so as to solve the technical problem in the prior art that the operation of controlling a window is complicated and inflexible.
In order to solve the above problem, the technical solution provided by the embodiment of the present application is as follows:
in a first aspect, an embodiment of the present application provides a window control method, where the method is applied to an intelligent terminal, the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, and the method includes:
acquiring the position of a touch point detected by the large screen display in real time;
determining a controlled window in a window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points, wherein the number of the touch points is at least three;
determining a corresponding control instruction according to the change condition of the position of the touch point;
and executing the operation corresponding to the control instruction aiming at the controlled window.
In a possible implementation manner, the determining, according to the number of the touch points and the area where the position of the touch point is located, a controlled window in a window displayed by the large-screen display includes:
and when the number of the touch points is at least three and half or more than half of the touch points are positioned in a window area displayed on the large screen display, determining the window as a controlled window.
In a possible implementation manner, the determining a corresponding control instruction according to the change condition of the position of the touch point includes:
and when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window maximization instruction.
In a possible implementation manner, the determining a corresponding control instruction according to the change condition of the position of the touch point includes:
and when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction.
In a possible implementation manner, the determining a corresponding control instruction according to the change condition of the position of the touch point includes:
and when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction.
In a possible implementation manner, the determining a corresponding control instruction according to the change condition of the position of the touch point includes:
and when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is larger than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction.
In a possible implementation manner, the determining, according to the number of the touch points and the area where the position of the touch point is located, a controlled window in a window displayed by the large-screen display includes:
and when the number of the touch points is at least six, and the number of the touch points in the left area of the large screen display is determined to be more than or equal to three and the number of the touch points in the right area of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all windows as controlled windows.
In a possible implementation manner, the determining a corresponding control instruction according to the change condition of the position of the touch point includes:
and when the positions of the touch points in the left area of the large screen display move leftwards at the same time and the positions of the touch points in the right area of the large screen display move rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
In a second aspect, an embodiment of the present application provides a window control device, where the device is applied to an intelligent terminal, the intelligent terminal is connected with a large screen display, and at least one window is displayed on the large screen display, where the device includes:
the acquisition unit is used for acquiring the position of the touch point detected by the large screen display in real time;
a first determining unit, configured to determine a controlled window in a window displayed by the large screen display according to the number of the touch points and an area where the touch points are located, where the number of the touch points is at least three;
the second determining unit is used for determining a corresponding control instruction according to the change condition of the position of the touch point;
and the control unit is used for executing the operation corresponding to the control instruction aiming at the controlled window.
In a possible implementation manner, the first determining unit is specifically configured to:
and when the number of the touch points is at least three and half or more than half of the touch points are positioned in a window area displayed on the large screen display, determining the window as a controlled window.
In a possible implementation manner, the second determining unit is specifically configured to:
and when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window maximization instruction.
In a possible implementation manner, the second determining unit is specifically configured to:
and when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction.
In a possible implementation manner, the second determining unit is specifically configured to:
and when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction.
In a possible implementation manner, the second determining unit is specifically configured to:
and when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is larger than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction.
In a possible implementation manner, the first determining unit is specifically configured to:
and when the number of the touch points is at least six, and the number of the touch points in the left area of the large screen display is determined to be more than or equal to three and the number of the touch points in the right area of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all windows as controlled windows.
In a possible implementation manner, the second determining unit is specifically configured to:
and when the positions of the touch points in the left area of the large screen display move leftwards at the same time and the positions of the touch points in the right area of the large screen display move rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
In a third aspect, embodiments of the present application provide a window control apparatus, the apparatus including a memory and a processor,
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is used for executing the window control method according to instructions in the program codes.
Therefore, the embodiment of the application has the following beneficial effects:
the embodiment of the application can acquire the position of the touch point detected by the large screen display in real time, the touch points can be generated when a user touches the large screen display through a plurality of fingers, and then the controlled window is determined in the window displayed by the large screen display according to the number of the touch points and the position area of the touch points, the corresponding control instruction is determined according to the change condition of the position of the touch points, the operation corresponding to the control instruction is executed aiming at the controlled window, so that the user can realize different control over the window displayed by the large screen display through different gestures generated by the fingers, the operation is convenient and more flexible.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a flowchart of an embodiment of a window control method according to an embodiment of the present disclosure;
fig. 3 is a flowchart of another embodiment of a window control method according to an embodiment of the present application;
fig. 4 is a flowchart of another embodiment of a window control method according to an embodiment of the present application;
fig. 5 is a flowchart of another embodiment of a window control method according to an embodiment of the present application;
fig. 6 is a flowchart of another embodiment of a window control method according to an embodiment of the present application;
fig. 7 is a flowchart of another embodiment of a window control method according to an embodiment of the present application;
fig. 8 is a schematic diagram illustrating an embodiment of a window control method according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram of an embodiment of a window control apparatus according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of an embodiment of a window control apparatus according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In order to facilitate understanding of the technical solutions provided by the present application, an application scenario of the present application in practice is described below.
Referring to fig. 1, which shows a schematic diagram of an exemplary application scenario of an embodiment of the present application, as shown in fig. 1, the window control method provided in the present application may be applied to an application scenario including a smart terminal 101 and a large screen display 102; the intelligent terminal 101 is any user equipment, such as a smart phone, a tablet computer, a laptop personal computer, and the like, which can establish connection with the large screen display 102, realize human-computer interaction with a user, and further control the large screen display 102 according to the touch point position information sent by the large screen display 102; the large screen display 102 is a large screen display device that can communicate with the intelligent terminal 101 and display a plurality of windows for the intelligent terminal 101, the large screen display 102 has a touch function, when a user touches the large screen display 102, the large screen display 102 can detect the position of a touch point and send the touch point to the intelligent terminal 101 in real time, the large screen display 102 is a large screen in a visual color television or a rear projection television, and the diagonal size of the screen can be more than 40 inches.
In practical applications, the smart terminal 101 is first connected to the large screen display 102, so that multiple windows of the smart terminal can be displayed on the large screen display 102, for example, an instant messenger window, a video player window, and the like can be simultaneously displayed. When the large-screen display 102 detects that a user touches the large-screen display 102 through a plurality of fingers, touch point position information is sent to the intelligent terminal 101 in real time, and after the intelligent terminal 101 receives the touch point position information sent by the large-screen display 102 in real time, a controlled window is determined according to the number of touch points and the area where the touch points are located, and a corresponding control instruction is determined according to the change situation of the positions of the touch points, so that the controlled window executes the operation corresponding to the control instruction. For example, when a user opens and presses a certain window displayed on the large-screen display 102 with three fingers, the intelligent terminal 101 acquires the positions of the three touch points, determines that the three touch points are located in an area displayed by the certain window according to the positions of the three touch points, determines the window as a controlled window, and as the movement of the fingers of the user changes, if the three touch points contract inwards, the positions of the touch points acquired by the intelligent terminal 101 also change continuously, and determines that a corresponding control instruction can be a window closing instruction according to the change situation, so that the controlled window is closed. Therefore, different control over the window displayed by the large screen display can be realized through different gestures generated by a plurality of fingers, and the large screen display is convenient to operate and flexible to realize.
Based on the above application scenarios, embodiments of the present application provide a window control method, which will be described in detail below with reference to the accompanying drawings.
Referring to fig. 2, which shows a flowchart of an embodiment of a window control method provided in an embodiment of the present application, this embodiment may be applied to an intelligent terminal, where the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, as shown in fig. 2, the method may include:
step 201: and acquiring the position of the touch point detected by the large screen display in real time.
In practical applications, the smart terminal may establish a connection with the large screen display, so that a window of at least one application program on the smart terminal may be displayed on the large screen display. A window is the most important part of the user interface, being the visual interface between the user and the application that generated the window.
The large screen display can detect the touch operation of the user on the large screen display in real time and send the position of the touch point of the user to the intelligent terminal connected with the large screen display in real time, namely the intelligent terminal can acquire the position of the touch point detected by the large screen display in real time. In this embodiment, the number of touch points triggered by the user on the large-screen display is at least three, for example, the user may simultaneously touch the large-screen display with a plurality of fingers, and the intelligent terminal may simultaneously acquire the positions of the plurality of touch points. The location of the touch point may be identified by coordinate values of the touch point on the large screen display.
For example, the following steps are carried out: assuming that the smart terminal is a smart phone A and is connected with the large screen display, 3 windows opened by the smart terminal are displayed on the large screen display, and a user touches the large screen display through three fingers, at the moment, the large screen display can determine the positions of the three touch points of the user and sends the positions of the touch points to the smart phone A in real time, so that the smart phone A can continue to execute the subsequent steps.
Step 202: and determining a controlled window in the window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points, wherein the number of the touch points is at least three.
After the intelligent terminal acquires the positions of the touch points, the number of the touch points can be determined according to the number of the acquired positions of the touch points. For example, the intelligent terminal obtains three sets of coordinate values for identifying the positions of the touch points, and then determines that the number of the touch points is three. In the embodiment of the application, the control of the window is realized by multi-point touch on the large-screen display, so that the number of touch points is at least three.
Meanwhile, the intelligent terminal can respectively determine the area of each touch point according to the position of the touch point. Specifically, the intelligent terminal knows the position areas of the windows displayed on the large-screen display, and then can determine the relationship between each touch point and the window area according to the positions of the touch points, for example, three touch points are provided in total, and it can be determined that the positions of two touch points are located in a certain window area and the position of another touch point is located outside the window area. The intelligent terminal can also determine that the touch point is located in the left area or the right area of the large screen display according to the position of the touch point.
According to the number of the touch points and the areas where the touch points are located, a controlled window can be determined from a plurality of windows displayed by the large-screen display, and the controlled window can be one or a plurality of windows. There are various implementation processes for determining a controlled window in the embodiment of the present application, which will be described in detail in the following embodiments.
Step 203: and determining a corresponding control instruction according to the change condition of the position of the touch point.
The user can carry out window control through different touch gestures on the large screen display, the position of a touch point acquired by the intelligent terminal is changed in real time, and the intelligent terminal can determine what touch gesture the current user is according to the change condition, so that a control instruction corresponding to the touch gesture is determined.
The control instruction for the window in the embodiment of the present application may include: window maximize instruction, window move instruction, window close instruction, window minimize instruction, full window hide instruction (i.e., show desktop instruction). How to determine the control command in the embodiment of the present application is also described in detail in the following embodiments.
Step 204: and executing the operation corresponding to the control instruction aiming at the controlled window.
After the controlled window is determined and the control instruction is determined, the operation corresponding to the control instruction can be executed for the controlled window. For example, if the control instruction is a window maximization instruction, executing window maximization operation aiming at the controlled window; if the control instruction is a window moving instruction, moving operation is carried out on the controlled window; and if the control instruction is an all-window hiding instruction, the controlled window is the displayed all-window at the moment, and the operation of hiding all the windows is executed.
Therefore, the position of the touch point detected by the large screen display can be obtained in real time, the touch points can be generated when a user touches the large screen display through a plurality of fingers, the controlled window is determined in the window displayed by the large screen display according to the number of the touch points and the position area of the touch points, the corresponding control instruction is determined according to the change situation of the position of the touch points, and the operation corresponding to the control instruction is executed aiming at the controlled window, so that the user can realize different control over the window displayed by the large screen display through different gestures generated by the plurality of fingers, the operation is convenient and rapid, and the realization is more flexible.
The following description will be made separately for different window control modes, such as window maximization, window movement, window closing, window minimization, and all-window hiding (i.e. displaying desktop).
Referring to fig. 3, which shows a flowchart of another embodiment of a window control method provided in the embodiment of the present application, this embodiment may be applied to an intelligent terminal, where the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, as shown in fig. 3, the method may include:
step 301: and acquiring the position of the touch point detected by the large screen display in real time.
The step is the same as the implementation process of step 201, and for related description, reference may be made to the above-mentioned embodiment, which is not described herein again.
Step 302: and when the number of the touch points is at least three, and half or more than half of the touch points are positioned in a window area displayed on the large-screen display, determining the window as a controlled window.
Step 302 is a possible implementation manner of step 202, and in this embodiment, after the intelligent terminal acquires the positions of the touch points, the number of the touch points may be determined according to the number of the acquired positions of the touch points. And when the number of the touch points is at least three, the positions of the touch points are further judged, the intelligent terminal knows the position area of each window displayed on the large screen display, and when at least half of the touch points are in one window area displayed on the large screen display, the window is a controlled window. For example, when a user touches a large-screen display with three fingers to generate three touch points, and two or three of the three touch points fall into a certain display window, the window is a controlled window.
Step 303: and when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining the corresponding control instruction as a window maximization instruction.
Step 303 is a possible implementation manner of step 203, and in this embodiment, if it is determined that the number of the touch points is at least three, the gesture of the current user is further determined according to the positions of the touch points. The initial position of the touch point is the position when the user finger touches the large screen display, and the current position of the touch point is the real-time position when the user finger moves on the large screen display. If the initial distance between every two touch points is smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, the current distance between every two touch points is larger than a second threshold value according to the current positions of the touch points, the condition that a multi-finger slightly-opened state exists when a user finger touches the large screen display is represented, then each finger is opened outwards, the opened distance reaches a certain degree, the condition that the user wants to perform the maximization operation on the window can be determined, and then the control instruction is a window maximization instruction. In the embodiment of the present application, both the first threshold and the second threshold may be set according to actual situations, which is not limited in the present application.
Step 304: and executing window maximization operation aiming at the controlled window.
The window maximization operation is an operation corresponding to the window maximization instruction, and the display maximization processing can be performed on the controlled window.
The present embodiment is further exemplarily described below through an actual application scenario. For example, a user touches a window displayed on a large screen display in a slightly-opened state through three fingers, then the three fingers gradually open outwards, in the process, the large screen display can detect the positions of touch points generated when the fingers touch the large screen display in real time and send the positions of the touch points to an intelligent terminal, the intelligent terminal judges that the number of the touch points is three, the positions of the three touch points are located in the same window, and the positions of the touch points have a change process from inside to outside, so that the gesture of the current user can be determined to be a gesture for performing the maximum operation on the window, and the intelligent terminal controls the window displayed on the large screen display to perform the maximum operation.
In this embodiment, the user can open the gesture of a plurality of fingers, realizes carrying out the control of maximize operation to a certain window that shows in the big screen display ware, and the simple operation just realizes more nimble.
The embodiments of the present application are not limited to the above-mentioned embodiments, and there are various ways to implement and optimize the maximization window by using multiple fingers in the embodiments of the present application, and these implementations are also considered to be within the scope of the embodiments of the present application.
Referring to fig. 4, which shows a flowchart of another embodiment of a window control method provided in the embodiment of the present application, this embodiment may be applied to an intelligent terminal, where the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, as shown in fig. 4, the method may include:
step 401: and acquiring the position of the touch point detected by the large screen display in real time.
The step is the same as the implementation process of step 201, and for related description, reference may be made to the above-mentioned embodiment, which is not described herein again.
Step 402: and when the number of the touch points is at least three, and half or more than half of the touch points are positioned in a window area displayed on the large-screen display, determining the window as a controlled window.
Step 402 is a possible implementation manner of step 202, and in this embodiment, after the intelligent terminal acquires the positions of the touch points, the number of the touch points may be determined according to the number of the acquired positions of the touch points. And when the number of the touch points is at least three, the positions of the touch points are further judged, the intelligent terminal knows the position area of each window displayed on the large screen display, and when at least half of the touch points are in one window area displayed on the large screen display, the window is a controlled window. For example, when a user touches a large-screen display with three fingers to generate three touch points, and two or three of the three touch points fall into a certain display window, the window is a controlled window.
Step 403: and when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction.
Step 403 is a possible implementation manner of step 203, and in this embodiment, if it is determined that the number of the touch points is at least three, the gesture of the current user is further determined according to the positions of the touch points. When the positions of the touch points move in the same direction at the same time, the gesture that the finger of the user drags on the large screen display after touching the large screen display can be represented, that is, the user wants to move the window can be determined, and the direction of the movement operation is consistent with the movement direction of the touch points, then the control instruction is a window movement instruction.
Step 404: and executing window moving operation aiming at the controlled window.
The window moving operation is an operation corresponding to the window moving instruction, and the controlled window can be moved to a corresponding direction.
The present embodiment is further exemplarily described below through an actual application scenario. For example, a user touches a window displayed on the large-screen display by three fingers, and then the three fingers move to the right at the same time, in the process, the large-screen display can detect the positions of touch points generated by the fingers touching the large-screen display in real time and send the positions of the touch points to the intelligent terminal, and the intelligent terminal judges that the number of the touch points is three, the positions of the three touch points are in the same window, and the positions of the touch points have a change process of moving to the right at the same time, so that the current gesture of the user can be determined to be a gesture for moving the window to the right, and the intelligent terminal controls the window displayed on the large-screen display to move to the right.
In this embodiment, the user can realize moving the control of operation to a certain window that shows in the big screen display through the gesture that a plurality of fingers removed, and the simple operation just realizes more nimble.
The embodiment of the present application is not limited to the above-mentioned embodiments, and there are various ways to implement and optimize the moving window by using multiple fingers in the embodiment of the present application, and these implementation manners are also considered to be within the protection scope of the embodiment of the present application.
Referring to fig. 5, which shows a flowchart of another embodiment of a window control method provided in the embodiment of the present application, this embodiment may be applied to an intelligent terminal, where the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, as shown in fig. 5, the method may include:
step 501: and acquiring the position of the touch point detected by the large screen display in real time.
The step is the same as the implementation process of step 201, and for related description, reference may be made to the above-mentioned embodiment, which is not described herein again.
Step 502: and when the number of the touch points is at least three, and half or more than half of the touch points are positioned in a window area displayed on the large-screen display, determining the window as a controlled window.
Step 502 is a possible implementation manner of step 202, and in this embodiment, after the intelligent terminal acquires the positions of the touch points, the number of the touch points may be determined according to the number of the acquired positions of the touch points. And when the number of the touch points is at least three, the positions of the touch points are further judged, the intelligent terminal knows the position area of each window displayed on the large screen display, and when at least half of the touch points are in one window area displayed on the large screen display, the window is a controlled window. For example, when a user touches a large-screen display with three fingers to generate three touch points, and two or three of the three touch points fall into a certain display window, the window is a controlled window.
Step 503: and when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction.
Step 503 is a possible implementation manner of step 203, and in this embodiment, if it is determined that the number of the touch points is at least three, the gesture of the current user is further determined according to the positions of the touch points. The initial position of the touch point is the position when the user finger touches the large screen display, and the current position of the touch point is the real-time position when the user finger moves on the large screen display. If the initial distance between every two touch points is larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, the current distance between every two touch points is smaller than a fourth threshold value according to the current positions of the touch points, the condition that a user is in a multi-finger fully-opened state when the user touches the large screen display with fingers is represented, then each finger contracts inwards, the contracted distance reaches a certain degree, the user can be determined to want to close the window, and the control instruction is a window closing instruction. In the embodiment of the present application, both the third threshold and the fourth threshold may be set according to actual situations, which is not limited in the present application.
Step 504: and executing window closing operation aiming at the controlled window.
The window closing operation is an operation corresponding to the window closing instruction, and the controlled window can be closed.
The present embodiment is further exemplarily described below through an actual application scenario. For example, a user touches a window displayed on the large screen display in an open state through five fingers, then the five fingers shrink inwards step by step, in the process, the large screen display can detect the position of a touch point generated when the finger touches the large screen display in real time, and send the position of the touch point to the intelligent terminal, and the intelligent terminal can determine that the current gesture of the user is a gesture for closing the window when judging that the number of the touch points is five, the positions of the five touch points are in the same window, and the positions of the touch points have an outside-to-inside change process, so that the intelligent terminal controls the window displayed on the large screen display to be closed.
In this embodiment, the user can realize closing the control of operation to a certain window that shows in the big screen display through the gesture of contracting a plurality of fingers, and the simple operation just realizes more nimble.
The embodiment of the present application is not limited to the above-mentioned embodiments, and there are various ways to implement and optimize the closing of the window by using multiple fingers in the embodiment of the present application, and these implementations are also considered to be within the scope of the embodiment of the present application.
Referring to fig. 6, which shows a flowchart of another embodiment of a window control method provided in the embodiment of the present application, this embodiment may be applied to an intelligent terminal, where the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, as shown in fig. 6, the method may include:
step 601: and acquiring the position of the touch point detected by the large screen display in real time.
The step is the same as the implementation process of step 201, and for related description, reference may be made to the above-mentioned embodiment, which is not described herein again.
Step 602: and when the number of the touch points is at least three, and half or more than half of the touch points are positioned in a window area displayed on the large-screen display, determining the window as a controlled window.
Step 602 is a possible implementation manner of step 202, and in this embodiment, after the intelligent terminal acquires the positions of the touch points, the number of the touch points may be determined according to the number of the acquired positions of the touch points. And when the number of the touch points is at least three, the positions of the touch points are further judged, the intelligent terminal knows the position area of each window displayed on the large screen display, and when at least half of the touch points are in one window area displayed on the large screen display, the window is a controlled window. For example, when a user touches a large-screen display with three fingers to generate three touch points, and two or three of the three touch points fall into a certain display window, the window is a controlled window.
Step 603: and when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is larger than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction.
Step 603 is a possible implementation manner of step 203, and in this embodiment, if it is determined that the number of the touch points is at least three, the gesture of the current user is further determined according to the positions of the touch points. When the positions of the touch points are simultaneously accelerated to move downwards and the acceleration is larger than a fifth threshold value or the positions of the touch points are simultaneously moved downwards to a preset area, the action that the finger is accelerated to drag downwards on the large screen display or dragged downwards to the preset area after the finger of the user touches the large screen display is represented, that is, it can be determined that the user wants to perform the minimization operation on the window, and the control instruction is a window minimization instruction. In the embodiment of the present application, the fifth threshold may be set according to practical situations, and the present application does not limit this.
Step 604: and executing window minimization operation aiming at the controlled window.
The window minimizing operation is an operation corresponding to the window maximizing instruction, and the display minimizing processing can be performed on the controlled window.
The present embodiment is further exemplarily described below through an actual application scenario. For example, a user touches a window displayed on a large screen display in a micro-stretched state by three fingers, then the three fingers move downwards in an accelerated manner, in the process, the large screen display can detect the positions of touch points generated when the fingers touch the large screen display in real time and send the positions of the touch points to an intelligent terminal, and the intelligent terminal can determine that the current gesture of the user is a gesture for performing a minimization operation on the window when judging that the number of the touch points is three, the positions of the three touch points are in the same window, and the positions of the touch points have a downward accelerated change process, so that the intelligent terminal controls the window displayed on the large screen display to perform the minimization operation. For another example, a user touches a window displayed on the large-screen display in a slightly-stretched state with three fingers, then the three fingers move downward to a preset area, such as the lowermost area of the large-screen display, in the process, the large-screen display can detect the positions of touch points generated when the fingers touch the large-screen display in real time and send the positions of the touch points to the intelligent terminal, and the intelligent terminal determines that the current user gesture is a gesture for performing a minimization operation on the window when determining that the number of the touch points is three, the positions of the three touch points are in the same window, and the positions of the touch points all have a change process of moving downward to the preset area, so that the intelligent terminal controls the window displayed on the large-screen display to perform the minimization operation.
In this embodiment, the user can realize the control of carrying out the minimizing operation to a certain window that shows in the big screen display through the gesture that a plurality of fingers moved down, and the simple operation just realizes more nimble.
The embodiments of the present application are not limited to the above-mentioned embodiments, and there are various ways to implement and optimize the minimization window by using multiple fingers in the embodiments of the present application, and these implementations are also considered to be within the scope of the embodiments of the present application.
Referring to fig. 7, which shows a flowchart of another embodiment of a window control method provided in the embodiment of the present application, this embodiment may be applied to an intelligent terminal, where the intelligent terminal is connected to a large-screen display, and at least one window is displayed on the large-screen display, as shown in fig. 7, the method may include:
step 701: and acquiring the position of the touch point detected by the large screen display in real time.
The step is the same as the implementation process of step 201, and for related description, reference may be made to the above-mentioned embodiment, which is not described herein again.
Step 702: and when the number of the touch points is at least six, and the number of the touch points in the left area of the large screen display is determined to be more than or equal to three and the number of the touch points in the right area of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all the windows as controlled windows.
Step 702 is a possible implementation manner of step 202, and in this embodiment, after the intelligent terminal acquires the positions of the touch points, the number of the touch points may be determined according to the number of the acquired positions of the touch points. And when the number of the touch points is at least six, further judging the positions of the touch points. And when the number of the touch points in the left area of the large screen display is more than or equal to three and the number of the touch points in the right area of the large screen display is more than or equal to three, determining all the windows as controlled windows. For example, if the current user touches the left area of the large-screen display with three fingers of the left hand and touches the right area of the large-screen display with three fingers of the right hand to generate six touch points, all windows displayed on the large-screen display are controlled windows.
Step 703: and when the positions of the touch points in the left area of the large screen display move leftwards at the same time and the positions of the touch points in the right area of the large screen display move rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
Step 703 is a possible implementation manner of step 203, in this embodiment, if it is determined that the number of the touch points is at least six, the gesture of the current user is further determined according to the positions of the touch points. When the position of the touch point in the left area of the large screen display moves leftwards at the same time, the position of the touch point in the right area of the large screen display moves rightwards at the same time, the left area of the large screen display is touched by a plurality of fingers of the left hand of a user, the right area of the large screen display is touched by a plurality of fingers of the right hand of the user, then the left hand and the right hand move towards the two sides of the large screen display at the same time, and the control instruction is the hiding instruction of all windows when the user wants.
Step 704: a full window hiding operation is performed for the full window.
All window hiding operations are operations corresponding to all window hiding instructions, and all windows can be hidden, so that a desktop can be displayed.
The present embodiment is further exemplarily described below through an actual application scenario. Referring to fig. 8, for example, a user touches the left area of the large screen display with three fingers of the left hand, touches the right area of the large screen display with three fingers of the right hand, and then both hands move to both sides of the large screen display, in the process, the large screen display may detect, in real time, the positions of touch points generated by the fingers touching the large screen display, and send the positions of the touch points to the intelligent terminal, and the intelligent terminal determines that the number of touch points is six, wherein the positions of three touch points are located in the left area of the large screen display, the positions of the other three touch points are located in the right area of the large screen display, and the positions of the touch points located in the left area of the large screen display have a change process of moving leftward, and the positions of the touch points located in the right area of the large screen display have a change process of moving rightward, and then the current user, therefore, the intelligent terminal controls all windows displayed on the large screen display to carry out hiding operation.
In this embodiment, the user can hide the control of operation through the gesture that a plurality of fingers of both hands moved to both sides, realizes hiding all windows that show in the big screen display ware, and the simple operation just realizes more nimble.
The embodiment of the present application is not limited to the above-mentioned embodiments, and there are various ways to implement and optimize the hiding of all windows by using multiple fingers in the embodiment of the present application, and these implementation manners are also considered to be within the scope of the embodiment of the present application.
Correspondingly, as shown in fig. 9, an embodiment of the present application further provides an embodiment of a window control apparatus, where the embodiment of the apparatus may be applied to an intelligent terminal, the intelligent terminal is connected with a large screen display, and at least one window is displayed on the large screen display, and the apparatus may include:
an obtaining unit 901, configured to obtain, in real time, a position of a touch point detected by the large-screen display.
The first determining unit 902 is configured to determine a controlled window in a window displayed on the large screen display according to the number of the touch points and an area where the touch points are located, where the number of the touch points is at least three.
And a second determining unit 903, configured to determine a corresponding control instruction according to a change situation of the position of the touch point.
And a control unit 904, configured to execute an operation corresponding to the control instruction for the controlled window.
In some possible implementation manners of the embodiment of the present application, the first determining unit may be specifically configured to:
and when the number of the touch points is at least three, and half or more than half of the touch points are positioned in a window area displayed on the large-screen display, determining the window as a controlled window.
In some possible implementation manners of the embodiment of the present application, the second determining unit may be specifically configured to:
and when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining the corresponding control instruction as a window maximization instruction.
In some possible implementation manners of the embodiment of the present application, the second determining unit may be specifically configured to:
and when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction.
In some possible implementation manners of the embodiment of the present application, the second determining unit may be specifically configured to:
and when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction.
In some possible implementation manners of the embodiment of the present application, the second determining unit may be specifically configured to:
and when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is larger than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction.
In some possible implementation manners of the embodiment of the present application, the first determining unit may be specifically configured to:
and when the number of the touch points is at least six, and the number of the touch points in the left area of the large screen display is determined to be more than or equal to three and the number of the touch points in the right area of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all the windows as controlled windows.
In some possible implementation manners of the embodiment of the present application, the second determining unit may be specifically configured to:
and when the positions of the touch points in the left area of the large screen display move leftwards at the same time and the positions of the touch points in the right area of the large screen display move rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
Therefore, the position of the touch point detected by the large screen display can be obtained in real time, the touch points can be generated when a user touches the large screen display through a plurality of fingers, the controlled window is determined in the window displayed by the large screen display according to the number of the touch points and the position area of the touch points, the corresponding control instruction is determined according to the change situation of the position of the touch points, and the operation corresponding to the control instruction is executed aiming at the controlled window, so that the user can realize different control over the window displayed by the large screen display through different gestures generated by the plurality of fingers, the operation is convenient and rapid, and the realization is more flexible.
Correspondingly, an embodiment of the present invention further provides a window control device, as shown in fig. 10, where the window control device may include:
a processor 1001, a memory 1002, an input device 1003 and an output device 1004. The number of processors 1001 in the window split-screen display device may be one or more, and one processor is taken as an example in fig. 10. In some embodiments of the present invention, the processor 1001, the memory 1002, the input device 1003 and the output device 1004 may be connected through a bus or other means, wherein the bus connection is taken as an example in fig. 10.
The memory 1002 may be used to store software programs and modules, and the processor 1001 executes various functional applications and data processing of the window split display device by operating the software programs and modules stored in the memory 1002. The memory 1002 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like. Further, the memory 1002 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The input device 1003 may be used to receive input numeric or character information and generate signal inputs related to user settings and function control of the window split display apparatus.
Specifically, in this embodiment, the processor 1001 loads an executable file corresponding to a process of one or more application programs into the memory 1002 according to the following instructions, and the processor 1001 runs the application programs stored in the memory 1002, thereby implementing various functions in the window control method.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing describes in detail a window control method, device, and apparatus provided in an embodiment of the present application, and a specific example is applied in the present application to explain the principle and implementation of the present application, and the description of the foregoing embodiment is only used to help understand the method and core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (14)

1. A window control method is characterized in that an intelligent terminal is connected with a large screen display, at least one window is displayed on the large screen display, and the method comprises the following steps:
acquiring the position of a touch point detected by the large screen display in real time;
determining a controlled window in the window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points;
determining a corresponding control instruction according to the change condition of the position of the touch point;
executing the operation corresponding to the control instruction aiming at the controlled window;
determining a controlled window in a window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points, wherein the determining comprises the following steps:
and when the number of the touch points is at least three and half or more than half of the touch points are positioned in a window area displayed on the large screen display, determining the window as a controlled window.
2. The method according to claim 1, wherein the determining the corresponding control command according to the change of the position of the touch point comprises:
when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining that a corresponding control instruction is a window maximization instruction;
or when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction;
or when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction;
or when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is greater than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction;
or when the position of the touch point in the left area of the large screen display moves leftwards at the same time and the position of the touch point in the right area of the large screen display moves rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
3. A window control method is characterized in that an intelligent terminal is connected with a large screen display, at least one window is displayed on the large screen display, and the method comprises the following steps:
acquiring the position of a touch point detected by the large screen display in real time;
determining a controlled window in the window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points;
determining a corresponding control instruction according to the change condition of the position of the touch point;
executing the operation corresponding to the control instruction aiming at the controlled window;
determining a controlled window in a window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points, wherein the determining comprises the following steps:
and when the number of the touch points is at least six, and the number of the touch points in the area on one side of the large screen display is determined to be more than or equal to three and the number of the touch points in the area on the other side of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all windows as controlled windows.
4. The method according to claim 3, wherein the determining the corresponding control command according to the change of the position of the touch point comprises:
when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining that a corresponding control instruction is a window maximization instruction;
or when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction;
or when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction;
or when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is greater than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction;
or when the position of the touch point in the left area of the large screen display moves leftwards at the same time and the position of the touch point in the right area of the large screen display moves rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
5. A window control method is characterized in that an intelligent terminal is connected with a large screen display, at least one window is displayed on the large screen display, and the large screen display sends the position of a touch point of a user touching the large screen display to the intelligent terminal, and the method comprises the following steps:
the intelligent terminal acquires the position of the touch point detected by the large screen display in real time;
the intelligent terminal determines a controlled window in a window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points;
the intelligent terminal determines a corresponding control instruction according to the change condition of the position of the touch point;
and the intelligent terminal enables the controlled window to execute the operation corresponding to the control instruction.
6. The method according to claim 5, wherein the intelligent terminal determines a controlled window in the window displayed by the large screen display according to the number of the touch points and the area where the touch points are located, and the method comprises:
and the position of the touch point is identified by using a coordinate value of the touch point on the large screen display, and the intelligent terminal knows the position area of each window in the at least one window, so that the intelligent terminal determines the relationship between each touch point and the position area of each window according to the position of the touch point.
7. The method according to claim 5 or 6, wherein the intelligent terminal determines a controlled window in the window displayed by the large screen display according to the number of the touch points and the area where the touch points are located, and the method comprises the following steps:
when the number of the touch points is at least three, and half or more than half of the touch points are positioned in a window area displayed on the large screen display, determining the window as a controlled window;
alternatively, the first and second electrodes may be,
and when the number of the touch points is at least six, and the number of the touch points in the area on one side of the large screen display is determined to be more than or equal to three and the number of the touch points in the area on the other side of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all windows as controlled windows.
8. The method according to claim 7, wherein the determining the corresponding control command according to the change of the position of the touch point comprises:
when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining that a corresponding control instruction is a window maximization instruction;
or when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction;
or when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction;
or when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is greater than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction;
or when the position of the touch point in the left area of the large screen display moves leftwards at the same time and the position of the touch point in the right area of the large screen display moves rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
9. A window control device is characterized in that an intelligent terminal is connected with a large screen display, at least one window is displayed on the large screen display, the large screen display sends the position of a touch point of a user touching the large screen display to the intelligent terminal, and the device comprises:
the acquisition unit is used for acquiring the position of the touch point detected by the large screen display in real time by the intelligent terminal;
the first determining unit is used for determining a controlled window in the window displayed by the large screen display according to the number of the touch points and the area where the positions of the touch points are located by the intelligent terminal;
the second determining unit is used for determining a corresponding control instruction according to the change condition of the position of the touch point by the intelligent terminal;
and the control unit is used for enabling the intelligent terminal to enable the controlled window to execute the operation corresponding to the control instruction.
10. The apparatus of claim 9, wherein the first determining unit comprises:
and the position of the touch point is identified by using a coordinate value of the touch point on the large screen display, and the intelligent terminal knows the position area of each window in the at least one window, so that the intelligent terminal determines the relationship between each touch point and the position area of each window according to the position of the touch point.
11. The apparatus according to claim 9 or 10, wherein the first determining unit comprises:
when the number of the touch points is at least three and the positions of half or more than half of the touch points are in a window area displayed on the large-screen display, determining the window as a controlled window;
alternatively, the first and second electrodes may be,
and when the number of the touch points is at least six, and the number of the touch points in the area on one side of the large screen display is determined to be more than or equal to three and the number of the touch points in the area on the other side of the large screen display is determined to be more than or equal to three according to the positions of the touch points, determining all the windows as controlled windows.
12. The apparatus according to claim 11, wherein the second determining unit is specifically configured to:
when the initial distance between every two touch points is determined to be smaller than a first threshold value according to the initial positions of the touch points, the position of each touch point moves outwards, and the current distance between every two touch points is determined to be larger than a second threshold value according to the current positions of the touch points, determining that a corresponding control instruction is a window maximization instruction;
or when the positions of the touch points move to the same direction at the same time, determining that the corresponding control instruction is a window moving instruction;
or when the initial distance between every two touch points is determined to be larger than a third threshold value according to the initial positions of the touch points, the position of each touch point moves inwards, and the current distance between every two touch points is determined to be smaller than a fourth threshold value according to the current positions of the touch points, determining that the corresponding control instruction is a window closing instruction;
or when the positions of the touch points are simultaneously moved downwards in an accelerated manner and the moving acceleration is greater than a fifth threshold value, or when the positions of the touch points are simultaneously moved downwards to a preset area, determining that the corresponding control instruction is a window minimizing instruction;
or when the position of the touch point in the left area of the large screen display moves leftwards at the same time and the position of the touch point in the right area of the large screen display moves rightwards at the same time, determining that the corresponding control instruction is a full window hiding instruction.
13. The utility model provides a system for realize window control, includes intelligent terminal and big screen display, the big screen display shows there is at least one window, its characterized in that, the system includes:
the large screen display is used for sending the position of a touch point of a user touching the large screen display to the intelligent terminal;
the intelligent terminal is used for acquiring the position of the touch point detected by the large screen display in real time;
the intelligent terminal is used for determining a controlled window in the window displayed by the large screen display according to the number of the touch points and the area of the positions of the touch points;
the intelligent terminal is used for determining a corresponding control instruction according to the change condition of the position of the touch point;
and the intelligent terminal is used for enabling the controlled window to execute the operation corresponding to the control instruction.
14. A window control apparatus, characterized in that the apparatus comprises a memory and a processor,
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the window control method of any one of claims 1-8 according to instructions in the program code.
CN202010091478.5A 2018-05-14 2018-05-14 Window control method, device and equipment Pending CN111338550A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010091478.5A CN111338550A (en) 2018-05-14 2018-05-14 Window control method, device and equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010091478.5A CN111338550A (en) 2018-05-14 2018-05-14 Window control method, device and equipment
CN201810457284.5A CN109739422B (en) 2018-05-14 2018-05-14 Window control method, device and equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810457284.5A Division CN109739422B (en) 2018-05-14 2018-05-14 Window control method, device and equipment

Publications (1)

Publication Number Publication Date
CN111338550A true CN111338550A (en) 2020-06-26

Family

ID=66354218

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810457284.5A Active CN109739422B (en) 2018-05-14 2018-05-14 Window control method, device and equipment
CN202010091478.5A Pending CN111338550A (en) 2018-05-14 2018-05-14 Window control method, device and equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810457284.5A Active CN109739422B (en) 2018-05-14 2018-05-14 Window control method, device and equipment

Country Status (1)

Country Link
CN (2) CN109739422B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112035023A (en) * 2020-08-14 2020-12-04 江苏铁锚玻璃股份有限公司 Intelligent regional display control method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115113770A (en) * 2021-03-17 2022-09-27 北京有竹居网络技术有限公司 Terminal control method and device, terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399710A (en) * 2013-08-07 2013-11-20 福州瑞芯微电子有限公司 Multi-window operating achieving mobile terminal and method
CN104571906A (en) * 2013-10-28 2015-04-29 联想(北京)有限公司 Information processing method and electronic equipment
CN106527858A (en) * 2016-10-28 2017-03-22 广州视睿电子科技有限公司 Window movement method and device
CN107870754A (en) * 2016-09-28 2018-04-03 法乐第(北京)网络科技有限公司 A kind of method and device of the content shown on control device
CN110618769A (en) * 2019-08-22 2019-12-27 华为技术有限公司 Application window processing method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101515220B (en) * 2009-04-13 2011-10-12 青岛海信移动通信技术股份有限公司 Method and related device for moving, zooming and controlling window in touch screen
CN102098361A (en) * 2010-12-21 2011-06-15 许富民 Mobile phone with external large display screen
KR101873787B1 (en) * 2011-02-10 2018-07-03 삼성전자주식회사 Method for processing multi-touch input in touch screen terminal and device thereof
CN102902469B (en) * 2011-07-25 2015-08-19 宸鸿光电科技股份有限公司 Gesture identification method and touch-control system
CN203872252U (en) * 2014-06-05 2014-10-08 江波 Link system between vehicle device and mobile terminal
CN104317474B (en) * 2014-09-22 2018-08-03 腾讯科技(深圳)有限公司 Windows switching method and device
CN107391011A (en) * 2017-07-07 2017-11-24 环球智达科技(北京)有限公司 The control method of intelligent terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399710A (en) * 2013-08-07 2013-11-20 福州瑞芯微电子有限公司 Multi-window operating achieving mobile terminal and method
CN104571906A (en) * 2013-10-28 2015-04-29 联想(北京)有限公司 Information processing method and electronic equipment
CN107870754A (en) * 2016-09-28 2018-04-03 法乐第(北京)网络科技有限公司 A kind of method and device of the content shown on control device
CN106527858A (en) * 2016-10-28 2017-03-22 广州视睿电子科技有限公司 Window movement method and device
CN110618769A (en) * 2019-08-22 2019-12-27 华为技术有限公司 Application window processing method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112035023A (en) * 2020-08-14 2020-12-04 江苏铁锚玻璃股份有限公司 Intelligent regional display control method

Also Published As

Publication number Publication date
CN109739422B (en) 2020-02-18
CN109739422A (en) 2019-05-10

Similar Documents

Publication Publication Date Title
US20190302984A1 (en) Method and device for controlling a flexible display device
US20130307765A1 (en) Contactless Gesture-Based Control Method and Apparatus
CN109753215B (en) Window split-screen display method, device and equipment
JP7233109B2 (en) Touch-sensitive surface-display input method, electronic device, input control method and system with tactile-visual technology
CN104063128B (en) A kind of information processing method and electronic equipment
WO2021203815A1 (en) Page operation method and apparatus, and terminal and storage medium
CN110007800B (en) Control method, device and equipment of touch operation mode and storage medium
US11693556B2 (en) Creating tables using gestures
CN107797750A (en) A kind of screen content identifying processing method, apparatus, terminal and medium
EP2965181B1 (en) Enhanced canvas environments
US20230091771A1 (en) Device Control Method, Storage Medium, and Non-Transitory Computer-Readable Electronic Device
CN109739422B (en) Window control method, device and equipment
US11455071B2 (en) Layout method, device and equipment for window control bars
CN103150103B (en) The method and system of gesture operation object and form
CN114415886A (en) Application icon management method and electronic equipment
CN113721808A (en) Control method and device
CN111831196B (en) Control method of folding screen, terminal device and storage medium
WO2023169499A1 (en) Single-hand control method and control apparatus for touch screen, electronic device, and storage medium
CN113031817B (en) Multi-touch gesture recognition method and false touch prevention method
CN106020691B (en) A kind of item selection method and mobile terminal
CN104951211A (en) Information processing method and electronic equipment
CN113986106A (en) Double-hand operation method and device of touch screen, electronic equipment and storage medium
CN114327717A (en) User interface processing method, device, equipment and computer readable storage medium
CN109739590B (en) Desktop wallpaper replacing method, device and equipment
KR101381878B1 (en) Method, device, and computer-readable recording medium for realizing touch input using mouse

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination