CN104571904A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN104571904A
CN104571904A CN201310516854.0A CN201310516854A CN104571904A CN 104571904 A CN104571904 A CN 104571904A CN 201310516854 A CN201310516854 A CN 201310516854A CN 104571904 A CN104571904 A CN 104571904A
Authority
CN
China
Prior art keywords
touch
mtd
information
application
full screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310516854.0A
Other languages
Chinese (zh)
Other versions
CN104571904B (en
Inventor
王超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310516854.0A priority Critical patent/CN104571904B/en
Priority to US14/229,917 priority patent/US20150121301A1/en
Publication of CN104571904A publication Critical patent/CN104571904A/en
Application granted granted Critical
Publication of CN104571904B publication Critical patent/CN104571904B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)

Abstract

The invention discloses an information processing method. The information processing method comprises the following steps of when a first touch operation is detected, obtaining first touch event information by analysis, and obtaining position coordinates of the first touch operation; if the first touch event information is positioned in an overlapped area of at least two non-full-screen windows in non-full-screen windows corresponding to N applications, acquiring priority information corresponding to the at least two non-full-screen windows according to the position coordinates, determining a first application according to the priority information, and calculating by utilizing a first conversion parameter corresponding to the non-full-screen window of the first application and the first touch event information to obtain first operation information corresponding to the first touch event information, wherein the first application responds to the first touch operation based on the first operation information. The invention further discloses electronic equipment. Through the adoption of the electronic equipment, the problem of operation disorder during the opening of a plurality of non-full-screen windows can be avoided, and the using experience of a user is improved.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to the field of communications, and in particular, to an information processing method and an electronic device.
Background
With the development of mobile terminals, especially the improvement of the resolution and the increase of the size of the screen, users gradually put forward the requirements of multi-window operation interfaces, that is, a plurality of small windows are opened simultaneously in the same mobile device, and each small window displays and operates an application. However, if the multi-window operation interface is put into use, when at least two windows overlap, a problem that the window corresponding to the touch operation cannot be determined occurs, and thus, confusion of operation is caused, and the use experience of a user is affected.
Disclosure of Invention
In view of this, an object of the present invention is to provide an information processing method and an electronic device, which can avoid the problem of operation confusion when a plurality of non-full screen windows are opened, and improve the user experience.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
the embodiment of the invention provides an information processing method, which is applied to electronic equipment, wherein the electronic equipment is provided with a touch display unit, the electronic equipment can run a plurality of applications and display the applications in a display area of the touch display unit, when N windows running the applications in a non-full screen mode are opened, N is an integer greater than or equal to 2, and the method comprises the following steps:
when a first touch operation is detected, analyzing to obtain first touch event information, and obtaining a position coordinate of the first touch operation;
judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates;
if so, acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
An embodiment of the present invention further provides an electronic device, where the electronic device includes: the touch control display unit and the processing unit; wherein,
the touch display unit is used for opening N windows which are operated in a non-full screen mode, wherein N is an integer greater than or equal to 2, and when a first touch operation is detected, first touch event information is obtained through analysis, and position coordinates of the first touch operation are obtained;
the processing unit is configured to determine, according to the position coordinates, whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications; if so, acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
According to the information processing method and the electronic device provided by the invention, when two or more non-full-screen windows are overlapped, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full-screen windows, so that the application responds to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened can be avoided, and the use experience of a user is improved.
Drawings
FIG. 1 is a first flowchart illustrating an information processing method according to an embodiment of the present invention;
FIG. 2a is a schematic diagram illustrating an overlapping state of two non-full screen windows according to an embodiment of the present invention;
FIG. 2b is a schematic diagram illustrating a zoom operation performed on an overlapping area of two non-full-screen windows according to an embodiment of the present invention;
FIG. 2c is a schematic diagram illustrating a display effect of performing a zoom operation on two non-full-screen windows according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a second information processing method according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
The first embodiment,
The embodiment of the invention provides an information processing method which is applied to electronic equipment, wherein the electronic equipment is provided with a touch display unit and can be a mobile terminal, such as a smart phone or a tablet computer.
The electronic device can run a plurality of applications and display the applications in a display area of a touch display unit, wherein when N windows running in a non-full screen mode are opened, N is greater than or equal to 2, as shown in FIG. 1, the method comprises the following steps:
step 101: when a first touch operation is detected, analyzing to obtain first touch event information, and obtaining a position coordinate of the first touch operation;
step 102: judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates;
step 103: if so, acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point.
Preferably, the determining whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications includes: according to the initial coordinates of the operation touch points and the end coordinates of the operation touch points in the first touch event information, whether the initial coordinates of the operation touch points and the end coordinates of the operation touch points are located in the overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications or not is checked from frame cache data stored in the touch display unit currently.
Preferably, the priority information may include: a scaling operation priority, and/or a time of a last interactive operation of the two non-full-screen windows.
Preferably, the first conversion parameter is an inverse matrix of a second conversion parameter corresponding to the non-full screen window of the application;
the form of the first conversion parameter includes at least one of: conversion matrix, parameters, parameter set.
The second transformation parameters comprise a second transformation matrix; the second conversion matrix is used for converting the full-screen display window of the application into a non-full-screen window, and the generation method comprises the following steps: when a first instruction is received, acquiring a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The converting the display window corresponding to the selected application by using the preset matrix to obtain the display area of the non-full screen window of the application comprises: reading the graph cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises two-position coordinate information of each pixel point and Red, Green and Blue (RGB) three-color information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) Converting to obtain the non-full screen window of the application, wherein the graphic cache data corresponding to the non-full screen window comprises the converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second transformation matrix is <math><mrow> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow></math> The three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
for example, when the current mobile terminal displays a non-full screen window 1 and a non-full screen window 2 as shown in fig. 2, and the first touch operation is a click operation and the touch point is a point a in the overlap area, it is determined that the non-full screen window 2 responds to the first touch operation according to the priority information of the non-full screen window 1 and the non-full screen window 2.
Therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
Example II,
The embodiment of the invention provides an information processing method which is applied to electronic equipment, wherein the electronic equipment is provided with a touch display unit and can be a mobile terminal, such as a smart phone or a tablet computer.
The electronic device can run a plurality of applications and display the applications in a display area of a touch display unit, and when N windows of the applications running in a non-full screen mode are opened, N is an integer greater than or equal to 2, as shown in fig. 3, the method includes:
step 301: when the first touch operation is detected, analyzing to obtain first touch event information, and obtaining the position coordinate of the first touch operation.
Step 302: judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates, if so, executing a step 303; otherwise, step 304 is performed.
Step 303: acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, calculating to obtain first operation information corresponding to the first touch event information by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, responding to the first touch operation by the first application based on the first operation information, and ending the processing flow.
Step 304: judging whether the first touch event information is located in a touch area of a non-full screen window of a first application, if so, calculating to obtain first operation information corresponding to the first touch event by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, and responding to the first touch operation by the first application based on the first operation information; otherwise, the subsequent operation is performed according to the prior art.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point.
Preferably, the determining whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications includes: according to the initial coordinates of the operation touch points and the end coordinates of the operation touch points in the first touch event information, whether the initial coordinates of the operation touch points and the end coordinates of the operation touch points are located in the overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications or not is checked from frame cache data stored in the touch display unit currently.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
The form of the conversion parameter includes at least one of: conversion matrix, parameters, parameter set. The first conversion matrix is an inverse matrix of a second conversion matrix corresponding to the applied non-full screen window;
the second conversion matrix is used for converting the full-screen display window of the application into a non-full-screen window, and the generation method comprises the following steps: when a first instruction is received, acquiring a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The converting the display window corresponding to the selected application by using the preset matrix to obtain the display area of the non-full screen window of the application comprises: reading the graph cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises coordinate information of each pixel point and Red, Green and Blue (RGB) information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) Converting to obtain the non-full screen window of the application, wherein the graphic cache data corresponding to the non-full screen window comprises the converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second conversion matrix is <math><mrow> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow></math> The three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
for example, when the current mobile terminal displays a non-full screen window 1 and a non-full screen window 2 as shown in fig. 2, and the first touch operation is a click operation and the touch point is a point a in the overlap area, it is determined that the non-full screen window 2 responds to the first touch operation according to the priority information of the non-full screen window 1 and the non-full screen window 2.
If the first touch operation is a click operation and the touch point is not located in the overlap region, for example, the touch point B in fig. 2, the system performs a corresponding response to the touch point, which is the prior art and is not described herein again; when the touch point is at the position of the touch point C in fig. 2, the non-full screen window 2 responds to the operation of the touch point C.
Therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
Example III,
The embodiment of the invention provides an information processing method which is applied to electronic equipment, wherein the electronic equipment is provided with a touch display unit and can be a mobile terminal, such as a smart phone or a tablet computer.
The electronic device can run a plurality of applications and display the applications in a display area of a touch display unit, and when N windows of the applications running in a non-full screen mode are opened, N is an integer greater than or equal to 2, as shown in fig. 3, the method includes:
step 301: when the first touch operation is detected, analyzing to obtain first touch event information, and obtaining the position coordinate of the first touch operation.
Step 302: judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates, if so, executing a step 303; otherwise, step 304 is performed.
Step 303: acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, calculating to obtain first operation information corresponding to the first touch event information by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, responding to the first touch operation by the first application based on the first operation information, and ending the processing flow.
Step 304: judging whether the first touch event information is located in a touch area of a non-full screen window of a first application, if so, calculating to obtain first operation information corresponding to the first touch event by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, and responding to the first touch operation by the first application based on the first operation information; otherwise, the subsequent operation is performed according to the prior art.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point.
Preferably, the determining whether the first touch operation is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications includes: according to the initial coordinates of the operation touch points and the end coordinates of the operation touch points in the first touch event information, whether the initial coordinates of the operation touch points and the end coordinates of the operation touch points are located in the overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications or not is checked from frame cache data stored in the touch display unit currently.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
Preferably, the obtaining of priority information corresponding to the at least two non-full screen windows and the determining of the first application to respond to the first touch operation according to the priority information include:
acquiring priority information corresponding to the at least two non-full screen windows, wherein the priority information represents the zooming operation priority and/or the time of the last interactive operation of the at least two non-full screen windows;
judging whether the first touch operation is zooming operation, if so, determining an application with a high zooming operation priority as a first application responding to the first touch operation according to the zooming operation priorities corresponding to the at least two non-full-screen windows;
if not, extracting the time of the last interactive operation corresponding to the at least two non-full screen windows, comparing the time of the last interactive operation, and determining the application with the later time of the last interactive operation as the first application responding to the first touch operation.
Preferably, the first conversion parameter comprises a first conversion matrix;
the first conversion matrix is an inverse matrix of a second conversion matrix corresponding to the applied non-full screen window;
the second conversion matrix is used for converting the full-screen display window of the application into a non-full-screen window, and the generation method comprises the following steps: when a first instruction is received, acquiring a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The converting the display window corresponding to the selected application by using the preset matrix to obtain the display area of the non-full screen window of the application comprises the following steps: reading the graph cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises two-position coordinate information of each pixel point and Red, Green and Blue (RGB) three-color information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) Converting to obtain the non-full screen window of the application, wherein the graphic cache data corresponding to the non-full screen window comprises the converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second transformation matrix is <math><mrow> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow></math> The three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
for example, when the current mobile terminal displays the non-full screen window 1 and the non-full screen window 2 as shown in fig. 2B, when it is determined that the first touch operation is the zoom operation according to the starting and ending positions of the touch point a and the touch point B, the zoom operation priorities of the non-full screen window 1 and the non-full screen window 2 are compared, and if the zoom operation priority of the non-full screen window 1 is higher, the application corresponding to the non-full screen window 1 responds to the first touch operation; the responded non-full screen window 1 is shown in fig. 2c, where the non-full screen window 1 is obviously scaled. Among them, the application having a higher priority in response to the zoom operation may be a picture or a video, or the priority of the zoom operation may be set by the user.
Therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
Therefore, the operation object of the first touch event can be selected according to the priority information corresponding to the applied non-full screen window and the current first touch event, so that the non-full screen window to be operated can be determined simply according to the gesture, and the user experience is improved.
Example four,
The embodiment of the invention provides an information processing method which is applied to electronic equipment, wherein the electronic equipment is provided with a touch display unit and can be a mobile terminal, such as a smart phone or a tablet computer.
The electronic device can run a plurality of applications and display the applications in a display area of a touch display unit, and when N windows of the applications running in a non-full screen mode are opened, N is an integer greater than or equal to 2, as shown in fig. 3, the method includes:
step 301: when the first touch operation is detected, analyzing to obtain first touch event information, and obtaining the position coordinate of the first touch operation.
Step 302: judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates, if so, executing a step 303; otherwise, step 304 is performed.
Step 303: acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, calculating to obtain first operation information corresponding to the first touch event information by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, responding to the first touch operation by the first application based on the first operation information, and ending the processing flow.
Step 304: judging whether the first touch event information is located in a touch area of a non-full screen window of a first application, if so, calculating to obtain first operation information corresponding to the first touch event by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, and responding to the first touch operation by the first application based on the first operation information; otherwise, the subsequent operation is performed according to the prior art.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: a start coordinate of an operation touch point of each touch point, and an end coordinate of the operation touch point.
Preferably, the determining whether the first touch operation is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications includes: and checking whether the initial coordinate of the operation touch point and the end coordinate of the operation touch point are positioned in the overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications from frame cache data stored in the touch display unit currently according to the initial coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch event information.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
Preferably, the first conversion parameter comprises a first conversion matrix;
the first conversion matrix is an inverse matrix of a second conversion matrix corresponding to the applied non-full screen window;
the second conversion matrix is used for converting the full-screen display window of the application into a non-full-screen window, and the generation method comprises the following steps: when a first instruction is received, acquiring a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The converting the display window corresponding to the selected application by using the preset matrix to obtain the display area of the non-full screen window of the application comprises: reading the graph cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises two-position coordinate information of each pixel point and Red, Green and Blue (RGB) three-color information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) IntoLine conversion is carried out, namely a non-full screen window of the application can be obtained, and the graphic cache data corresponding to the non-full screen window comprises converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second transformation matrix is <math><mrow> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow></math> The three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
Preferably, the obtaining priority information corresponding to the at least two non-full screen windows, and determining a first application to respond to the first touch operation according to the priority information includes:
acquiring priority information corresponding to the at least two non-full screen windows, wherein the priority information represents the zooming operation priority and/or the time of the last interactive operation of the at least two non-full screen windows;
judging whether the first touch operation is zooming operation, if so, determining an application with a high zooming operation priority as a first application responding to the first touch operation according to the zooming operation priorities corresponding to the at least two non-full screen windows;
if not, extracting the time of the last interactive operation corresponding to the at least two non-full screen windows, comparing the time of the last interactive operation, and determining the application with the later time of the last interactive operation as the first application responding to the first touch operation.
Preferably, the calculating, by using the first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, first operation information corresponding to the first touch event information includes:
converting the initial coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch time by using a first conversion parameter corresponding to the non-full screen window of the first application;
and taking the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point as first operation information corresponding to the first touch event information.
For example, when the current mobile terminal displays a non-full screen window 1 and a non-full screen window 2 as shown in fig. 2, and the first touch operation is a click operation and the touch point is a point a in the overlap area, it is determined that the non-full screen window 2 responds to the first touch operation according to the priority information of the non-full screen window 1 and the non-full screen window 2.
The method comprises the steps that a non-full screen window 1 and a non-full screen window 2 shown in fig. 2B are displayed on a current mobile terminal, when a first touch operation is determined to be a zooming operation according to starting and ending positions of a touch point A and a touch point B, the zooming operation priorities of the non-full screen window 1 and the non-full screen window 2 are compared, and if the zooming operation priority of the non-full screen window 1 is higher, an application corresponding to the non-full screen window 1 responds to the first touch operation; the responded non-full screen window 1 is shown in fig. 2c, where the non-full screen window 1 is obviously scaled. Among them, the application having a higher priority in response to the zoom operation may be a picture or a video, or the priority of the zoom operation may be set by the user.
Therefore, the operation object of the first touch event can be selected according to the priority information corresponding to the applied non-full screen window and the current first touch event, so that the non-full screen window to be operated can be determined simply according to the gesture, and the user experience is improved.
Example V,
The embodiment of the invention provides an information processing method which is applied to electronic equipment, wherein the electronic equipment is provided with a touch display unit and can be a mobile terminal, such as a smart phone or a tablet computer.
The electronic device can run a plurality of applications and display the applications in a display area of a touch display unit, and when N windows of the applications running in a non-full screen mode are opened, N is an integer greater than or equal to 2, as shown in FIG. 3, the method includes:
step 301: when the first touch operation is detected, analyzing to obtain first touch event information, and obtaining the position coordinate of the first touch operation.
Step 302: judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates, if so, executing a step 303; otherwise, step 304 is performed.
Step 303: acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, calculating to obtain first operation information corresponding to the first touch event information by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, responding to the first touch operation by the first application based on the first operation information, and ending the processing flow.
Step 304: judging whether the first touch event information is located in a touch area of a non-full screen window of a first application, if so, calculating to obtain first operation information corresponding to the first touch event by using a first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, and responding to the first touch operation by the first application based on the first operation information; otherwise, the subsequent operation is performed according to the prior art.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point.
Preferably, the determining whether the first touch operation is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications includes: according to the initial coordinates of the operation touch points and the end coordinates of the operation touch points in the first touch event information, whether the initial coordinates of the operation touch points and the end coordinates of the operation touch points are located in the overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications or not is checked from frame cache data stored in the touch display unit currently.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
Preferably, the calculating, by using the first conversion matrix corresponding to the non-full screen window of the first application and the first touch event information, to obtain first operation information corresponding to the first touch operation includes:
converting the initial coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch event information by using a first conversion matrix corresponding to the non-full screen window of the first application;
and taking the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point as first operation information corresponding to the first touch operation.
Preferably, the obtaining priority information corresponding to the at least two non-full screen windows, and determining a first application to respond to the first touch operation according to the priority information includes:
acquiring priority information corresponding to the at least two non-full screen windows, wherein the priority information represents the zooming operation priority and/or the time of the last interactive operation of the at least two non-full screen windows;
judging whether the first touch operation is zooming operation, if so, determining an application with a high zooming operation priority as a first application responding to the first touch operation according to the zooming operation priorities corresponding to the at least two non-full screen windows;
if not, extracting the time of the last interactive operation corresponding to the at least two non-full screen windows, comparing the time of the last interactive operation, and determining the application with the later time of the last interactive operation as the first application responding to the first touch operation.
Preferably, the responding, by the first application, to the first touch operation based on the first operation information includes:
determining the operation type of the first operation information according to the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point in the first operation information;
when the operation type of the first operation information is click operation, judging whether the first operation information selects an event coordinate in the first application, if so, generating response information of the event coordinate, and otherwise, not responding;
and when the operation type of the first operation information is sliding operation, the first application responds according to the first operation information.
The determining the operation type of the first operation information according to the converted start coordinate of the operation touch point and the converted end coordinate of the operation touch point in the first operation information may be:
when the initial coordinate of the operation touch point is the same as the end coordinate of the operation touch point, the operation type is click operation;
when the initial coordinate of the operation touch point is different from the end coordinate of the operation touch point, and the distance is greater than a preset threshold, the operation type is sliding operation;
and when the operation has multiple touch points, the distance between the starting coordinates of at least two operation touch points and the ending coordinates of the operation touch points is greater than a preset threshold, and the operation type is zooming.
Preferably, the first conversion parameter comprises a first conversion matrix; wherein the first conversion matrix is an inverse matrix of a second conversion matrix corresponding to the applied non-full screen window; for example, when the second transformation matrix is a preset matrix preset by the system, the first transformation matrix is calculated by inverting the second transformation matrix, that is: 1 0 0 0 1 0 0 0 1 - 1 .
the second conversion matrix is used for converting the full-screen display window of the application into a non-full-screen window, and the generation method comprises the following steps: when a first instruction is received, acquiring a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The converting the display window corresponding to the selected application by using the preset matrix to obtain the display area of the non-full screen window of the application comprises: reading the graph cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises two-position coordinate information of each pixel point and Red, Green and Blue (RGB) three-color information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) Converting to obtain the non-full screen window of the application, wherein the graphic cache data corresponding to the non-full screen window comprises the converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second transformation matrix is <math><mrow> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow></math> The three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
For example, when the current mobile terminal displays a non-full screen window 1 and a non-full screen window 2 as shown in fig. 2, and the first touch operation is a click operation and the touch point is a point a in the overlap area, it is determined that the non-full screen window 2 responds to the first touch operation according to the priority information of the non-full screen window 1 and the non-full screen window 2.
The method comprises the steps that a non-full screen window 1 and a non-full screen window 2 shown in fig. 2B are displayed on a current mobile terminal, when a first touch operation is determined to be a zooming operation according to starting and ending positions of a touch point A and a touch point B, the zooming operation priorities of the non-full screen window 1 and the non-full screen window 2 are compared, and if the zooming operation priority of the non-full screen window 1 is higher, an application corresponding to the non-full screen window 1 responds to the first touch operation; the responded non-full screen window 1 is shown in fig. 2c, where the non-full screen window 1 is obviously scaled. Among them, the application having a higher priority in response to the zoom operation may be a picture or a video, or the priority of the zoom operation may be set by the user.
Therefore, the operation object of the first touch event can be selected according to the priority information corresponding to the applied non-full screen window and the current first touch event, so that the non-full screen window to be operated can be determined simply according to the gesture, and the user experience is improved.
Example six,
An embodiment of the present invention provides an electronic device, as shown in fig. 4, the electronic device includes: a touch display unit 41 and a processing unit 42; wherein,
the touch display unit 41 is configured to open N windows operating in a non-full screen mode, where N is an integer greater than or equal to 2; when a first touch operation is detected, analyzing to obtain the first touch event information, obtaining a position coordinate of the first touch operation, and sending the position coordinate to the processing unit 42;
the processing unit 42 is configured to determine, according to the position coordinates, whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications; if so, acquiring priority information corresponding to the two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point of each touch point.
Preferably, the processing unit 42 is specifically configured to, according to the start coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch event information, check whether the start coordinate of the operation touch point and the end coordinate of the operation touch point are located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications from the frame buffer data stored in the touch display unit 41.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
Preferably, the first conversion matrix is an inverse matrix of a second conversion matrix corresponding to the non-full screen window of the application;
the processing unit 42 is specifically configured to, when receiving the first instruction, obtain a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The processing unit 42 is specifically configured to read the graphics cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises coordinate information of each pixel point and Red, Green and Blue (RGB) information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) Converting to obtain the non-full screen window of the application, wherein the graphic cache data corresponding to the non-full screen window comprises the converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second transformation matrix isThe three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
for example, when the current mobile terminal displays a non-full screen window 1 and a non-full screen window 2 as shown in fig. 2, and the first touch operation is a click operation and the touch point is a point a in the overlap area, it is determined that the non-full screen window 2 responds to the first touch operation according to the priority information of the non-full screen window 1 and the non-full screen window 2.
Therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
Example seven,
An embodiment of the present invention provides an electronic device, as shown in fig. 4, the electronic device includes: a touch display unit 41 and a processing unit 42; wherein,
the touch display unit 41 is configured to open N windows operating in a non-full screen mode, where N is an integer greater than or equal to 2, and when a first touch operation is detected, analyze the first touch event information to obtain a position coordinate of the first touch operation, and send the position coordinate to the processing unit 42;
the processing unit 42 is configured to determine, according to the position coordinates, whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications; if so, acquiring priority information corresponding to the two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point of each touch point.
Preferably, the processing unit 42 is specifically configured to, according to the start coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch event information, check whether the start coordinate of the operation touch point and the end coordinate of the operation touch point are located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications from the frame buffer data stored in the touch display unit 41.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
Preferably, the processing unit 42 is further configured to determine, according to the position coordinate, whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in non-full-screen windows corresponding to the N applications, if not, determine whether the first touch event is located in a touch area of a non-full-screen window of a first application, if so, calculate first operation information corresponding to the first touch operation by using a first conversion parameter of the first application and the first touch event, where the first application responds according to the first operation information; otherwise, the subsequent operation is performed according to the prior art.
Preferably, the first touch event information includes a number of touch points and position coordinates thereof, and the position coordinates include: the start coordinates of the operation touch point and the end coordinates of the operation touch point of each touch point.
Preferably, the processing unit 42 is specifically configured to, according to the start coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch event, check whether the start coordinate of the operation touch point and the end coordinate of the operation touch point are located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications from the frame buffer data stored in the touch display unit 41.
Preferably, the priority information may include: a zoom operation priority, and/or a time of a last interactive operation of the two non-full screen windows.
Preferably, the processing unit 42 is specifically configured to obtain priority information corresponding to the at least two non-full-screen windows, where the priority information represents a zoom operation priority and/or a last time of an interactive operation of the at least two non-full-screen windows;
judging whether the first touch operation is zooming operation, if so, determining an application with a high zooming operation priority as a first application responding to the first touch operation according to the zooming operation priorities corresponding to the at least two non-full screen windows;
if not, extracting the time of the last interactive operation corresponding to the at least two non-full screen windows, comparing the time of the last interactive operation, and determining the application with the later time of the last interactive operation as the first application responding to the first touch operation.
The processing unit 42 is specifically configured to determine that the operation type is a click operation when the start coordinate of the operation touch point is the same as the end coordinate of the operation touch point; when the initial coordinate of the operation touch point is different from the end coordinate of the operation touch point and the distance is greater than a preset threshold, determining that the operation type is sliding operation; and when the operation has multiple touch points, determining the operation type to be zoom when the distance between the initial coordinate of the operation touch point of each touch point and the end coordinate of the operation touch point is greater than a preset threshold.
Preferably, the processing unit 42 is specifically configured to convert, by using a first conversion matrix corresponding to the non-full screen window of the first application, the start coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch time; and taking the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point as first operation information corresponding to the first touch event.
Preferably, the processing unit 42 is specifically configured to determine the operation type of the first operation information according to the converted start coordinate of the operation touch point and the converted end coordinate of the operation touch point in the first operation information; when the operation type of the first operation information is click operation, judging whether the first operation information selects an event coordinate in the first application, and if so, generating response information of the event coordinate; and when the operation type of the first operation information is sliding operation, the first application responds according to the first operation information.
Preferably, the first conversion matrix is an inverse matrix of a second conversion matrix corresponding to the non-full screen window of the application; for example, when the second transformation matrix is a preset matrix preset by the system, the first transformation matrix is calculated by inverting the second transformation matrix, that is: 1 0 0 0 1 0 0 0 1 - 1 .
the processing unit 42 is further configured to generate the non-full screen window, and specifically, when a first instruction is received, obtain a preset matrix; and converting the full-screen display window corresponding to the application by using the preset matrix to obtain a display area of the non-full-screen window of the application.
The processing unit 42 is further configured to read graphics cache data of the application; converting the read graph cache data by using the preset matrix, and generating frame cache data corresponding to the touch display unit by using the graph cache data; and displaying the non-full screen window of the application on the touch display unit by utilizing the frame cache data.
The graph cache data comprises coordinate information of each pixel point and Red, Green and Blue (RGB) information of each pixel point.
Considering the situation that there may be an overlapping area between the non-full screen windows corresponding to the two applications, as shown in fig. 2, in this embodiment, the two-dimensional coordinates (x) of the identification pixel point in the graphics cache data of the non-full screen window corresponding to the application are usedo,yo) Extended to three-dimensional coordinates (x)o,yo,zo) (ii) a Wherein different non-full screen windows have different third-dimensional coordinates zoThus, different non-full screen windows can be distinguished with different third dimensional coordinates.
The preset matrix can be an identity matrix; for the extended three-dimensional coordinates (x) in the graphics cache datao,yo,zo) Converting to obtain the non-full screen window of the application, wherein the graphic cache data corresponding to the non-full screen window comprises the converted (x)o,yo,zo) And the RGB information of the corresponding pixel point.
Taking the example of zooming out 1/2 the full-screen display window to convert it into a non-full-screen window, the corresponding second conversion matrix is 1 / 2 0 0 0 1 / 2 0 0 0 1 / 2 , Three-dimensional coordinate (x) of each pixel point in frame cache data corresponding to non-full screen windowt,yt,zt) As shown in formula (1):
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow></math>
taking the non-full screen window moving laterally by Δ x and moving longitudinally by Δ y as an example, the corresponding second conversion matrix is <math><mrow> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow></math> The three-dimensional coordinate (x) of each pixel point in the frame cache data corresponding to the non-full screen windowt,yt,zt) As shown in the formula (2),
<math><mrow> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>t</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>t</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mi>&Delta;x</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> <mtd> <mi>&Delta;y</mi> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mtd> </mtr> </mtable> </mfenced> <mo>&times;</mo> <mfenced open='(' close=')'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>z</mi> <mi>o</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow></math>
therefore, the operation object of the first touch event can be selected according to the priority information corresponding to the applied non-full screen window and the current first touch event, so that the non-full screen window to be operated can be determined simply according to the gesture, and the user experience is improved.
For example, when the current mobile terminal displays a non-full screen window 1 and a non-full screen window 2 as shown in fig. 2, and the first touch operation is a click operation and the touch point is a point a in the overlap area, it is determined that the non-full screen window 2 responds to the first touch operation according to the priority information of the non-full screen window 1 and the non-full screen window 2.
The method comprises the steps that a non-full screen window 1 and a non-full screen window 2 shown in fig. 2B are displayed on a current mobile terminal, when a first touch operation is determined to be a zooming operation according to starting and ending positions of a touch point A and a touch point B, the zooming operation priorities of the non-full screen window 1 and the non-full screen window 2 are compared, and if the zooming operation priority of the non-full screen window 1 is higher, an application corresponding to the non-full screen window 1 responds to the first touch operation; the responded non-full screen window 1 is shown in fig. 2c, where the non-full screen window 1 is obviously scaled. Among them, the application having a higher priority in response to the zoom operation may be a picture or a video, or the priority of the zoom operation may be set by the user.
Therefore, with the embodiment, when two or more non-full screen windows overlap, the application corresponding to the touch operation can be determined according to the current touch operation and the priority information of the non-full screen windows, and the application can respond to the touch operation. Therefore, the problem of operation confusion when a plurality of non-full screen windows are opened is avoided, and the use experience of a user is improved.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. An information processing method is applied to an electronic device, the electronic device is provided with a touch display unit, the electronic device can run a plurality of applications and displays the applications in a display area of the touch display unit, and when N windows of the applications running in a non-full screen mode are opened, N is an integer greater than or equal to 2, the method comprises the following steps:
when a first touch operation is detected, analyzing to obtain first touch event information, and obtaining a position coordinate of the first touch operation;
judging whether the first touch event information is located in an overlapping area of at least two non-full screen windows in the non-full screen windows corresponding to the N applications according to the position coordinates;
if so, acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
2. The method according to claim 1, wherein the determining whether the first touch event information is located in an overlapping area of at least two non-full screen windows of the non-full screen windows corresponding to the N applications according to the position coordinates further comprises:
if not, judging whether the first touch event information is located in a touch area of a non-full screen window of a first application, if so, calculating to obtain first operation information corresponding to the first touch operation by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, and responding to the first touch operation by the first application based on the first operation information.
3. The method according to claim 2, wherein the obtaining priority information corresponding to the at least two non-full screen windows and determining, according to the priority information, a first application to respond to the first touch operation comprises:
acquiring priority information corresponding to the at least two non-full screen windows, wherein the priority information represents the zooming operation priority and/or the time of the last interactive operation of the at least two non-full screen windows;
judging whether the first touch operation is zooming operation, if so, determining an application with a high zooming operation priority as a first application responding to the first touch operation according to the zooming operation priorities corresponding to the at least two non-full screen windows;
if not, extracting the time of the last interactive operation corresponding to the at least two non-full screen windows, comparing the time of the last interactive operation, and determining the application with the later time of the last interactive operation as the first application responding to the first touch operation.
4. The method of claim 3, wherein calculating first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to a non-full screen window of the first application and the first touch event information comprises:
converting the initial coordinate of the operation touch point and the end coordinate of the operation touch point in the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application;
and taking the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point as first operation information corresponding to the first touch operation.
5. The method of claim 4, wherein the first application responding to the first touch operation based on the first operation information comprises:
determining the operation type of the first operation information according to the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point in the first operation information;
when the operation type of the first operation information is click operation, judging whether the first operation information selects an event coordinate in the first application, if so, generating response information of the event coordinate, and executing an event corresponding to the event coordinate according to the first response information;
and when the operation type of the first operation information is sliding operation, the first application responds according to the first operation information.
6. The method according to any of claims 1-5, wherein the first conversion parametric form comprises at least one of: conversion matrix, parameters, parameter set.
7. An electronic device, the electronic device comprising: the touch control display unit and the processing unit; wherein,
the touch display unit is used for opening N windows which are operated in a non-full screen mode, wherein N is an integer greater than or equal to 2, and when a first touch operation is detected, first touch event information is obtained through analysis, and position coordinates of the first touch operation are obtained;
the processing unit is configured to determine, according to the position coordinates, whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications; if so, acquiring priority information corresponding to the at least two non-full screen windows, determining a first application to respond to the first touch operation according to the priority information, and calculating to obtain first operation information corresponding to the first touch event information by using a first conversion parameter corresponding to the non-full screen window of the first application and the first touch event information, wherein the first application responds to the first touch operation based on the first operation information.
8. The electronic device of claim 7,
the processing unit is further configured to determine, according to the position coordinate, whether the first touch event information is located in an overlapping area of at least two non-full-screen windows in the non-full-screen windows corresponding to the N applications, if not, determine whether the first touch event information is located in a touch area of a non-full-screen window of a first application, if so, calculate first operation information corresponding to the first touch operation by using a first conversion parameter corresponding to the non-full-screen window of the first application and the first touch event information, where the first application responds to the first touch operation based on the first operation information.
9. The electronic device of claim 8,
the processing unit is specifically configured to acquire priority information corresponding to the at least two non-full-screen windows, where the priority information represents a zoom operation priority and/or a last interactive operation time of the at least two non-full-screen windows;
judging whether the first touch operation is zooming operation, if so, determining an application with a high zooming operation priority as a first application responding to the first touch operation according to the zooming operation priorities corresponding to the at least two non-full screen windows;
if not, extracting the time of the last interactive operation corresponding to the at least two non-full screen windows, comparing the time of the last interactive operation, and determining the application with the later time of the last interactive operation as the first application responding to the first touch operation.
10. The electronic device of claim 9,
the processing unit is specifically configured to convert, by using a first conversion parameter corresponding to a non-full screen window of the first application, a start coordinate of an operation touch point and an end coordinate of the operation touch point in the first touch event information; and taking the converted initial coordinate of the operation touch point and the converted end coordinate of the operation touch point as first operation information corresponding to the first touch operation.
11. The electronic device of claim 10,
the processing unit is specifically configured to determine an operation type of the first operation information according to the converted start coordinate of the operation touch point and the converted end coordinate of the operation touch point in the first operation information;
when the operation type of the first operation information is click operation, judging whether the first operation information selects an event coordinate in the first application, if so, generating response information of the event coordinate, and executing an event corresponding to the event coordinate according to the first response information;
and when the operation type of the first operation information is sliding operation, the first application responds according to the first operation information.
12. The electronic device of any of claims 6-11, wherein the first converted parametric form comprises at least one of: conversion matrix, parameters, parameter set.
CN201310516854.0A 2013-10-28 2013-10-28 A kind of information processing method and electronic equipment Active CN104571904B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310516854.0A CN104571904B (en) 2013-10-28 2013-10-28 A kind of information processing method and electronic equipment
US14/229,917 US20150121301A1 (en) 2013-10-28 2014-03-30 Information processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310516854.0A CN104571904B (en) 2013-10-28 2013-10-28 A kind of information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN104571904A true CN104571904A (en) 2015-04-29
CN104571904B CN104571904B (en) 2018-08-10

Family

ID=53088105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310516854.0A Active CN104571904B (en) 2013-10-28 2013-10-28 A kind of information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN104571904B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325720A (en) * 2015-06-16 2017-01-11 联想(北京)有限公司 Information processing method and information processing device
CN107402700A (en) * 2017-06-21 2017-11-28 北京小度信息科技有限公司 Page display method and device
CN107562346A (en) * 2017-09-06 2018-01-09 广东欧珀移动通信有限公司 Terminal control method, device, terminal and computer-readable recording medium
CN110673784A (en) * 2019-09-24 2020-01-10 华勤通讯技术有限公司 Single-hand operation method and device of large-screen intelligent equipment and intelligent equipment
CN111190532A (en) * 2019-12-31 2020-05-22 北京奇才天下科技有限公司 Interaction method and device based on gesture recognition and electronic equipment
CN112068742A (en) * 2020-04-28 2020-12-11 北京字节跳动网络技术有限公司 Method, device, terminal and storage medium for controlling application window

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1959349A (en) * 2005-10-31 2007-05-09 株式会社电装 Displaying device
CN101441559A (en) * 2007-11-19 2009-05-27 盛趣信息技术(上海)有限公司 Method and system for implementing window local modal in game
EP2523129A1 (en) * 2011-05-11 2012-11-14 Dassault Systèmes Selection of a manipulator of an object among a plurality of manipulators
CN102968243A (en) * 2012-09-29 2013-03-13 顾晶 Method, device and equipment for displaying multiple application windows on mobile terminal
CN103365525A (en) * 2012-03-28 2013-10-23 百度在线网络技术(北京)有限公司 Mobile terminal and multi-window displaying method for mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1959349A (en) * 2005-10-31 2007-05-09 株式会社电装 Displaying device
CN101441559A (en) * 2007-11-19 2009-05-27 盛趣信息技术(上海)有限公司 Method and system for implementing window local modal in game
EP2523129A1 (en) * 2011-05-11 2012-11-14 Dassault Systèmes Selection of a manipulator of an object among a plurality of manipulators
CN103365525A (en) * 2012-03-28 2013-10-23 百度在线网络技术(北京)有限公司 Mobile terminal and multi-window displaying method for mobile terminal
CN102968243A (en) * 2012-09-29 2013-03-13 顾晶 Method, device and equipment for displaying multiple application windows on mobile terminal

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325720A (en) * 2015-06-16 2017-01-11 联想(北京)有限公司 Information processing method and information processing device
CN106325720B (en) * 2015-06-16 2020-01-31 联想(北京)有限公司 Information processing method and information processing apparatus
CN107402700A (en) * 2017-06-21 2017-11-28 北京小度信息科技有限公司 Page display method and device
CN107402700B (en) * 2017-06-21 2020-06-09 北京星选科技有限公司 Page display method and device
CN107562346A (en) * 2017-09-06 2018-01-09 广东欧珀移动通信有限公司 Terminal control method, device, terminal and computer-readable recording medium
CN110673784A (en) * 2019-09-24 2020-01-10 华勤通讯技术有限公司 Single-hand operation method and device of large-screen intelligent equipment and intelligent equipment
CN111190532A (en) * 2019-12-31 2020-05-22 北京奇才天下科技有限公司 Interaction method and device based on gesture recognition and electronic equipment
CN112068742A (en) * 2020-04-28 2020-12-11 北京字节跳动网络技术有限公司 Method, device, terminal and storage medium for controlling application window

Also Published As

Publication number Publication date
CN104571904B (en) 2018-08-10

Similar Documents

Publication Publication Date Title
CN104571904B (en) A kind of information processing method and electronic equipment
CN105912091B (en) Electronic device and method for reducing power consumption thereof
EP2950193A1 (en) Electronic device with foldable display and method of operating the same
KR20150136440A (en) Method for controlling display and electronic device supporting the same
US10789914B2 (en) Computer system, screen sharing method, and program
US9804762B2 (en) Method of displaying for user interface effect and electronic device thereof
KR20160005609A (en) Method for displaying graphic user interface and electronic device supporting the same
CN108259810A (en) A kind of method of video calling, equipment and computer storage media
CN109298909B (en) Window adjusting method, mobile terminal and computer readable storage medium
CN104615336A (en) Information processing method and electronic equipment
US20150121301A1 (en) Information processing method and electronic device
CN113359995B (en) Man-machine interaction method, device, equipment and storage medium
US20150121270A1 (en) Information processing method and electronic device
CN104267931A (en) Electronic equipment and method for processing information
KR102183397B1 (en) Image processing method and electronic device implementing the same
KR20150110032A (en) Electronic Apparatus and Method for Image Data Processing
CN104123062B (en) A kind of information processing method and electronic equipment
CN103870115B (en) Information processing method and electronic equipment
CN115665314B (en) Screen display method, device, terminal and computer readable storage medium
CN104571791A (en) Information processing method and electronic equipment
CN104571796A (en) Information processing method and electronic equipment
CN104571844B (en) A kind of information processing method and electronic equipment
KR20150140012A (en) Method for displaying screen and electronic device implementing the same
CN105320421B (en) Message display method, device and terminal
CN112363787A (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant