CN106095303B - Application program operation method and device - Google Patents

Application program operation method and device Download PDF

Info

Publication number
CN106095303B
CN106095303B CN201610377935.0A CN201610377935A CN106095303B CN 106095303 B CN106095303 B CN 106095303B CN 201610377935 A CN201610377935 A CN 201610377935A CN 106095303 B CN106095303 B CN 106095303B
Authority
CN
China
Prior art keywords
touch
application program
area
point
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610377935.0A
Other languages
Chinese (zh)
Other versions
CN106095303A (en
Inventor
周奇
钱伟强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yibin bond China smart technology Co.,Ltd.
Original Assignee
周奇
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 周奇 filed Critical 周奇
Priority to CN201610377935.0A priority Critical patent/CN106095303B/en
Publication of CN106095303A publication Critical patent/CN106095303A/en
Application granted granted Critical
Publication of CN106095303B publication Critical patent/CN106095303B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Abstract

The invention discloses an application program operation method and device, wherein the application program operation method comprises the following steps: identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area, wherein each application program touch identification area corresponds to an application program; and when the touch gesture input on the intelligent terminal is recognized to be sliding up and down or sliding down and up in the application program touch recognition area, corresponding operation is executed on the application program corresponding to the application program touch recognition area. The technical scheme provided by the invention can effectively reduce the probability of misoperation on the application program.

Description

Application program operation method and device
Technical Field
The invention relates to the technical field of intelligent terminals, in particular to an application program operation method and device.
Background
With the development of science and technology, the functions of smart terminals (such as smart phones, tablet computers, and the like) are more and more powerful, thousands of application programs are developed for users to use, the application programs are indispensable parts of the smart terminals, and after the application programs are installed in the smart terminals, the users can use the installed application programs to realize corresponding functions (such as office, chat, video, games, and the like).
Because the current intelligent terminal basically supports touch control, the touch mode is mostly adopted to operate the application program installed on the intelligent terminal (for example, to start the application program), however, the existing mode for operating the application program installed on the intelligent terminal is too single, and a user can only perform corresponding operation on the corresponding application program by clicking or long-pressing an application program icon, and in addition, because the touched touch point is fixed in the process of clicking or long-pressing, misoperation on the corresponding application program is easily caused due to careless clicking or long-pressing of a certain application program icon.
Disclosure of Invention
The invention provides an application program operation method and device, which are used for reducing the probability of misoperation on an application program.
The invention provides an application program operation method in a first aspect, which comprises the following steps:
identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area, wherein each application program touch identification area corresponds to an application program;
and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area.
A second aspect of the present invention provides an application operating apparatus, including:
the touch gesture recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal slides up and down or slides up and down in a preset application program touch recognition area, wherein each application program touch recognition area corresponds to one application program;
and the control unit is used for executing corresponding operation on the application program corresponding to the application program touch recognition area when the touch gesture recognition unit recognizes that the touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in the preset application program touch recognition area.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1-a is a flowchart illustrating an embodiment of a method for operating an application according to the present invention;
FIG. 1-b is a schematic diagram illustrating an interface of an intelligent terminal and a touch recognition area of an application program in an application scenario;
1-c are schematic diagrams illustrating an interface of an intelligent terminal and an application touch recognition area in another application scenario;
FIG. 2-a is a schematic flow chart diagram illustrating a method for operating an application according to another embodiment of the present invention;
FIG. 2-b is a schematic view of an x-y coordinate system provided by the present invention;
FIG. 2-c is an enlarged view of region A1 in an application scenario;
FIG. 3-a is a flowchart illustrating a method for operating an application according to yet another embodiment of the present invention;
FIG. 3-b is an enlarged view of region A1 in another application scenario;
fig. 4 is a schematic structural diagram of an application operating device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an application program operation method, which comprises the following steps: recognizing a touch gesture input on the intelligent terminal; and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in a preset application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area, wherein each application program touch recognition area corresponds to one application program. The embodiment of the invention also provides a corresponding application program operating device, which is respectively described in detail below.
Example one
An embodiment of the present invention provides an application program operating method, as shown in fig. 1-a, the application program operating method in the embodiment of the present invention includes:
step 101, identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area;
wherein each application touch recognition area corresponds to an application.
In the embodiment of the invention, a corresponding application program touch identification area is preset for an installed application program on the intelligent terminal, and whether a touch gesture input on the intelligent terminal slides up and down or slides down and up in the preset application program touch identification area is identified by detecting the position information of each touch point triggered (namely touched) in the touch process. Each touch process starts from the moment that a touch point on a touch screen of the intelligent terminal is triggered to the moment that all touch points on the touch screen are released.
Because the sliding up and down or the sliding down and up in the area with the smaller range has certain difficulty, the upper and lower boundary ranges of the application program clicking area of the application program can be expanded on the basis of the application program clicking area of the application program for facilitating the operation of a user, and the upper and lower boundary ranges are used as the application program touch recognition area of the application program. Optionally, the left and right boundaries of the application touch recognition area are the same as the application click area of the corresponding application, and the upper and lower boundary ranges of the application touch recognition area are larger than the application click area of the corresponding application, for example, when the application touch recognition area is preset for the application 1, the left and right boundaries of the application touch recognition area of the application 1 are set as the left and right boundaries of the application click area of the application 1, and the upper and lower boundary ranges of the application touch recognition area of the application 1 are set as the upper and lower boundary ranges of the application click area of the application 1. The application program click area is a range in which the corresponding application program receives click triggering (that is, a user executes a click operation in a certain application program click area to trigger an application program corresponding to the application program click area). Specifically, the ratio of the upper and lower boundary ranges of the touch recognition area of the application program to be larger than the application program click area of the corresponding application program may be 40%, and of course, the upper and lower boundary ranges may be set to other values. It should be noted that the ratio of the upper and lower boundary ranges of the touch recognition area of the application program being larger than the application program click area of the corresponding application program cannot be too large, otherwise, the upper and lower boundary ranges may conflict with other operations (for example, an operation of drawing a screen upwards).
Optionally, when it is recognized that the touch gesture input on the smart terminal meets the first condition, it is determined that the touch gesture slides up and down in a preset application program touch recognition area, and otherwise, it is determined that the touch gesture does not slide up and down in the preset application program touch recognition area. Wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area.
Optionally, when it is recognized that the touch gesture input on the smart terminal meets the second condition, it is determined that the touch gesture slides up and down in a preset application program touch recognition area, and otherwise, it is determined that the touch gesture does not slide up and down in the preset application program touch recognition area. Wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area.
102, when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch recognition area, executing corresponding operation on the application program corresponding to the application program touch recognition area;
in the embodiment of the present invention, a corresponding operation is set in advance for a touch gesture sliding up and down or sliding down and up in an application touch recognition area, so that when the touch gesture input on the smart terminal is recognized in step 101 as sliding up and down or sliding down and up in a preset application touch recognition area, a corresponding operation is performed on an application corresponding to the corresponding application touch recognition area.
Specifically, the area where the icon of the application is located may be set as the application touch recognition area. For example, as shown in fig. 1-b, a schematic diagram of an interface and an application touch recognition area of an intelligent terminal in an application scenario is shown, where an icon 1, an icon 2, an icon 3, an icon 4, an icon 5, and an icon 6 are icons of 6 applications installed on the intelligent terminal (for convenience of description, these 6 applications are described as an application 1, an application 2, an application 3, an application 4, an application 5, and an application 6, respectively), as shown in fig. 1-b, an area a1 where the icon 1 is located may be set as an application touch recognition area corresponding to the application 1 in advance, an area a2 where the icon 2 is located may be set as an application touch recognition area corresponding to the application 2, an area A3 where the icon 3 is located may be set as an application touch recognition area corresponding to the application 3, the area a4 in which the icon 4 is located is set as an application touch recognition area corresponding to the application 4, the area a5 in which the icon 5 is located is set as an application touch recognition area corresponding to the application 5, and the area a6 in which the icon 6 is located is set as an application touch recognition area corresponding to the application 6. It should be noted that, in fig. 1-b, blank areas are left between the touch recognition areas of the applications (i.e., they are not connected to each other), and in other embodiments, the touch recognition areas of the applications corresponding to the adjacent icons may be connected, as shown in fig. 1-c, the area a1 is connected to the area a2 and the area a4, the area a2 is connected to the area a1, the area A3 and the area a5, and the area A3 is connected to the area a2 and the area a6, respectively. Of course, in other embodiments, other areas may be set as the application touch recognition areas corresponding to the applications, which is not limited herein. Further, the area where the icon of the application program is located may also be enlarged, and the enlarged area is used as the application program touch recognition area of the corresponding application program, for example, the position of the upper boundary of the area where the icon of the application program is located may be raised by a preset height on the basis of the area where the icon of the application program is located, so as to enlarge the touch range of the application program touch recognition area of the application program in the vertical direction.
In an application scenario, an operation corresponding to a touch gesture of sliding up and down or sliding down and up in a preset application touch recognition area is set as starting an application, and in step 102, when it is recognized that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application touch recognition area, the application corresponding to the application touch recognition area is started. Taking fig. 1-b as an example, when the step 101 recognizes that the application program slides up and down or slides down and up in the area a1, the application program corresponding to the area a1 (i.e., the application program 1) is started.
In another application scenario, it is set that an operation corresponding to a touch gesture of sliding up and down or sliding down and up in a preset application touch recognition area is to pop up a widget interface (also referred to as a pop-up frame interface) related to an application, and in step 102, when it is recognized that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application touch recognition area, the widget interface related to the application corresponding to the application touch recognition area is popped up. Taking fig. 1-b as an example for explanation, when step 101 identifies that the application program slides up and down or slides down and up in the area a1, the widget interface related to the application program corresponding to the area a1 (i.e. the aforementioned application program 1) is popped up.
In another application scenario, the touch gestures of the up-down sliding or the up-down sliding in the touch recognition area of the preset application program may be set to correspond to different operations respectively. For example, the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as starting the application, the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as popping up the widget interface related to the application, or the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as popping up the widget interface related to the application, and the operation corresponding to the touch gesture sliding up and down in the preset application touch recognition area may be set as starting the application.
Further, when the step 101 recognizes that the touch gesture input on the smart terminal does not slide up and down in the application touch recognition area and the touch gesture does not slide up and down in the application touch recognition area, immediately, or after waiting for a preset time or a preset event, returning to the step 101.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Example two
The difference between the embodiment of the present invention and the first embodiment is that the embodiment of the present invention further defines an identification condition of up-down sliding to further reduce the probability of misoperation of the application program, as shown in fig. 2-a, the application program operation method in the embodiment of the present invention includes:
step 201, identifying whether a touch gesture input on the intelligent terminal meets a first condition;
wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area.
If the touch gesture input on the intelligent terminal is recognized to satisfy the first condition, step 202 is executed, and if the touch gesture input on the intelligent terminal is recognized not to satisfy the first condition, step 203 is executed.
An x-y coordinate system used by the touch screen of the intelligent terminal is shown in fig. 2-b (the coordinate system shown in fig. 2-b is also a default coordinate system of the current intelligent terminal), in the coordinate system shown in fig. 2-b, coordinate values of an x axis sequentially increase from left to right, and coordinate values of a y axis sequentially decrease from top to bottom, and correspondingly, in the touch screen of the intelligent terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of touch points sequentially increase from left to right, and the y coordinate values of touch points sequentially increase from top to bottom. Taking the preset application touch recognition area as a rectangular area as an example, in step 201, if an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt or less, and the above-mentioned one touch pointOrdinate value y ofiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program; wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
In an application scenario, whether the touch gesture input on the intelligent terminal meets the first condition can be identified by detecting the positions of the touch points of the input touch gesture. Taking the area a1 in fig. 1-b as an example for explanation, setting an x-y coordinate system used by a touch screen of an intelligent terminal as shown in fig. 2-b, setting x coordinate values of a left boundary and a right boundary of the area a1 as xleft and xrt, respectively, and y coordinate values of an upper boundary and a lower boundary of the area a1 as ymax and ymin, respectively, and then a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 as (ymax + ymin)/2, when each touch point of the input touch gesture is recognized in the area a1, and y coordinate values of a first touch point and a last touch point of the input touch gesture are both greater than ymid (i.e. both located in the lower half area of the application touch recognition area), and when at least one touch point of the input touch gesture has a y coordinate value less than ymid (i.e. located in the upper half area of the application touch recognition area), and if not, determining that the touch gesture input on the intelligent terminal does not meet the first condition, and executing step 203.
In another application scenario, the first touch point, the inflection point and the last touch point of the input touch gesture can be also detectedTo identify whether the touch gesture input on the smart terminal satisfies the first condition. Specifically, if it is recognized that a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are all in the same preset application touch recognition area, the first touch point and the last touch point are located in a lower half area of the application touch recognition area, and the inflection point is located in an upper half area of the application touch recognition area, it is determined that the touch gesture satisfies the first condition. In this application scenario, it is necessary to determine an inflection point, where the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1I.e., a trip point where the y coordinate value increases to decrease in the y coordinate value or decreases to increase in the y coordinate value), where the y coordinate value increasesjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point. Taking the area a1 in fig. 1-b as an example for explanation, assuming that an x-y coordinate system used by a touch screen of the smart terminal is as shown in fig. 2-b, x coordinate values of a left boundary and a right boundary of the area a1 are xleft and xrt, respectively, y coordinate values of an upper boundary and a lower boundary of the area a1 are ymax and ymin, respectively, a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 is (ymax + ymin)/2, such as an enlarged schematic diagram of the area a1 shown in fig. 2-c, where P is an input touch gesture, where a point P1 and a point P3 are a first touch point and a last touch point of the input touch gesture, respectively, a point P2 is an inflection point in the input touch gesture, and when a point P1 is recognized to satisfy ymid<y1<ymax and xleft<x1<xrt, Point P2 satisfies ymin<y2<ymid and xleft<x2<xrt, Point P3 satisfies ymid<y3<ymax and xleft<x3<xrt, judging that the input touch gesture satisfies the first condition, and executing step 202 if the input touch gesture satisfies the first condition, or else judging that the input touch gesture does not satisfy the first conditionThe first condition is that step 203 is executed.
Optionally, the triggered touch point is detected in the input process of the touch gesture, when it is detected that the triggered touch point does not meet the requirement in the first condition, that is, it is determined that the touch gesture to be currently input does not meet the first condition, step 203 is executed, and then, input of a new touch gesture is waited.
Step 202, judging that the touch gesture slides up and down in a preset application program touch recognition area, and executing corresponding operation on an application program corresponding to the application program touch recognition area;
in the embodiment of the present invention, when it is determined that the touch gesture input on the smart terminal is sliding up and down in a preset application touch recognition area, a preset operation corresponding to the sliding up and down is performed on an application corresponding to the application touch recognition area, specifically, the preset operation corresponding to the sliding up and down may be, for example, starting the application, popping up a widget interface related to the application, or other application operations, which is not limited herein.
And 203, judging that the touch gesture does not slide up and down in a preset application program touch recognition area, and not executing preset operation corresponding to the up-down sliding on the application program corresponding to the application program touch recognition area.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to slide up and down in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
EXAMPLE III
The difference between the embodiment of the present invention and the first embodiment is that the embodiment of the present invention further defines an identification condition of up-down sliding to further reduce the probability of misoperation of the application program, as shown in fig. 3-a, the application program operation method in the embodiment of the present invention includes:
step 301, identifying whether a touch gesture input on the intelligent terminal meets a second condition;
wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area.
And executing step 302 when the touch gesture input on the intelligent terminal is recognized to meet the second condition, and executing step 303 when the touch gesture input on the intelligent terminal is recognized to not meet the second condition.
An x-y coordinate system used by the touch screen of the intelligent terminal is shown in fig. 2-b (the coordinate system shown in fig. 2-b is also a default coordinate system of the current intelligent terminal), in the coordinate system shown in fig. 2-b, coordinate values of an x axis sequentially increase from left to right, and coordinate values of a y axis sequentially decrease from top to bottom, and correspondingly, in the touch screen of the intelligent terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of touch points sequentially increase from left to right, and the y coordinate values of touch points sequentially increase from top to bottom. If the touch recognition area of the application program preset in the embodiment of the present invention is a rectangular area, in step 201, specifically, if it is recognized that the touch recognition area is intelligentAbscissa value x of a touch point of a touch gesture input on a terminaliX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program; wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
In an application scenario, whether the touch gesture input on the intelligent terminal meets the second condition may be identified by detecting the position of each touch point of the input touch gesture. Taking the area a1 in fig. 1-b as an example, let the x-y coordinate system used by the touch screen of the smart terminal be as shown in fig. 2-b, and correspondingly, in the touch screen of the smart terminal applying the coordinate system shown in fig. 2-b, the x coordinate values of the touch points from left to right increase in sequence, and the y coordinate values of the touch points from top to bottom increase in sequence. If the x coordinate values of the left boundary and the right boundary of the area a1 are xleft and xrt, respectively, and the y coordinate values of the upper boundary and the lower boundary of the area a1 are ymax and ymin, respectively, then the y coordinate value ymid of the boundary between the upper half area and the lower half area of the area a1 is (ymax + ymin)/2, when it is recognized that each touch point of the input touch gesture is in the area a1, and the y coordinate values of the first touch point and the last touch point of the input touch gesture are both smaller than ymid (i.e., both located in the upper half area of the touch recognition area of the application), and when at least one touch point of the input touch gesture exists with a y coordinate value larger than ymid (i.e., located in the lower half area of the touch recognition area of the application), it is determined that the touch gesture input on the smart terminal satisfies the second condition, step 302 is performed, otherwise it is determined that the touch gesture input on the smart terminal does not satisfy the second condition, step 303 is performed.
In another application scenario, whether the touch gesture input on the intelligent terminal meets the second condition may also be identified by detecting the positions of the first touch point, the inflection point and the last touch point of the input touch gesture. Specifically, if it is recognized that a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are all in the same preset application touch recognition area, the first touch point and the last touch point are located in the upper half area of the application touch recognition area, and the inflection point is located in the lower half area of the application touch recognition area, it is determined that the touch gesture satisfies the second condition. In this application scenario, it is necessary to determine an inflection point, where the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1I.e., a trip point where the y coordinate value increases to decrease in the y coordinate value or decreases to increase in the y coordinate value), where the y coordinate value increasesjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point. Taking the area a1 in fig. 1-b as an example for explanation, assuming that an x-y coordinate system used by a touch screen of the smart terminal is as shown in fig. 2-b, x coordinate values of a left boundary and a right boundary of the area a1 are xleft and xrt, respectively, y coordinate values of an upper boundary and a lower boundary of the area a1 are ymax and ymin, respectively, a y coordinate value ymid of a boundary between an upper half area and a lower half area of the area a1 is (ymax + ymin)/2, as shown in an enlarged schematic diagram of the area a1 shown in fig. 3-b, S is an input touch gesture, wherein points S1 and S3 are a first touch point and a last touch point of the input touch gesture, respectively, point S2 is an inflection point in the input touch gesture, and when it is recognized that the point S1 satisfies ymin<y4<ymid and xleft<x4<xrt, point S2Satisfy ymid<y5<ymax and xleft<x5<xrt, Point S3 satisfies ymin<y6<ymid and xleft<x6<xrt, judging that the input touch gesture satisfies the second condition, and executing step 302, otherwise, judging that the touch gesture input on the intelligent terminal does not satisfy the second condition, and executing step 303.
Optionally, the triggered touch point is detected in the input process of the touch gesture, and when it is detected that the triggered touch point does not meet the requirement in the second condition, it is determined that the touch gesture to be currently input does not meet the second condition, and step 303 is executed.
Step 302, judging that the touch gesture slides up and down in a preset application program touch recognition area, and executing corresponding operation on an application program corresponding to the application program touch recognition area;
in an embodiment of the present invention, when it is determined that the touch gesture input on the smart terminal is sliding up and down in a preset application touch recognition area, a preset operation corresponding to the sliding up and down is performed on an application corresponding to the application touch recognition area, specifically, the preset operation corresponding to the sliding up and down may be, for example, starting the application, popping up a small window interface related to the application, or other application operations, which is not limited herein.
And step 303, determining that the touch gesture does not slide up and down in a preset application program touch recognition area, and not executing a preset operation corresponding to the up-and-down sliding on the application program corresponding to the application program touch recognition area.
It should be noted that, the embodiment (i.e., the third embodiment) of the present invention may be combined with the second embodiment for recognition, or, on the basis of the first embodiment and the third embodiment, the touch gesture input on the smart terminal may be recognized in other ways as sliding up and down in a preset application touch recognition area, which is not limited herein.
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to slide up and down in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
Example four
Referring to fig. 4, an application operating device 400 according to an embodiment of the present invention is described, including: a touch gesture recognition unit 401 and a control unit 402.
The touch gesture recognition unit 401 is configured to recognize whether a touch gesture input on the smart terminal slides up and down or slides down and up in a preset application touch recognition area, where each application touch recognition area corresponds to an application. The control unit 402 is configured to execute a corresponding operation on the application program corresponding to the application program touch recognition area when the touch gesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in the preset application program touch recognition area.
Optionally, the touch gesture recognition unit 401 includes: the first sub-recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal meets a first condition; a first determination unit for determining whether the first sub-recognition unit recognizes an input touchWhen the touch gesture meets the first condition, judging that the touch gesture slides up and down in a preset application program touch recognition area; and when the first sub-recognition unit recognizes that the input touch gesture does not meet the first condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area. Wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the lower half portion of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half portion of the application program touch recognition area. Further, the first sub-identification unit is specifically configured to: when the first touch point, the inflection point and the last touch point of the touch gesture input on the intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the lower half part area of the application program touch recognition area, and the inflection point is located in the upper half part area of the application program touch recognition area, it is judged that the touch gesture meets the first condition. Wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point.
Optionally, the touch gesture recognition unit 401 includes: the second sub-recognition unit is used for recognizing whether the touch gesture input on the intelligent terminal meets a second condition; the second judging unit is used for judging that the touch gesture slides up and down in a preset application program touch recognition area when the second sub-recognition unit recognizes that the input touch gesture meets the second condition; when the second sub-recognition unit recognizes that the input touch gesture does not satisfy the second condition, determining that the touch gesture is performedThe touch screen is not slid up and down in a preset application program touch recognition area; wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area. Further, the second sub-identification unit is specifically configured to: when a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are identified to be in the same preset application program touch identification area, the first touch point and the last touch point are located in the upper half area of the application program touch identification area, and the inflection point is located in the lower half area of the application program touch identification area, it is determined that the touch gesture meets the second condition. Wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA vertical coordinate value y representing the inflection pointj-1Ordinate value, y, of a touch point immediately preceding the inflection pointj+1And a vertical coordinate value of a touch point subsequent to the inflection point.
Optionally, in the embodiment of the present invention, the preset application touch identification area is a rectangular area. The first sub-identification unit may be specifically configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf the touch point is less than or equal to ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, then, it is determined that the touch point is located at the positionThe application touches the upper half area of the recognition area. The second sub-identification unit is specifically configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf not more than ymax, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, determining that the touch point is located in the upper half area of the touch recognition area of the application program. Wherein the xleft and the xrt are abscissa values of a left boundary and an abscissa value of a right boundary of the application touch recognition area, respectively, and the ymin and the ymax are ordinate values of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area, respectively.
Optionally, the operation corresponding to the touch gesture sliding up and down or sliding down and up in the preset application program touch recognition area is to start an application program; the control unit 402 is specifically configured to start the application program corresponding to the application program touch recognition area when the touch gesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding down and up in a preset application program touch recognition area.
Optionally, the operation corresponding to the touch gesture sliding up and down or sliding down and up in the preset application program touch recognition area is to pop up a widget interface related to the application program; the control unit 402 is specifically configured to pop up a widget interface related to an application program corresponding to the touch recognition area of the application program when the touch gesture recognition unit 401 recognizes that the touch gesture input on the smart terminal is sliding up and down or sliding up and down in the preset touch recognition area of the application program.
Because the sliding up and down or the sliding down and up in the area with the smaller range has certain difficulty, the upper and lower boundary ranges of the application program clicking area of the application program can be expanded on the basis of the application program clicking area of the application program for facilitating the operation of a user, and the upper and lower boundary ranges are used as the application program touch recognition area of the application program. Optionally, the left and right boundaries of the touch recognition area of the application program are the same as the application program click area of the corresponding application program, and the upper and lower boundary ranges of the touch recognition area of the application program are larger than the application program click area of the corresponding application program. The application program click area is a range in which the corresponding application program receives click triggering (that is, a user executes a click operation in a certain application program click area to trigger an application program corresponding to the application program click area). Specifically, the ratio of the upper and lower boundary ranges of the touch recognition area of the application program to be larger than the application program click area of the corresponding application program may be 40%, and of course, the upper and lower boundary ranges may be set to other values. It should be noted that the ratio of the upper and lower boundary ranges of the touch recognition area of the application program being larger than the application program click area of the corresponding application program cannot be too large, otherwise, the upper and lower boundary ranges may conflict with other operations (for example, an operation of drawing a screen upwards).
It should be noted that the application operating device may be integrated into the smart terminal in a hardware and/or software manner, and the smart terminal may be specifically a smart phone, a tablet computer, or another terminal equipped with a touch screen, which is not limited herein.
It should be understood that, in the embodiment of the present invention, functions of each functional module of the application program operating apparatus may be specifically implemented according to the method in the foregoing method embodiment, and a specific implementation process thereof may refer to relevant descriptions in the foregoing method embodiment, which is not described herein again.
Therefore, according to the scheme of the invention, the application program touch identification area corresponding to the application program is preset, and when the touch gesture input on the intelligent terminal is recognized to be up-down sliding or up-down sliding in the application program touch identification area, the corresponding operation is executed on the application program corresponding to the application program touch identification area.
In the several embodiments provided in the present application, it should be understood that the disclosed system and method may be implemented in other ways.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present invention is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no acts or modules are necessarily required of the invention.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In view of the above, it is intended that the present invention not be limited to the specific embodiments and applications described in the above, but that the present invention is not limited to the embodiments and applications described in the above.

Claims (2)

1. An application operating method, comprising:
identifying whether a touch gesture input on the intelligent terminal is sliding up and down or sliding down and up in a preset application program touch identification area, wherein each application program touch identification area corresponds to an application program, and a blank area is reserved between the application program touch identification areas;
when the touch gesture input on the intelligent terminal is recognized to be sliding up and down or sliding down and up in the application program touch recognition area, corresponding operation is executed on the application program corresponding to the application program touch recognition area, and the corresponding operation is popping up a small window interface related to the application program on the intelligent mobile phone;
the left and right boundaries of the application program touch identification area are the same as the application program clicking area of the corresponding application program, the upper and lower boundary ranges of the application program touch identification area are 40% larger than the proportion of the application program clicking area of the corresponding application program, wherein the application program clicking area is the range of the corresponding application program for receiving click trigger, and a user executes click operation in a certain application program clicking area to trigger the application program corresponding to the application program clicking area;
the method for identifying whether the touch gesture input on the intelligent terminal slides up and down or down in a preset application program touch identification area comprises the following steps:
identifying whether a touch gesture input on the intelligent terminal meets a first condition;
if the input touch gesture is recognized to meet the first condition, judging that the touch gesture slides up and down in a preset application program touch recognition area;
if the input touch gesture is recognized not to meet the first condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area;
and/or the presence of a gas in the gas,
identifying whether a touch gesture input on the intelligent terminal meets a second condition;
if the input touch gesture is recognized to meet the second condition, judging that the touch gesture slides up and down in a preset application program touch recognition area;
if the input touch gesture is recognized not to meet the second condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area;
wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are both located in the lower half part area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half part area of the application program touch recognition area;
the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are both located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area;
the identifying whether the touch gesture input on the smart terminal satisfies a first condition includes:
if the first touch point, the inflection point and the last touch point of the touch gesture input on the intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the lower half part area of the application program touch recognition area, and the inflection point is located in the upper half part area of the application program touch recognition area, judging that the touch gesture meets the first condition;
the recognizing whether the touch gesture input on the smart terminal satisfies a second condition includes:
if a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the upper half area of the application program touch recognition area, and the inflection point is located in the lower half area of the application program touch recognition area, judging that the touch gesture meets the second condition;
wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA longitudinal coordinate value, y, representing the inflection pointj-1Ordinate value, y, of a touch point preceding the inflection pointj+1A vertical coordinate value representing a touch point subsequent to the inflection point;
the preset application program touch identification area is a rectangular area;
the identification is whether the touch gesture input on the intelligent terminal slides up and down or slides up and down in a preset application program touch identification area, and the method comprises the following steps:
if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf not, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program;
if the abscissa value x of a touch point of the touch gesture input on the intelligent terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program;
wherein, the xleft and the xrt are respectively an abscissa value of a left boundary and an abscissa value of a right boundary of the application touch recognition area, and the ymin and the ymax are respectively an ordinate value of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area.
2. An application operating device, comprising:
the touch gesture recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal slides up and down or slides down and up in a preset application program touch recognition area, wherein each application program touch recognition area corresponds to an application program, and blank areas are reserved among the application program touch recognition areas;
the control unit is used for executing corresponding operation on the application program corresponding to the application program touch recognition area when the touch gesture recognition unit recognizes that the touch gesture input on the intelligent terminal slides up and down or slides up and down in a preset application program touch recognition area, wherein the corresponding operation is to pop up a small window interface related to the application program on the intelligent mobile phone;
the left and right boundaries of the application program touch identification area are the same as the application program clicking area of the corresponding application program, the upper and lower boundary ranges of the application program touch identification area are 40% larger than the proportion of the application program clicking area of the corresponding application program, wherein the application program clicking area is the range of the corresponding application program for receiving click trigger, and a user executes click operation in a certain application program clicking area to trigger the application program corresponding to the application program clicking area;
the touch gesture recognition unit includes:
the first sub-recognition unit is used for recognizing whether a touch gesture input on the intelligent terminal meets a first condition;
the first judging unit is used for judging that the touch gesture slides up and down in a preset application program touch recognition area when the first sub-recognition unit recognizes that the input touch gesture meets the first condition; when the first sub-recognition unit recognizes that the input touch gesture does not meet the first condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area;
wherein the first condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are both located in the lower half part area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the upper half part area of the application program touch recognition area;
and/or the presence of a gas in the gas,
the touch gesture recognition unit includes:
the second sub-recognition unit is used for recognizing whether the touch gesture input on the intelligent terminal meets a second condition;
the second judging unit is used for judging that the touch gesture slides up and down in a preset application program touch recognition area when the second sub-recognition unit recognizes that the input touch gesture meets the second condition; when the second sub-recognition unit recognizes that the input touch gesture does not meet the second condition, judging that the touch gesture does not slide up and down in a preset application program touch recognition area;
wherein the second condition is: each touch point of the input touch gesture is in the same preset application program touch recognition area, the first touch point and the last touch point of the input touch gesture are both located in the upper half area of the application program touch recognition area, and at least one touch point of the input touch gesture is located in the lower half area of the application program touch recognition area;
the first sub-identification unit is specifically configured to: when a first touch point, an inflection point and a last touch point of a touch gesture input on an intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the lower half part area of the application program touch recognition area, and the inflection point is located in the upper half part area of the application program touch recognition area, judging that the touch gesture meets the first condition;
the second sub-identification unit is specifically configured to: when a first touch point, an inflection point and a last touch point of a touch gesture input on the intelligent terminal are recognized to be in the same preset application program touch recognition area, the first touch point and the last touch point are located in the upper half area of the application program touch recognition area, and the inflection point is located in the lower half area of the application program touch recognition area, judging that the touch gesture meets the second condition;
wherein the inflection point satisfies the condition yj<yj-1And y isj<yj+1Or condition yj>yj-1And y isj>yj+1Touch point of, yjA longitudinal coordinate value, y, representing the inflection pointj-1Ordinate value, y, of a touch point preceding the inflection pointj+1A vertical coordinate value representing a touch point subsequent to the inflection point;
the preset application program touch identification area is a rectangular area;
the first sub-identification unit is specifically further configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf not, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program;
the second sub-identification unit is specifically further configured to: when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiSatisfying (ymax + ymin)/2. ltoreq. yiIf not, judging that the touch point is positioned in the lower half part area of the touch identification area of the application program; when an abscissa value x of a touch point of a touch gesture input on the smart terminal is recognizediX is less than or equal to xleftiXrt, and the ordinate value y of the touch pointiY is less than or equal to ymini<(ymax + ymin)/2, judging that the touch point is positioned in the upper half area of the touch identification area of the application program;
wherein, the xleft and the xrt are respectively an abscissa value of a left boundary and an abscissa value of a right boundary of the application touch recognition area, and the ymin and the ymax are respectively an ordinate value of an upper boundary and an ordinate value of a lower boundary of the application touch recognition area.
CN201610377935.0A 2016-05-31 2016-05-31 Application program operation method and device Active CN106095303B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610377935.0A CN106095303B (en) 2016-05-31 2016-05-31 Application program operation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610377935.0A CN106095303B (en) 2016-05-31 2016-05-31 Application program operation method and device

Publications (2)

Publication Number Publication Date
CN106095303A CN106095303A (en) 2016-11-09
CN106095303B true CN106095303B (en) 2021-03-23

Family

ID=57230705

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610377935.0A Active CN106095303B (en) 2016-05-31 2016-05-31 Application program operation method and device

Country Status (1)

Country Link
CN (1) CN106095303B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107045421B (en) * 2017-04-27 2021-06-18 宇龙计算机通信科技(深圳)有限公司 Screen switching method and mobile terminal
CN107479746A (en) * 2017-07-31 2017-12-15 广州源创网络科技有限公司 A kind of method of toch control and device
CN112363613A (en) * 2020-09-25 2021-02-12 惠州市德赛西威汽车电子股份有限公司 Infrared sliding gesture induction recognition method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101568896A (en) * 2007-06-08 2009-10-28 索尼株式会社 Information processing apparatus, input device, information processing system, information processing method, and program
CN102799339A (en) * 2011-05-24 2012-11-28 汉王科技股份有限公司 Touch implementation method and device of application function button
CN103617002A (en) * 2013-12-16 2014-03-05 深圳市理邦精密仪器股份有限公司 Method and device for achieving touch interface
CN105224215A (en) * 2015-08-28 2016-01-06 小米科技有限责任公司 Terminal control method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7509588B2 (en) * 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
CN103309482A (en) * 2012-03-12 2013-09-18 富泰华工业(深圳)有限公司 Electronic equipment and touch control method and touch control device thereof
CN102841682B (en) * 2012-07-12 2016-03-09 宇龙计算机通信科技(深圳)有限公司 Terminal and gesture control method
CN104460999B (en) * 2014-11-28 2017-07-28 广东欧珀移动通信有限公司 A kind of gesture identification method and device with flex point
CN104360805B (en) * 2014-11-28 2018-01-16 广东欧珀移动通信有限公司 Application icon management method and device
CN104636065A (en) * 2014-12-31 2015-05-20 小米科技有限责任公司 Method and device for awakening terminal
CN105138241B (en) * 2015-09-02 2018-09-18 Tcl移动通信科技(宁波)有限公司 A kind of application program launching method, system and mobile terminal based on mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101568896A (en) * 2007-06-08 2009-10-28 索尼株式会社 Information processing apparatus, input device, information processing system, information processing method, and program
CN102799339A (en) * 2011-05-24 2012-11-28 汉王科技股份有限公司 Touch implementation method and device of application function button
CN103617002A (en) * 2013-12-16 2014-03-05 深圳市理邦精密仪器股份有限公司 Method and device for achieving touch interface
CN105224215A (en) * 2015-08-28 2016-01-06 小米科技有限责任公司 Terminal control method and device

Also Published As

Publication number Publication date
CN106095303A (en) 2016-11-09

Similar Documents

Publication Publication Date Title
US11079908B2 (en) Method and apparatus for adding icon to interface of android system, and mobile terminal
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US10551987B2 (en) Multiple screen mode in mobile terminal
RU2582854C2 (en) Method and device for fast access to device functions
EP3056982B1 (en) Terminal apparatus, display control method and recording medium
CN102855081B (en) The apparatus and method that web browser interface using gesture is provided in a device
US9007314B2 (en) Method for touch processing and mobile terminal
US9158399B2 (en) Unlock method and mobile device using the same
US20100107067A1 (en) Input on touch based user interfaces
CN102902471B (en) Input interface switching method and input interface switching device
WO2014029345A1 (en) Method and device for controlling terminal device
US20100194702A1 (en) Signal processing apparatus, signal processing method and selection method of uer interface icon for multi-touch panel
KR101251761B1 (en) Method for Data Transferring Between Applications and Terminal Apparatus Using the Method
CN106095303B (en) Application program operation method and device
KR20140024721A (en) Method for changing display range and an electronic device thereof
CN104007919A (en) Electronic device and control method thereof
KR20130097331A (en) Apparatus and method for selecting object in device with touch screen
CN112817483B (en) Multi-point touch processing method, device, equipment and storage medium
US9804706B2 (en) Systems and methods of adapting input based on a change in user interface presentation
EP2677413B1 (en) Method for improving touch recognition and electronic device thereof
WO2016173307A1 (en) Message copying method and device, and smart terminal
US20110012843A1 (en) Touch-controlled electronic apparatus and related control method
CN106293051B (en) Gesture-based interaction method and device and user equipment
CN107292279B (en) Display method, display device, terminal and computer-readable storage medium
CN106814935B (en) Application icon display method of terminal and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210419

Address after: No.27, west section of Xinggang Road, Yibin Lingang Economic and Technological Development Zone, Yibin City, Sichuan Province, 644000

Patentee after: Yibin bond China smart technology Co.,Ltd.

Address before: Zhongxin science and technology building, No. 31 Shenzhen Road, 518029 street gossip in Guangdong province Futian District Yuanling 9 layer (layer 1001 building since 10)

Patentee before: Zhou Qi