WO2014190862A1 - Method and apparatus for controlling application on intelligent terminal - Google Patents

Method and apparatus for controlling application on intelligent terminal Download PDF

Info

Publication number
WO2014190862A1
WO2014190862A1 PCT/CN2014/077843 CN2014077843W WO2014190862A1 WO 2014190862 A1 WO2014190862 A1 WO 2014190862A1 CN 2014077843 W CN2014077843 W CN 2014077843W WO 2014190862 A1 WO2014190862 A1 WO 2014190862A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch screen
application
area
item
preconfigured
Prior art date
Application number
PCT/CN2014/077843
Other languages
French (fr)
Inventor
Xiang JIANG
Xiao CHENG
Tiejun TIAN
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2014190862A1 publication Critical patent/WO2014190862A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The present disclosure provides a method and apparatus for controlling an application on an intelligent terminal. The method includes detecting, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen; identifying a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and triggering the identified start instruction to start the function item of the application.

Description

METHOD AND APPARATUS FOR CONTROLLING
APPLICATION ON INTELLIGENT TERMINAL
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The application claims the priority and benefits of Chinese Patent Application No. 201310202964.X, filed on May 28, 2013, the disclosure of which is incorporated herein in its entirety by reference.
TECHNICAL FIELD
[0002] The present disclosure relates to data processing, and to a method and apparatus for controlling an application on an intelligent terminal.
BACKGROUND
[0003] Currently, there are more and more kinds of intelligent terminals having a touch screen, for example, touch screen computers, touch screen cell phones, touch screen palm PCs, touch screen tablet PCs, etc. In these intelligent terminals having a touch screen, usually human-computer instruction interactions are performed by using the touch screen, and usually an operator uses the touch screen to input information such as operation instructions, etc. into the intelligent terminals.
[0004] With the improvement of the processing capability of intelligent terminals and the development of information technologies, more and more applications (APPs) are running on the intelligent terminals. Applications on portable intelligent terminals are developed with a spurt of speed. With the popularity of touch screens, applications with input gestures being used to execute all kinds of instructions are commonly used now.
[0005] In traditional reading APPs that are applicable to touch screens, usually a method for controlling an APP is realized by way of using fingers to tap a keyboard or a menu or navigation on a screen. For example, when a user taps a menu key on a keyboard of an intelligent phone, a menu or navigation of an APP will be shown. The menu or the navigation may include start icons of various function items of the APP. When a user taps a certain start icon on the touch screen, a start instruction of a function item corresponding to the start icon will be triggered. For example, for a reading application, when a user is reading an electronic book or reading an electronic magazine, usually he needs to look up content of the electronic book or content of the electronic magazine to further jump to other pages.
[0006] To realize this, a traditional technical solution is that: 1) the user needs to tap a keyboard or a menu or navigation on the screen to show menu items or navigation items; 2) if an icon of a "content" function item is shown among the menu items or among the navigation items on the current screen, the user needs to further tap the icon of the "content" function item to trigger a start instruction of the "content" function item, and content data of the electronic book that the user is reading will be shown on the current screen; and 3) if there are so many function items among the menu items or among the navigation items that there is not enough space to show the icon of the "content" function item on the current screen, then the user needs to further perform operations such as scrolling, etc. for the menu items or the navigation items, so as to display the icon of the "content" function item on the current screen, and then tap a picture of the "content" function item.
[0007] The above solution may have the following defects: when the user currently stays at a specified content page of the APP, for example, the user is reading an electronic magazine, if the user hopes to trigger a specified function item, e.g., displaying content of the electronic magazine, then he needs to perform human-computer interactions with the intelligent terminal at least twice to realize this. In case that the screen is relatively small, a search operation by scrolling in the menu or in the navigation will be involved, thus the operation efficiency thereof is low, and for a function item with a high triggering frequency, e.g., a content displaying function, the operation efficiency thereof will decrease exponentially, which is inconvenient for the user to use.
SUMMARY
[0008] In view of the above defects in the traditional solutions, the present disclosure provides a method and apparatus for controlling an application on an intelligent terminal to improve the efficiency of operating the application.
[0009] In an aspect, the method for controlling an application on an intelligent terminal includes:
[0010] detecting, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
[0011] identifying a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and
[0012] triggering the identified start instruction to start the function item of the application.
[0013] In another aspect, the apparatus for controlling an application on an intelligent terminal includes: a memory; one or more processors; one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions to:
[0014] detect, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
[0015] identify a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and
[0016] trigger the identified start instruction to start the function item of the application.
[0017] In a further aspect, the present disclosure provides a computer readable storage medium storing one or more programs configured to be executed by a computer system, the one or more programs include instructions to:
[0018] detect, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
[0019] identify a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and [0020] trigger the identified start instruction to start the function item of the application.
[0021] A computer readable storage medium storing one or more programs configured to be executed by a computer system, the one or more programs comprising instructions to:
[0022] detect, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
[0023] identify a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and
[0024] trigger the identified start instruction to start the function item of the application.
[0025] In the present disclosure, a start instruction of a specified function item of an application can be triggered by using one sliding gesture. In this way, the operation efficiency of the application is improved and it is convenient for a user to use. Especially, when the application is a reading application, the operation efficiency of a function item of the reading application with a high triggering efficiency, e.g., a content displaying function item, will be significantly improved, which further improves the convenience when the user uses the reading application.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] For a better understanding of the present disclosure, reference should be made to the Detailed Description below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
[0027] FIG. l shows a flow chart of a method for controlling an application on an intelligent terminal according to an example of the present disclosure;
[0028] FIG.2 shows a flow chart of a method for controlling an application on an intelligent terminal according to another example of the present disclosure;
[0029] FIG.3 shows a detailed flow chart of detecting a specified sliding gesture whose start point is in a specified area of a touch screen; [0030] FIG.4 shows a schematic diagram of triggering display of an electronic book content by using a finger to slide downwards in a specified area on the left side of the touch screen;
[0031] FIG.5 shows a schematic diagram of the constitution of an apparatus for controlling an application on an intelligent terminal according to an example of the present disclosure;
[0032] FIG.6 shows a schematic diagram of the constitution of an apparatus for controlling an application on an intelligent terminal according to another example of the present disclosure;
[0033] FIG.7 shows a schematic diagram of the detailed constitution of a gesture detection module according to examples of the present disclosure;
[0034] FIG.8 shows an apparatus for controlling an application on a mobile terminal according to an example of the present disclosure; and
[0035] FIG.9 shows a non-transitory computer readable storage medium according to an example of the present disclosure.
DETAILED DESCRIPTION
[0036] Reference will now be made in detail to examples, which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure.
[0037] Fig. l shows a flow chart of a method for controlling an application on an intelligent terminal according to an example of the present disclosure.
[0038] As shown in Fig. l, the method mainly includes the following process.
[0039] At block 101, when an intelligent terminal runs an application (APP), it detects an input gesture received on a touch screen of the intelligent terminal that shows a specified content page of the APP. [0040] At block 102, if a specified sliding gesture whose start point is in a specified area of the screen is detected, then according to a preconfigured mapping relation between the specified sliding gesture and a start instruction of a specified function item of the APP, the start instruction of the specified function item of the APP corresponding to the specified sliding gesture whose start point is in the specified area of the screen is identified.
[0041] At block 103, the identified start instruction of the specified function item of the APP is triggered to start the function item of the application.
[0042] In the present disclosure, the intelligent terminal refers to a terminal device that has a touch screen and a data computing function, including but not limited to touch screen intelligent cell phones, touch screen palm PCs, touch screen tablet PCs, etc. Those intelligent terminals having a touch screen may be installed with an operation system (OS), including but not limited to: Android OS, Symbian OS, Windows mobile OS, or Apple iPhone OS, etc. Those intelligent terminals having the touch screen all have a corresponding receiving module that can receive and identify various operation actions and gestures applied on the touch screen, including but not limited to: tap actions, sliding gestures, etc. Further, the detection module provides external invoking interfaces. An external program can invoke these external invoking interfaces to detect various operation actions and gestures received and identified by the touch screen of the present intelligent terminal.
[0043] In the above, some specific types of intelligent terminals and specific types of operation systems are listed, however, a skilled person in the art can be aware that the implementation of the present disclosure is not limited to the above listed types, but is applicable to other types of intelligent terminals and other types of operation systems, which will not be elaborated herein.
[0044] In an example, the APP is a reading APP, e.g., an electronic book reading APP, a news information reading APP, a magazine reading APP, a webpage browsing APP, etc. Usually these kinds of APPs all have a content displaying function, and therefore are very suitable to use the technical solution of the present disclosure to trigger a start instruction of the content displaying function and thus can reduce the number of human-computer interactions and improve the operation efficiency. Of course, the technical solution of the present disclosure is not limited to reading APPs, it may also be used in other kinds of APPs. [0045] In the present example, specifically, detecting the input gesture received by the mobile terminal on the touch screen showing the specified content page of the APP is: on a reading page of the APP that shows the content of a specific content vector of the APP, detecting the input gesture received by the intelligent terminal; the content vector refers to an electronic book, an electronic magazine, an electronic file, or even webpage content etc.
[0046] In an example, the start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specified area of the screen is: the start instruction of a content displaying function of the APP. However, the start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specified area of the screen is not limited to be for the content displaying function. In other examples, the start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specified area of the screen can be used for other function items, e.g., any function item contained in the menu, e.g., a setting function item, a reading progress adjustment & setting function item, etc. The mapping relation between the specified sliding gesture and the start instruction of the specified function item of the APP can be preconfigured in advance as needed. For example, a sliding gesture that slides vertically from the start point in the specified area of the touch screen may correspond to the start instruction of the content displaying function, a horizontal sliding gesture may correspond to the start instruction of the reading progress adjustment & setting function, and an arc-shaped sliding gesture that starts from a center point area of the touch screen may correspond to the start instruction of a screen rotation function, etc.
[0047] Based on the above examples, in a further example, referring to the flow chart of a detailed example shown in Fig.2, the start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specified area of the screen is: the start instruction of the content displaying function of the APP. Triggering the identified start instruction of the specified function item at block 103 includes: obtaining the content data of the content vector (e.g., an electronic book, an electronic magazine, etc. to which the current content page belongs) to which the content page that is currently displayed on the screen belongs, and displaying the obtained content data in a specified area of the current screen (usually on the left side of the screen). After triggering the identified start instruction of the specified function item to start the content displaying function item, as shown in Fig.2, the method further includes the following process.
[0048] At block 104, a triggering operation received by the intelligent terminal is detected.
[0049] At block 105, if it is detected that an item of the content data is triggered, then a content page corresponding to the triggered item is looked up, and the content page corresponding to the triggered item is displayed on the current screen.
[0050] In the further example shown in Fig.2, after displaying the obtained content data in the specified area of the current screen and before it is detected that the item of the content data is triggered, the method further includes: detecting a triggering operation received by the intelligent terminal, and after it is detected that the triggering operation occurs in an area outside of the area where the content data is displayed on the screen, cancelling display of the content data, and restoring display of screen contents that are displayed before the content data is displayed. Here, the triggering operation can be a tap operation.
[0051] Fig.3 shows a detailed flow chart of detecting the specified sliding gesture whose start point is in the specified area of the touch screen according to an example. Referring to Fig.3, in the above block 101, the method of detecting the specified sliding gesture whose start point is in the specified area of the touch screen may include the following process.
[0052] At block 301, it is determined whether there is a finger touch on the touch screen, and if there is, then block 302 is entered, or otherwise, the determination continues.
[0053] At block 302, the number of fingers that fall on the screen is detected. If it is a single finger, then block 303 is entered, or otherwise block 301 is returned to perform the determination.
[0054] At block 303, the top left or the top right of the touch screen is taken as a zero position, a coordinate (x, y) of the finger touch on the touch screen is obtained, and if x is smaller than or equals to w/z, where w is the total width of the touch screen and z is a predefined value, then block 303 is entered, or otherwise, block 301 is returned to continue to perform the determination. [0055] In a further example, the range of the value of z in block 303 is [4, 5]. In this way, it more conforms to a real scenario where the user operates the APP. That is, when the user touches an area from the left edge of the screen to 1/4 to 1/5 of the screen from left (taking the top left corner of the screen as the zero position) by a single finger, or touches an area from the right edge of the screen to 1/4 to 1/5 of the screen from right (taking the top right corner of the screen as the zero position) by a single finger, block 303 is entered. When the range of the value of z is [4, 5], it more conforms to an actual scenario when the user operates the APP, which enables the user to make a sliding gesture in the most convenient operation pose.
[0056] At block 304, it is determined whether the finger slides, and if it does, then block 305 is entered; or otherwise, block 301 is returned to continue to perform the determination.
[0057] At block 305, the movement direction of the finger (X, Y) is obtained, where X is a horizontal movement distance, Y is a vertical movement distance, and if both the values of X and Y meet predefined triggering conditions, block 306 is entered; or otherwise block 304 is returned to continue to perform the determination.
[0058] At block 305, the predefined triggering conditions that the values of X and Y meet are: X equals to 0 and Y is larger than or equals to heightT, where heightT is a preconfigured smallest vertical movement distance; that is, the finger touch needs to slide vertically a certain distance that is larger than or equals to heightT; or, X is larger than 0, Y is larger than 0, arctg (X/Y) is smaller than angleT, and X2 +Y2 is larger than or equals to lengthT, where angleT is a preconfigured largest angle, lengthT is a preconfigured smallest movement distance; that is, generally, when a finger touch of the user slides vertically, it may not be absolutely vertical, but in a certain angle more or less, and thus with the triggering condition, the sliding angle and sliding distance of the user's finger touch can be defined, and as long as the sliding angle is in the range of angleT and the sliding distance is larger than or equals to lengthT, the predefined triggering condition will be met.
[0059] At block 306, it is determined whether the finger leaves the screen, and if it does, then it is determined that the specified sliding gesture whose start point is in the specified area of the screen is detected. [0060] As shown in Fig.4, Fig.4 shows a schematic diagram of triggering display of electronic book content by using a finger to slide downwards in a specified area on the left side of the touch screen.
[0061] In the above blocks, the detailed determination methods and data obtaining methods can adopt any existing solution which will be known by those skilled in the art in view of the above description of the present disclosure and will not be elaborated herein. For example, the operation system of the intelligent terminal can sense input gestures of the user on the touch screen, and each time an input gesture is sensed, an action message will be sent out. The method and apparatus of the present disclosure can make the above determination and data obtaining and further determine that a specified sliding gesture whose start point is in the specified area of the screen by only detecting the action message by using the API of the operation system of the intelligent terminal.
[0062] In correspondence to the above method, the present disclosure further discloses an apparatus for controlling an application on an intelligent terminal to execute the above method.
[0063] Fig. 5 shows a schematic diagram of the constitution of an apparatus for controlling an application on an intelligent terminal according to an example of the present disclosure. The control apparatus is inside the APP of the intelligent terminal. And referring to Fig.5, the control apparatus includes a gesture detection module 501, an identification module 502, and an instruction execution module 503.
[0064] The gesture detection module 501, when an intelligent terminal runs an application, detects an input gesture received on the touch screen showing a specified content page of the APP.
[0065] The identification module 502 identifies a start instruction of the specified function item of the APP corresponding to the specified sliding gesture whose start point is in the specified area of the screen according to a preconfigured mapping relation between the specified sliding gesture and the start instruction of the specified function item of the APP, when the gesture detection module 501 detects the specified sliding gesture whose start point is in the specified area of the screen to start the specified function item of the application. [0066] The instruction execution module 503 triggers the identified start instruction of the specified function item of the APP.
[0067] In the present example, the control apparatus can be configured inside of the APP to control the APP. In an example, the APP is a reading APP such as an electronic book reading APP, a news information reading APP, a magazine reading APP, a webpage browsing APP, etc. And usually, these APPs have a content displaying function, and are very suitable to use the technical scheme of the present disclosure to trigger a start instruction of the content displaying function, and thus can reduce the number of human-computer interactions and improve the operation efficiency. Of course, the technical solution of the present disclosure is not limited to the reading APPs, but can be used in other kinds of APPs.
[0068] The specified content page of the APP is: a reading page of the APP that shows the content of a specific content vector of the APP; the content vector is an electronic book, an electronic magazine, an electronic file, or even webpage content, etc.
[0069] The start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specific area of the screen is: the start instruction of the content displaying function of the APP. However, the start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specified area of the screen is not limited to the content displaying function. In other examples, the start instruction of the specified function item corresponding to the specified sliding gesture whose start point is in the specified area of the screen can refer to other function items, e.g., any function item contained in the menu, e.g., a setting function item, a reading progress adjustment & setting function item, etc.
[0070] Based on the above example, in a further example, the instruction execution module 503 further obtains the content data of the content vector (e.g., an electronic book, an electronic magazine, etc. to which current page content belongs) to which the content page that is currently displayed on the screen belongs, and displays the obtained content data in a specified area of the current screen.
[0071] In the present example, as shown in Fig.6, the control apparatus further includes: a triggering detection module 504 to detect a triggering operation received by the intelligent terminal; and a page switching module 505 to look up a content page corresponding to an item of the content data and display the content page after the item of the content data being triggered is detected by the triggering detection module 504.
[0072] In a further example, the page switching module 505 further cancels display of the content data and restore display of screen contents that are displayed before the content data is displayed, after the obtained content data is displayed in the specified area of the current screen and the triggering detection module 504 detects that a tap operation is received in an area outside of the content data on the touch screen.
[0073] Fig.7 shows a schematic diagram of the detailed constitution of a gesture detection module according to examples of the present disclosure. Referring to Fig.7, the gesture detection module 501 specifically includes: a first determination module 511, a second determination module 512, a third determination module 513, a fourth determination module 514, a fifth determination module 515, and a sixth determination module 515.
[0074] The first determination module 511 determines whether there is a finger touch on the touch screen, and if there is, then a second determination module 512 is triggered, or otherwise the determination continues.
[0075] The second determination module 512 determines the number of fingers that fall on the screen. If it is a single finger, then the third determination module 513 is triggered, or otherwise, the first determination module is returned to perform the determination.
[0076] The second determination module 513 takes the top left or the top right of the touch screen as a zero position, and obtains a coordinate (x, y) of the finger touch on the touch screen. If x is smaller than or equals to w/z, where w is the total width of the touch screen and z is a predefined value, then a third determination module 514 is triggered, or otherwise the first determination module is returned to continue to perform the determination.
[0077] In a further example, the range of the value of z is [4:5]. In this way, it more conforms to a real scenario of the user operating the APP. That is, when the user touches an area from the left edge of the screen to 1/4 to 1/5 of the screen from left (taking the top left corner of the screen as the zero position) by a single finger, or touches an area from the right edge of the screen to 1/4 to 1/5 of the screen from right (taking the top right corner of the screen as the zero position) by a single finger, further process is performed.
[0078] The fourth determination module 514 determines whether the finger touch slides, and if it does, then the fourth determination module 515 is triggered; or otherwise the first determination module 511 will be returned to continue to perform the determination.
[0079] The fourth determination module 515 obtains a sliding movement direction of the finger (X, Y), where X is a horizontal movement distance of the finger, Y is a vertical movement distance of the finger, and if both the values of X and Y meet predefined triggering conditions, the fifth determination module 516 is triggered, or otherwise, the first determination module 511 is returned to perform the determination.
[0080] In the fourth determination module 515, the predefined triggering conditions that the values of X and Y meet are: X equals to 0 and Y is larger than or equals to heightT, where heightT is a preconfigured smallest vertical movement distance; that is, the finger touch needs to slide vertically a certain distance that is larger than or equals to heightT; or, X is larger than 0, Y is larger than 0, arctg (X/Y) is smaller than angleT, and X2+Y2 is larger than or equals to lengthT, where angleT is a preconfigured largest angle, lengthT is a preconfigured smallest movement distance; that is, generally, when a finger touch of the user slides vertically, it may not be absolutely vertical, but in a certain angle more or less, and thus with the triggering condition, the sliding angle and sliding distance of the user's finger touch can be defined, and as long as the sliding angle is in the range of angleT and the sliding distance is larger than or equals to lengthT, the predefined triggering condition will be met.
[0081] The fifth determination module 516 determines whether the finger leaves the screen, and if it does, then it is determined that the specified sliding gesture whose start point is in the specified area of the screen is detected.
[0082] In the above determination modules, the detailed determination methods and data obtaining methods can adopt any existing solution which will be known by those skilled in the art in view of the above description of the present disclosure and will not be elaborated herein. For example, the operation system of the intelligent terminal can sense input gestures of the user on the touch screen, and each time an input gesture is sensed, an action message will be sent out. The method and apparatus of the present disclosure can make the above determination and data obtaining and further determine that a specified sliding gesture whose start point is in the specified area of the screen by only detecting the action message by using the API of the operation system of the intelligent terminal.
[0083] In addition, the respective function modules in the examples of the present disclosure can be integrated into a processing unit, or may also exist individually, or two or more of which may be integrated into a unit.
[0084] Fig.8 shows an apparatus for controlling an application on a mobile terminal according to an example of the present disclosure. Referring to Fig.8, the apparatus includes a memory 80, one or more processors 81, and one or more programs 82 stored in the memory 80 and configured to be executed by the one or more processors 81. The one or more programs 82 include instructions to execute the blocks 101 to 105 and the blocks 301 to 306.
[0085] Fig. 9 shows a non-transitory computer readable storage medium according to an example of the present disclosure. Referring to Fig.9, the storage medium 90 stores the instructions that execute the blocks 101 to 105 and the blocks 301 to 306. The storage medium can be for example a paper storage medium (e.g., paper tape), magnetic storage medium (e.g., floppy disk, hard disk, flash disk, etc.), optical storage medium (e.g., CD- ROM), Optical-magnetic storage medium (e.g., MO, etc.).
[0086] The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.

Claims

WHAT IS CLAIMED IS:
1. A method for controlling an application on an intelligent terminal, comprising: detecting, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
identifying a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and
triggering the identified start instruction to start the function item of the application.
2. The method according to claim 1, wherein triggering the identified start instruction to start the function item of the application comprises:
obtaining content data of content vector to which the first content page belongs; and displaying the obtained content data on a second area of the touch screen.
3. The method according to claim 2, wherein after triggering the identified start instruction to start the function item of the application, the method further comprises:
detecting an item in the content data being triggered;
looking up a second content page that corresponds to the triggered item; and displaying the second content page that corresponds to the triggered item on the touch screen.
4. The method according to claim 3, wherein after displaying the obtained content data on the second area of the touch screen and before detecting the item in the content data being triggered, the method further comprises:
detecting a triggering operation occurring outside of the second area of the touch screen; and
cancelling display of the content data according to the detected triggering operation.
5. The method according to any of claims 1 to 4, wherein, detecting the sliding gesture whose start point is in the first area of the touch screen comprises: determining that there is a single touch in the first area;
determining that the single touch slides;
determining that a horizontal sliding distance X and a vertical sliding distance Y that the single touch slides meet a preconfigured triggering condition; and determining that the single touch leaves the touch screen.
6. The method according to claim 5, wherein the first area is an area on the left or right of the touch screen, and a width of the first area is smaller than or equals to a ratio of a width of the touch screen.
7. The method according to claim 6, wherein the preconfigured triggering condition that X and Y meet is that:
X equals to 0 and Y is larger than or equals to a preconfigured smallest vertical movement distance; or
X is larger than 0, Y is larger than 0, arctg (X/Y) is smaller than a preconfigured largest angle, and X2+Y2 is larger than or equals to a preconfigured smallest movement distance.
8. The method according to claim 6, wherein the ratio is 1/5 or 1/4.
9. An apparatus for controlling an application on an intelligent terminal, comprising: a memory;
one or more processors;
one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions to:
detect, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
identify a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and
trigger the identified start instruction to start the function item of the application.
10. The apparatus according to claim 9, wherein the one or more programs comprise instructions to:
obtain content data of content vector to which the first content page belongs; and display the obtained content data on a second area of the touch screen.
11. The apparatus according to claim 10, wherein the one or more programs comprise instructions to:
detect an item in the content data being triggered;
look up a second content page that corresponds to the triggered item; and
display the second content page that corresponds to the triggered item on the touch screen.
12. The apparatus according to claim 11, wherein the one or more programs comprise instructions to
detect a triggering operation occurring outside of the second area of the touch screen; and
cancel display of the content data according to the detected triggering operation.
13. The apparatus according to any of claims 9 to 12, wherein the one or more programs comprise instructions to
determine that there is a single touch in the first area;
determine that the single touch slides;
determine that a horizontal sliding distance X and a vertical sliding distance Y that the single touch slides meet a preconfigured triggering condition; and determine that the single touch leaves the touch screen.
14. The apparatus according to claim 13, wherein the first area is an area on the left or right of the touch screen, and a width of the first area is smaller than or equals to a ratio of a width of the touch screen.
15. The apparatus according to claim 14, wherein the preconfigured triggering condition that X and Y meet is that:
X equals to 0 and Y is larger than or equals to a preconfigured smallest vertical movement distance; or
X is larger than 0, Y is larger than 0, arctg(X/Y) is smaller than a preconfigured largest angle, and X2+Y2 is larger than or equals to a preconfigured smallest movement distance.
16. The apparatus according to claim 14, wherein the ratio is 1/5 or 1/4.
17. A computer readable storage medium storing one or more programs configured to be executed by a computer system, the one or more programs comprising instructions to: detect, on a touch screen of the intelligent terminal that displays a first content page of an application, a sliding gesture whose start point is in a first area of the touch screen;
identify a start instruction of a function item of the application that corresponds to the sliding gesture, according a preconfigured mapping relation between the detected sliding gesture and the start instruction of the function item of the application; and
trigger the identified start instruction to start the function item of the application.
18. The computer readable storage medium according to claim 17, wherein the one or more programs comprise instructions to:
obtain content data of content vector to which the first content page belongs; and display the obtained content data on a second area of the touch screen.
19. The computer readable storage medium according to claim 18, wherein the one or programs comprises instructions to:
detect an item in the content data being triggered;
look up a second content page that corresponds to the triggered item; and
display the second content page that corresponds to the triggered item on the touch screen.
20. The computer readable storage medium according to any of claims 17 to 19, wherein the one or more programs comprise instructions to:
determine that there is a single touch in the first area; determine that the single touch slides;
determine that a horizontal sliding distance X and a vertical sliding distance Y that the single touch slides meet a preconfigured triggering condition; and
determine that the single touch leaves the touch screen.
PCT/CN2014/077843 2013-05-28 2014-05-20 Method and apparatus for controlling application on intelligent terminal WO2014190862A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310202964.X 2013-05-28
CN201310202964.XA CN104182166A (en) 2013-05-28 2013-05-28 Control method and device of intelligent terminal application program

Publications (1)

Publication Number Publication Date
WO2014190862A1 true WO2014190862A1 (en) 2014-12-04

Family

ID=51963260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/077843 WO2014190862A1 (en) 2013-05-28 2014-05-20 Method and apparatus for controlling application on intelligent terminal

Country Status (2)

Country Link
CN (1) CN104182166A (en)
WO (1) WO2014190862A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111381666A (en) * 2018-12-27 2020-07-07 北京右划网络科技有限公司 Control method and device based on sliding gesture, terminal equipment and storage medium
US10908868B2 (en) 2017-01-26 2021-02-02 Huawei Technologies Co., Ltd. Data processing method and mobile device
CN113608658A (en) * 2021-06-15 2021-11-05 南京统信软件技术有限公司 Page sliding control method and mobile terminal

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105843594A (en) * 2015-01-13 2016-08-10 阿里巴巴集团控股有限公司 Method and device for displaying application program page of mobile terminal
CN106293795A (en) * 2015-06-09 2017-01-04 冠捷投资有限公司 Startup method
CN105094801B (en) * 2015-06-12 2019-12-24 阿里巴巴集团控股有限公司 Application function activation method and device
CN105182760A (en) * 2015-07-22 2015-12-23 小米科技有限责任公司 Intelligent household equipment remote control method, device and terminal
CN108536363A (en) * 2017-03-03 2018-09-14 上海传英信息技术有限公司 Program assembly display processing method and system applied to mobile terminal
CN110099210A (en) * 2019-04-22 2019-08-06 惠州Tcl移动通信有限公司 Function items setting method, device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102169416A (en) * 2011-04-27 2011-08-31 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and method for page jump of touch panel
WO2011130919A1 (en) * 2010-04-23 2011-10-27 Motorola Mobility, Inc. Electronic device and method using touch-detecting surface
CN102855081A (en) * 2011-06-07 2013-01-02 三星电子株式会社 Apparatus and method for providing web browser interface using gesture in device
CN103092496A (en) * 2011-11-02 2013-05-08 腾讯科技(深圳)有限公司 Browser control method, browser control device and mobile equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102955672B (en) * 2012-11-06 2015-12-16 北京京东世纪贸易有限公司 A kind of method and apparatus of display page on the equipment with touch-screen

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011130919A1 (en) * 2010-04-23 2011-10-27 Motorola Mobility, Inc. Electronic device and method using touch-detecting surface
CN102169416A (en) * 2011-04-27 2011-08-31 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and method for page jump of touch panel
CN102855081A (en) * 2011-06-07 2013-01-02 三星电子株式会社 Apparatus and method for providing web browser interface using gesture in device
CN103092496A (en) * 2011-11-02 2013-05-08 腾讯科技(深圳)有限公司 Browser control method, browser control device and mobile equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10908868B2 (en) 2017-01-26 2021-02-02 Huawei Technologies Co., Ltd. Data processing method and mobile device
US11567725B2 (en) 2017-01-26 2023-01-31 Huawei Technologies Co., Ltd. Data processing method and mobile device
CN111381666A (en) * 2018-12-27 2020-07-07 北京右划网络科技有限公司 Control method and device based on sliding gesture, terminal equipment and storage medium
CN111381666B (en) * 2018-12-27 2023-08-01 北京右划网络科技有限公司 Control method and device based on sliding gesture, terminal equipment and storage medium
CN113608658A (en) * 2021-06-15 2021-11-05 南京统信软件技术有限公司 Page sliding control method and mobile terminal
CN113608658B (en) * 2021-06-15 2024-01-02 南京统信软件技术有限公司 Page sliding control method and mobile terminal

Also Published As

Publication number Publication date
CN104182166A (en) 2014-12-03

Similar Documents

Publication Publication Date Title
WO2014190862A1 (en) Method and apparatus for controlling application on intelligent terminal
EP2825950B1 (en) Touch screen hover input handling
EP2533146B1 (en) Apparatus and method for providing web browser interface using gesture in device
US10275151B2 (en) Apparatus and method for cursor control and text selection and editing based on gesture-based touch inputs received in a virtual keyboard display area
KR101710418B1 (en) Method and apparatus for providing multi-touch interaction in portable device
US8633909B2 (en) Information processing apparatus, input operation determination method, and input operation determination program
EP3055762B1 (en) Apparatus and method of copying and pasting content in a computing device
US9152316B2 (en) Electronic device, controlling method thereof, and non-transitory storage medium
CN104238927B (en) The control method and device of intelligent terminal application program
EP2560086B1 (en) Method and apparatus for navigating content on screen using pointing device
WO2014121626A1 (en) Displaying method, device and storage medium of mobile terminal shortcuts
US20130038552A1 (en) Method and system for enhancing use of touch screen enabled devices
US20130201121A1 (en) Touch display device and touch method
US20150286283A1 (en) Method, system, mobile terminal, and storage medium for processing sliding event
CN106445972B (en) Page display method and device
US10474344B2 (en) Method, apparatus and recording medium for a scrolling screen
JP5854928B2 (en) Electronic device having touch detection function, program, and control method of electronic device having touch detection function
US9727305B2 (en) Method and electronic device for information processing
EP2677413B1 (en) Method for improving touch recognition and electronic device thereof
US9563346B2 (en) Method for scrolling a displayed image in a touch system
US20130318482A1 (en) Gestural control for quantitative inputs
US20110316887A1 (en) Electronic device with a touch screen and touch operation control method utilized thereby
CN110417984B (en) Method, device and storage medium for realizing operation in special-shaped area of screen
CN103809794A (en) Information processing method and electronic device
CN110764683A (en) Processing operation method and terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14804614

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 20.04.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14804614

Country of ref document: EP

Kind code of ref document: A1