WO2016045579A1 - 一种应用程序交互控制方法、装置及终端 - Google Patents

一种应用程序交互控制方法、装置及终端 Download PDF

Info

Publication number
WO2016045579A1
WO2016045579A1 PCT/CN2015/090288 CN2015090288W WO2016045579A1 WO 2016045579 A1 WO2016045579 A1 WO 2016045579A1 CN 2015090288 W CN2015090288 W CN 2015090288W WO 2016045579 A1 WO2016045579 A1 WO 2016045579A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
touch
touch gesture
interaction control
touch area
Prior art date
Application number
PCT/CN2015/090288
Other languages
English (en)
French (fr)
Inventor
杨见
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2016045579A1 publication Critical patent/WO2016045579A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof

Definitions

  • This paper relates to screen touch technology, and in particular to an application interaction control method, device and terminal.
  • some physical function buttons are usually set on the border of the screen of the mobile terminal, and in the process of using the application program, some functions and parameters of the application program can be adjusted relatively quickly through these physical buttons.
  • the exposure and the time of the shutter can be adjusted by the physical keys that increase or decrease the volume.
  • buttons to control the application
  • the application of the physical buttons or the interactive interface of the application may have the following inconveniences:
  • the border of the screen is relatively narrow.
  • the side borders on the left and right sides of the terminal screen are relatively narrow, when the user holds the terminal to perform the corresponding application interaction operation, the palm easily touches the screen of the side frame, thereby deriving the false touch phenomenon.
  • the touch gesture operation of the related art has a limited limitation, and the physical interface does not replace the physical well in some application interfaces.
  • the function of the button at the same time, the use of the interactive interface of the application itself is more complicated and has a greater limitation.
  • the touch gestures of the related art are generally defined in the full screen. Therefore, when there are many functions, more touch gestures need to be divided for this purpose, so that the user's use difficulty is enhanced and the user experience is poor.
  • the technical problem to be solved by the present invention is to provide an application interaction control method, device, and terminal, so as to solve the problem that the related technology does not completely rely on the touch gesture to control the operation of the application program.
  • the touch gesture of the related art has a limited operation, and the function of the physical button is not well replaced in the interface of some applications.
  • the use of the interface of the application itself is complicated and has limited limitations.
  • the touch gestures of the related art are generally defined in the full screen. Therefore, when there are many functional requirements, more touch gestures need to be divided for this purpose, thereby making the user's use difficulty and the user experience poor.
  • An application interaction control method comprising:
  • the method before the step of acquiring a touch gesture corresponding to the interaction control interface of the application by the touch area of the border of the mobile terminal, the method further includes:
  • the step of acquiring a touch gesture corresponding to the interaction control interface of the application in the touch area of the border of the mobile terminal includes:
  • the step of responding to the touch gesture by the interaction control interface and executing an interaction control command corresponding to the touch gesture includes:
  • the step of responding to the touch gesture by the interaction control interface and executing an interaction control command corresponding to the touch gesture further includes:
  • the step of acquiring a touch gesture corresponding to the interaction control interface of the application in the touch area of the border of the mobile terminal includes:
  • the method before the step of acquiring the interaction control interface of the application, the method further includes:
  • the application control command corresponding to the interaction control interface is extracted, and the category of the control command of the application is divided, and the touch area corresponding to the control command of the application of the same category is set according to the adjacent orientation.
  • the step of setting the corresponding touch area and the touch gesture according to the interaction control interface of the application includes:
  • trigger data of a virtual button of the application where the trigger data includes a trigger scene and a trigger frequency
  • the touch area is associated with the selected touch gesture, so that the same touch gesture refers to different mobile terminal control commands in different touch areas.
  • the step of dividing the touch area by analyzing the trigger data and the frame structure of the mobile terminal includes:
  • the frame structure of the mobile terminal is analyzed, and the touch area of the mobile terminal is divided according to the user selection or according to the trigger scene and the trigger frequency.
  • An application interaction control device comprising a setting module and a control module, wherein:
  • the setting module is configured to: obtain a touch gesture corresponding to an interaction control interface of the application in a touch area of the border of the mobile terminal;
  • the control module is configured to: execute an interaction control command corresponding to the touch gesture by using the interaction control interface to respond to the touch gesture.
  • the device further includes an initialization module, where the initialization module includes an application identification unit, a first interaction control interface acquisition unit, and a second interaction control interface acquisition unit, where
  • the application identification unit is configured to: identify a currently running application within a screen display range;
  • the first interaction control interface acquiring unit is configured to: acquire an operation interface of the application, and acquire a first interaction control interface according to the operation interface;
  • the second interaction control interface acquiring unit is configured to: acquire an operating environment of the application, and acquire a second interaction control interface according to the operating environment.
  • the setting module includes a touch area dividing unit, a touch area associating unit, and a touch gesture setting unit, where
  • the touch area dividing unit is configured to: divide the touch area into a first level touch area and a second level touch area;
  • the touch area association unit is configured to: associate the first level touch area with the first interaction control interface, and associate the second level touch area and the second interaction control interface;
  • the touch gesture setting unit is configured to: set an application-level first touch gesture according to the first-level touch area, and set an operating system-level second touch according to the second-level touch area; gesture.
  • the setting module further includes a priority setting unit, a touch gesture acquiring unit, a touch gesture dividing unit, and a touch gesture response unit, where
  • the priority setting unit is configured to: set a first priority according to the first hierarchical touch area, and set a second priority according to the second hierarchical touch area;
  • the touch gesture acquisition unit is configured to: acquire the touch gesture through the touch area within a preset time;
  • the touch gesture division unit is configured to divide the touch gesture according to the first touch gesture and the second touch gesture
  • the touch gesture response unit is configured to: respond to the first touch gesture of the first level touch area according to the first priority, and respond to the second level touch according to the second priority The second touch gesture of the area;
  • the priority setting unit is further configured to: set a third priority according to the first priority and the second priority;
  • the touch gesture response unit is further configured to: respond to the first touch gesture of the first hierarchical touch area according to the third priority, and simultaneously respond to the second level touch according to the third priority The second touch gesture of the control area.
  • the setting module is configured to obtain a touch gesture corresponding to the interaction control interface of the application in the touch area of the border of the mobile terminal as follows:
  • the setting module is further configured to:
  • the application control command corresponding to the interaction control interface is extracted, and the category of the control command of the application is divided, and the touch area corresponding to the control command of the application of the same category is set according to the adjacent orientation.
  • the setting module is configured to set a corresponding touch area and a touch gesture according to the interaction control interface of the application according to the following manner:
  • trigger data of a virtual button of the application where the trigger data includes a trigger scene and a trigger frequency
  • the touch area is associated with the selected touch gesture, so that the same touch gesture refers to different mobile terminal control commands in different touch areas.
  • the setting module is configured to divide the touch area by analyzing the trigger data and a frame structure of the mobile terminal as follows:
  • the frame structure of the mobile terminal is analyzed, and the touch area of the mobile terminal is divided according to the user selection or according to the trigger scene and the trigger frequency.
  • An application interaction control terminal comprising any of the above application interaction control devices.
  • a computer program comprising program instructions that, when executed by a terminal, cause the terminal to perform any of the above-described application interaction control methods.
  • the control commands of the corresponding application programs are sent according to the corresponding buttons or combined buttons and the like by different interactive control interfaces.
  • the interaction control of the application is more in line with the user's usage habits, and the user experience is enhanced.
  • FIG. 1 is a flowchart of an application interaction method according to an embodiment of the present invention.
  • FIG. 2 is a flowchart of an application interaction method according to a second preferred embodiment of the present invention.
  • FIG. 3 is a flowchart of an application interaction method according to a third preferred embodiment of the present invention.
  • FIG. 4 is a flowchart of an application interaction method according to a fourth preferred embodiment of the present invention.
  • FIG. 5 is a flowchart of an application interaction method according to a fifth preferred embodiment of the present invention.
  • FIG. 6 is a structural block diagram of an application interaction apparatus according to an embodiment of the present invention.
  • FIG. 1 is a flowchart of an application interaction method according to an embodiment of the present invention. The method includes:
  • S1 Obtain a touch gesture corresponding to an interaction control interface of the application in a touch area of the border of the mobile terminal.
  • the interactive interface of the application of the mobile terminal is provided with one or more virtual interactive buttons, such as a scroll up button, a down button, an add button, and a delete button in the application.
  • the mobile phone is taken as an example.
  • the up-down key N1 and the down-turn key N2 of the mobile phone are set in the position of the upper border of the screen, the addition key N3 is set in the position of the right border in the screen, and the delete key N4 is set on the screen. The position inside the left border.
  • an interactive control interface for the up button N1, the down button N2, the add button N3, and the delete button N4 on the mobile phone is obtained.
  • the interactive control interface refers to a calling interface that is reserved for the utility function of the application by the application of the mobile phone operating system, and causes the application to execute the control command by calling an interactive control interface of the operating system-based application. To complete the corresponding function.
  • the first step is to obtain an interactive control interface of the up button N1 of the application of the mobile phone, an interactive control interface of the down key N2, an interactive control interface of the add key N3, and an interactive control interface of the delete key N4.
  • two or more virtual interactive buttons are simultaneously pressed to implement corresponding functions, for example, using a down key N2 and a delete key N4
  • the screen capture function is triggered. Therefore, the interactive control interface of the down key N2 and the interactive control interface of the delete key N4 are simultaneously acquired, and then the interactive control interface invoked by the screen capture is synthesized according to the control commands of the interfaces.
  • the corresponding function is implemented by pressing in a certain order, for example, when the down key N2 and the delete key N4 are pressed in one second respectively, a screen capture is triggered.
  • the interactive control interface of the down key N2 is first acquired, and then the interactive control interface of the delete key N4 is acquired in one second, and if so, the interactive control invoked by the screen capture is synthesized according to the control commands of the two interfaces. interface.
  • different interaction control interfaces may also send control commands of the corresponding application according to factors such as combined time, such as a combination situation at the same time and a sequential combination of situations.
  • the application control command corresponding to the interaction control interface is extracted, and the category of the control command of the application program is divided, and the category may be divided according to the control object. Or according to the combination order of the interactive control interfaces, etc., it can be understood that by dividing the category of the control command of the application, and the touch area corresponding to the control command of the application of the same category is set according to the adjacent orientation, the touch operation is performed. More in line with the user's usage habits, enhance the user experience.
  • the corresponding touch area and the touch gesture are set according to the interactive control interface of the application.
  • the specific settings are as follows:
  • the trigger data of the virtual button of the application is recorded within a preset time period.
  • the trigger data includes a trigger scene and a trigger frequency.
  • this step is to trigger the trigger data of the virtual button of the application for a period of time, obtain the trigger scene of the virtual button of the single application and the trigger frequency, and acquire the scene and sequence of the virtual button of the multiple applications. And frequency.
  • the triggering scenario mentioned in this embodiment refers to a scenario based on an application interface of a current operating system, for example, a display interface of a current operating system, an interactive operation interface of an application, and the like.
  • the touch area is divided by analyzing the trigger data and the frame structure of the terminal.
  • the trigger scene and the trigger frequency of the virtual buttons of each application are acquired and analyzed, and on the other hand, the trigger scene and the trigger frequency of the virtual buttons of the application when combined are acquired and analyzed.
  • the data of the above analysis is sorted, for example, by the frequency of use of the virtual keys of the application or by the relevance of the relevance of the virtual keys of the application.
  • the border of the terminal in the embodiment of the present invention may be a touch panel area near the edge of the screen in the screen, or an area of the terminal frame structure near the edge of the screen. It can be understood that when the latter scheme is adopted, A corresponding touch sensor needs to be set in this area.
  • the touch gesture database is used to capture the touch gesture, and the touch gesture is divided according to the gesture type.
  • a touch gesture corresponding to the physical range is selected according to the physical range of the touch area.
  • the physical range of the divided touch area is analyzed. If the physical range is small, a touch gesture of long pressing or double clicking is selected. If the physical range is large, a touch gesture of dragging or sliding may be selected.
  • touch area and the selected touch gesture are associated.
  • the association between the touch area and the selected touch gesture is established, that is, the same touch gesture refers to different terminal control commands in different touch areas.
  • the control command of the application corresponding to the interactive control interface is responded according to the touch gesture.
  • the four interactive control interfaces of the mobile phone up button N1, the down button N2, the add button N3, and the delete button N4 are associated with the corresponding touch gestures, and then, according to the touch region of the touch gesture and The type of touch gesture responds to the corresponding application control command.
  • the beneficial effect of the embodiment is that the control commands of the application corresponding to the interaction control interface are extracted, so that different interaction control interfaces can send corresponding control commands of the application according to the corresponding buttons or combined buttons.
  • the touch operation is more in line with the user's usage habits, and the user experience is enhanced.
  • FIG. 2 is a flow chart of an application interaction method according to a second preferred embodiment of the present invention.
  • the interactive control interface Before the acquisition operation of the interactive control interface is completed, according to the application Before the interactive control interface sets the corresponding touch area and touch gestures, it includes:
  • S01 identifying the currently running application within the on-screen display range.
  • One way is to detect the detailed information of the running process in the operating system process manager of the mobile terminal, and obtain the currently running application according to the process; another way is to identify the interactive interface of the current screen through image recognition technology, The interactive interface determines to identify an application corresponding thereto.
  • S02 Obtain an operation interface of the application, and obtain a first interaction control interface according to the operation interface.
  • the running interface of this step refers to an interface that is displayed in the touch screen and belongs to the function range of the application under the current interactive interface. Therefore, the running interface discharges the interactive interface of the operating system.
  • the interaction control interface corresponding to the add key N3 and the delete key N4 of the application M is the first interactive control interface.
  • S03 Obtain an operating environment of the application, and obtain a second interactive control interface according to the operating environment.
  • the operating environment of this step refers to the interactive interface of the operating system of the mobile terminal at the current moment.
  • the interactive control interface corresponding to the home screen key N5, the return key N6, and the multi-tasking switch key N7 of the operating system is the second interactive control interface.
  • the beneficial effect of the embodiment is that, by acquiring the running interface of the application, acquiring the first interactive control interface according to the running interface, acquiring the running environment of the application, and acquiring the second interactive control interface according to the operating environment, so that the application is in the application
  • the control commands in the application and the control commands in the operating system level can be clearly distinguished, and the control logic is clearer and more efficient.
  • FIG. 3 is a flow chart of an application interaction method according to a third preferred embodiment of the present invention.
  • setting the corresponding touch area and the touch gesture according to the interaction control interface of the application includes:
  • the touch area is divided into a first level touch area and a second level touch area.
  • the divided touch regions are again divided according to the first hierarchical touch region and the second hierarchical touch region.
  • the beneficial effect of the embodiment is that, by dividing the touch area into the first level touch area and the second level touch area, an interactive control interface for calling multiple applications with a single touch gesture is realized, and the touch is improved.
  • the utilization of the gesture is controlled.
  • the application and the operating system are divided into two levels, the logic is clear, and the user is easy to learn and use, which reduces the learning difficulty of the touch gesture.
  • FIG. 4 is a flow chart of an application interaction method according to a fourth preferred embodiment of the present invention.
  • the interactive control interface responding to the touch gesture and executing the interactive control command corresponding to the touch gesture further includes:
  • a fast forward touch gesture P1 there are a fast reverse touch gesture P2, a pause play gesture P3, and a volume adjustment gesture P4
  • a touch gesture that zooms in focus there is a touch gesture that zooms in focus.
  • Q1 a touch gesture Q2 for zooming out the focal length and a touch gesture Q3 for adjusting the shutter time
  • a main screen touch gesture R1 a return touch gesture R2, and a multi-task switch touch Gesture R3.
  • the fast-forward touch gesture P1 is the same as the touch gesture Q1 that zooms in on the focal length; the fast-reverse touch gesture P2 is the same as the touch gesture Q2 that pulls the far focus; the volume adjustment gesture P4 and the adjusted delay shutter time
  • the touch gesture Q3 is the same.
  • the touch gesture is divided according to the first touch gesture and the second touch gesture.
  • the beneficial effect of the embodiment is that the first priority is set in the first level touch area, and the second priority is set in the second level touch area, and responds according to the corresponding touch area and the corresponding priority.
  • the corresponding touch gestures make the control commands of the touch gesture more diverse, and avoid the defects that the control commands corresponding to the touch gestures are easily conflicted when the multiple programs run in parallel.
  • FIG. 5 is a flowchart of an application interaction method according to a fifth preferred embodiment of the present invention.
  • the interactive control interface responding to the touch gesture and executing the interactive control command corresponding to the touch gesture further includes:
  • the third priority is set according to the first priority and the second priority. First, the first priority and the second priority are determined, and the third set is determined according to the determined result range. The priority is used to judge the touch gesture.
  • the first priority is at least one type, and therefore, the foregoing may be set according to different first priority (one or more) and the second priority.
  • Third priority is at least one type, and therefore, the foregoing may be set according to different first priority (one or more) and the second priority.
  • the beneficial effect of the embodiment is that the third priority is set by the first priority and the second priority, and the first touch gesture of the first-level touch area is responded to by the third priority, and the third priority is The level responds to the second touch gesture of the second level touch area.
  • the preset control priority is used to accurately respond to the control commands required by the user.
  • FIG. 6 is a structural block diagram of an application interaction apparatus according to an embodiment of the present invention.
  • the embodiment of the invention further provides an application interaction control device, the device comprising:
  • the setting module 10 is configured to: set a corresponding touch area and a touch gesture according to an interaction control interface of the application;
  • the control module 20 is configured to: respond to the touch gesture through the interactive control interface, and execute an interactive control command corresponding to the touch gesture.
  • the device further includes an initialization module 30, where the initialization module 30 includes an application identification unit 31, a first interaction control interface acquisition unit 32, and a second interaction control interface acquisition unit 33, where
  • the application identification unit 31 is configured to: identify the currently running application within the on-screen display range;
  • the first interaction control interface obtaining unit 32 is configured to: acquire an operation interface of the application, and acquire a first interaction control interface according to the operation interface;
  • the second interaction control interface obtaining unit 33 is configured to: acquire an operating environment of the application, and acquire a second interactive control interface according to the operating environment.
  • the setting module 10 includes a touch area dividing unit 11 , a touch area associating unit 12 , and a touch gesture setting unit 13 .
  • the touch area dividing unit 11 is configured to: divide the touch area into a first level touch area and a second level touch area;
  • the touch area association unit 12 is configured to: associate the first level touch area with the first interaction control interface, and associate the second level touch area and the second interaction control interface;
  • the touch gesture setting unit 13 is configured to: set the first touch gesture of the application level according to the first level touch area, and set the second touch gesture of the operating system level according to the second level touch area.
  • the setting module 10 further includes a priority setting unit 14, a touch gesture acquiring unit 15, a touch gesture dividing unit 16, and a touch gesture response unit 17, wherein
  • the priority setting unit 14 is configured to: set a first priority according to the first hierarchical touch area, and set a second priority according to the second hierarchical touch area;
  • the touch gesture acquiring unit 15 is configured to: acquire a touch gesture through the touch area within a preset time;
  • the touch gesture division unit 16 is configured to divide the touch gesture according to the first touch gesture and the second touch gesture;
  • the touch gesture response unit 17 is configured to: respond to the first touch gesture of the first level touch area according to the first priority, and respond to the second touch gesture of the second level touch area according to the second priority;
  • the priority setting unit 14 is further configured to: set a third priority according to the first priority and the second priority;
  • the touch gesture response unit 17 is further configured to: respond to the first touch gesture of the first level touch area according to the third priority, and respond to the second touch gesture of the second level touch area according to the third priority.
  • the embodiment of the invention further provides an application interaction control terminal, and the terminal comprises the above application interaction control device.
  • the embodiment of the invention further discloses a computer program, comprising program instructions, when the program instruction is executed by the terminal, so that the terminal can execute any of the above-mentioned application interaction control methods.
  • the embodiment of the invention also discloses a carrier carrying the computer program.
  • the control command of the corresponding application is sent by using different interactive control interfaces according to the corresponding button or the combined button.
  • the interaction control of the application is more in line with the user's usage habits, and the user experience is enhanced.
  • the application interaction control device can be used for a mobile phone, or other communication terminal having an application interaction control function, such as a smart phone or the like, can be a software unit running in the communication terminal, or can be integrated as a separate pendant. These communication terminals are either operating in the application systems of these mobile terminals.
  • the control command of the corresponding application is sent by using different interactive control interfaces according to the corresponding button or the combined button.
  • the interaction control of the application is more in line with the user's usage habits, and the user experience is enhanced. Therefore, the present invention has strong industrial applicability.

Abstract

一种应用程序交互控制方法、装置及终端。其中,该方法包括:在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势(S1);通过交互控制接口响应触控手势,并执行与触控手势对应的交互控制命令(S2)。实施该方法、装置及终端,通过提取与交互控制接口对应的应用程序的控制命令,实现了以不同的交互控制接口根据相应的按键或者组合的按键等情形,发送相应的应用程序的控制命令。同时,通过划分应用程序的控制命令的类别、划分触控区一级划分优先级等处理操作,使得应用程序的交互控制更符合用户的使用习惯,增强了用户体验。

Description

一种应用程序交互控制方法、装置及终端 技术领域
本文涉及屏幕触控技术,尤其涉及一种应用程序交互控制方法、装置及终端。
背景技术
在相关技术中,通常会在移动终端屏幕的边框设置一些物理的功能按键,在使用应用程序的过程中,通过这些物理按键可以较为快捷地调控应用程序的一些功能和参数。例如,在相机应用程序下,通过音量增减的物理键可以调节曝光度以及定时快门时间等。
同时,除了使用物理按键对应用程序进行控制,还可以通过在应用程序的交互界面上采用点击或者滑动操作,以实现对应用程序的一些功能和参数进行调控的目的。
但是,随着移动终端的边框越做越窄,出现省去物理按键的移动终端,因此,失去了通过这些物理按键较为快捷地调控应用程序的一些功能和参数这一途径。
同时,由于应用程序的功能越来越复杂,相应的交互界面也越来越复杂,每个应用程序的交互界面和交互方式也各不相同。因此,通过在应用程序的交互界面上采用点击或者滑动操作,以实现对应用程序的一些功能和参数进行调控时,实用性较差、操作难度较大,用户体验不佳,还容易造成误触现象的发生。
更进一步地,随着移动终端的屏幕尺寸也在向大的趋势发展,在这种情况下通过物理按键或者应用程序的交互界面对应用程序进行调控可能会有以下不便之处:
屏幕的边框比较窄,一方面,当终端屏幕左右两边的侧边框比较窄时,当用户手握终端进行相应的应用程序交互操作时,手掌容易误触侧边框的屏幕,从而导出误触现象的发生;
另一方面,当终端屏幕上下两边的顶、底部边框比较窄时,当用户横握终端观看视频,并对播放器程序的音量或者光线进行调节时,手指容易误触及其它的屏幕区域,从而导致误触现象的发生,同时,当下边框过窄,用户在握持终端操作点击屏幕下端按键的过程中,容易使得终端的重心不稳,从而造成终端从手中滑落的危险。
因此,相关技术中还没有一种完全依靠触控手势对应用程序进行控制操作的方法,同时,相关技术的触控手势操作局限性较大,在有些应用程序的界面内并不能很好替代物理按键的功能,同时,应用程序本身的交互界面的使用步骤较为复杂,局限性较大。另一方面,相关技术的触控手势通常定义在全屏幕,因此,当功能较多时,需要为此划分较多的触控手势,从而使得用户的使用难度增强,用户体验较差。
发明内容
有鉴于此,本发明要解决的技术问题是提供一种应用程序交互控制方法、装置及终端,以解决相关技术中还没有一种完全依靠触控手势对应用程序进行控制操作的方法,同时,相关技术的触控手势操作局限性较大,在有些应用程序的界面内并不能很好替代物理按键的功能,应用程序本身的交互界面的使用步骤较为复杂,局限性较大,另一方面,相关技术的触控手势通常定义在全屏幕,因此,当功能需求较多时,需要为此划分较多的触控手势,从而使得用户的使用难度增强,用户体验较差的缺陷。
本发明解决上述技术问题所采用的技术方案如下:
一种应用程序交互控制方法,所述方法包括:
在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势;
通过所述交互控制接口响应所述触控手势,并执行与所述触控手势对应的交互控制命令。
可选地,所述在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势的步骤之前,该方法还包括:
在屏显范围内识别当前运行的应用程序;
获取所述应用程序的运行界面,并根据所述运行界面获取第一交互控制接口;
获取所述应用程序的运行环境,并根据所述运行环境获取第二交互控制接口。
可选地,所述在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势的步骤包括:
将所述触控区划分为第一层级触控区和第二层级触控区;
关联所述第一层级触控区和第一交互控制接口,同时,关联所述第二层级触控区和第二交互控制接口;
根据所述第一层级触控区设置应用程序级别的第一触控手势,同时,根据所述第二层级触控区设置操作系统级别的第二触控手势。
可选地,所述通过所述交互控制接口响应所述触控手势,并执行与所述触控手势对应的交互控制命令的步骤包括:
根据所述第一层级触控区设置第一优先级,同时,根据所述第二层级触控区设置第二优先级;
在预设时间内通过所述触控区内获取所述触控手势;
将所述触控手势按第一触控手势和第二触控手势进行划分;
按所述第一优先级响应所述第一层级触控区的第一触控手势,同时,按所述第二优先级响应所述第二层级触控区的第二触控手势。
可选地,所述通过所述交互控制接口响应所述触控手势,并执行与所述触控手势对应的交互控制命令的步骤还包括:
根据所述第一优先级和所述第二优先级设置第三优先级;
按所述第三优先级响应所述第一层级触控区的第一触控手势,同时,按所述第三优先级响应所述第二层级触控区的第二触控手势。
可选地,所述在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势的步骤包括:
获取所述应用程序的交互控制接口,之后,再根据所述应用程序的交互控制接口设置相应的触控区和触控手势。
可选地,所述获取所述应用程序的交互控制接口的步骤之前,该方法还包括:
提取与交互控制接口对应的应用程序控制命令,并划分应用程序的控制命令的类别,将同一类别的应用程序的控制命令所对应的触控区域按邻近方位设置。
可选地,所述根据所述应用程序的交互控制接口设置相应的触控区和触控手势的步骤包括:
在预设时间段内记录所述应用程序的虚拟按键的触发数据,其中,该触发数据包括触发场景以及触发频率;
通过分析所述触发数据以及移动终端的边框结构划分触控区域;
建立触控手势数据库,在手势数据库内,根据所述触控区域的物理范围选取与所述物理范围相应的触控手势;
关联所述触控区域与所选取的触控手势,使得同一触控手势在不同的触控区域指代的是不同的移动终端控制命令。
可选地,所述通过分析所述触发数据以及移动终端的边框结构划分触控区域的步骤包括:
获取并分析多个应用程序的虚拟按键的触发场景以及触发频率;
分析该移动终端的边框结构,根据用户选择或者根据所述触发场景和触发频率划分该移动终端的触控区域。
一种应用程序交互控制装置,所述装置包括设置模块和控制模块,其中:
所述设置模块设置成:在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势;
所述控制模块设置成:通过所述交互控制接口响应所述触控手势,执行与所述触控手势对应的交互控制命令。
可选地,该装置还包括初始化模块,所述初始化模块包括应用程序识别单元、第一交互控制接口获取单元以及第二交互控制接口获取单元,其中,
所述应用程序识别单元设置成:在屏显范围内识别当前运行的应用程序;
所述第一交互控制接口获取单元设置成:获取所述应用程序的运行界面,并根据所述运行界面获取第一交互控制接口;
所述第二交互控制接口获取单元设置成:获取所述应用程序的运行环境,并根据所述运行环境获取第二交互控制接口。
可选地,所述设置模块包括触控区划分单元、触控区关联单元以及触控手势设置单元,其中,
所述触控区划分单元设置成:将所述触控区划分为第一层级触控区和第二层级触控区;
所述触控区关联单元设置成:关联所述第一层级触控区和第一交互控制接口,同时,关联所述第二层级触控区和第二交互控制接口;
所述触控手势设置单元设置成:根据所述第一层级触控区设置应用程序级别的第一触控手势,同时,根据所述第二层级触控区设置操作系统级别的第二触控手势。
可选地,所述设置模块还包括优先级设置单元、触控手势获取单元、触控手势划分单元以及触控手势响应单元,其中,
所述优先级设置单元设置成:根据所述第一层级触控区设置第一优先级,同时,根据所述第二层级触控区设置第二优先级;
所述触控手势获取单元设置成:在预设时间内通过所述触控区内获取所述触控手势;
所述触控手势划分单元设置成:将所述触控手势按第一触控手势和第二触控手势进行划分;
所述触控手势响应单元设置成:按所述第一优先级响应所述第一层级触控区的第一触控手势,同时,按所述第二优先级响应所述第二层级触控区的第二触控手势;
所述优先级设置单元还设置成:根据所述第一优先级和所述第二优先级设置第三优先级;
所述触控手势响应单元还设置成:按所述第三优先级响应所述第一层级触控区的第一触控手势,同时,按所述第三优先级响应所述第二层级触控区的第二触控手势。
可选地,所述设置模块设置成按照如下方式在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势:
获取所述应用程序的交互控制接口,之后,再根据所述应用程序的交互控制接口设置相应的触控区和触控手势。
可选地,所述设置模块还设置成:
提取与交互控制接口对应的应用程序控制命令,并划分应用程序的控制命令的类别,将同一类别的应用程序的控制命令所对应的触控区域按邻近方位设置。
可选地,所述设置模块设置成按照如下方式根据所述应用程序的交互控制接口设置相应的触控区和触控手势:
在预设时间段内记录所述应用程序的虚拟按键的触发数据,其中,该触发数据包括触发场景以及触发频率;
通过分析所述触发数据以及移动终端的边框结构划分触控区域;
建立触控手势数据库,在手势数据库内,根据所述触控区域的物理范围选取与所述物理范围相应的触控手势;
关联所述触控区域与所选取的触控手势,使得同一触控手势在不同的触控区域指代的是不同的移动终端控制命令。
可选地,所述设置模块设置成按照如下方式通过分析所述触发数据以及移动终端的边框结构划分触控区域:
获取并分析多个应用程序的虚拟按键的触发场景以及触发频率;
分析该移动终端的边框结构,根据用户选择或者根据所述触发场景和触发频率划分该移动终端的触控区域。
一种应用程序交互控制终端,所述终端包括上述任意的应用程序交互控制装置。
一种计算机程序,包括程序指令,当该程序指令被终端执行时,使得该终端可执行上述任意的应用程序交互控制方法。
一种载有所述的计算机程序的载体。
本发明技术方案,通过提取与交互控制接口对应的应用程序的控制命令,实现了以不同的交互控制接口根据相应的按键或者组合的按键等情形,发送相应的应用程序的控制命令。同时,通过划分应用程序的控制命令的类别、划分触控区一级划分优先级等处理操作,使得应用程序的交互控制更符合用户的使用习惯,增强了用户体验。
附图概述
下面将结合附图及实施例对本发明作进一步说明,附图中:
图1是本发明实施例提出的应用程序交互方法的流程图;
图2是本发明第二较佳实施例提出的应用程序交互方法的流程图;
图3是本发明第三较佳实施例提出的应用程序交互方法的流程图;
图4是本发明第四较佳实施例提出的应用程序交互方法的流程图;
图5是本发明第五较佳实施例提出的应用程序交互方法的流程图;
图6是本发明实施例提出的应用程序交互装置的结构框图。
本发明的较佳实施方式
以下结合附图和实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
实施例一
图1是本发明实施例提出的应用程序交互方法的流程图。该方法包括:
S1,在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势。
通常而言,移动终端的应用程序的交互界面设置有一个或者多个虚拟交互按键,比如应用程序内的上翻键、下翻键、添加键以及删除键等。本实施例以手机为例,该手机的上翻键N1、下翻键N2设置在屏幕内靠上边框的位置,添加键N3设置在屏幕内靠右边框的位置,删除键N4则设置在屏幕内靠左边框的位置。
首先,获取该手机上翻键N1、下翻键N2、添加键N3以及删除键N4的交互控制接口。可以理解,该交互控制接口是指基于手机操作系统的应用程序留给该应用程序实用功能的一个调用接口,通过调用基于操作系统的应用程序的交互控制接口而使应用程序去执行该控制命令,以完成相应的功能。
因此,本步骤首先是获取该手机的应用程序的上翻键N1的交互控制接口、下翻键N2的交互控制接口、添加键N3的交互控制接口以及删除键N4的交互控制接口。
可选地,该手机的应用程序(例如应用程序M)的交互界面内,通过两个或两个以上的虚拟交互按键同时按下以实现相应的功能,例如采用下翻键N2与删除键N4同时按下时触发的是截屏功能。因此,同时获取下翻键N2的交互控制接口与删除键N4的交互控制接口,再根据两者接口的控制命令合成截屏所调用的交互控制接口。
可选地,该手机的应用程序M的交互界面内,通过按一定次序按下以实现相应的功能,例如采用下翻键N2与删除键N4分别在一秒内先后按下时触发的是截屏功能。因此,则首先获取下翻键N2的交互控制接口,然后在一秒的时间内检测是否获取到删除键N4的交互控制接口,若是,则根据两者接口的控制命令合成截屏所调用的交互控制接口。
可选地,该手机的应用程序M的交互界面内,通过滑动特定的交互控制 虚拟手势,发送特定的应用程序控制命令,同时,不同的交互控制接口还可以根据组合的情形来发送相应的应用程序控制命令。
可选地,不同的交互控制接口还可以根据组合的时间,例如同一时刻的组合情形和按次序的先后组合情形等因素发送相应的应用程序的控制命令。
因此,本实施例是在获取终端的物理按键所对应的交互控制接口之前,提取与交互控制接口对应的应用程序控制命令,并划分应用程序的控制命令的类别,该类别可以按照控制对象划分、或者按照交互控制接口的组合次序划分等等,可以理解,通过划分应用程序的控制命令的类别,并将同一类别的应用程序的控制命令所对应的触控区域按邻近方位设置,使得触控操作更符合用户的使用习惯,增强用户体验。
当完成对交互控制接口的获取操作后,再根据应用程序的交互控制接口设置相应的触控区和触控手势。具体的设置方式如下所述:
首先,在预设时间段内记录应用程序的虚拟按键的触发数据。其中,触发数据包括触发场景以及触发频率。
由于不同的应用程序的虚拟按键所被使用的场景和频率是不同的,同时,多个应用程序的虚拟按键被使用的次序和频率也是不同的。因此,本步骤是在一段时间内,统计应用程序的虚拟按键的触发数据,获取单个应用程序的虚拟按键的触发场景以及触发频率,以及获取多个应用程序的虚拟按键被连同使用的场景、次序和频率。可以理解,本实施例所提及的触发场景是指基于当前操作系统的应用程序界面的场景,例如,当前操作系统的显示界面以及应用程序的交互操作界面等等。
然后,通过分析触发数据以及终端的边框结构划分触控区域。
一方面,获取并分析各应用程序的虚拟按键的触发场景以及触发频率,另一方面,获取并分析该应用程序的虚拟按键在组合使用时的触发场景以及触发频率。整理上述分析的数据,例如,按应用程序的虚拟按键的使用频率排序或者按应用程序的虚拟按键的使用先后次序的相关性的高低排序。
然后,分析该手机的边框结构,根据用户选择或者根据使用场景和使用频率划分该手机的边框。划分的方式可以按照使用的频率越高,划分的边框 的触控区域越大,或者使得划分的边框的触控区域更接近用户手指最容易按到的区域范围、
可以理解,本发明实施例所指的终端的边框可以是屏幕内靠近屏幕边缘的触控面板区域,或者是屏幕外靠近屏幕边缘的终端框架结构的区域,可以理解,当采用后者方案时,需要在该区域内设置相应的触感感应器。
再次,建立触控手势数据库。通过该触控手势数据库录取触控手势,并将录取的触控手势按手势类型进行划分。
在手势数据库内,根据触控区域的物理范围选取与物理范围相应的触控手势。分析上述经划分后的触控区域的物理范围,若该物理范围较小,则选择长按或者双击的触控手势,若该物理范围较大,则可以选择拖拽或者滑动的触控手势。
最后,关联触控区域与选取的触控手势。建立触控区域与选取的触控手势的关联关系,即使得同一触控手势在不同的触控区域指代的是不同的终端控制命令。
S2,通过交互控制接口响应所述触控手势,并执行与触控手势对应的交互控制命令。
根据触控手势响应与交互控制接口对应的应用程序的控制命令。如上例所述,首先,将手机上翻键N1、下翻键N2、添加键N3以及删除键N4四个交互控制接口与相应的触控手势关联,然后,根据触控手势的触控区域以及触控手势的类型响应相应的应用程序控制命令。
本实施例的有益效果在于,通过提取与交互控制接口对应的应用程序的控制命令,使得不同的交互控制接口可以根据相应的按键或者组合的按键等情形,发送相应的应用程序的控制命令。同时,通过划分应用程序的控制命令的类别,使得触控操作更符合用户的使用习惯,增强了用户体验。
实施例二
图2是本发明第二较佳实施例提出的应用程序交互方法的流程图。
基于上述实施例,完成对交互控制接口的获取操作之前,根据应用程序 的交互控制接口设置相应的触控区和触控手势之前包括:
S01,在屏显范围内识别当前运行的应用程序。一种方式是,在移动终端的操作系统进程管理器中检测运行进程的详细信息,并根据该进程获取当前运行的应用程序;另一种方式是通过图像识别技术识别当前屏幕的交互界面,通过该交互界面判断识别与之对应的应用程序。
S02,获取应用程序的运行界面,并根据运行界面获取第一交互控制接口。本步骤的运行界面是指应用程序在当前交互界面下,显示于触控屏内的属于该应用程序功能范围内的界面。因此,该运行界面排出了操作系统的交互界面。例如,应用程序M的添加键N3以及删除键N4所对应的交互控制接口即为第一交互控制接口。
S03,获取应用程序的运行环境,并根据运行环境获取第二交互控制接口。本步骤的运行环境是指移动终端的操作系统在当前时刻的交互界面。例如,当应用程序M在运行时,此时操作系统的主屏幕键N5、返回键N6以及多任务切换键N7所对应的交互控制接口即为第二交互控制接口。
本实施例的有益效果在于,通过获取应用程序的运行界面,并根据运行界面获取第一交互控制接口,获取应用程序的运行环境,并根据运行环境获取第二交互控制接口,使得在对应用程序交互控制时,更能明确区分应用程序内的控制命令和操作系统级别的控制命令,控制逻辑更为清晰、高效。
实施例三
图3是本发明第三较佳实施例提出的应用程序交互方法的流程图。
基于上述实施例,根据应用程序的交互控制接口设置相应的触控区和触控手势包括:
S11,将触控区划分为第一层级触控区和第二层级触控区。在上述实施例一对触控区的划分方法的基础上,本实施例对划分后的触控区再次按照第一层级触控区和第二层级触控区进行划分。
S12,关联第一层级触控区和第一交互控制接口,同时,关联第二层级触控区和第二交互控制接口。可以理解,针对于应用程序,在第一层级触控区 获取触控手势,根据该触控手势触发第一交互控制接口,同理可知,针对于操作系统,在第二层级触控区获取触控手势,根据该触控手势触发第二交互控制接口。
S13,根据第一层级触控区设置应用程序级别的第一触控手势,同时,根据第二层级触控区设置操作系统级别的第二触控手势。可以理解,由于划分了第一层级触控区和第二层级触控区,因此,第一触控手势与第二触控手势可以是相同或者不同的,从而使得同一触控手势适用的范围更为广泛,当运行于操作系统的应用程序较多,交互操作界面较为复杂时,保证了使用较少的触控手势,在较多的应用程序以及相应应用程序的交互界面内实现交互控制。
本实施例的有益效果在于,通过将触控区划分为第一层级触控区和第二层级触控区,实现了以单一的触控手势调用多个应用程序的交互控制接口,提高了触控手势的利用率,同时,应用程序与操作系统两层级的划分,逻辑清晰,便于用户学习使用,降低了触控手势的学习难度。
实施例四
图4是本发明第四较佳实施例提出的应用程序交互方法的流程图。
基于上述实施例,通过交互控制接口响应触控手势,并执行与触控手势对应的交互控制命令还包括:
S21,根据第一层级触控区设置第一优先级,同时,根据第二层级触控区设置第二优先级。
例如,在播放器应用程序P内,有快进触控手势P1、快退触控手势P2、暂停播放手势P3以及音量调节手势P4;在拍照应用程序Q内,有拉近焦距的触控手势Q1、拉远焦距的触控手势Q2以及调整延时快门时间的触控手势Q3;在该移动终端的操作系统内,有主屏幕触控手势R1、返回触控手势R2以及多任务切换触控手势R3。
其中,快进触控手势P1与拉近焦距的触控手势Q1相同;快退触控手势P2与拉远焦距的触控手势Q2相同;音量调节手势P4与调整延时快门时间的 触控手势Q3相同。
设置第一优先级,其中,P1优先于Q1,P2优先于Q2,P4优先于Q3,那么,当播放器应用程序P与拍照应用程序Q同时分屏运行时,若检测到快进触控手势P1或拉近焦距的触控手势Q1,则优先响应快进操作;若检测到快退触控手势P2与拉远焦距的触控手势Q2,则优先响应快退操作;若检测到音量调节手势P4与调整延时快门时间的触控手势Q3,则优先响应音量调节。
可以理解,再按上述方法设置主屏幕触控手势R1、返回触控手势R2以及多任务切换触控手势R3的第二优先级。
S22,在预设时间内通过触控区内获取触控手势。
S23,将触控手势按第一触控手势和第二触控手势进行划分。
S24,按第一优先级响应第一层级触控区的第一触控手势,同时,按第二优先级响应第二层级触控区的第二触控手势。
本实施例的有益效果在于,通过在第一层级触控区设置第一优先级,同时,在第二层级触控区设置第二优先级,并根据相应的触控区以及相应的优先级响应相应的触控手势,从而使得触控手势的控制命令更加多样化,避免了在多程序并行运行时,容易造成触控手势对应的控制命令相冲突的缺陷。
实施例五
图5是本发明第五较佳实施例提出的应用程序交互方法的流程图。
基于上述实施例,通过交互控制接口响应触控手势,并执行与触控手势对应的交互控制命令还包括:
S25,根据第一优先级和第二优先级设置第三优先级。
S26,按第三优先级响应第一层级触控区的第一触控手势,同时,按第三优先级响应第二层级触控区的第二触控手势。
可以理解,当用户在使用播放器应用程序P观看视频的同时,还在使用拍照应用程序Q录制视频,当需要根据返回触控手势R2返回时,若该返回 触控手势R2与应用程序P或者应用程序Q内的某一触控手势相同,则会引起控制命令的冲突。本实施例提出的解决方案是,根据第一优先级和第二优先级设置第三优先级,首先,判断第一优先级和第二优先级,在判断的结果范围内再根据设置的第三优先级进行触控手势的判断。
可以理解,在同一移动终端内,由于运行有不同的应用程序,第一优先级至少有一种,因此,可以根据不同的第一优先级(一个或者多个)与第二优先级共同设置上述的第三优先级。
本实施例的有益效果在于,通过第一优先级和第二优先级设置第三优先级,并按第三优先级响应第一层级触控区的第一触控手势,同时,按第三优先级响应第二层级触控区的第二触控手势。实现了在多应用程序、多交互界面的环境下,按预先设置的优先级,准确响应用户所需的控制命令。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分步骤是可以通过程序来控制相关的硬件完成,所述的程序可以在存储于一计算机可读取存储介质中,所述的存储介质,如ROM/RAM、磁盘、光盘等。
实施例六
图6是本发明实施例提出的应用程序交互装置的结构框图。
本发明实施例还提出了一种应用程序交互控制装置,该装置包括:
设置模块10,设置成:根据应用程序的交互控制接口设置相应的触控区和触控手势;
控制模块20,设置成:通过交互控制接口响应触控手势,执行与触控手势对应的交互控制命令。
可选地,本装置还包括初始化模块30,初始化模块30包括应用程序识别单元31、第一交互控制接口获取单元32以及第二交互控制接口获取单元33,其中,
应用程序识别单元31设置成:在屏显范围内识别当前运行的应用程序;
第一交互控制接口获取单元32设置成:获取应用程序的运行界面,并根据运行界面获取第一交互控制接口;
第二交互控制接口获取单元33设置成:获取应用程序的运行环境,并根据运行环境获取第二交互控制接口。
可选地,设置模块10包括触控区划分单元11、触控区关联单元12以及触控手势设置单元13,其中,
触控区划分单元11设置成:将触控区划分为第一层级触控区和第二层级触控区;
触控区关联单元12设置成:关联第一层级触控区和第一交互控制接口,同时,关联第二层级触控区和第二交互控制接口;
触控手势设置单元13设置成:根据第一层级触控区设置应用程序级别的第一触控手势,同时,根据第二层级触控区设置操作系统级别的第二触控手势。
可选地,设置模块10还包括优先级设置单元14、触控手势获取单元15、触控手势划分单元16以及触控手势响应单元17,其中,
优先级设置单元14设置成:根据第一层级触控区设置第一优先级,同时,根据第二层级触控区设置第二优先级;
触控手势获取单元15设置成:在预设时间内通过触控区内获取触控手势;
触控手势划分单元16设置成:将触控手势按第一触控手势和第二触控手势进行划分;
触控手势响应单元17设置成:按第一优先级响应第一层级触控区的第一触控手势,同时,按第二优先级响应第二层级触控区的第二触控手势;
优先级设置单元14还设置成:根据第一优先级和第二优先级设置第三优先级;
触控手势响应单元17还设置成:按第三优先级响应第一层级触控区的第一触控手势,同时,按第三优先级响应第二层级触控区的第二触控手势。
需要说明的是,上述方法实施例中的技术特征在本装置均对应适用,这里不再重述。
本发明实施例还提出了一种应用程序交互控制终端,该终端包括上述应用程序交互控制装置。
本发明实施例还公开了一种计算机程序,包括程序指令,当该程序指令被终端执行时,使得该终端可执行上述任意的应用程序交互控制方法。
本发明实施例还公开了一种载有所述的计算机程序的载体。
本发明实施例,通过提取与交互控制接口对应的应用程序的控制命令,实现了以不同的交互控制接口根据相应的按键或者组合的按键等情形,发送相应的应用程序的控制命令。同时,通过划分应用程序的控制命令的类别、划分触控区一级划分优先级等处理操作,使得应用程序的交互控制更符合用户的使用习惯,增强了用户体验。
该应用程序交互控制装置可以用于移动电话,或者具有应用程序交互控制功能的其他通信终端,例如智能手机等中,可以是运行于这些通信终端内的软件单元,也可以作为独立的挂件集成到这些通信终端中或者运行于这些移动终端的应用系统中。
以上参照附图说明了本发明的优选实施例,并非因此局限本发明的权利范围。本领域技术人员不脱离本发明的范围和实质,可以有多种变型方案实现本发明,比如作为一个实施例的特征可用于另一实施例而得到又一实施例。凡在运用本发明的技术构思之内所作的任何修改、等同替换和改进,均应在本发明的权利范围之内。
工业实用性
本发明实施例,通过提取与交互控制接口对应的应用程序的控制命令,实现了以不同的交互控制接口根据相应的按键或者组合的按键等情形,发送相应的应用程序的控制命令。同时,通过划分应用程序的控制命令的类别、划分触控区一级划分优先级等处理操作,使得应用程序的交互控制更符合用户的使用习惯,增强了用户体验。因此本发明具有很强的工业实用性。

Claims (20)

  1. 一种应用程序交互控制方法,所述方法包括:
    在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势;
    通过所述交互控制接口响应所述触控手势,并执行与所述触控手势对应的交互控制命令。
  2. 根据权利要求1所述的应用程序交互控制方法,其中,所述在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势的步骤之前,该方法还包括:
    在屏显范围内识别当前运行的应用程序;
    获取所述应用程序的运行界面,并根据所述运行界面获取第一交互控制接口;
    获取所述应用程序的运行环境,并根据所述运行环境获取第二交互控制接口。
  3. 根据权利要求2所述的应用程序交互控制方法,其中,所述在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势的步骤包括:
    将所述触控区划分为第一层级触控区和第二层级触控区;
    关联所述第一层级触控区和第一交互控制接口,同时,关联所述第二层级触控区和第二交互控制接口;
    根据所述第一层级触控区设置应用程序级别的第一触控手势,同时,根据所述第二层级触控区设置操作系统级别的第二触控手势。
  4. 根据权利要求3所述的应用程序交互控制方法,其中,所述通过所述交互控制接口响应所述触控手势,并执行与所述触控手势对应的交互控制命令的步骤包括:
    根据所述第一层级触控区设置第一优先级,同时,根据所述第二层级触控区设置第二优先级;
    在预设时间内通过所述触控区内获取所述触控手势;
    将所述触控手势按第一触控手势和第二触控手势进行划分;
    按所述第一优先级响应所述第一层级触控区的第一触控手势,同时,按所述第二优先级响应所述第二层级触控区的第二触控手势。
  5. 根据权利要求4所述的应用程序交互控制方法,其中,所述通过所述交互控制接口响应所述触控手势,并执行与所述触控手势对应的交互控制命令的步骤还包括:
    根据所述第一优先级和所述第二优先级设置第三优先级;
    按所述第三优先级响应所述第一层级触控区的第一触控手势,同时,按所述第三优先级响应所述第二层级触控区的第二触控手势。
  6. 根据权利要求1所述的应用程序交互控制方法,其中,所述在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势的步骤包括:
    获取所述应用程序的交互控制接口,之后,再根据所述应用程序的交互控制接口设置相应的触控区和触控手势。
  7. 根据权利要求6所述的应用程序交互控制方法,其中,所述获取所述应用程序的交互控制接口的步骤之前,该方法还包括:
    提取与交互控制接口对应的应用程序控制命令,并划分应用程序的控制命令的类别,将同一类别的应用程序的控制命令所对应的触控区域按邻近方位设置。
  8. 根据权利要求6所述的应用程序交互控制方法,其中,所述根据所述应用程序的交互控制接口设置相应的触控区和触控手势的步骤包括:
    在预设时间段内记录所述应用程序的虚拟按键的触发数据,其中,该触发数据包括触发场景以及触发频率;
    通过分析所述触发数据以及移动终端的边框结构划分触控区域;
    建立触控手势数据库,在手势数据库内,根据所述触控区域的物理范围选取与所述物理范围相应的触控手势;
    关联所述触控区域与所选取的触控手势,使得同一触控手势在不同的触控区域指代的是不同的移动终端控制命令。
  9. 根据权利要求8所述的应用程序交互控制方法,其中,所述通过分析所述触发数据以及移动终端的边框结构划分触控区域的步骤包括:
    获取并分析多个应用程序的虚拟按键的触发场景以及触发频率;
    分析该移动终端的边框结构,根据用户选择或者根据所述触发场景和触发频率划分该移动终端的触控区域。
  10. 一种应用程序交互控制装置,所述装置包括设置模块和控制模块,其中:
    所述设置模块设置成:在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势;
    所述控制模块设置成:通过所述交互控制接口响应所述触控手势,执行与所述触控手势对应的交互控制命令。
  11. 根据权利要求10所述的应用程序交互控制装置,该装置还包括初始化模块,所述初始化模块包括应用程序识别单元、第一交互控制接口获取单元以及第二交互控制接口获取单元,其中,
    所述应用程序识别单元设置成:在屏显范围内识别当前运行的应用程序;
    所述第一交互控制接口获取单元设置成:获取所述应用程序的运行界面,并根据所述运行界面获取第一交互控制接口;
    所述第二交互控制接口获取单元设置成:获取所述应用程序的运行环境,并根据所述运行环境获取第二交互控制接口。
  12. 根据权利要求11所述的应用程序交互控制装置,其中,所述设置模块包括触控区划分单元、触控区关联单元以及触控手势设置单元,其中,
    所述触控区划分单元设置成:将所述触控区划分为第一层级触控区和第二层级触控区;
    所述触控区关联单元设置成:关联所述第一层级触控区和第一交互控制接口,同时,关联所述第二层级触控区和第二交互控制接口;
    所述触控手势设置单元设置成:根据所述第一层级触控区设置应用程序级别的第一触控手势,同时,根据所述第二层级触控区设置操作系统级别的第二触控手势。
  13. 根据权利要求12所述的应用程序交互控制装置,其中,所述设置模块还包括优先级设置单元、触控手势获取单元、触控手势划分单元以及触控手势响应单元,其中,
    所述优先级设置单元设置成:根据所述第一层级触控区设置第一优先级,同时,根据所述第二层级触控区设置第二优先级;
    所述触控手势获取单元设置成:在预设时间内通过所述触控区内获取所述触控手势;
    所述触控手势划分单元设置成:将所述触控手势按第一触控手势和第二触控手势进行划分;
    所述触控手势响应单元设置成:按所述第一优先级响应所述第一层级触控区的第一触控手势,同时,按所述第二优先级响应所述第二层级触控区的第二触控手势;
    所述优先级设置单元还设置成:根据所述第一优先级和所述第二优先级设置第三优先级;
    所述触控手势响应单元还设置成:按所述第三优先级响应所述第一层级触控区的第一触控手势,同时,按所述第三优先级响应所述第二层级触控区的第二触控手势。
  14. 根据权利要求10所述的应用程序交互控制装置,其中,所述设置模块设置成按照如下方式在移动终端边框的触控区获取与应用程序的交互控制接口对应的触控手势:
    获取所述应用程序的交互控制接口,之后,再根据所述应用程序的交互控制接口设置相应的触控区和触控手势。
  15. 根据权利要求14所述的应用程序交互控制装置,其中,所述设置模块还设置成:
    提取与交互控制接口对应的应用程序控制命令,并划分应用程序的控制命令的类别,将同一类别的应用程序的控制命令所对应的触控区域按邻近方位设置。
  16. 根据权利要求14所述的应用程序交互控制装置,其中,所述设置模块设置成按照如下方式根据所述应用程序的交互控制接口设置相应的触控区和触控手势:
    在预设时间段内记录所述应用程序的虚拟按键的触发数据,其中,该触发数据包括触发场景以及触发频率;
    通过分析所述触发数据以及移动终端的边框结构划分触控区域;
    建立触控手势数据库,在手势数据库内,根据所述触控区域的物理范围选取与所述物理范围相应的触控手势;
    关联所述触控区域与所选取的触控手势,使得同一触控手势在不同的触控区域指代的是不同的移动终端控制命令。
  17. 根据权利要求16所述的应用程序交互控制装置,其中,所述设置模块设置成按照如下方式通过分析所述触发数据以及移动终端的边框结构划分触控区域:
    获取并分析多个应用程序的虚拟按键的触发场景以及触发频率;
    分析该移动终端的边框结构,根据用户选择或者根据所述触发场景和触发频率划分该移动终端的触控区域。
  18. 一种应用程序交互控制终端,所述终端包括所述权利要求10-17中任一项所述的应用程序交互控制装置。
  19. 一种计算机程序,包括程序指令,当该程序指令被终端执行时,使得该终端可执行如权利要求1-9中任一项所述的应用程序交互控制方法。
  20. 一种载有如权利要求19所述的计算机程序的载体。
PCT/CN2015/090288 2014-09-22 2015-09-22 一种应用程序交互控制方法、装置及终端 WO2016045579A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410487312.X 2014-09-22
CN201410487312.XA CN104267902B (zh) 2014-09-22 2014-09-22 一种应用程序交互控制方法、装置及终端

Publications (1)

Publication Number Publication Date
WO2016045579A1 true WO2016045579A1 (zh) 2016-03-31

Family

ID=52159429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/090288 WO2016045579A1 (zh) 2014-09-22 2015-09-22 一种应用程序交互控制方法、装置及终端

Country Status (2)

Country Link
CN (1) CN104267902B (zh)
WO (1) WO2016045579A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110825217A (zh) * 2018-08-13 2020-02-21 珠海格力电器股份有限公司 家电控制方法及装置
CN112749046A (zh) * 2019-10-31 2021-05-04 比亚迪股份有限公司 Mss系统模拟数据构造方法、装置、设备及存储介质

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013169846A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying additional information in response to a user contact
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
AU2013259613B2 (en) 2012-05-09 2016-07-21 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
CN109298789B (zh) 2012-05-09 2021-12-31 苹果公司 用于针对激活状态提供反馈的设备、方法和图形用户界面
CN104267902B (zh) * 2014-09-22 2017-03-08 努比亚技术有限公司 一种应用程序交互控制方法、装置及终端
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US10346030B2 (en) * 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN107025033A (zh) * 2016-02-01 2017-08-08 百度在线网络技术(北京)有限公司 一种调节屏幕亮度的方法和装置
CN106339173A (zh) * 2016-08-31 2017-01-18 新诺商桥科技(北京)有限公司 一种智慧桌面系统
CN107562262B (zh) * 2017-08-14 2020-06-19 维沃移动通信有限公司 一种响应触控操作的方法、终端及计算机可读存储介质
CN107562346A (zh) * 2017-09-06 2018-01-09 广东欧珀移动通信有限公司 终端控制方法、装置、终端及计算机可读存储介质
CN107729131A (zh) * 2017-09-25 2018-02-23 努比亚技术有限公司 一种事件处理方法、终端及计算机可读存储介质
WO2019061512A1 (zh) 2017-09-30 2019-04-04 华为技术有限公司 一种任务切换方法及终端
CN108958071B (zh) * 2018-06-07 2019-05-07 中兴高能技术有限责任公司 极片辊压机控制方法、装置及计算机可读存储介质
CN109144392B (zh) * 2018-08-22 2021-04-16 北京奇艺世纪科技有限公司 一种处理手势冲突的方法、装置及电子设备
CN109697012A (zh) * 2018-12-25 2019-04-30 华勤通讯技术有限公司 智能手表的控制方法、智能手表和存储介质
CN109933199B (zh) * 2019-03-13 2022-05-24 阿波罗智联(北京)科技有限公司 基于手势的控制方法、装置、电子设备及存储介质
CN110162238A (zh) * 2019-05-23 2019-08-23 努比亚技术有限公司 一种快捷调用关联应用方法及装置、移动终端及存储介质
CN112068743A (zh) * 2020-08-26 2020-12-11 深圳传音控股股份有限公司 交互方法、终端及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
CN102122229A (zh) * 2010-02-19 2011-07-13 微软公司 使用边框作为输入机制
CN102253709A (zh) * 2010-05-19 2011-11-23 禾瑞亚科技股份有限公司 手势判断的方法与装置
CN103019554A (zh) * 2011-09-20 2013-04-03 联想(北京)有限公司 命令识别方法及使用该方法的电子设备
CN103605465A (zh) * 2013-12-06 2014-02-26 上海艾为电子技术有限公司 一种控制手持设备的方法及手持设备
CN103870171A (zh) * 2012-12-07 2014-06-18 联想(北京)有限公司 数据处理方法和装置
CN104267902A (zh) * 2014-09-22 2015-01-07 深圳市中兴移动通信有限公司 一种应用程序交互控制方法、装置及终端

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799340A (zh) * 2011-05-26 2012-11-28 上海三旗通信科技股份有限公司 一种切换多应用到当前窗口进而激活的操作手势
CN102436347A (zh) * 2011-11-10 2012-05-02 盛乐信息技术(上海)有限公司 一种应用程序的切换方法及一种触摸屏设备
CN103324420B (zh) * 2012-03-19 2016-12-28 联想(北京)有限公司 一种多点触控板输入操作识别方法及电子设备
CN103513899A (zh) * 2012-06-21 2014-01-15 北京睿思汇通移动科技有限公司 一种移动终端浏览器的分割屏幕方法及操控浏览器的方法
CN103853481B (zh) * 2012-12-06 2021-03-12 腾讯科技(深圳)有限公司 模拟触屏移动终端按键的方法和系统
WO2014113923A1 (zh) * 2013-01-22 2014-07-31 华为终端有限公司 基于触摸屏的物理按键模拟方法及装置
CN103197885B (zh) * 2013-03-04 2018-05-15 东莞宇龙通信科技有限公司 移动终端的操控方法及其移动终端
CN103347108A (zh) * 2013-07-05 2013-10-09 中科创达软件股份有限公司 一种侧面安装可编程快捷触控板的手机及实现方法
CN103941919A (zh) * 2014-04-23 2014-07-23 宁波保税区攀峒信息科技有限公司 —种触摸事件识别模式

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
CN102122229A (zh) * 2010-02-19 2011-07-13 微软公司 使用边框作为输入机制
CN102253709A (zh) * 2010-05-19 2011-11-23 禾瑞亚科技股份有限公司 手势判断的方法与装置
CN103019554A (zh) * 2011-09-20 2013-04-03 联想(北京)有限公司 命令识别方法及使用该方法的电子设备
CN103870171A (zh) * 2012-12-07 2014-06-18 联想(北京)有限公司 数据处理方法和装置
CN103605465A (zh) * 2013-12-06 2014-02-26 上海艾为电子技术有限公司 一种控制手持设备的方法及手持设备
CN104267902A (zh) * 2014-09-22 2015-01-07 深圳市中兴移动通信有限公司 一种应用程序交互控制方法、装置及终端

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110825217A (zh) * 2018-08-13 2020-02-21 珠海格力电器股份有限公司 家电控制方法及装置
CN110825217B (zh) * 2018-08-13 2023-07-11 珠海格力电器股份有限公司 家电控制方法及装置
CN112749046A (zh) * 2019-10-31 2021-05-04 比亚迪股份有限公司 Mss系统模拟数据构造方法、装置、设备及存储介质
CN112749046B (zh) * 2019-10-31 2023-08-11 比亚迪股份有限公司 Mss系统模拟数据构造方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN104267902A (zh) 2015-01-07
CN104267902B (zh) 2017-03-08

Similar Documents

Publication Publication Date Title
WO2016045579A1 (zh) 一种应用程序交互控制方法、装置及终端
CN105760102B (zh) 终端交互控制方法、装置及应用程序交互控制方法
US11740694B2 (en) Managing and mapping multi-sided touch
US10416789B2 (en) Automatic selection of a wireless connectivity protocol for an input device
CN105814522B (zh) 基于运动识别来显示虚拟输入设备的用户界面的设备和方法
US20110126094A1 (en) Method of modifying commands on a touch screen user interface
CN112118380B (zh) 相机操控方法、装置、设备及存储介质
US9571739B2 (en) Camera timer
WO2016041469A1 (zh) 触控方法、触控装置、触控终端及计算机可读存储介质
US10474324B2 (en) Uninterruptable overlay on a display
WO2019206243A1 (zh) 一种素材展示方法、终端和计算机存储介质
US20180267624A1 (en) Systems and methods for spotlight effect manipulation
JP2015537266A (ja) インターフェース制御方法及び制御装置
WO2017206383A1 (zh) 一种终端控制方法、控制装置以及终端
CN108616775A (zh) 视频播放时智能截图的方法、装置、存储介质及智能终端
CN112114653A (zh) 终端设备的操控方法、装置、设备及存储介质
WO2016145827A1 (zh) 终端的控制方法及装置
WO2015131813A1 (zh) 一种进行设备操作的方法和系统
CN108664891A (zh) 一种基于指纹识别的拍摄方法及终端
CN107024998A (zh) 一种输入方法、装置和用于输入的装置
CN108924406A (zh) 拍照方法、装置、可读存储介质及智能终端
CN109871131A (zh) 一种字符串拆分的方法及装置
CN109814764A (zh) 设备控制方法及装置、电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15845401

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 300817)

122 Ep: pct application non-entry in european phase

Ref document number: 15845401

Country of ref document: EP

Kind code of ref document: A1