WO2014063643A1 - 一种触控页面控制方法及系统 - Google Patents

一种触控页面控制方法及系统 Download PDF

Info

Publication number
WO2014063643A1
WO2014063643A1 PCT/CN2013/085881 CN2013085881W WO2014063643A1 WO 2014063643 A1 WO2014063643 A1 WO 2014063643A1 CN 2013085881 W CN2013085881 W CN 2013085881W WO 2014063643 A1 WO2014063643 A1 WO 2014063643A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
control
area
page
controls
Prior art date
Application number
PCT/CN2013/085881
Other languages
English (en)
French (fr)
Inventor
邹迪飞
黄�俊
罗谚君
林声炜
钟于胜
朱德亮
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to US14/437,201 priority Critical patent/US9772768B2/en
Publication of WO2014063643A1 publication Critical patent/WO2014063643A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present invention relates to a touch control technology for a terminal, and more particularly to a touch page control method and system. Background technique
  • touch-type electronic devices such as touch-screen mobile phones, tablet computers, etc.
  • touch-type electronic devices mainly implement specific operations through agreed touch actions, such as turning pages, advancing, rewinding, and enlarging pages through specific gestures, but
  • users who use touch-sensitive electronic devices need to go through long-term exploration or consult with others to know these actions, which increases the user's learning cost and brings a bad user experience.
  • buttons on the page if you want to perform some operations, such as playing videos, downloading software, playing songs, etc.
  • the above buttons need to be activated by touch to perform the desired operation.
  • the buttons are relatively small, and it is easy to accidentally touch the button near the button when the button is touched, thereby causing an erroneous operation.
  • a touch page control method comprising: receiving a page open request; determining a page to be opened, determining whether a control exists in the page; The above control exists in the page to count the number of the above controls, and determine the location of the control; create a gesture area according to the number of controls, the number of the gesture area corresponds to the number of controls; display the corresponding control in the gesture area display
  • a touch page control system comprising: a request receiving unit, configured to receive a page open request; a determining unit, configured to determine a page to be opened, determine whether a control exists in the page; and a control information determining unit, configured to: Counting the number of the above controls and determining the position of the control when the above control exists in the page; the gesture area creation unit is configured to create a gesture area according to the number of controls, the number of the gesture areas corresponding to
  • a method for creating a graphical user interface for a touch device comprising: creating an operable control; creating a gesture identification area corresponding to the control, wherein the gesture identification area display can be activated in the same operation as the corresponding control Gesture; and a gesture operation area for creating a gesture for receiving a user.
  • a touch device comprising a memory and a processor, wherein the memory stores executable program code, the executable program code being operable to: create an operable control when executed by the processor; a gesture identification area corresponding to the control, the gesture identification area displaying a gesture capable of activating the same operation as the corresponding control; and creating a gesture operation area for receiving a gesture of the user.
  • a non-transitory computer program product comprising executable program code for a touch device, the executable program code operable to: when executed, create an operational control; create a corresponding to the control a gesture identification area, the gesture identification area displays a gesture capable of activating the same operation as the corresponding control; and creating a gesture operation area for receiving a gesture of the user.
  • the touch page control method and system of the present invention display a gesture area corresponding to the control on the page, and display a gesture required to activate the corresponding control in the gesture area, so that the page operated by the method and the system is not operated Need to learn a variety of operating gestures, reducing the user's learning costs.
  • the gesture displayed in the gesture area enables the controls in the page to be activated by different gestures, which can reduce the chance of misoperation compared with the manner of using the touch activation control.
  • FIG. 1 is a flowchart of a method for controlling a touch page according to the present invention.
  • FIG. 2 is a schematic diagram of a page after the gesture area is created by using the method shown in FIG. 1.
  • Figure 3 is a schematic illustration of the gesture displayed by gesture area A in Figure 2.
  • FIG. 4 is a schematic diagram of another page after the gesture area is created by using the method shown in FIG. 1.
  • FIG. 5 is a schematic diagram showing the gesture displayed by the gesture area in FIG. 4.
  • FIG. 6 is a block diagram of a touch page control system according to the present invention.
  • FIG. 7 is a flow chart of a method for creating a graphical user interface for a touch device according to the present invention.
  • FIG. 8 is a flow chart of another method for creating a graphical user interface for a touch device according to the present disclosure.
  • FIG. 9 is a schematic diagram of a graphical user interface created by the method provided by the present invention.
  • FIG. 10 is a schematic diagram of another graphical user interface created by the method provided by the present invention. detailed description
  • the touch device in the present invention mainly includes a touch screen mobile phone, a tablet computer, etc., but is not limited thereto.
  • the page with controls in the present invention refers to a page containing an operable control, such as an operation page of Thunder, PPTV, etc., and a software download, a song play page, etc., but is not limited thereto.
  • Thunder look, PPTV operation page contains the controls activated by the operation, such as by clicking the activated button, you can watch the category selection controls, such as movies, TV, anime, etc., or you can play control controls, such as watching immediately, Download and more.
  • the controls included in the software download and song play page can include controls such as downloading and playing.
  • FIG. 1 is a flowchart of a touch page control method according to an embodiment of the present invention. As shown in Figure 1, the method of the present invention comprises the following steps:
  • Step S101 receiving a page open request
  • Step S102 Determine a page to be opened, and determine whether there is a control activated by the operation in the page;
  • Step S103 if the above control exists in the page, the number of the above controls is counted, and the eight-step SIM is determined, and the gesture area is created according to the number of controls and the location thereof, and the number and position of the gesture area correspond to the number and position of the control. ;
  • Step S105 Display a gesture required to activate the corresponding control in the gesture area.
  • Figure 2 is a schematic diagram of a page after creating a gesture area using this method.
  • the page includes three areas, which are the control area, the column content area, and the hand. Potential area.
  • the control area is located on the left side of the page, which includes three controls, and the page is used for Thunder.
  • the PPTV page can be used for movies, TVs, and animation controls.
  • the content area of the column is located on the right side of the page, and is used to display the content of the corresponding column of each control.
  • the content of the column content area displays the first control of the control area, that is, the content corresponding to the movie control.
  • the content may include information such as the name, picture, and the like of the plurality of movies.
  • the TV or animation control is activated, the content area of the column is adjusted accordingly to the content corresponding to the TV control or the animation control.
  • the gesture area is located in the upper right corner of the content area of the column and is used to display the gestures required to activate the corresponding control.
  • the gesture area includes three gesture areas A, B, and C.
  • the positions of the three gesture areas correspond to the positions of the three controls in the control area, and the areas of the three gesture areas are respectively larger than the operable area of the corresponding control.
  • Figure 3 is a schematic illustration of the gesture displayed by gesture area A in Figure 2.
  • the gesture for activating the gesture area A in FIG. 2 starts from the left end of the gesture area A, slides to the right horizontally or approximately horizontally, and extends to a right end of the gesture area, and then continues to slide vertically downward for a distance, longitudinally.
  • the sliding distance is not less than one quarter of the lateral sliding distance.
  • the gestures displayed in the gesture regions may be the same or different. The preferred manner is that the gestures of the gesture regions are different, thereby avoiding erroneous operations caused by the same gestures.
  • the position of the gesture area is not limited thereto, and the position thereof may be adjusted according to the position of the control, or the gesture area is set on a blank portion of the page or a portion with less text, when the gesture area is set
  • the step of judging the page text needs to be added to judge the distribution of the text on the page.
  • the distribution of the content (including text and/or picture, etc.) in the page is determined, and the gesture operation area is displayed in an area where the content is less than a predetermined value.
  • the predetermined value may relate to an area ratio of content in the area, such as a gesture area displayed when the area ratio of the content is less than 20%.
  • the method may further include: presetting a gesture required to activate a particular kind of control.
  • the step of determining the number and location of the controls further comprising determining the type of the control;
  • the method further comprises: extracting the required gesture from the set gesture according to the type of the control; and displaying the gesture required to activate the corresponding control in the gesture area
  • the displayed gesture is a gesture extracted from the set gesture.
  • the method may further include: marking a corresponding trajectory in the gesture area according to the displayed gesture; determining whether the trajectory matches the displayed gesture; and if corresponding to the displayed gesture, activating the corresponding gesture area Control.
  • the method may further include: correcting the drawn trajectory after the corresponding trajectory is drawn in the gesture area according to the displayed gesture; correspondingly, in the step of determining whether the trajectory matches the displayed gesture, according to the correction
  • the result is judged.
  • the correction range includes correction of the straightness, the length, and the swept area of the drawn track.
  • the correction is used to eliminate the deviation between the actual operating gesture and the ideal value. For example, there may be a curvature when the line is actually drawn, and the correction may be used to determine a curve having a radius of curvature greater than a predetermined value as a straight line.
  • the straightness includes the straightness of each segment of the trajectory. For example, in Fig.
  • the straightness includes the straightness of the transverse segment of the trajectory and the longitudinal segment.
  • the length includes the length of each segment of the trajectory.
  • the length includes the length of the transverse segment of the trajectory and the length of the longitudinal segment.
  • the crossed area refers to the area of the smallest rectangle that wraps each end of the trajectory.
  • the correction of the traversed area means that within a certain error range, if the traversed area exceeds the range of the corresponding gesture area, it is also considered that the traversed area does not exceed the range of the gesture area. For example, when the gesture shown in Fig. 3 is drawn, if the gesture area A is exceeded within the range allowed by the error when sliding longitudinally, it is considered that the gesture does not exceed the gesture area A.
  • the touch page control method of the present invention displays a gesture area corresponding to the control on the page, and displays a gesture required to activate the corresponding control in the gesture area, so that the page operated by the method is operated. There is no need to learn various operation gestures, which reduces the user's learning cost. Moreover, the gesture displayed in the gesture area enables the controls in the page to be activated by different gestures, which can reduce the probability of misoperation compared with the manner of using the touch activation control. Further, since the area of the gesture area in the present invention is larger than the operable area of the corresponding control, the probability of erroneous operation is further reduced.
  • FIG. 6 is a block diagram of a touch page control system according to the present invention.
  • the touch page control system of the present invention includes a request receiving unit 201, a determining unit 202, a control information determining unit 203, a gesture area creating unit 204, and a display unit 205.
  • the request receiving unit 201 is configured to receive an open request of the page
  • the determining unit 202 is configured to judge the page to be opened, and determine whether the page is saved. In an operational control;
  • the gesture area creating unit 204 is configured to create a gesture area according to the number of controls and the location thereof, and the number and position of the gesture area correspond to the number and position of the control; the area of the gesture area is larger than the operable area of the corresponding control.
  • the display unit 205 is for displaying a gesture required to activate the corresponding control in the gesture area.
  • the system further includes a gesture setting unit 206 for presetting a gesture required to activate a particular type of control.
  • control information determining unit 203 is also used to determine the type of control.
  • the system further includes a gesture extraction unit 207 for extracting the desired gesture by the gesture setting unit 206 in accordance with the type of control.
  • the determining unit 202 is further configured to display, in the gesture area, a gesture required to activate the corresponding control, and after the gesture area performs a corresponding activation gesture according to the displayed gesture, determine the gesture and the display acting on the gesture area. Whether the gestures match, if they match, activate the corresponding control.
  • the system further includes a correction unit 208 for correcting the gesture track acting on the gesture area.
  • the determination unit 202 makes a determination after the correction unit 208 is calibrated.
  • the range of corrections includes corrections for the straightness, length, and swept area of the trajectory.
  • the touch page control system of the present invention displays a gesture area corresponding to the control on the page, and displays a gesture required to activate the corresponding control in the gesture area, so that the page operated by the system is operated. There is no need to learn various operation gestures, which reduces the user's learning cost. Moreover, the gesture displayed in the gesture area enables the controls in the page to be activated by different gestures, which can reduce the probability of misoperation compared with the manner of using the touch activation control. Further, since the area of the gesture area in the present invention is larger than the operable area of the corresponding control, the probability of erroneous operation is further reduced.
  • a method of creating a graphical user interface for a touch device includes the following steps:
  • Step S701 creating an operable control.
  • the creation method can be applied to various applications including controls, and the operation corresponding to the control can be activated by operating the control.
  • Step S702 Create a gesture identification area corresponding to the control, and the gesture identification area displays a gesture that can activate the same operation as the corresponding control.
  • the gesture identification area and the control can correspond in various ways.
  • the position of the control and gesture identification area can be arranged in the same manner, see Figures 9 and 10, the control A at the topmost position corresponds to the gesture operation area A; the control C at the bottommost Corresponding to the gesture operation area C; the control B located at the intermediate position corresponds to the gesture operation area B.
  • the number of controls is not limited to the number shown in Figures 9-10, and may be less than three or more than three.
  • control and the corresponding gesture operation area may correspond to each other by different colors, for example, the control A and the gesture operation area A are marked in red; the control B and the gesture operation area B are marked as yellow; the control C and The gesture operation area C is marked in blue.
  • other ways may be used to prompt the user which control corresponds to which gesture operating area.
  • the corresponding gesture operation area displays the gestures required to activate the corresponding control.
  • the gestures required for the activation operation will be described in detail later in connection with the two embodiments provided by the present invention.
  • Step S703 Create a gesture operation area for receiving a gesture of the user.
  • the gesture operation area may be in one-to-one correspondence with the gesture identification area, and then correspond to the control one by one, that is, creating a gesture operation area corresponding to the number of controls, and may be used for the gesture operation area by using the corresponding manner described above. Correspondence with the control, for example, by a positional relationship or a color, etc., to indicate the correspondence.
  • the gesture operation area may respectively receive a gesture of the corresponding gesture identification area to activate an operation that the corresponding control can activate. In this case, different gestures can be displayed in the same gesture area, and different gestures can be displayed.
  • the gesture identifier is located in the corresponding gesture operation area, so as to utilize the gesture to identify the correspondence between the area and the control. As shown in FIG.
  • the gesture identification area A is located in the corresponding gesture operation area A
  • the gesture identification area B is located in the corresponding gesture operation area B
  • the gesture identification area C is located in the corresponding gesture operation area C.
  • the gesture identification area is located in the gesture operation area, and the gesture identification area coincides with the gesture operation area.
  • the area of each of the gesture operating zones is greater than the operable area of the corresponding control.
  • the gesture operation area has an operable area larger than the control to facilitate the user to perform gesture operations in the gesture operation area.
  • the gesture operation area is one.
  • the gesture operating area can be located anywhere in the interface.
  • the user's gesture is any of the gestures that can activate the same operation as the corresponding control. That is to say, various gestures for activating the operations corresponding to the respective controls can be received within the gesture operation area.
  • the operation corresponding to the activation of the control A can be performed in the gesture operation area, and the operations corresponding to the reception controls B and C can also be performed in the gesture operation area. In this way, the area of the gesture operation area is enlarged, which is convenient for the user to Do it ok.
  • each of the controls respectively corresponds to a different gesture.
  • different gestures are displayed in the gesture identification area AC.
  • the identifier displayed in the gesture identification area A is ""
  • the identifier displayed in the gesture identification area B is ",” displayed in the gesture identification area C.
  • the identifier is " ".
  • the method further includes the following steps, as shown in FIG. 8:
  • Step 801 is executed to create a gesture database in advance, and the gesture database includes a plurality of standby gestures that can activate the same operation as the corresponding control.
  • gestures can be created that activate the operations corresponding to the various controls that may occur within different interfaces.
  • Step 802 is performed, and the inactive gesture corresponding to the control is extracted from the gesture database according to the control, and used to create a gesture identification area corresponding to the control.
  • the corresponding standby gesture is found from the gesture database, and is used as a gesture for activating the operation corresponding to the control.
  • Fig. 8 The steps 701-703 included in Fig. 8 are the same as or similar to the steps 701-703 shown in Fig. 7, and therefore will not be described in detail herein.
  • the graphical user interface may further include a column content area, and the column content area occupies an area other than the control area.
  • the gesture operation area and/or the gesture identification area may be located in a blank area in the content area of the column or in an area with less content.
  • the gesture operating area and/or the gesture identification area may be distinguished from the column content.
  • the column content area may coincide with the hand operating area.
  • the predetermined content is displayed when the operation corresponding to the control is not activated.
  • the predetermined content may be set by the user or set as a default, and the predetermined content may be, for example, the content of the column corresponding to the control A.
  • the content corresponding to the activated operation is displayed. For example, when the operation corresponding to the control B is activated, the operation can be displayed.
  • a touch device comprising a memory and a processor, wherein the memory stores executable program code, executable program code operable to: when executed by a processor, Create an actionable control; create a gesture identification area corresponding to the control, the gesture identification area displays a gesture that can activate the same operation as the corresponding control; and create a connection for the connection The gesture operation area that receives the user's gesture.
  • a non-transitory computer program product comprising executable program code for a touch device.
  • the non-transitory computer program product includes a non-transitory digital data storage medium such as a magnetic or optical disk, a random access memory (RAM), a magnetic hard disk, a flash memory, and/or a read only memory (ROM).
  • the executable program code is operative to: when executed, create an operational control; create a gesture identification area corresponding to the control, the gesture identification area display can activate the same operation as the corresponding control Gesture; and a gesture operation area for creating a gesture for receiving a user.
  • the executable program code is operable to: when executed, receive a page open request; determine a page to be opened, determine whether a control exists in the page; if the control exists in the page , counting the number of the controls and determining the location thereof; creating a gesture area according to the number of controls, the number of the gesture areas corresponding to the number of controls; displaying the gestures required to activate the corresponding controls in the gesture area.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明涉及一种触控页面控制方法,包括:接收页面打开请求;对将要打开的页面进行判断,判断该页面中是否存在控件;如果该页面中存在上述控件则计数上述控件的数量,并确定其所在的位置;根据控件的数量创建手势区,该手势区的数量与控件的数量相对应;在各手势区显示激活对应的控件所需的手势。本发明的触控页面控制方法在页面上显示出与控件相对应的手势区,并在手势区显示激活对应的控件所需的手势,使利用本方法管理的页面在操作时不需学习各种操作手势,降低了用户的学习成本。另外,本发明还提供一种触控页面控制系统。

Description

一种触控页面控制方法及系统 相关申请
本申请要求于 2012 年 10 月 24 日提交中国专利局、 申请号为
201210409083.0、发明名称为 "一种触控设备页面控制方法及系统"的中国 专利申请的优先权, 其全部内容通过弓 I用结合在本申请中。 技术领域
本发明涉及一种终端触控技术, 尤其涉及一种触控页面控制方法及系 统。 背景技术
现阶段的触控式电子设备, 如触摸屏手机、 平板电脑等, 主要是通过 约定的触控动作来实现特定操作, 如通过特定的手势进行翻页、 前进、 后 退、 放大页面等操作, 但对于初次使用触控式电子设备的用户来说, 需要 经过长时间的摸索或者向他人请教才能够知晓这些动作, 增加了用户的学 习成本, 带来了不良的用户体验。
另外, 对于某些软件的操作页面, 以及某些网页, 如软件下载、 歌曲 播放页面等, 其页面上存在有操作按钮, 如果要进行某个操作, 如播放视 频、 下载软件、 播放歌曲等, 需要通过触控的方式激活上述按钮, 来进行 所需的操作。 然而, 由于现阶段触控式电子设备的显示屏较小, 使得这些 按钮也相对较小,容易在触控上述按钮时误触其附近的按钮, 造成误操作。 发明内容
有鉴于此, 有必要提供一种不易误操作的触控页面控制方法及系统。 上述的触控页面控制方法及系统是通过以下技术方案实现的: 一种触控页面控制方法, 其包括: 接收页面打开请求; 对将要打开的 页面进行判断, 判断该页面中是否存在控件; 如果该页面中存在上述控件 则计数上述控件的数量, 并确定其所在的位置; 根据控件的数量创建手势 区, 该手势区的数量与控件的数量相对应; 在手势区显示激活对应的控件 一种触控页面控制系统, 其包括: 请求接收单元, 用于接收页面打开 请求; 判断单元, 用于对将要打开的页面进行判断, 判断该页面中是否存 在控件; 控件信息确定单元, 用于在该页面中存在上述控件时计数上述控 件的数量并确定其所在的位置; 手势区创建单元, 用于根据控件的数量创 建手势区, 该手势区的数量与控件的数量相对应; 显示单元, 用于在手势 区显示激活对应的控件所需的手势。
一种创建用于触控设备的图形用户界面的方法, 包括: 创建可操作的 控件; 创建与所述控件相对应的手势标识区, 所述手势标识区显示能与对 应的控件激活相同的操作的手势; 以及创建用于接收用户的手势的手势操 作区。
一种触控设备, 包括存储器和处理器, 其中, 所述存储器存储可执行 程序代码, 所述可执行程序代码可操作用于: 当由所述处理器执行时, 创 建可操作的控件; 创建与所述控件相对应的手势标识区, 所述手势标识区 显示能与对应的控件激活相同的操作的手势; 以及创建用于接收用户的手 势的手势操作区。
一种包括用于触控设备的可执行程序代码的非暂时性计算机程序产 品, 所述可执行程序代码可操作用于: 当执行时, 创建可操作的控件; 创 建与所述控件相对应的手势标识区, 所述手势标识区显示能与对应的控件 激活相同的操作的手势; 以及创建用于接收用户的手势的手势操作区。
本发明的触控页面控制方法和系统在页面上显示出与控件相对应的手 势区, 并在手势区显示激活对应的控件所需的手势, 使利用本方法和系统 操作的页面在操作时不需学习各种操作手势, 降低了用户的学习成本。 并 且, 手势区显示的手势使页面中的控件可以通过不同的手势激活, 与利用 触控激活控件的方式相比, 可以降低误操作的几率。
为让本发明的上述和其他目的、 特征和优点能更明显易懂, 下文特举 较佳实施例, 并配合所附图式, 作详细说明如下。 附图说明
图 1为本发明所揭示的一种触控页面控制方法的流程图。
图 2为利用图 1所示的方法创建好手势区后的一个页面的示意图。 图 3所示为图 2中手势区 A所显示的手势的示意图。
图 4为利用图 1所示的方法创建好手势区后的另一个页面的示意图。 图 5所示为图 4中手势区所显示的手势的示意图。
图 6为本发明揭示的一种触控页面控制系统的方框示意图。
图 7为本发明所揭示的一种创建用于触控设备的图形用户界面的方法 的流程图。
图 8为本发明所揭示的另一种创建用于触控设备的图形用户界面的方 法的流程图。
图 9为本发明提供的方法创建的一种图形用户界面的示意图。
图 10为本发明提供的方法创建的另一种图形用户界面的示意图。 具体实施方式
为更进一步阐述本发明为实现预定发明目的所采取的技术手段及功 效, 以下结合附图及较佳实施例, 对依据本发明提出的触控页面控制方法 及系统的具体实施方式、 结构、 特征及其功效, 详细说明如后。
首先需要说明的是, 本发明中的触控设备主要包括触摸屏手机、 平板 电脑等, 但不限于此。 本发明中的带控件的页面指含有可操作的控件的页 面,例如迅雷看看、 PPTV等的操作页面,以及软件下载、歌曲播放页面等, 但不限于此。 迅雷看看、 PPTV的操作页面中含有的通过操作激活的控件, 例如通过点击激活的按钮, 可以是观看类别选择控件, 如电影、 电视、 动 漫等, 也可以是播放操作控件, 如立即观看、 下载等。 软件下载、 歌曲播 放页面中含有的控件可以包括下载、 播放等控件。
图 1所示为本发明实施例揭示的一种触控页面控制方法的流程图。 如 图 1所示, 本发明的方法包括以下步骤:
步骤 S101 , 接收页面打开请求;
步骤 S102 , 对将要打开的页面进行判断, 判断该页面中是否存在通过 操作激活的控件;
步骤 S103 , 如果该页面中存在上述控件则计数上述控件的数量, 并确 八步骤 SIM , 根据控件的数量及其所在位置创建手势区, 该手势区的数 量及位置与控件的数量及位置相对应;
步骤 S105 , 在手势区显示激活对应的控件所需的手势。
图 2所示即为利用此方法创建好手势区后的一个页面的示意图。在图 2 中, 该页面包括三个区域, 该三个区域分别为控件区、 栏目内容区以及手 势区。 控件区位于页面的左侧, 其包括三个控件, 以该页面为迅雷看看、
PPTV的页面为例,这三个控件可以分别为电影、 电视、 动漫控件。栏目内 容区位于页面的右侧, 用于显示各个控件对应栏目的内容, 以该页面为迅 雷看看、 PPTV的页面为例,栏目内容区默认显示控件区第一个控件即电影 控件对应的内容, 该内容可以包括多部电影的名称、 图片等信息。 当将电 视或动漫控件激活时, 栏目内容区会相应的调整为电视控件或动漫控件对 应的内容。 手势区位于栏目内容区的右上角, 其用于显示激活对应的控件 所需的手势。 在图 2所示的实施例中, 手势区包括 A、 B、 C三个手势区。 这三个手势区的位置与控件区中三个控件的位置相对应, 且这三个手势区 的面积分别大于对应的控件的可操作的面积。
图 3所示为图 2中手势区 A所显示的手势的示意图。 根据图 3所示, 激活图 2中手势区 A的手势为由手势区 A的左端开始, 向右水平或近似水 平滑动一段区域, 到手势区的右端, 然后向下继续纵向滑动一段距离, 纵 向滑动的距离不少于横向滑动距离的四分之一。 需要说明的是, 在本发明 中, 各手势区所显示的手势可以相同, 也可以不同, 优选方式为各手势区 的手势是不同的, 进而可避免因手势相同引起的误操作。 另外, 在本发明 中, 手势区的位置并不限于此, 其位置可以根据控件的位置进行调整, 或 者是将手势区设于页面上的空白部分或文字较少的部分, 当将手势区设于 页面的空白部分或文字较少的部分时, 需要增加判断页面文字的步骤来判 断页面上文字的分布情况。 具体地, 在判断页面中存在控件后, 确定页面 中内容 (包括文字和 /或图片等) 的分布情况, 在内容少于预定值的区域内 显示手势操作区。 在一个实施例中, 所述预定值可以涉及内容在该区域内 所占的面积比,例如内容占其所在的区域的面积比少于 20%时显示手势区。
在图 2所示的实施例中, 控件区包括三个控件, 可以理解的, 控件区 也可以包括其他数量的控件, 例如图 4中, 控件区仅包括一个控件, 该控 件可以为软件的下载控件或者是歌曲的播放控件。 与之相对应, 栏目内容 区显示的内容可以是该软件的开发商、 版本、 运行环境等相关信息, 也可 以是该歌曲的演唱者、 作词、 作曲、 所属专辑等相关信息。 图 5所示为图 4 中手势区所显示的手势的示意图, 图 5中显示了激活图 4中控件的手势为 V形手势。
进一步地, 该方法还可以包括: 预先设定激活特定种类的控件所需的 手势。 相应地, 在确定控件数量和位置的步骤中, 还包括确定控件的种类; 在手势区内显示激活对应的控件所需的手势的步骤之前, 还包括根据控件 的种类从设定的手势中提取所需的手势; 在手势区内显示激活对应的控件 所需的手势的步骤中, 显示的手势是从设定的手势中提取的手势。
进一步地,该方法在步骤 S105后还可以包括: 根据显示的手势在手势 区划出相应的轨迹; 判断该轨迹是否与显示的手势相符; 如果与显示的手 势相符则激活与该手势区相对应的控件。
进一步地, 该方法在根据显示的手势在手势区划出相应的轨迹之后还 可以包括: 对该划出的轨迹进行校正; 相应地, 在判断该轨迹是否与显示 的手势相符的步骤中是根据校正的结果进行判断。 在此需要说明的是, 校 正的范围包括对划出轨迹的直线度、 长度以及划过的区域进行校正。 所述 校正用于消除实际操作的手势与理想值之间的偏差。 举例来说, 实际划出 直线时可能存在曲率, 所述校正可以用来将将曲率半径大于预定值以上的 曲线确定为直线。 其中, 直线度包括轨迹各段的直线度, 例如在图 3 中, 直线度包括轨迹横向段和纵向段的直线度。 长度包括轨迹各段的长度, 在 图 3 中, 长度包括轨迹横向段和纵向段的长度。 划过的区域指包裹轨迹各 个端点的最小矩形的区域。 对划过区域的校正指在一定的误差范围内如果 划过的区域超出了对应手势区的范围则也认为划过的区域没有超出手势区 的范围。 例如, 在划出图 3所示的手势时, 如果纵向滑动时在误差允许的 范围内超出了手势区 A, 则认为是该手势没有超出手势区 A。
由以上叙述可以知道, 本发明的触控页面控制方法在页面上显示出与 控件相对应的手势区, 并在手势区显示激活对应的控件所需的手势, 使利 用本方法操作的页面在操作时不需学习各种操作手势, 降低了用户的学习 成本。并且, 手势区显示的手势使页面中的控件可以通过不同的手势激活, 与利用触控激活控件的方式相比, 可以降低误操作的几率。 进一步的, 由 于本发明中手势区的面积大于对应的控件的可操作的面积, 进一步降低了 误操作的几率。
图 6为本发明揭示的一种触控页面控制系统的方框示意图。 如图 6所 示, 本发明的触控页面控制系统包括请求接收单元 201、 判断单元 202、 控 件信息确定单元 203、 手势区创建单元 204以及显示单元 205。
其中:
请求接收单元 201用于接收页面的打开请求;
判断单元 202用于对将要打开的页面进行判断, 判断该页面中是否存 在可操作的控件;
于在该页面中存在上述控件时计数上述控件
Figure imgf000008_0001
手势区创建单元 204用于根据控件的数量及其所在位置创建手势区, 该手势区的数量及位置与控件的数量及位置相对应; 该手势区的面积大于 对应的控件的可操作的面积。
显示单元 205用于在手势区显示激活对应的控件所需的手势。
进一步的,该系统还包括手势设定单元 206 ,用于预先设定激活特定种 类控件所需的手势。 在该系统包括手势设定单元 206的实施例中, 控件信 息确定单元 203还用于确定控件的种类。 在该系统包括手势设定单元 206 的实施例中,该系统还包括手势提取单元 207 ,用于根据控件的种类由手势 设定单元 206提取所需的手势。
进一步的, 该判断单元 202还用于在手势区显示出激活对应的控件所 需的手势, 并在手势区根据显示的手势做出相应的激活手势后, 判断作用 于手势区的手势与显示的手势是否相符, 如果相符则激活对应的控件。
进一步的,该系统还包括校正单元 208 ,用于对作用于手势区的手势轨 迹进行校正。 在该系统包括校正单元 208的实施例中, 判断单元 202在校 正单元 208校正后进行判断。 该校正的范围包括对划出轨迹的直线度、 长 度以及划过的区域进行校正。
由以上叙述可以知道, 本发明的触控页面控制系统在页面上显示出与 控件相对应的手势区, 并在手势区显示激活对应的控件所需的手势, 使利 用本系统操作的页面在操作时不需学习各种操作手势, 降低了用户的学习 成本。并且, 手势区显示的手势使页面中的控件可以通过不同的手势激活, 与利用触控激活控件的方式相比, 可以降低误操作的几率。 进一步的, 由 于本发明中手势区的面积大于对应的控件的可操作的面积, 进一步降低了 误操作的几率。
根据本发明的另一个方面, 还提供一种创建用于触控设备的图形用户 界面的方法。 如图 7所示, 该方法包括以下步骤:
步骤 S701 , 创建可操作的控件。 该创建方法可以应用到包括控件的各 种应用中, 通过操作控件可以激活该控件所对应的操作。
步骤 S702 , 创建与控件相对应的手势标识区, 手势标识区显示能与对 应的控件激活相同的操作的手势。 手势标识区与控件可以通过各种方式对应。 在一个实施例中, 可以通 过以相同的方式来排布控件和手势标识区的位置, 参见图 9和 10 , 位于最 顶部位置的控件 A和手势操作区 A相对应;位于最底部的控件 C和手势操 作区 C相对应; 位于中间位置的控件 B和手势操作区 B相对应。 当然, 控 件的数量不限于图 9-10中所示的数量, 可以少于三个或多于三个。 在另一 个实施例中, 控件与相对应的手势操作区可以通过不同的颜色一一对应, 例如控件 A和手势操作区 A标注为红色;控件 B和手势操作区 B标注为黄 色; 控件 C和手势操作区 C标注为蓝色。 在其它实施例中, 还可以采用其 它的方式提示用户哪个控件与哪个手势操作区相对应。
相应的手势操作区内显示激活相应的控件所对应的操作需要采用的手 势。 后文将结合本发明提供的两种实施例对激活操作所需的手势进行详细 描述。
步骤 S703 , 创建用于接收用户的手势的手势操作区。
在一个实施例中, 手势操作区可以与手势标识区一一对应, 进而与控 件一一对应, 即创建与控件数量相对应的手势操作区, 并且可以采用上述 的对应方式提示用于手势操作区与控件的对应关系, 例如通过位置关系或 颜色等标志提示对应关系。 手势操作区可以分别接收对应的手势标识区的 手势, 以激活对应的控件能激活的操作。 在此情况下, 不同的手势标识区 内可以显示相同的手势, 也可以显示不同的手势。 优选地, 手势标识区分 别位于对应的手势操作区内, 以便利用手势标识区与控件的对应关系。 如 图 9所示,手势标识区 A位于对应的手势操作区 A内 ,手势标识区 B位于 对应的手势操作区 B内 , 手势标识区 C位于对应的手势操作区 C内。手势 标识区位于手势操作区内包括手势标识区与手势操作区重合。
在一个优选实施例中, 手势操作区中的每个的面积大于对应的控件的 可操作的面积。参见图 9 ,手势操作区具有大于控件的可操作的面积, 以方 便用户在手势操作区内进行手势操作。
在另一个实施例中, 参见图 10 , 手势操作区为一个。 作为示例, 手势 操作区可以位于界面中的任意位置处。 用户的手势为能与对应的控件激活 相同的操作的手势中的任何一个。 也就是说, 在该手势操作区内可以接收 激活各个控件所对应的操作的各种手势。 激活控件 A所对应的操作可以在 该手势操作区内进行, 而接收控件 B和 C所对应的操作也可以在该手势操 作区内进行。 以此设置方式, 扩大了手势操作区的面积, 方便用户进行准 确地操作。
优选地, 在界面中显示的控件的数量为多个时, 控件中的每个分别对 应不同的手势。如图 10所示,手势标识区 A-C内分别显示不同的手势,作 为示例, 手势标识区 A内显示的标识为 " " , 手势标识区 B内显示的标 识为 " , 手势标识区 C 内显示的标识为 " " 。 这样, 如果手势操 作区内接受到的手势操作为 " " , 则控件 A所对应的操作可以被激活。
进一步, 本发明提供的方法在创建与控件相对应的手势标识区 (即步 骤 701 ) 之前, 该方法还包括以下步骤, 参见图 8 :
执行步骤 801 ,预先创建手势数据库,该手势数据库包括多个能与对应 的控件激活相同的操作的待用手势。 作为示例, 可以创建激活不同的界面 内可能出现的各种控件所对应的操作的手势。
执行步骤 802 , 根据控件从手势数据库中提取与控件对应的待用手势, 用于创建与控件相对应的手势标识区。根据在待显示的界面内出现的控件, 从手势数据库中查找到对应的待用手势, 用作激活该控件所对应的操作的 手势。
图 8中所包含的步骤 701-703与图 7中所示的步骤 701-703分别相同或 相似, 因此这里不再详述。
如图 2、 4、 9-10所示, 该图形用户界面内还可以包括栏目内容区, 栏 目内容区占用控件区域以外的区域。 手势操作区和 /或手势标识区可以位于 栏目内容区中空白区域内或内容较少的区域内。在另一个实施例中,如图 9 所示, 手势操作区和 /或手势标识区可以与栏目内容区分开。 在仅包括一个 手势操作区的实施例 (例如图 10所示的实施例) 中, 栏目内容区可以与手 势操作区重合。
当所述控件所对应的操作未被激活时, 显示预定内容。 该预定内容可 以由用户设置或者为默认设置, 预定内容可以例如为控件 A所对应的栏目 内容。 当多个控件所对应的操作中的一个被激活时, 显示所激活的操作所 对应的内容。 例如, 当控件 B所对应的操作被激活时, 可以显示该操作所
、 、;根据本发明的另一个方面, 还提供一种触控设备, 包括存储器和处理 器, 其中, 存储器存储可执行程序代码, 可执行程序代码可操作用于: 当 由处理器执行时, 创建可操作的控件; 创建与控件相对应的手势标识区, 手势标识区显示能与对应的控件激活相同的操作的手势; 以及创建用于接 收用户的手势的手势操作区。
根据本发明的再一个方面, 还提供一种包括用于触控设备的可执行 程序代码的非暂时性计算机程序产品。 所述非暂时性计算机程序产品包 括非暂时性数字数据存储介质,例如磁或光盘、 随机存取存储器(RAM) 、 磁式硬盘、 闪存和 /或只读存储器(ROM) 等。 在一个实施例中, 该可执行 程序代码可操作用于: 当执行时, 创建可操作的控件; 创建与控件相对 应的手势标识区, 手势标识区显示能与对应的控件激活相同的操作的手 势; 以及创建用于接收用户的手势的手势操作区。 在另一个实施例中, 该可执行程序代码可操作用于: 当执行时, 接收页面打开请求; 对将要打 开的页面进行判断, 判断该页面中是否存在控件; 如果该页面中存在所述 控件, 则计数所述控件的数量, 并确定其所在的位置; 根据控件的数量创 建手势区, 该手势区的数量与控件的数量相对应; 在手势区显示激活对应 的控件所需的手势。
以上所述, 仅是本发明的较佳实施例而已, 并非对本发明作任何形式 上的限制, 虽然本发明已以较佳实施例揭示如上, 然而并非用以限定本发 明, 任何本领域技术人员, 在不脱离本发明技术方案范围内, 当可利用上 述揭示的技术内容作出些许更动或修饰为等同变化的等效实施例, 但凡是 未脱离本发明技术方案内容, 依据本发明的技术实质对以上实施例所作的 任何简介修改、 等同变化与修饰, 均仍属于本发明技术方案的范围内。

Claims

权利要求
1 . 一种触控页面控制方法, 其特征在于包括:
接收页面打开请求;
对将要打开的页面进行判断, 判断该页面中是否存在控件;
如果该页面中存在所述控件, 则计数所述控件的数量, 并确定其所在
、 ^根据控件的数量创建手势区, 该手势区的数量与控件的数量相对应; 在手势区内显示激活对应的控件所需的手势。
2. 如权利要求 1所述的触控页面控制方法, 其特征在于, 该手势区 的面积大于对应的控件的可操作的面积。
3 . 如权利要求 1所述的触控页面控制方法, 其特征在于, 该方法还 包括:
预先设定激活特定种类的控件所需的手势;
在确定控件数量步骤中, 还包括确定控件的种类;
在手势区内显示激活对应的控件所需的手势的步骤之前, 还包括根据 控件的种类从设定的手势中提取所需的手势;
在手势区内显示激活对应的控件所需的手势的步骤中, 显示的手势是 由设定的手势中提取的手势。
4. 如权利要求 1所述的触控页面控制方法, 其特征在于, 该方法还 包括:
根据显示的手势在手势区划出相应的轨迹;
判断该轨迹是否与显示的手势相符;
如果与显示的手势相符则激活与该手势区相对应的控件。
5 . 如权利要求 4所述的触控页面控制方法, 其特征在于, 该方法还 包括:
对该划出的轨迹进行校正;
判断该轨迹是否与显示的手势相符的步骤中是根据校正的结果进行判
6. 如权利要求 5所述的触控页面控制方法, 其特征在于, 该校正的 范围包括对划出轨迹的直线度、 长度以及划过的区域进行校正。
7. 一种触控页面控制系统, 其特征在于包括:
请求接收单元, 用于接收页面打开请求; 判断单元, 用于对将要打开的页面进行判断, 判断该页面中是否存 控件;
于在该页面中存在所述控件时计数所述控件
Figure imgf000013_0001
手势区创建单元, 用于根据控件的数量创建手势区, 该手势区的数量 与控件的数量相对应; 以及
显示单元, 用于在手势区显示激活对应的控件所需的手势。
8. 如权利要求 7所述的触控页面控制系统, 其特征在于, 该手势区 的面积大于对应的控件的可操作的面积。
9. 如权利要求 7所述的触控页面控制系统, 其特征在于, 该系统还 包括:
手势设定单元, 用于预先设定激活特定种类控件所需的手势; 控件信息确定单元还用于确定控件的种类;
该系统还包括:
手势提取单元,用于根据控件的种类由手势设定单元提取所需的手势。
10. 如权利要求 7所述的触控页面控制系统 ,其特征在于,判断单元 还用于判断作用于手势区的手势与显示的手势是否相符。
11 . 如权利要求 10所述的触控页面控制系统, 其特征在于, 该系统 还包括:
校正单元, 用于对作用于手势区的手势轨迹进行校正;
判断单元在校正单元校正后进行判断。
12. 如权利要求 11所述的触控页面控制系统, 其特征在于, 该校正 的范围包括对划出轨迹的直线度、 长度以及划过的区域进行校正。
13 . 一种创建用于触控设备的图形用户界面的方法, 包括:
创建可操作的控件;
创建与所述控件相对应的手势标识区, 所述手势标识区显示能与对应 的控件激活相同的操作的手势; 以及
创建用于接收用户的手势的手势操作区。
14. 如权利要求 13所述的方法, 其特征在于, 所述手势操作区与所 述手势标识区一一对应, 所述手势操作区分别用于接收对应的手势标识区 的手势, 以激活对应的控件能激活的操作。
15 . 如权利要求 14所述的方法, 其特征在于, 所述手势标识区分别 位于对应的手势操作区内。
16. 如权利要求 14所述的方法, 其特征在于, 所述手势操作区中的 每个的面积大于对应的控件的可操作的面积。
17. 如权利要求 13所述的方法, 其特征在于, 所述手势操作区为一 个, 所述用户的手势是所述能与对应的控件激活相同的操作的手势中的任 何一个。
18. 如权利要求 17所述的方法, 其特征在于, 所述控件是多个, 所 述控件中的每个分别对应不同的手势。
19. 如权利要求 13所述的方法, 其特征在于, 在创建与所述控件相 对应的手势标识区之前, 该方法还包括:
预先创建手势数据库, 所述手势数据库包括多个能与对应的控件激活 相同的操作的待用手势; 以及
根据所述控件从所述手势数据库中提取与所述控件对应的待用手势, 用于创建与所述控件相对应的手势标识区。
20. 如权利要求 13所述的方法, 其特征在于, 在所述控件所占用的 区域以外的区域创建栏目内容区, 所述栏目内容区用于:
当所述控件所对应的操作未被激活时 , 显示预定内容;
当所述控件所对应的操作中的一个被激活时, 显示所激活的操作所对 、、 21 . 一种触控设备, 包括存储器和处理器, 其中,
所述存储器存储可执行程序代码, 所述可执行程序代码可操作用于: 当由所述处理器执行时,
创建可操作的控件;
创建与所述控件相对应的手势标识区, 所述手势标识区显示能与对应 的控件激活相同的操作的手势; 以及
创建用于接收用户的手势的手势操作区。
创建可操作的控件; 创建与所述控件相对应的手势标识区, 所述手势 标识区显示能与对应的控件激活相同的操作的手势; 以及
创建用于接收用户的手势的手势操作区。
PCT/CN2013/085881 2012-10-24 2013-10-24 一种触控页面控制方法及系统 WO2014063643A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/437,201 US9772768B2 (en) 2012-10-24 2013-10-24 Touch page control method and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201210409083.0 2012-10-24
CN201210409083.0A CN103777881B (zh) 2012-10-24 2012-10-24 一种触控设备页面控制方法及系统

Publications (1)

Publication Number Publication Date
WO2014063643A1 true WO2014063643A1 (zh) 2014-05-01

Family

ID=50544030

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/085881 WO2014063643A1 (zh) 2012-10-24 2013-10-24 一种触控页面控制方法及系统

Country Status (3)

Country Link
US (1) US9772768B2 (zh)
CN (1) CN103777881B (zh)
WO (1) WO2014063643A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461355A (zh) * 2014-11-18 2015-03-25 苏州佳世达电通有限公司 电子装置运作方法以及电子装置
WO2020037469A1 (zh) * 2018-08-20 2020-02-27 华为技术有限公司 界面的显示方法及电子设备
CN113946271A (zh) * 2021-11-01 2022-01-18 北京字跳网络技术有限公司 显示控制方法、装置、电子设备和存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193735A (zh) * 2011-03-24 2011-09-21 北京思创银联科技股份有限公司 触控操作方法
CN102193720A (zh) * 2010-03-03 2011-09-21 宏碁股份有限公司 内容选取方法及其触控系统

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6476834B1 (en) * 1999-05-28 2002-11-05 International Business Machines Corporation Dynamic creation of selectable items on surfaces
US8364688B1 (en) * 1999-09-07 2013-01-29 Thomas C Douglass System and method for providing and updating on-line forms and registrations
TW201009650A (en) * 2008-08-28 2010-03-01 Acer Inc Gesture guide system and method for controlling computer system by gesture
KR20100118366A (ko) * 2009-04-28 2010-11-05 삼성전자주식회사 휴대 단말기의 터치스크린 운용 방법 및 이를 지원하는 휴대 단말기
US8847880B2 (en) * 2009-07-14 2014-09-30 Cywee Group Ltd. Method and apparatus for providing motion library
US8436821B1 (en) * 2009-11-20 2013-05-07 Adobe Systems Incorporated System and method for developing and classifying touch gestures
CN102236502A (zh) * 2010-04-21 2011-11-09 上海三旗通信科技有限公司 一种移动终端压力触控手势识别的人机交互方式
CN102063244A (zh) * 2010-05-26 2011-05-18 绩优科技(深圳)有限公司 一种解决焦点窗口在触摸屏上的控制方法
US9164542B2 (en) * 2010-08-31 2015-10-20 Symbol Technologies, Llc Automated controls for sensor enabled user interface
CN102694942B (zh) * 2011-03-23 2015-07-15 株式会社东芝 图像处理装置、操作方法显示方法及画面显示方法
CN102681774B (zh) * 2012-04-06 2015-02-18 优视科技有限公司 通过手势控制应用界面的方法、装置和移动终端

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193720A (zh) * 2010-03-03 2011-09-21 宏碁股份有限公司 内容选取方法及其触控系统
CN102193735A (zh) * 2011-03-24 2011-09-21 北京思创银联科技股份有限公司 触控操作方法

Also Published As

Publication number Publication date
US9772768B2 (en) 2017-09-26
CN103777881B (zh) 2018-01-09
US20150277747A1 (en) 2015-10-01
CN103777881A (zh) 2014-05-07

Similar Documents

Publication Publication Date Title
US10444976B2 (en) Drag and drop for touchscreen devices
AU2014200472B2 (en) Method and apparatus for multitasking
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
CN105573639B (zh) 用于触发应用的显示的方法和系统
US20130342480A1 (en) Apparatus and method for controlling a terminal using a touch input
US20120176313A1 (en) Display apparatus and voice control method thereof
US10739953B2 (en) Apparatus and method for providing user interface
WO2013091467A1 (zh) 通过拖拽手势控制应用界面的方法和装置
CN103597438B (zh) 信息处理终端及方法和记录介质
US20130241829A1 (en) User interface method of touch screen terminal and apparatus therefor
EP2575009A2 (en) User interface method for a portable terminal
JP6012770B2 (ja) タッチスクリーン機器でのフォルダの新規作成方法及び端末
US20130191769A1 (en) Apparatus and method for providing a clipboard function in a mobile terminal
US20150363086A1 (en) Information processing terminal, screen control method, and screen control program
US20150286356A1 (en) Method, apparatus, and terminal device for controlling display of application interface
EP2613228A1 (en) Display apparatus and method of editing displayed letters in the display apparatus
WO2013182141A1 (zh) 一种人机交互方法、装置及其电子设备
JP5963291B2 (ja) タッチセンシティブ・スクリーンからシンボルを入力する方法および装置
WO2022242542A1 (zh) 应用图标的管理方法和电子设备
WO2014063643A1 (zh) 一种触控页面控制方法及系统
US20160062601A1 (en) Electronic device with touch screen and method for moving application functional interface
US20160041960A1 (en) Method and device for controlling the same
JP7142961B2 (ja) 多言語キーボードシステム
US9069398B1 (en) Electronic device having a touch panel display and a method for operating the same
JP2015166970A (ja) 文字入力システム用のプログラムおよび情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13848520

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14437201

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.09.2015)

122 Ep: pct application non-entry in european phase

Ref document number: 13848520

Country of ref document: EP

Kind code of ref document: A1