WO2014063643A1 - 一种触控页面控制方法及系统 - Google Patents
一种触控页面控制方法及系统 Download PDFInfo
- Publication number
- WO2014063643A1 WO2014063643A1 PCT/CN2013/085881 CN2013085881W WO2014063643A1 WO 2014063643 A1 WO2014063643 A1 WO 2014063643A1 CN 2013085881 W CN2013085881 W CN 2013085881W WO 2014063643 A1 WO2014063643 A1 WO 2014063643A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- control
- area
- page
- controls
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present invention relates to a touch control technology for a terminal, and more particularly to a touch page control method and system. Background technique
- touch-type electronic devices such as touch-screen mobile phones, tablet computers, etc.
- touch-type electronic devices mainly implement specific operations through agreed touch actions, such as turning pages, advancing, rewinding, and enlarging pages through specific gestures, but
- users who use touch-sensitive electronic devices need to go through long-term exploration or consult with others to know these actions, which increases the user's learning cost and brings a bad user experience.
- buttons on the page if you want to perform some operations, such as playing videos, downloading software, playing songs, etc.
- the above buttons need to be activated by touch to perform the desired operation.
- the buttons are relatively small, and it is easy to accidentally touch the button near the button when the button is touched, thereby causing an erroneous operation.
- a touch page control method comprising: receiving a page open request; determining a page to be opened, determining whether a control exists in the page; The above control exists in the page to count the number of the above controls, and determine the location of the control; create a gesture area according to the number of controls, the number of the gesture area corresponds to the number of controls; display the corresponding control in the gesture area display
- a touch page control system comprising: a request receiving unit, configured to receive a page open request; a determining unit, configured to determine a page to be opened, determine whether a control exists in the page; and a control information determining unit, configured to: Counting the number of the above controls and determining the position of the control when the above control exists in the page; the gesture area creation unit is configured to create a gesture area according to the number of controls, the number of the gesture areas corresponding to
- a method for creating a graphical user interface for a touch device comprising: creating an operable control; creating a gesture identification area corresponding to the control, wherein the gesture identification area display can be activated in the same operation as the corresponding control Gesture; and a gesture operation area for creating a gesture for receiving a user.
- a touch device comprising a memory and a processor, wherein the memory stores executable program code, the executable program code being operable to: create an operable control when executed by the processor; a gesture identification area corresponding to the control, the gesture identification area displaying a gesture capable of activating the same operation as the corresponding control; and creating a gesture operation area for receiving a gesture of the user.
- a non-transitory computer program product comprising executable program code for a touch device, the executable program code operable to: when executed, create an operational control; create a corresponding to the control a gesture identification area, the gesture identification area displays a gesture capable of activating the same operation as the corresponding control; and creating a gesture operation area for receiving a gesture of the user.
- the touch page control method and system of the present invention display a gesture area corresponding to the control on the page, and display a gesture required to activate the corresponding control in the gesture area, so that the page operated by the method and the system is not operated Need to learn a variety of operating gestures, reducing the user's learning costs.
- the gesture displayed in the gesture area enables the controls in the page to be activated by different gestures, which can reduce the chance of misoperation compared with the manner of using the touch activation control.
- FIG. 1 is a flowchart of a method for controlling a touch page according to the present invention.
- FIG. 2 is a schematic diagram of a page after the gesture area is created by using the method shown in FIG. 1.
- Figure 3 is a schematic illustration of the gesture displayed by gesture area A in Figure 2.
- FIG. 4 is a schematic diagram of another page after the gesture area is created by using the method shown in FIG. 1.
- FIG. 5 is a schematic diagram showing the gesture displayed by the gesture area in FIG. 4.
- FIG. 6 is a block diagram of a touch page control system according to the present invention.
- FIG. 7 is a flow chart of a method for creating a graphical user interface for a touch device according to the present invention.
- FIG. 8 is a flow chart of another method for creating a graphical user interface for a touch device according to the present disclosure.
- FIG. 9 is a schematic diagram of a graphical user interface created by the method provided by the present invention.
- FIG. 10 is a schematic diagram of another graphical user interface created by the method provided by the present invention. detailed description
- the touch device in the present invention mainly includes a touch screen mobile phone, a tablet computer, etc., but is not limited thereto.
- the page with controls in the present invention refers to a page containing an operable control, such as an operation page of Thunder, PPTV, etc., and a software download, a song play page, etc., but is not limited thereto.
- Thunder look, PPTV operation page contains the controls activated by the operation, such as by clicking the activated button, you can watch the category selection controls, such as movies, TV, anime, etc., or you can play control controls, such as watching immediately, Download and more.
- the controls included in the software download and song play page can include controls such as downloading and playing.
- FIG. 1 is a flowchart of a touch page control method according to an embodiment of the present invention. As shown in Figure 1, the method of the present invention comprises the following steps:
- Step S101 receiving a page open request
- Step S102 Determine a page to be opened, and determine whether there is a control activated by the operation in the page;
- Step S103 if the above control exists in the page, the number of the above controls is counted, and the eight-step SIM is determined, and the gesture area is created according to the number of controls and the location thereof, and the number and position of the gesture area correspond to the number and position of the control. ;
- Step S105 Display a gesture required to activate the corresponding control in the gesture area.
- Figure 2 is a schematic diagram of a page after creating a gesture area using this method.
- the page includes three areas, which are the control area, the column content area, and the hand. Potential area.
- the control area is located on the left side of the page, which includes three controls, and the page is used for Thunder.
- the PPTV page can be used for movies, TVs, and animation controls.
- the content area of the column is located on the right side of the page, and is used to display the content of the corresponding column of each control.
- the content of the column content area displays the first control of the control area, that is, the content corresponding to the movie control.
- the content may include information such as the name, picture, and the like of the plurality of movies.
- the TV or animation control is activated, the content area of the column is adjusted accordingly to the content corresponding to the TV control or the animation control.
- the gesture area is located in the upper right corner of the content area of the column and is used to display the gestures required to activate the corresponding control.
- the gesture area includes three gesture areas A, B, and C.
- the positions of the three gesture areas correspond to the positions of the three controls in the control area, and the areas of the three gesture areas are respectively larger than the operable area of the corresponding control.
- Figure 3 is a schematic illustration of the gesture displayed by gesture area A in Figure 2.
- the gesture for activating the gesture area A in FIG. 2 starts from the left end of the gesture area A, slides to the right horizontally or approximately horizontally, and extends to a right end of the gesture area, and then continues to slide vertically downward for a distance, longitudinally.
- the sliding distance is not less than one quarter of the lateral sliding distance.
- the gestures displayed in the gesture regions may be the same or different. The preferred manner is that the gestures of the gesture regions are different, thereby avoiding erroneous operations caused by the same gestures.
- the position of the gesture area is not limited thereto, and the position thereof may be adjusted according to the position of the control, or the gesture area is set on a blank portion of the page or a portion with less text, when the gesture area is set
- the step of judging the page text needs to be added to judge the distribution of the text on the page.
- the distribution of the content (including text and/or picture, etc.) in the page is determined, and the gesture operation area is displayed in an area where the content is less than a predetermined value.
- the predetermined value may relate to an area ratio of content in the area, such as a gesture area displayed when the area ratio of the content is less than 20%.
- the method may further include: presetting a gesture required to activate a particular kind of control.
- the step of determining the number and location of the controls further comprising determining the type of the control;
- the method further comprises: extracting the required gesture from the set gesture according to the type of the control; and displaying the gesture required to activate the corresponding control in the gesture area
- the displayed gesture is a gesture extracted from the set gesture.
- the method may further include: marking a corresponding trajectory in the gesture area according to the displayed gesture; determining whether the trajectory matches the displayed gesture; and if corresponding to the displayed gesture, activating the corresponding gesture area Control.
- the method may further include: correcting the drawn trajectory after the corresponding trajectory is drawn in the gesture area according to the displayed gesture; correspondingly, in the step of determining whether the trajectory matches the displayed gesture, according to the correction
- the result is judged.
- the correction range includes correction of the straightness, the length, and the swept area of the drawn track.
- the correction is used to eliminate the deviation between the actual operating gesture and the ideal value. For example, there may be a curvature when the line is actually drawn, and the correction may be used to determine a curve having a radius of curvature greater than a predetermined value as a straight line.
- the straightness includes the straightness of each segment of the trajectory. For example, in Fig.
- the straightness includes the straightness of the transverse segment of the trajectory and the longitudinal segment.
- the length includes the length of each segment of the trajectory.
- the length includes the length of the transverse segment of the trajectory and the length of the longitudinal segment.
- the crossed area refers to the area of the smallest rectangle that wraps each end of the trajectory.
- the correction of the traversed area means that within a certain error range, if the traversed area exceeds the range of the corresponding gesture area, it is also considered that the traversed area does not exceed the range of the gesture area. For example, when the gesture shown in Fig. 3 is drawn, if the gesture area A is exceeded within the range allowed by the error when sliding longitudinally, it is considered that the gesture does not exceed the gesture area A.
- the touch page control method of the present invention displays a gesture area corresponding to the control on the page, and displays a gesture required to activate the corresponding control in the gesture area, so that the page operated by the method is operated. There is no need to learn various operation gestures, which reduces the user's learning cost. Moreover, the gesture displayed in the gesture area enables the controls in the page to be activated by different gestures, which can reduce the probability of misoperation compared with the manner of using the touch activation control. Further, since the area of the gesture area in the present invention is larger than the operable area of the corresponding control, the probability of erroneous operation is further reduced.
- FIG. 6 is a block diagram of a touch page control system according to the present invention.
- the touch page control system of the present invention includes a request receiving unit 201, a determining unit 202, a control information determining unit 203, a gesture area creating unit 204, and a display unit 205.
- the request receiving unit 201 is configured to receive an open request of the page
- the determining unit 202 is configured to judge the page to be opened, and determine whether the page is saved. In an operational control;
- the gesture area creating unit 204 is configured to create a gesture area according to the number of controls and the location thereof, and the number and position of the gesture area correspond to the number and position of the control; the area of the gesture area is larger than the operable area of the corresponding control.
- the display unit 205 is for displaying a gesture required to activate the corresponding control in the gesture area.
- the system further includes a gesture setting unit 206 for presetting a gesture required to activate a particular type of control.
- control information determining unit 203 is also used to determine the type of control.
- the system further includes a gesture extraction unit 207 for extracting the desired gesture by the gesture setting unit 206 in accordance with the type of control.
- the determining unit 202 is further configured to display, in the gesture area, a gesture required to activate the corresponding control, and after the gesture area performs a corresponding activation gesture according to the displayed gesture, determine the gesture and the display acting on the gesture area. Whether the gestures match, if they match, activate the corresponding control.
- the system further includes a correction unit 208 for correcting the gesture track acting on the gesture area.
- the determination unit 202 makes a determination after the correction unit 208 is calibrated.
- the range of corrections includes corrections for the straightness, length, and swept area of the trajectory.
- the touch page control system of the present invention displays a gesture area corresponding to the control on the page, and displays a gesture required to activate the corresponding control in the gesture area, so that the page operated by the system is operated. There is no need to learn various operation gestures, which reduces the user's learning cost. Moreover, the gesture displayed in the gesture area enables the controls in the page to be activated by different gestures, which can reduce the probability of misoperation compared with the manner of using the touch activation control. Further, since the area of the gesture area in the present invention is larger than the operable area of the corresponding control, the probability of erroneous operation is further reduced.
- a method of creating a graphical user interface for a touch device includes the following steps:
- Step S701 creating an operable control.
- the creation method can be applied to various applications including controls, and the operation corresponding to the control can be activated by operating the control.
- Step S702 Create a gesture identification area corresponding to the control, and the gesture identification area displays a gesture that can activate the same operation as the corresponding control.
- the gesture identification area and the control can correspond in various ways.
- the position of the control and gesture identification area can be arranged in the same manner, see Figures 9 and 10, the control A at the topmost position corresponds to the gesture operation area A; the control C at the bottommost Corresponding to the gesture operation area C; the control B located at the intermediate position corresponds to the gesture operation area B.
- the number of controls is not limited to the number shown in Figures 9-10, and may be less than three or more than three.
- control and the corresponding gesture operation area may correspond to each other by different colors, for example, the control A and the gesture operation area A are marked in red; the control B and the gesture operation area B are marked as yellow; the control C and The gesture operation area C is marked in blue.
- other ways may be used to prompt the user which control corresponds to which gesture operating area.
- the corresponding gesture operation area displays the gestures required to activate the corresponding control.
- the gestures required for the activation operation will be described in detail later in connection with the two embodiments provided by the present invention.
- Step S703 Create a gesture operation area for receiving a gesture of the user.
- the gesture operation area may be in one-to-one correspondence with the gesture identification area, and then correspond to the control one by one, that is, creating a gesture operation area corresponding to the number of controls, and may be used for the gesture operation area by using the corresponding manner described above. Correspondence with the control, for example, by a positional relationship or a color, etc., to indicate the correspondence.
- the gesture operation area may respectively receive a gesture of the corresponding gesture identification area to activate an operation that the corresponding control can activate. In this case, different gestures can be displayed in the same gesture area, and different gestures can be displayed.
- the gesture identifier is located in the corresponding gesture operation area, so as to utilize the gesture to identify the correspondence between the area and the control. As shown in FIG.
- the gesture identification area A is located in the corresponding gesture operation area A
- the gesture identification area B is located in the corresponding gesture operation area B
- the gesture identification area C is located in the corresponding gesture operation area C.
- the gesture identification area is located in the gesture operation area, and the gesture identification area coincides with the gesture operation area.
- the area of each of the gesture operating zones is greater than the operable area of the corresponding control.
- the gesture operation area has an operable area larger than the control to facilitate the user to perform gesture operations in the gesture operation area.
- the gesture operation area is one.
- the gesture operating area can be located anywhere in the interface.
- the user's gesture is any of the gestures that can activate the same operation as the corresponding control. That is to say, various gestures for activating the operations corresponding to the respective controls can be received within the gesture operation area.
- the operation corresponding to the activation of the control A can be performed in the gesture operation area, and the operations corresponding to the reception controls B and C can also be performed in the gesture operation area. In this way, the area of the gesture operation area is enlarged, which is convenient for the user to Do it ok.
- each of the controls respectively corresponds to a different gesture.
- different gestures are displayed in the gesture identification area AC.
- the identifier displayed in the gesture identification area A is ""
- the identifier displayed in the gesture identification area B is ",” displayed in the gesture identification area C.
- the identifier is " ".
- the method further includes the following steps, as shown in FIG. 8:
- Step 801 is executed to create a gesture database in advance, and the gesture database includes a plurality of standby gestures that can activate the same operation as the corresponding control.
- gestures can be created that activate the operations corresponding to the various controls that may occur within different interfaces.
- Step 802 is performed, and the inactive gesture corresponding to the control is extracted from the gesture database according to the control, and used to create a gesture identification area corresponding to the control.
- the corresponding standby gesture is found from the gesture database, and is used as a gesture for activating the operation corresponding to the control.
- Fig. 8 The steps 701-703 included in Fig. 8 are the same as or similar to the steps 701-703 shown in Fig. 7, and therefore will not be described in detail herein.
- the graphical user interface may further include a column content area, and the column content area occupies an area other than the control area.
- the gesture operation area and/or the gesture identification area may be located in a blank area in the content area of the column or in an area with less content.
- the gesture operating area and/or the gesture identification area may be distinguished from the column content.
- the column content area may coincide with the hand operating area.
- the predetermined content is displayed when the operation corresponding to the control is not activated.
- the predetermined content may be set by the user or set as a default, and the predetermined content may be, for example, the content of the column corresponding to the control A.
- the content corresponding to the activated operation is displayed. For example, when the operation corresponding to the control B is activated, the operation can be displayed.
- a touch device comprising a memory and a processor, wherein the memory stores executable program code, executable program code operable to: when executed by a processor, Create an actionable control; create a gesture identification area corresponding to the control, the gesture identification area displays a gesture that can activate the same operation as the corresponding control; and create a connection for the connection The gesture operation area that receives the user's gesture.
- a non-transitory computer program product comprising executable program code for a touch device.
- the non-transitory computer program product includes a non-transitory digital data storage medium such as a magnetic or optical disk, a random access memory (RAM), a magnetic hard disk, a flash memory, and/or a read only memory (ROM).
- the executable program code is operative to: when executed, create an operational control; create a gesture identification area corresponding to the control, the gesture identification area display can activate the same operation as the corresponding control Gesture; and a gesture operation area for creating a gesture for receiving a user.
- the executable program code is operable to: when executed, receive a page open request; determine a page to be opened, determine whether a control exists in the page; if the control exists in the page , counting the number of the controls and determining the location thereof; creating a gesture area according to the number of controls, the number of the gesture areas corresponding to the number of controls; displaying the gestures required to activate the corresponding controls in the gesture area.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/437,201 US9772768B2 (en) | 2012-10-24 | 2013-10-24 | Touch page control method and system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210409083.0 | 2012-10-24 | ||
CN201210409083.0A CN103777881B (zh) | 2012-10-24 | 2012-10-24 | 一种触控设备页面控制方法及系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014063643A1 true WO2014063643A1 (zh) | 2014-05-01 |
Family
ID=50544030
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2013/085881 WO2014063643A1 (zh) | 2012-10-24 | 2013-10-24 | 一种触控页面控制方法及系统 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9772768B2 (zh) |
CN (1) | CN103777881B (zh) |
WO (1) | WO2014063643A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104461355A (zh) * | 2014-11-18 | 2015-03-25 | 苏州佳世达电通有限公司 | 电子装置运作方法以及电子装置 |
WO2020037469A1 (zh) * | 2018-08-20 | 2020-02-27 | 华为技术有限公司 | 界面的显示方法及电子设备 |
CN113946271A (zh) * | 2021-11-01 | 2022-01-18 | 北京字跳网络技术有限公司 | 显示控制方法、装置、电子设备和存储介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102193735A (zh) * | 2011-03-24 | 2011-09-21 | 北京思创银联科技股份有限公司 | 触控操作方法 |
CN102193720A (zh) * | 2010-03-03 | 2011-09-21 | 宏碁股份有限公司 | 内容选取方法及其触控系统 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6476834B1 (en) * | 1999-05-28 | 2002-11-05 | International Business Machines Corporation | Dynamic creation of selectable items on surfaces |
US8364688B1 (en) * | 1999-09-07 | 2013-01-29 | Thomas C Douglass | System and method for providing and updating on-line forms and registrations |
TW201009650A (en) * | 2008-08-28 | 2010-03-01 | Acer Inc | Gesture guide system and method for controlling computer system by gesture |
KR20100118366A (ko) * | 2009-04-28 | 2010-11-05 | 삼성전자주식회사 | 휴대 단말기의 터치스크린 운용 방법 및 이를 지원하는 휴대 단말기 |
US8847880B2 (en) * | 2009-07-14 | 2014-09-30 | Cywee Group Ltd. | Method and apparatus for providing motion library |
US8436821B1 (en) * | 2009-11-20 | 2013-05-07 | Adobe Systems Incorporated | System and method for developing and classifying touch gestures |
CN102236502A (zh) * | 2010-04-21 | 2011-11-09 | 上海三旗通信科技有限公司 | 一种移动终端压力触控手势识别的人机交互方式 |
CN102063244A (zh) * | 2010-05-26 | 2011-05-18 | 绩优科技(深圳)有限公司 | 一种解决焦点窗口在触摸屏上的控制方法 |
US9164542B2 (en) * | 2010-08-31 | 2015-10-20 | Symbol Technologies, Llc | Automated controls for sensor enabled user interface |
CN102694942B (zh) * | 2011-03-23 | 2015-07-15 | 株式会社东芝 | 图像处理装置、操作方法显示方法及画面显示方法 |
CN102681774B (zh) | 2012-04-06 | 2015-02-18 | 优视科技有限公司 | 通过手势控制应用界面的方法、装置和移动终端 |
-
2012
- 2012-10-24 CN CN201210409083.0A patent/CN103777881B/zh active Active
-
2013
- 2013-10-24 WO PCT/CN2013/085881 patent/WO2014063643A1/zh active Application Filing
- 2013-10-24 US US14/437,201 patent/US9772768B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102193720A (zh) * | 2010-03-03 | 2011-09-21 | 宏碁股份有限公司 | 内容选取方法及其触控系统 |
CN102193735A (zh) * | 2011-03-24 | 2011-09-21 | 北京思创银联科技股份有限公司 | 触控操作方法 |
Also Published As
Publication number | Publication date |
---|---|
US20150277747A1 (en) | 2015-10-01 |
CN103777881A (zh) | 2014-05-07 |
CN103777881B (zh) | 2018-01-09 |
US9772768B2 (en) | 2017-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10444976B2 (en) | Drag and drop for touchscreen devices | |
US20160004373A1 (en) | Method for providing auxiliary information and touch control display apparatus using the same | |
CN105573639B (zh) | 用于触发应用的显示的方法和系统 | |
US20130342480A1 (en) | Apparatus and method for controlling a terminal using a touch input | |
US20190095059A1 (en) | Method and device for processing application icon and electronic apparatus | |
BR102014002492A2 (pt) | método e aparelho para multitarefa | |
US20120176313A1 (en) | Display apparatus and voice control method thereof | |
US10739953B2 (en) | Apparatus and method for providing user interface | |
CN103597438B (zh) | 信息处理终端及方法和记录介质 | |
EP2575009A2 (en) | User interface method for a portable terminal | |
JP6012770B2 (ja) | タッチスクリーン機器でのフォルダの新規作成方法及び端末 | |
US20130191769A1 (en) | Apparatus and method for providing a clipboard function in a mobile terminal | |
US20150363086A1 (en) | Information processing terminal, screen control method, and screen control program | |
US20150286356A1 (en) | Method, apparatus, and terminal device for controlling display of application interface | |
EP2613228A1 (en) | Display apparatus and method of editing displayed letters in the display apparatus | |
WO2013182141A1 (zh) | 一种人机交互方法、装置及其电子设备 | |
JP5963291B2 (ja) | タッチセンシティブ・スクリーンからシンボルを入力する方法および装置 | |
WO2022242542A1 (zh) | 应用图标的管理方法和电子设备 | |
WO2014063643A1 (zh) | 一种触控页面控制方法及系统 | |
US20160062601A1 (en) | Electronic device with touch screen and method for moving application functional interface | |
US20160041960A1 (en) | Method and device for controlling the same | |
JP7142961B2 (ja) | 多言語キーボードシステム | |
CN110795015A (zh) | 操作提示方法、装置、设备及存储介质 | |
US9069398B1 (en) | Electronic device having a touch panel display and a method for operating the same | |
US9536126B2 (en) | Function execution method based on a user input, and electronic device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13848520 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14437201 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.09.2015) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13848520 Country of ref document: EP Kind code of ref document: A1 |