WO2013174057A1 - 触控屏操作方法及装置 - Google Patents

触控屏操作方法及装置 Download PDF

Info

Publication number
WO2013174057A1
WO2013174057A1 PCT/CN2012/077777 CN2012077777W WO2013174057A1 WO 2013174057 A1 WO2013174057 A1 WO 2013174057A1 CN 2012077777 W CN2012077777 W CN 2012077777W WO 2013174057 A1 WO2013174057 A1 WO 2013174057A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
coordinate point
interface element
touch operation
movable state
Prior art date
Application number
PCT/CN2012/077777
Other languages
English (en)
French (fr)
Inventor
房稳
郭锋
王颖
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2013174057A1 publication Critical patent/WO2013174057A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to the field of communications, and in particular to a touch screen operating method and apparatus.
  • Touch screen operation has the advantage of being intuitive and convenient compared to other modes of operation.
  • users are increasingly operating through touch screens, and the buttons of mobile phones are showing fewer and fewer trends.
  • a small number of physical buttons have been concentrated on the side.
  • the most common operations are: click, long press and swipe.
  • most of the Android systems and other large-screen touch phones can perform all touch screen user interface operations through these three operations.
  • the user opens an application or a new interface or menu by clicking; starts a special operation by long pressing or switches some interface elements to a special state; switches the operation page by swiping; Move the position of the interface element and so on.
  • Microsoft applied for a patent in 2000 the patent number is US6897853, and the patent was authorized in 2005.
  • the patent application file clicks on the touch screen from the bottom of the judgment method, long press And the swipe operation is protected.
  • the scheme can be briefly described as follows: First, the user input is received by using the touch screen, and whether the click or the swipe is determined according to the input distance and time, if not clicked and swiped, and then according to whether the input continues for a certain period of time or not The movement judges whether it is a long press, if the input does not move within a certain time, it is a long press, otherwise it is a drag.
  • Inventec also laid out a patent here in 2005, the patent number is CN200510027700.
  • the above operation mode has the following disadvantages:
  • the human hand may block part of the screen, which may cause the user to be unable to see the drag destination, and therefore, the drag cannot be confirmed.
  • dragging the interface elements may result in inaccurate final position of the drag, and may require the user to drag the interface element to drag the interface element to the destination, thereby giving the user
  • the operation is inconvenient. Therefore, in the related art, the destination cannot be accurately confirmed due to blocking part of the screen during operation, resulting in inaccuracy of the drag operation, thereby reducing the accuracy and convenience of the operation and reducing the user experience.
  • a touch screen operation method including: detecting a first coordinate point corresponding to a first touch operation; setting a user interface element corresponding to the first coordinate point to a movable state And detecting a second coordinate point corresponding to the second touch operation, and moving the user interface element to the second coordinate point.
  • the first touch operation comprises: a double click operation.
  • the second touch operation comprises at least one of: performing a click operation on the second coordinate point, and dragging from the first coordinate point to the second coordinate point.
  • detecting the second coordinate point corresponding to the second touch operation, and moving the user interface element to the second coordinate point comprises: after detecting the second coordinate point corresponding to the second touch operation, detecting the second coordinate point There is a corresponding user interface element; the user interface element is interchanged with the position of the corresponding user interface element on the second coordinate point.
  • the corresponding user interface element and the user interface element on the second coordinate point are on different user interface pages, wherein the corresponding user interface element and the user interface element on the second coordinate point are icons.
  • detecting the second coordinate point corresponding to the second touch operation comprises: dragging the user interface element to the edge of the user interface page where the user interface element is located according to the first preset direction, wherein the first preset direction and the user interface The page switching direction of the page is consistent; after the user interface page switching of the touch screen, the second coordinate point corresponding to the second touch operation is detected in the switched user interface page.
  • the touch screen operation method further includes: when the user interface element is an icon on the user interface, the user interface element is followed.
  • the second preset direction is moved out of the touch screen, wherein the second preset direction is different from the page switching direction of the user interface page of the touch screen; after confirming, the user interface element is deleted from the page where the user interface element is located.
  • the touch screen operation method further includes: detecting that the second coordinate point corresponding to the second touch operation corresponds to the first touch operation The first coordinate points coincide, or the second touch operation is not detected within the preset time period; the state of the user interface element corresponding to the first coordinate point is set from the movable state to the non-movable state.
  • the touch screen operation method further includes: the user interface element displays the preset identifier, wherein the preset identifier is used to indicate the user interface element Is in a movable state.
  • a touch screen operating device including: a detecting module configured to detect a first coordinate point corresponding to the first touch operation; and a setting module configured to be compared with the first coordinate point The corresponding user interface element is set to a movable state; the processing module is configured to detect a second coordinate point corresponding to the second touch operation, and move the user interface element to the second coordinate point.
  • the processing module includes: a detecting unit configured to detect a second coordinate point corresponding to the second touch operation, and detect a corresponding user interface element on the second coordinate point; and the processing unit is configured to set the user interface element The position of the corresponding user interface element on the second coordinate point is interchanged.
  • the user interface element that needs to be moved is selected by the first touch operation, the first coordinate point corresponding to the first touch operation is detected, and the user interface element corresponding to the first coordinate point is set to a movable state, after the first touch operation is completed, the user interface element corresponding to the first coordinate point is set to a movable state, and then the user interface element is moved by detecting the second coordinate point corresponding to the second touch operation.
  • the movement of the user interface element is implemented.
  • the movement of the user interface element is completed by two discontinuous touch operations, that is, the user interface element is set by the first touch operation.
  • the user interface element is moved by the second touch operation, which avoids the "drag" operation in the related art, which requires the operation of the user interface element to be completed by continuous operations. Convenient and inaccurate problems, which improve the accuracy and convenience of the operation and improve the user experience.
  • FIG. 1 is a flow chart of a touch screen operation method according to an embodiment of the invention
  • FIG. 2 is a block diagram showing a structure of a touch screen operation device according to an embodiment of the invention
  • 3 is a structural block diagram of a processing module according to an embodiment of the present invention
  • FIG. 4 is a flowchart of another method for operating a touch screen according to an embodiment of the present invention
  • FIG. 5 is a set of user interface elements according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of the interchange of two icon positions in a nine-square grid interface according to an embodiment of the present invention.
  • FIG. 7 is a diagram of changing a screen unlocking predetermined path position on a lock screen interface according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of deleting user interface elements according to an embodiment of the present invention;
  • FIG. 9 is a schematic diagram of a cross-page moving icon according to an embodiment of the present invention;
  • FIG. 10 is another cross-page movement according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of a mobile phone user interface page in accordance with an embodiment of the present invention.
  • the touch screen operation method includes steps S102 to S106.
  • Step S102 Detect a first coordinate point corresponding to the first touch operation.
  • Step S104 Set a user interface element corresponding to the first coordinate point to a movable state.
  • Step S106 Detect a second coordinate point corresponding to the second touch operation, and move the user interface element to the second coordinate point.
  • the user interface element that needs to be moved is selected by the first touch operation, the first coordinate point corresponding to the first touch operation is detected, and the user interface element corresponding to the first coordinate point is set to be a moving state, after the first touch operation is completed, setting a user interface element corresponding to the first coordinate point to a movable state, and then moving the user interface element to the second coordinate point corresponding to the second touch operation to In the second coordinate point, the movement of the user interface element is implemented.
  • the movement of the user interface element is completed by two discontinuous touch operations, that is, the user interface element is set to be the first touch operation.
  • a preferred first touch operation is provided.
  • the first touch operation includes, but is not limited to: a double click operation.
  • the user interface element that needs to be moved can be set to a movable state by double-clicking operation, and the user can complete the first touch operation after setting the user interface element to the movable state, that is, after the user interface element is set to the movable state, the user The finger can leave the touch screen, and then move the user interface element that has been set to the movable state by the touch operation again.
  • a a preferred second touch operation for example, the second touch operation includes but not At least one of the following: performing a click operation on the second coordinate point, and dragging from the first coordinate point to the second coordinate point, so that after the user interface element is set to the movable state, the user interface element can be flexibly performed.
  • a preferred method for detecting a second coordinate point corresponding to the second touch operation and moving the user interface element to the second coordinate point is provided, for example, After detecting the second coordinate point corresponding to the second touch operation, detecting that the second coordinate point has a corresponding user interface element; and swapping the user interface element with the position of the corresponding user interface element on the second coordinate point.
  • the user interface element that has been set to the movable state is interchanged with the position of the corresponding user interface element on the second coordinate point.
  • the two user interface elements are interchanged, which greatly improves the user's operation diversity.
  • the above user interface The prime can be an icon or a page, and the user can swap the two icons and/or the page by a simple operation.
  • the corresponding user on the second coordinate point The interface element and the user interface element are on different user interface pages, wherein the corresponding user interface element and the user interface element on the second coordinate point are icons.
  • the corresponding user interface on the second coordinate point When the element and the user interface element are icons, the icon can be moved between pages to complete the icon interchange position between different pages, thereby enhancing the practicability of the embodiment. In order to improve the flexibility of operation, the present invention is preferred.
  • the user interface element is dragged to the edge of the user interface page where the user interface element is located according to the first preset direction, where the first preset direction and the user interface page are The page switching direction is consistent; after the user interface page switching of the touch screen, the second coordinate point corresponding to the second touch operation is detected in the switched user interface page.
  • the icon can be moved between pages by dragging, for example, the icon to be moved is first set to a movable state by a double-click operation, and then the icon is dragged, according to the page. The direction of the switch drags the icon to the edge of the page. At this time, the touch screen switches the page, and then drags the icon to the desired position in the switched page.
  • Clicking on the location to move the icon to the clicked location provides a variety of ways to move the icon between pages, thereby increasing operational flexibility.
  • the user interface element corresponding to the first coordinate point is set to the movable state
  • the user interface element is an icon on the user interface
  • the user interface element when the user interface element is set to the movable state, when the user interface element is dragged or drawn out of the touch screen range, the prompt for deleting the icon is displayed, when confirmed , the user interface elements are removed from the page to meet the application needs of different scenarios.
  • the user interface element After the user interface element is set to the movable state, the user interface element can also be restored to the non-movable normal state by, for example, after the user interface element corresponding to the first coordinate point is set to the movable state, Detecting that the second coordinate point corresponding to the second touch operation coincides with the first coordinate point corresponding to the first touch operation, or does not detect the second touch operation within the preset time period; and corresponding to the first coordinate point
  • the state of the user interface element is set to a non-movable state by the movable state. For example, after the user interface element is set to the movable state by a double-click operation, the user interface element can be clicked again to restore the user interface element to a non-movable normal state.
  • the user interface element will return to the non-movable normal state when no operation is performed on the user interface element within the preset time period.
  • the user interface element may display the preset identifier. The preset identifier is used to indicate that the user interface element is in a movable state.
  • the touch screen operating device includes: a first detecting module 202 configured to detect a first coordinate point corresponding to the first touch operation; 204, connected to the first detecting module 202, configured to set a user interface element corresponding to the first coordinate point to a movable state;
  • the processing module 206 is connected to the setting module 204, and is configured to detect a second coordinate point corresponding to the second touch operation, and move the user interface element to the second coordinate point.
  • the user interface element that needs to be moved is selected by the first touch operation, the first detecting module 202 detects the first coordinate point corresponding to the first touch operation, and the setting module 204 corresponds to the first coordinate point.
  • the user interface element is set to a movable state, and the user interface element corresponding to the first coordinate point is set to a movable state after the first touch operation is completed, and then the processing module 206 detects the second coordinate corresponding to the second touch operation. Point, and move the user interface element to the second coordinate point to realize the movement of the user interface element.
  • the movement of the user interface element is completed by two discontinuous touch operations, that is, by the first The touch operation sets the user interface element to a movable state, and then moves the user interface element through the second touch operation, thereby avoiding the "drag" operation in the related art, which requires the user interface to be completed through continuous operations.
  • the processing module 206 includes: a detecting unit 2062, configured to detect a second coordinate point corresponding to the second touch operation, and detect the second There are corresponding user interface elements on the two coordinate points; the processing unit 2064 is connected to the detecting unit 2062, and is arranged to interchange the position of the user interface element with the corresponding user interface element on the second coordinate point.
  • the processing module 206 includes: a dragging unit configured to drag the user interface element to the edge of the user interface page where the user interface element is located according to the first preset direction, where The first preset direction is consistent with the page switching direction of the user interface page; the processing unit is connected to the dragging unit, and is configured to detect the second touch in the switched user interface page after the user interface page switching on the touch screen The second coordinate point corresponding to the control operation.
  • the touch screen operating device further includes: a mobile module, configured to: in the case that the user interface element is an icon on the user interface, the user interface element is in accordance with the second The preset direction is moved out of the touch screen, wherein the second preset direction is different from the page switching direction of the user interface page of the touch screen; the module is deleted, connected to the mobile module, and the user interface element is set from the user interface element after confirmation Deleted on the page.
  • a mobile module configured to: in the case that the user interface element is an icon on the user interface, the user interface element is in accordance with the second The preset direction is moved out of the touch screen, wherein the second preset direction is different from the page switching direction of the user interface page of the touch screen; the module is deleted, connected to the mobile module, and the user interface element is set from the user interface element after confirmation Deleted on the page.
  • the touch screen operating device further includes: a second detecting module, configured to detect that the second coordinate point corresponding to the second touch operation coincides with the first coordinate point corresponding to the first touch operation, or not within the preset time period The second touch operation is detected; the recovery module is connected to the second detection module, and is configured to set the state of the user interface element corresponding to the first coordinate point from the movable state to the non-movable state.
  • the touch screen operating device further includes: a display module configured to display a preset identifier, wherein the preset identifier is used to indicate that the user interface element is in a movable state.
  • Step S402 In the standby state, when a graphical interactive user interface element is double-clicked, the graphical interactive user interface element enters a movable state.
  • Step S404 After the graphical interactive user interface element enters the movable state, determine whether subsequent user operations on the screen set other user interface elements to a movable state, and if yes, go to step S406, if no, Then, the process goes to step S408.
  • Step S406 If the user double-clicks other graphical interactive user interface elements at this time, the two graphical interactive user interface elements are directly replaced with each other, and the positional movement of the two graphical interactive user interface elements is realized at the same time.
  • Step S408 After the graphic interactive user interface element enters the movable state, it is determined whether the user clicks or double-clicks a blank position in the page where the graphic interactive user interface element is located, and if yes, the process goes to step S410. If no, the process goes to step S412.
  • Step S410 Move the above-mentioned user interface element set to a movable state to a blank position that is clicked or double-clicked to implement positional movement of the graphical interactive user interface element.
  • Step S412 After the graphical interactive user interface element enters the movable state, it is determined whether the user clicks or double-clicks the user interface element that is already in the movable state at this time, and if yes, the process goes to step S418, and if not, then Go to step S414.
  • Step S414 It is determined whether there is an action of dragging the user interface element, and if yes, the process goes to step S416, and if no, the process goes to step S418.
  • Step S416 Move the user interface element to the focus position of the drag operation.
  • Step S418 Restore the above-mentioned user interface element set to the movable state to a non-movable normal state.
  • FIG. 5 is a schematic diagram showing the effect of setting a user interface element to a movable state according to an embodiment of the present invention.
  • the visual image of the "weather query” icon is displayed.
  • the effect is changed relative to other unoperated icons.
  • the "Weather Query” icon is displayed in an enlarged manner.
  • This visual effect can prompt the user that the icon is currently in a movable state, and can be further moved.
  • the visual effect after the icon is set to the movable state may have different representation manners according to the user's personal preferences and habits, and the above-mentioned enlargement processing of the icon is only one of the feasible ways, for example, it may also be set.
  • the icon for the movable state is displayed in a blinking manner to show that the current state of the icon is different from the normal situation, and it can be moved.
  • 6 is a schematic diagram of the interchange of two icon positions in a nine-square grid interface according to an embodiment of the present invention. As shown in FIG. 6, in the nine-square grid interface, first double-click the camera icon to make the camera icon moveable; FM (radio) icon, the position of the FM icon and the camera icon are interchanged. After the two icons are swapped successfully, both icons become unmovable and normal. Therefore, the position exchange of any two icons can be conveniently and efficiently realized by the above operation.
  • 7 is a schematic diagram of changing a predetermined path position of a screen unlocking on a lock screen interface according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of deleting a user interface element according to an embodiment of the present invention, as shown in FIG.
  • FIG. 9 is a schematic diagram of a cross-page moving icon according to an embodiment of the present invention.
  • the method is that the user moves the icon across the page by dragging.
  • an icon is set to a movable state by double-clicking, and then the user drags the icon to the left edge of the page (the drag direction is equivalent to the first preset direction).
  • the page displayed by the touch screen automatically switches to the left view, and the icon stays in the right of the left view.
  • the user can further move the above icon to the desired position of the left view by double clicking or clicking on the blank space of the left view.
  • FIG. 10 is a schematic diagram of another cross-page moving icon according to an embodiment of the present invention.
  • the user interface interaction icon is first set to a movable state by a double-click method, and then a screen operation operation is performed through a blank interface.
  • the standby interface enters another view. After entering another view, click or double-click directly on the position where you want to place the icon in the view, so that the icon is directly moved from the previous page to the position where the page is clicked or double-clicked.
  • 11 is a schematic diagram of a mobile phone user interface page in accordance with an embodiment of the present invention. As shown in FIG.
  • Double-clicking in a blank space in the middle view switches the entire middle view to the movable state. After the middle view is switched to the movable state, there is a corresponding visual effect prompt. As shown in FIG. 5, all the icons in the entire view are enlarged, and the screen is moved to the left view interface by a screen-drawing operation, and double-clicking or clicking the left view , the replacement of the left view and the middle view is realized, after the replacement, the new middle view is the previous left view, and the new left view is the middle view before the replacement.
  • modules or steps of the present invention can be implemented by a general-purpose computing device, which can be concentrated on a single computing device or distributed over a network composed of multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device, such that they may be stored in the storage device by the computing device and, in some cases, may be different from the order herein.
  • the steps shown or described are performed, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps are fabricated as a single integrated circuit module.
  • the invention is not limited to any specific combination of hardware and software.
  • the above is only the preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes can be made to the present invention. Any modifications, equivalent substitutions, improvements, etc. made within the spirit and scope of the present invention are intended to be included within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明提供了一种触控屏操作方法及装置,其中,该方法包括:检测到第一触控操作对应的第一坐标点;将与第一坐标点对应的用户界面元素设置为可移动状态;检测第二触控操作对应的第二坐标点,将用户界面元素移动到第二坐标点上。本发明解决了相关技术中由于操作时遮挡住部分屏幕而导致的拖动操作不准确的问题,从而提高了操作的准确性、便捷性,改善了用户体验。

Description

触控屏操作方法及装置 技术领域 本发明涉及通信领域, 具体而言, 涉及一种触控屏操作方法及装置。 背景技术 触摸屏操作相对于其它操作方式具有直观, 便捷的优点。 随着手机逐渐过度到大 屏触控操作时代, 用户通过触摸屏完成的操作越来越多, 手机的按键呈现出越来越少 的趋势。 在一些 pad类产品中, 实体的少量按键已经全部集中在侧面。 在用户的触控操作中, 最常用的几种操作为: 点击, 长按和划动。 目前使用较多 的安卓系统以及其它一些大屏触控手机都可以通过这三种操作完成所有触控屏用户界 面操作。 一般情况下, 用户通过单击打开某个应用或新的界面、 菜单; 通过长按启动一些 特殊操作或将某些界面元素切换到特殊状态; 通过划动切换操作页面; 通过长按后直 接划动实现对界面元素的位置拖动等等。 在该领域的已有相关专利中, 微软公司于 2000 年申请了一篇专利, 专利号为 US6897853 , 该专利 05年获得授权, 该专利申请文件从底层的判断方式方面对触摸屏 的点击, 长按和划动操作进行了保护。 其方案可以简述为: 首先使用触控屏接收用户 输入, 根据输入的距离以及时间来判断是否为单击或划动, 如果非单击和划动, 再根 据输入持续的特定时间内是否有移动判断是否为长按, 如果特定时间内所述输入没有 移动则为长按, 否则为拖动。 另外, 英华达公司也于 2005年在此布局一篇专利, 专利号为 CN200510027700。 该专利申请文件的独立权利要求的描述如下: 接收到触摸屏操作的坐标点以后, 首先 判断是否是双击操作; 如果非双击操作, 则判断是否为单击操作; 如果非单击操作, 则判断是否为鼠标光标移动的操作。 这也是一篇驱动底层相关的专利。 基于对上述专利申请文件的分析, 在相关技术中, 对界面元素进行移动的"拖动" 操作要求必须长按后在不放开的状态下进行移动, 而且目前的很多手机也采用了这种 操作方式。 然而, 上述操作方式具有如下缺点: 当用户手指长按触控屏时, 人手会遮 挡住部分屏幕, 从而可能造成用户无法看到拖动目的地的情况, 因此, 在无法确认拖 动操作的目的地的情况下进行界面元素的拖动可能会出现拖动的最终位置不准确的现 象, 可能需要用户再此进行拖动操作以将界面元素拖动到目的地, 从而给用户的操作 带来不便。 因此, 在相关技术中, 在操作时由于遮挡住部分屏幕而无法准确地确认目的地, 导致了拖动操作的不准确性, 从而降低了操作的准确性、 便捷性, 降低了用户体验。 发明内容 本发明提供了一种触控屏操作方法及装置, 以至少解决相关技术中由于操作时遮 挡住部分屏幕而导致的拖动操作不准确的问题。 根据本发明的一个方面, 提供了一种触控屏操作方法, 其包括: 检测到第一触控 操作对应的第一坐标点; 将与第一坐标点对应的用户界面元素设置为可移动状态; 检 测第二触控操作对应的第二坐标点, 将用户界面元素移动到第二坐标点上。 优选地, 第一触控操作包括: 双击操作。 优选地, 第二触控操作包括以下至少之一: 在第二坐标点上进行点击操作、 从第 一坐标点到第二坐标点的拖动。 优选地, 检测第二触控操作对应的第二坐标点, 将用户界面元素移动到第二坐标 点上包括: 检测到第二触控操作对应的第二坐标点后, 检测出第二坐标点上有对应的 用户界面元素; 将用户界面元素与第二坐标点上对应的用户界面元素的位置互换。 优选地, 第二坐标点上对应的用户界面元素与用户界面元素在不同的用户界面页 面上, 其中, 第二坐标点上对应的用户界面元素和用户界面元素为图标。 优选地, 检测第二触控操作对应的第二坐标点包括: 按照第一预设方向将用户界 面元素拖动到用户界面元素所在用户界面页面的边缘, 其中, 第一预设方向与用户界 面页面的页面切换方向一致; 在触控屏进行用户界面页面切换后, 在切换后的用户界 面页面中检测第二触控操作对应的第二坐标点。 优选地, 在将与第一坐标点对应的用户界面元素设置为可移动状态之后, 上述触 控屏操作方法还包括: 在用户界面元素为用户界面上的图标的情况下, 将用户界面元 素按照第二预设方向移出触控屏, 其中, 第二预设方向与触控屏的用户界面页面的页 面切换方向不同; 确认后将用户界面元素从用户界面元素所在的页面上删除。 优选地, 在将与第一坐标点对应的用户界面元素设置为可移动状态之后, 上述触 控屏操作方法还包括: 检测第二触控操作对应的第二坐标点与第一触控操作对应的第 一坐标点重合, 或在预设时间段内未检测到第二触控操作; 将与第一坐标点对应的用 户界面元素的状态由可移动状态设置为不可移动状态。 优选地, 在将与第一坐标点对应的用户界面元素设置为可移动状态之后, 上述触 控屏操作方法还包括: 用户界面元素显示预设标识, 其中, 预设标识用于指示用户界 面元素处于可移动状态。 根据本发明的另一方面, 提供了一种触控屏操作装置, 其包括: 检测模块, 设置 为检测第一触控操作对应的第一坐标点; 设置模块, 设置为将与第一坐标点对应的用 户界面元素设置为可移动状态; 处理模块, 设置为检测第二触控操作对应的第二坐标 点, 将用户界面元素移动到第二坐标点上。 优选地, 处理模块包括: 检测单元, 设置为检测到第二触控操作对应的第二坐标 点后, 检测出第二坐标点上有对应的用户界面元素; 处理单元, 设置为将用户界面元 素与第二坐标点上对应的用户界面元素的位置互换。 在本发明中, 首先, 通过第一触控操作来选中需要移动的用户界面元素, 检测到 第一触控操作对应的第一坐标点, 并将与第一坐标点对应的用户界面元素设置为可移 动状态, 实现第一触控操作结束后将第一坐标点对应的用户界面元素设置为可移动状 态, 然后, 通过检测第二触控操作对应的第二坐标点, 来将用户界面元素移动到第二 坐标点上, 实现用户界面元素的移动, 在上述过程中, 实现了通过两个不连续的触控 操作来完成用户界面元素的移动, 即通过第一触控操作将用户界面元素设置为可移动 状态,然后,再通过第二触控操作来移动用户界面元素,避免了在相关技术中的"拖动" 操作要求必须通过连续的操作来完成用户界面元素的移动而导致的操作不便捷、 不准 确的问题, 从而提高了操作的准确性、 便捷性, 改善了用户体验。 附图说明 此处所说明的附图用来提供对本发明的进一步理解, 构成本申请的一部分, 本发 明的示意性实施例及其说明用于解释本发明, 并不构成对本发明的不当限定。 在附图 中: 图 1是根据本发明实施例的触控屏操作方法的流程图; 图 2是根据本发明实施例的触控屏操作装置的结构框图; 图 3是根据本发明实施例的处理模块的结构框图; 图 4是根据本发明实施例的另一种触控屏操作方法的流程图; 图 5是根据本发明实施例的用户界面元素设置为可移动状态的视效示意图; 图 6是根据本发明实施例的在九宫格界面中互换两个图标位置的示意图; 图 7是根据本发明实施例的在锁屏界面上改变屏幕解锁预定路径位置的示意图; 图 8是根据本发明实施例的删除用户界面元素的示意图; 图 9是根据本发明实施例的跨页面移动图标的示意图; 图 10是根据本发明实施例的另一种跨页面移动图标的示意图; 以及 图 11是根据本发明实施例的移动手机用户界面页面的示意图。 具体实施方式 下文中将参考附图并结合实施例来详细说明本发明。 需要说明的是, 在不冲突的 情况下, 本申请中的实施例及实施例中的特征可以相互组合。 本实施例提供了一种触控屏操作方法, 如图 1所示, 该触控屏操作方法包括步骤 S102至步骤 S106。 步骤 S102: 检测到第一触控操作对应的第一坐标点。 步骤 S104: 将与第一坐标点对应的用户界面元素设置为可移动状态。 步骤 S106: 检测第二触控操作对应的第二坐标点, 将用户界面元素移动到第二坐 标点上。 通过上述步骤, 首先, 通过第一触控操作来选中需要移动的用户界面元素, 检测 到第一触控操作对应的第一坐标点, 并将与第一坐标点对应的用户界面元素设置为可 移动状态, 实现第一触控操作结束后将第一坐标点对应的用户界面元素设置为可移动 状态, 然后, 通过检测第二触控操作对应的第二坐标点, 来将用户界面元素移动到第 二坐标点上, 实现用户界面元素的移动, 在上述过程中, 实现了通过两个不连续的触 控操作来完成用户界面元素的移动, 即通过第一触控操作将用户界面元素设置为可移 动状态, 然后, 再通过第二触控操作来移动用户界面元素, 避免了在相关技术中的"拖 动"操作要求必须通过连续的操作来完成用户界面元素的移动而导致的操作不便捷、不 准确的问题, 从而提高了操作的准确性、 便捷性, 改善了用户体验。 为了实现通过不连续的触控操作来移动用户界面元素, 在本优选实施例中, 提供 了一种优选的第一触控操作, 例如, 第一触控操作包括但不限于: 双击操作。 在上述优选实施例中, 可以通过双击操作来将需要移动的用户界面元素设置为可 移动状态, 实现了将用户界面元素设置为可移动状态后可以结束第一触控操作, 即将 用户界面元素设置为可移动状态后, 用户手指可以离开触控屏, 然后, 再通过再次的 触控操作移动已设置为可移动状态的用户界面元素。 为了提高移动用户界面元素的操作的灵活性, 在本优选实施例中, 提供了一种优 选的第二触控操作, 例如, 第二触控操作包括但不限于以下至少之一: 在第二坐标点 上进行点击操作、 从第一坐标点到第二坐标点的拖动, 以便在用户界面元素设置为可 移动状态后, 可以灵活地对用户界面元素进行移动。 为了满足不同的应用场景, 在本优选实施例中, 提供了一种优选的检测第二触控 操作对应的第二坐标点, 将用户界面元素移动到第二坐标点上的方法, 例如, 检测到 第二触控操作对应的第二坐标点后, 检测出第二坐标点上有对应的用户界面元素; 将 用户界面元素与第二坐标点上对应的用户界面元素的位置互换。 在上述优选实施例中, 当检测出第二坐标点上有对应的用户界面元素时, 将已设 置为可移动状态的用户界面元素与第二坐标点上对应的用户界面元素的位置互换, 便 捷地实现了将两个用户界面元素互换位置, 大大提高了用户的操作多样性。 优选地, 上述用户界面元素可以是图标也可以页面, 用户可以通过简单的操作将两个图标和 / 或页面互换位置。 为了增强本实施例的实用性, 在本优选实施例中, 第二坐标点上对应的用户界面 元素与用户界面元素在不同的用户界面页面上, 其中, 第二坐标点上对应的用户界面 元素和用户界面元素为图标。 在上述优选实施例中, 当第二坐标点上对应的用户界面元素和用户界面元素为图 标时, 可以实现将图标在页面之间进行移动, 完成不同页面间的图标互换位置, 从而 增强了本实施例的实用性。 为了提高操作的灵活性, 在本优选实施例中, 按照第一预设方向将用户界面元素 拖动到用户界面元素所在用户界面页面的边缘, 其中, 第一预设方向与用户界面页面 的页面切换方向一致; 在触控屏进行用户界面页面切换后, 在切换后的用户界面页面 中检测第二触控操作对应的第二坐标点。 在上述优选实施例中, 可以通过拖动的方式将图标在页面间进行移动, 例如, 首 先通过双击的操作将需要移动的图标设置为可移动状态, 然后, 再对图标进行拖动, 按照页面切换的方向将图标拖动到页面的边缘, 此时触控屏进行页面切换, 然后在切 换后的页面中将图标拖动到希望的位置上, 也可以在切换后的页面中希望放该图标的 位置上进行点击操作, 以将图标移动到点击的位置上, 为实现图标在页面间进行移动 提供了多种方式, 从而提高了操作的灵活性。 为了满足不同场景的应用需求, 在本优选实施例中, 例如, 在将与第一坐标点对 应的用户界面元素设置为可移动状态之后, 在用户界面元素为用户界面上的图标的情 况下, 将用户界面元素按照第二预设方向移出触控屏, 其中, 第二预设方向与触控屏 的用户界面页面的页面切换方向不同; 确认后将用户界面元素从用户界面元素所在的 页面上删除。 在上述优选的实施例中, 当用户界面元素已设置为可移动状态后, 将用户界面元 素拖动或划出触控屏范围时, 此时显示出是否要删除此图标的提示, 当确认后, 则将 用户界面元素从页面上删除, 以满足不同场景的应用需求。 在将用户界面元素设置为可移动状态后, 还可以通过以下方式将用户界面元素恢 复为不可移动的正常状态, 例如, 在将与第一坐标点对应的用户界面元素设置为可移 动状态之后, 检测第二触控操作对应的第二坐标点与第一触控操作对应的第一坐标点 重合, 或在预设时间段内未检测到第二触控操作; 将与第一坐标点对应的用户界面元 素的状态由可移动状态设置为不可移动状态, 例如, 当通过双击操作将用户界面元素 设置为可移动状态后, 可以再次点击上述用户界面元素来将用户界面元素恢复为不可 移动的正常状态, 或者, 当将用户界面元素设置为可移动状态后, 在预设时间段内未 检测到对用户界面元素进行任何操作时,用户界面元素会恢复为不可移动的正常状态。 为了增强视觉效果, 以提示用户界面元素处于可移动状态, 在本优选实施例中, 在将与第一坐标点对应的用户界面元素设置为可移动状态之后, 用户界面元素可以显 示预设标识, 其中, 预设标识用于指示用户界面元素处于可移动状态。 本实施例提供了一种触控屏操作装置, 如图 2所示, 该触控屏操作装置包括: 第 一检测模块 202, 设置为检测第一触控操作对应的第一坐标点; 设置模块 204, 连接至 第一检测模块 202, 设置为将与第一坐标点对应的用户界面元素设置为可移动状态; 处理模块 206, 连接至设置模块 204, 设置为检测第二触控操作对应的第二坐标点, 将 用户界面元素移动到第二坐标点上。 在上述实施例中, 通过第一触控操作来选中需要移动的用户界面元素, 第一检测 模块 202检测到第一触控操作对应的第一坐标点, 设置模块 204将与第一坐标点对应 的用户界面元素设置为可移动状态, 实现第一触控操作结束后将第一坐标点对应的用 户界面元素设置为可移动状态, 然后, 处理模块 206检测第二触控操作对应的第二坐 标点, 并将用户界面元素移动到第二坐标点上, 实现用户界面元素的移动, 在上述过 程中, 实现了通过两个不连续的触控操作来完成用户界面元素的移动, 即通过第一触 控操作将用户界面元素设置为可移动状态, 然后, 再通过第二触控操作来移动用户界 面元素, 避免了在相关技术中的"拖动"操作要求必须通过连续的操作来完成用户界面 元素的移动而导致的操作不便捷、不准确的问题, 从而提高了操作的准确性、便捷性, 改善了用户体验。 为了满足不同的应用场景, 在本优选实施例中, 如图 3所示, 上述处理模块 206 包括: 检测单元 2062, 设置为检测到第二触控操作对应的第二坐标点后, 检测出第二 坐标点上有对应的用户界面元素; 处理单元 2064, 连接至检测单元 2062, 设置为将用 户界面元素与第二坐标点上对应的用户界面元素的位置互换。 为了提高操作的灵活性, 在本优选实施例中, 上述处理模块 206包括: 拖动单元, 设置为按照第一预设方向将用户界面元素拖动到用户界面元素所在用户界面页面的边 缘, 其中, 第一预设方向与用户界面页面的页面切换方向一致; 处理单元, 连接至拖 动单元, 设置为在触控屏进行用户界面页面切换后, 在切换后的用户界面页面中检测 第二触控操作对应的第二坐标点。 为了满足不同场景的应用需求,在本优选实施例中, 上述触控屏操作装置还包括: 移动模块, 设置为在用户界面元素为用户界面上的图标的情况下, 将用户界面元素按 照第二预设方向移出触控屏, 其中, 第二预设方向与触控屏的用户界面页面的页面切 换方向不同; 删除模块, 连接至移动模块, 设置为确认后将用户界面元素从用户界面 元素所在的页面上删除。 上述触控屏操作装置还包括: 第二检测模块, 设置为检测第二触控操作对应的第 二坐标点与第一触控操作对应的第一坐标点重合, 或在预设时间段内未检测到第二触 控操作; 恢复模块, 连接至第二检测模块, 设置为将与第一坐标点对应的用户界面元 素的状态由可移动状态设置为不可移动状态。 上述触控屏操作装置还包括: 显示模块, 设置为显示预设标识, 其中, 预设标识 用于指示用户界面元素处于可移动状态。 以下结合附图和实例对上述各个优选实施例进行详细地描述。 图 4是根据本发明实施例的另一种触控屏操作方法的流程图, 如图 4所示, 该触 控屏操作方法包括步骤 S402至步骤 S418。 步骤 S402: 在待机状态下, 当某个图形交互式用户界面元素被双击后, 该图形交 互式用户界面元素进入可移动状态。 步骤 S404: 在上述图形交互式用户界面元素进入可移动状态后, 判断在屏幕上的 后续的用户操作, 是否将其他用户界面元素设置为可移动状态, 若是, 则转至步骤 S406, 若否, 则转至步骤 S408。 步骤 S406: 如果用户在此时双击了其它的图形交互式用户界面元素, 则直接将两 个图形交互式用户界面元素互相更换位置, 同时实现两个图形交互式用户界面元素的 位置移动。 步骤 S408: 在上述图形交互式用户界面元素进入可移动状态后, 判断是否用户此 时单击或双击了上述图形交互式用户界面元素所在页面中的一个空白位置, 若是, 则 转至步骤 S410, 若否, 则转至步骤 S412。 步骤 S410:将上述被设置为可移动状态的用户界面元素移动到被单击或双击的空 白位置, 以实现图形交互式用户界面元素的位置移动。 步骤 S412: 在上述图形交互式用户界面元素进入可移动状态后, 判断是否用户此 时再次单击或双击已经处于可移动状态的用户界面元素, 若是, 则转至步骤 S418, 若 否, 则转至步骤 S414。 步骤 S414: 判断是否有拖动该用户界面元素的动作, 若是, 则转至步骤 S416, 若否, 则转至步骤 S418。 步骤 S416: 将该用户界面元素移动至拖动操作的重点位置上。 步骤 S418:将上述被设置为可移动状态的用户界面元素恢复为不可移动的正常状 态。 图 5是根据本发明实施例的用户界面元素设置为可移动状态的视效示意图, 如图 5所示,在待机界面对 "天气查询"图标进行双击操作后,该"天气查询"图标的视觉效果 相对与其它未操作的图标发生了变化, 该"天气查询"图标被放大显示, 此视效效果可 以提示用户该图标目前处于可以移动状态, 可以进一步地对其进行移动操作。 优选地, 图标被设置为可移动状态后的视效可以根据用户的个人爱好和习惯有多 种不同的表现方式, 上述对图标进行放大处理只是其中的一个可行方式, 例如, 还可 以是被设置为可移动状态的图标以闪烁的方式显示, 以示出该图标目前的状态与正常 情况下是有差别的, 是可以对其进行移动操作的。 图 6是根据本发明实施例的在九宫格界面中互换两个图标位置的示意图, 如图 6 所示, 在九宫格界面, 首先双击 camera (照相机) 图标, 使 camera图标成为可移动状 态; 之后双击 FM (收音机) 图标, 实现 FM图标与 camera图标的位置互换, 两个图 标互换位置成功后, 两个图标都变为不可移动的正常状态。 因此, 通过上述操作可以 便捷地、 有效地实现任意两个图标的位置交换。 图 7是根据本发明实施例的在锁屏界面上改变屏幕解锁预定路径位置的示意图, 如图 7所示, 解锁控件的初始位置在屏幕下方, 双击解锁控件后, 解锁控件切换到可 移动状态, 此时再拖动解锁控件, 即可实现解锁路径的位置改动。 此外, 在解锁控件 切换到可移动状态后, 还可以在解锁控件所在页面的空白处进行单击或双击, 将解锁 控件移动至单击或双击的位置上, 以实现解锁路径的位置改动。 当需要删除某些不再需要的图形交互式用户界面对象时, 也可以采取本发明的思 路删除该对象, 图 8是根据本发明实施例的删除用户界面元素的示意图, 如图 8所示, 首先, 双击"本地搜索"图标, 将其切换到可移动状态; 在"本地搜索"图标切换到可移 动状态下, 将"本地搜索"图标向上(该方向相当于第二预设方向)划动出屏幕的范围, 当然, 这里的向上只是一种示例, 还可以是按照其他预设的方向将"本地搜索"图标划 动出屏幕的范围, 只是这里的预设的方向与触控屏的页面的切换方向不一致, 此时, 屏幕弹出提示, 询问用户是否确认删除该应用, 如果用户点击确认, 则该应用被删除, 如果用户未确认删除, 则上述用户界面元素 ("本地搜索"图标) 回到原来的位置, 不 被删除。 图 9是根据本发明实施例的跨页面移动图标的示意图, 如图 9所示, 该方式是用 户采取拖动方式跨页移动图标。 首先, 通过双击方式将某个图标设置为可移动状态, 之后用户拖动该图标至页面左边缘(该拖动方向相当于第一预设方向)。当上述图标到 达页面左边缘后, 触控屏显示的页面自动切换到左视图, 上述图标停留在左视图的右 侧, 仍然处于可移动状态, 用户可以进一步地通过双击或单击左视图的空白处, 将上 述图标移动到左视图的希望的位置。或继续以拖动的方式将其放置到自己希望的位置, 实现跨页移动图标。 图 10是根据本发明实施例的另一种跨页面移动图标的示意图, 如图 10所示, 首 先通过双击方式将用户界面交互图标设置为可移动状态, 之后通过在空白界面的划屏 操作令待机界面进入另外一个视图, 进入另一个视图后, 在视图中希望放置图标的位 置上直接单击或双击, 实现将图标从上一个页面直接被移动到该页面所单击或双击的 位置。 图 11是根据本发明实施例的移动手机用户界面页面的示意图。 如图 11所示, 当 用户希望将待机界面中视图与左视图更换时, 可以采取如图示例的操作: 在中视图中的一个空白处进行双击, 将整个中视图切换到可移动状态。 中视图切 换到可移动状态后, 有对应的视效提示, 如图 5所示, 将整个视图中的所有图标均放 大处理, 通过划屏操作移动到左视图的界面, 双击或单击左视图, 实现将左视图与中 视图的更换, 更换后新的中视图为之前的左视图, 新的左视图为更换前的中视图。 显然, 本领域的技术人员应该明白, 上述的本发明的各模块或各步骤可以用通用 的计算装置来实现, 它们可以集中在单个的计算装置上, 或者分布在多个计算装置所 组成的网络上, 可选地, 它们可以用计算装置可执行的程序代码来实现, 从而, 可以 将它们存储在存储装置中由计算装置来执行, 并且在某些情况下, 可以以不同于此处 的顺序执行所示出或描述的步骤, 或者将它们分别制作成各个集成电路模块, 或者将 它们中的多个模块或步骤制作成单个集成电路模块来实现。 这样, 本发明不限制于任 何特定的硬件和软件结合。 以上所述仅为本发明的优选实施例而已, 并不用于限制本发明, 对于本领域的技 术人员来说, 本发明可以有各种更改和变化。 凡在本发明的精神和原则之内, 所作的 任何修改、 等同替换、 改进等, 均应包含在本发明的保护范围之内。

Claims

权 利 要 求 书
1. 一种触控屏操作方法, 包括:
检测到第一触控操作对应的第一坐标点;
将与所述第一坐标点对应的用户界面元素设置为可移动状态; 检测第二触控操作对应的第二坐标点, 将所述用户界面元素移动到所述第 二坐标点上。
2. 根据权利要求 1所述的方法, 其中, 所述第一触控操作包括: 双击操作。
3. 根据权利要求 1或 2所述的方法,其中,所述第二触控操作包括以下至少之一: 在所述第二坐标点上进行点击操作、 从所述第一坐标点到所述第二坐标点的拖 动。
4. 根据权利要求 1或 2所述的方法, 其中, 所述检测第二触控操作对应的第二坐 标点, 将所述用户界面元素移动到所述第二坐标点上包括:
检测到所述第二触控操作对应的第二坐标点后, 检测出所述第二坐标点上 有对应的用户界面元素;
将所述用户界面元素与所述第二坐标点上对应的用户界面元素的位置互 换。
5. 根据权利要求 4所述的方法, 其中, 所述第二坐标点上对应的用户界面元素与 所述用户界面元素在不同的用户界面页面上, 其中, 所述第二坐标点上对应的 用户界面元素和所述用户界面元素为图标。
6. 根据权利要求 5所述的方法, 其中, 检测所述第二触控操作对应的第二坐标点 包括:
按照第一预设方向将所述用户界面元素拖动到所述用户界面元素所在用户 界面页面的边缘, 其中, 所述第一预设方向与所述用户界面页面的页面切换方 向一致;
在触控屏进行用户界面页面切换后, 在切换后的用户界面页面中检测所述 第二触控操作对应的第二坐标点。
7. 根据权利要求 1所述的方法, 其中, 在将与所述第一坐标点对应的用户界面元 素设置为可移动状态之后, 还包括:
在所述用户界面元素为用户界面上的图标的情况下, 将所述用户界面元素 按照第二预设方向移出触控屏, 其中, 所述第二预设方向与所述触控屏的用户 界面页面的页面切换方向不同;
确认后将所述用户界面元素从所述用户界面元素所在的页面上删除。
8. 根据权利要求 1所述的方法, 其中, 在将与所述第一坐标点对应的用户界面元 素设置为可移动状态之后, 还包括:
检测所述第二触控操作对应的第二坐标点与所述第一触控操作对应的第一 坐标点重合, 或在预设时间段内未检测到所述第二触控操作;
将与所述第一坐标点对应的用户界面元素的状态由可移动状态设置为不可 移动状态。
9. 根据权利要求 1所述的方法, 其中, 在将与所述第一坐标点对应的用户界面元 素设置为可移动状态之后, 还包括:
所述用户界面元素显示预设标识, 其中, 所述预设标识用于指示所述用户 界面元素处于可移动状态。
10. 一种触控屏操作装置, 包括:
检测模块, 设置为检测第一触控操作对应的第一坐标点;
设置模块, 设置为将与所述第一坐标点对应的用户界面元素设置为可移动 状态;
处理模块, 设置为检测第二触控操作对应的第二坐标点, 将所述用户界面 元素移动到所述第二坐标点上。
11. 根据权利要求 10所述的装置, 其中, 所述处理模块包括:
检测单元, 设置为检测到所述第二触控操作对应的第二坐标点后, 检测出 所述第二坐标点上有对应的用户界面元素;
处理单元, 设置为将所述用户界面元素与所述第二坐标点上对应的用户界 面元素的位置互换。
PCT/CN2012/077777 2012-05-24 2012-06-28 触控屏操作方法及装置 WO2013174057A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2012101635455A CN102722324A (zh) 2012-05-24 2012-05-24 触控屏操作方法及装置
CN201210163545.5 2012-05-24

Publications (1)

Publication Number Publication Date
WO2013174057A1 true WO2013174057A1 (zh) 2013-11-28

Family

ID=46948111

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/077777 WO2013174057A1 (zh) 2012-05-24 2012-06-28 触控屏操作方法及装置

Country Status (2)

Country Link
CN (1) CN102722324A (zh)
WO (1) WO2013174057A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105786713A (zh) * 2016-03-28 2016-07-20 努比亚技术有限公司 移动终端的分屏排查方法、装置及移动终端

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102981711A (zh) * 2012-11-22 2013-03-20 中兴通讯股份有限公司 一种在触摸屏上移动应用图标的方法和系统
CN103019547B (zh) * 2012-12-24 2015-09-09 广东欧珀移动通信有限公司 一种调整移动终端应用程序位置的方法及系统
CN103076948A (zh) * 2013-01-15 2013-05-01 广东欧珀移动通信有限公司 一种非自动对主菜单图标进行排序的方法和装置
JP2014182652A (ja) 2013-03-19 2014-09-29 Canon Inc 情報処理装置およびその制御方法、ならびにプログラム
CN104750406B (zh) * 2013-12-31 2019-12-24 深圳迈瑞生物医疗电子股份有限公司 监护设备及其显示界面布局调整方法、装置
CN104978135B (zh) * 2014-04-09 2019-10-18 腾讯科技(深圳)有限公司 一种图标显示方法、装置及移动终端
CN105022687B (zh) * 2014-04-22 2018-10-26 腾讯科技(深圳)有限公司 自动化测试方案中滑动操作的实现方法及装置
CN104436657B (zh) * 2014-12-22 2018-11-13 青岛烈焰畅游网络技术有限公司 游戏控制方法、装置以及电子设备
CN104881225A (zh) * 2015-05-18 2015-09-02 百度在线网络技术(北京)有限公司 一种调节条的控制方法和装置
CN104951228B (zh) * 2015-05-22 2018-05-08 小米科技有限责任公司 图标的放置方法、装置及终端设备
CN106367913A (zh) * 2015-07-23 2017-02-01 博西华电器(江苏)有限公司 衣物处理机及其操作界面
CN106610830B (zh) * 2015-10-26 2020-04-03 北京国双科技有限公司 页面元素的拖放方法及装置
CN105468263A (zh) * 2015-11-19 2016-04-06 中科创达软件股份有限公司 一种信息处理方法、装置及电子设备
CN105511757B (zh) * 2015-12-10 2019-02-12 Oppo广东移动通信有限公司 一种播放列表控制方法及移动终端
CN109432766B (zh) * 2015-12-24 2021-06-25 网易(杭州)网络有限公司 一种游戏控制方法及装置
CN107368230A (zh) * 2016-05-13 2017-11-21 中兴通讯股份有限公司 一种界面元素移动的方法和装置
CN107870705B (zh) * 2016-09-28 2021-12-28 珠海金山办公软件有限公司 一种应用菜单的图标位置的改变方法及装置
CN109901766B (zh) * 2017-12-07 2023-03-24 珠海金山办公软件有限公司 文档视口的移动方法、装置及电子设备
CN114564134A (zh) * 2022-02-14 2022-05-31 维沃移动通信有限公司 应用图标显示方法、装置
CN117130515A (zh) * 2023-02-10 2023-11-28 荣耀终端有限公司 创建快捷入口的方法和装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590568B1 (en) * 2000-11-20 2003-07-08 Nokia Corporation Touch screen drag and drop input technique
CN101836182A (zh) * 2007-09-04 2010-09-15 苹果公司 编辑界面
CN102306080A (zh) * 2011-08-25 2012-01-04 鸿富锦精密工业(深圳)有限公司 触摸型电子装置及其图标移动的方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770326B (zh) * 2008-12-31 2012-07-25 北京联想软件有限公司 触摸屏上移动对象的实现方法及计算设备
KR101674205B1 (ko) * 2009-10-27 2016-11-08 엘지전자 주식회사 이동 통신 단말기에서의 아이콘 표시 제어 방법 및 이를 적용한 이동 통신 단말기
US20110216095A1 (en) * 2010-03-04 2011-09-08 Tobias Rydenhag Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces
KR101708821B1 (ko) * 2010-09-30 2017-02-21 엘지전자 주식회사 이동 단말기 및 그 제어 방법
CN202110524U (zh) * 2011-06-14 2012-01-11 上海博泰悦臻电子设备制造有限公司 终端设备及其图标位置互换装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590568B1 (en) * 2000-11-20 2003-07-08 Nokia Corporation Touch screen drag and drop input technique
CN101836182A (zh) * 2007-09-04 2010-09-15 苹果公司 编辑界面
CN102306080A (zh) * 2011-08-25 2012-01-04 鸿富锦精密工业(深圳)有限公司 触摸型电子装置及其图标移动的方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105786713A (zh) * 2016-03-28 2016-07-20 努比亚技术有限公司 移动终端的分屏排查方法、装置及移动终端

Also Published As

Publication number Publication date
CN102722324A (zh) 2012-10-10

Similar Documents

Publication Publication Date Title
WO2013174057A1 (zh) 触控屏操作方法及装置
EP3706400B1 (en) Icon management method and device
EP2372516B1 (en) Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display
CN108509115B (zh) 页操作方法及其电子装置
JP6328947B2 (ja) マルチタスキング運用のための画面表示方法及びこれをサポートする端末機
EP3591509B1 (en) Split-screen display method and apparatus, and electronic device thereof
EP3133483B1 (en) Touchscreen apparatus and user interface processing method for the touchscreen apparatus
US20150143285A1 (en) Method for Controlling Position of Floating Window and Terminal
EP3002664B1 (en) Text processing method and touchscreen device
KR101930225B1 (ko) 터치스크린 동작모드의 제어방법 및 제어장치
US20080001928A1 (en) Driving method and input method, for touch panel
CN101996031A (zh) 具有触摸输入功能的电子装置及其触摸输入方法
WO2011026395A1 (zh) 多触点字符输入方法及系统
EP2613247B1 (en) Method and apparatus for displaying a keypad on a terminal having a touch screen
EP2677405A1 (en) Electronic apparatus, control setting method, and program
CN103076942A (zh) 便携式终端中用于改变图标的设备和方法
KR20150033508A (ko) 아이콘의 이동 방법 및 이 방법이 적용되는 터치 타입 휴대 단말기
WO2012160829A1 (ja) タッチスクリーン装置、タッチ操作入力方法及びプログラム
CN105700763A (zh) 终端界面窗口的移动方法及装置
CN103324389A (zh) 智能终端应用程序的操作方法
CN103019585A (zh) 触摸屏的单点控制方法、装置及移动终端
WO2013182141A1 (zh) 一种人机交互方法、装置及其电子设备
JP6087608B2 (ja) 携帯可能な装置、携帯可能な装置を制御する方法およびプログラム
CN103092389A (zh) 一种实现虚拟鼠标操作的触摸屏装置和方法
JP5882973B2 (ja) 情報処理装置、方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12877438

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12877438

Country of ref document: EP

Kind code of ref document: A1