WO2016011568A1 - 一种用于多点触摸终端的触摸控制方法与设备 - Google Patents

一种用于多点触摸终端的触摸控制方法与设备 Download PDF

Info

Publication number
WO2016011568A1
WO2016011568A1 PCT/CN2014/000767 CN2014000767W WO2016011568A1 WO 2016011568 A1 WO2016011568 A1 WO 2016011568A1 CN 2014000767 W CN2014000767 W CN 2014000767W WO 2016011568 A1 WO2016011568 A1 WO 2016011568A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
button
touch button
user
touch screen
Prior art date
Application number
PCT/CN2014/000767
Other languages
English (en)
French (fr)
Inventor
毛信良
周田伟
陈二喜
Original Assignee
上海逗屋网络科技有限公司
毛信良
周田伟
陈二喜
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=51598292&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2016011568(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by 上海逗屋网络科技有限公司, 毛信良, 周田伟, 陈二喜 filed Critical 上海逗屋网络科技有限公司
Priority to KR1020147028782A priority Critical patent/KR101739782B1/ko
Priority to EP14193542.9A priority patent/EP2977881A1/en
Publication of WO2016011568A1 publication Critical patent/WO2016011568A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/22Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
    • G09G5/30Control of display attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to the field of computers, and in particular, to a technique for touch control of a multi-touch terminal. Background technique
  • the operation of the touch screen is usually based on a single touch operation.
  • a plurality of touch points for performing a touch operation basically constitute an operation, such as a mobile phone.
  • the sliding unlock operation prevents complex touch operations from being performed, resulting in poor interactivity and affecting the user experience.
  • a touch control method for a multi-touch terminal includes:
  • a touch control device for a multi-touch terminal, wherein the device comprises:
  • a first device configured to acquire a first operation of the first touch button and a second operation of the second touch button by the user on the touch screen of the multi-touch terminal;
  • a second device configured to perform object execution according to the operation corresponding to the first operation
  • the object operation corresponding to the second operation is performed.
  • the present invention acquires a first operation of a first touch button and a second operation of a second touch button on a touch screen of a multi-touch terminal, and then corresponds to the first operation.
  • the operation execution object performs the object operation corresponding to the second operation, thereby supporting the user to perform more complicated human-computer interaction by implementing two manipulation buttons on the multi-touch screen, for example, selecting the orientation and movement of the character through the button 1,
  • the operation convenience and diversity of the multi-touch screen are improved, the human-computer interaction efficiency is improved, and the user experience is improved.
  • the present invention can also detect whether the user touches the target control area of the touch screen, and if so, display the first touch button or the second touch button for the user to operate, thereby improving the interaction.
  • the accuracy of the control improves the convenience and diversity of the operation of the multi-touch screen, improves the efficiency of human-computer interaction and enhances the user experience.
  • the present invention may also hide the first touch button or the second touch button when the user stops touching the target control area of the touch screen; or, the present invention may also cause the first touch button to
  • the second touch button is adapted to various preset conditions at the position of the touch screen; thereby making the interface more friendly, improving the efficiency of human-computer interaction and improving the user experience.
  • the present invention may further use the first operation to control movement of the operation execution object; further, the second operation includes adjusting an operation action area of the object operation, and together with the first operation Perform various operations; thus realize support for complex human-computer interaction, improve the convenience and diversity of operation of multi-touch screen, improve the efficiency of human-computer interaction and enhance the user experience.
  • the present invention can also perform subsequent operations corresponding to the object operation after the execution of the object operation is completed, thereby making the whole operation more complete and improving.
  • the operation convenience and diversity of the multi-touch screen improve the efficiency of human-computer interaction and enhance the user experience.
  • FIG. 1 shows a schematic diagram of a touch control device for a multi-touch terminal according to an aspect of the present invention
  • FIG. 2 shows a schematic diagram of a touch control device for a multi-touch terminal in accordance with a preferred embodiment of the present invention
  • FIG. 3 shows a flow chart of touch control for a multi-touch terminal in accordance with another aspect of the present invention
  • FIG. 4 shows a flow control flow diagram for a multi-touch terminal in accordance with a preferred embodiment of the present invention
  • 5 through 7 respectively show a schematic diagram of a touch screen for a multi-touch terminal in accordance with a preferred embodiment of the present invention.
  • the touch control device includes a first device 1 and a second device 2.
  • the first device 1 acquires a first operation of the first touch button and a second operation of the second touch button by the user on the touch screen of the multi-touch terminal;
  • An operation execution object corresponding to an operation performs an object operation corresponding to the second operation.
  • the touch control device includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network.
  • the user equipment includes, but is not limited to, any mobile electronic product that can interact with a user through a touchpad, such as a smart phone, a PDA, etc., and the mobile electronic product can adopt any operating system, such as an android operating system. iOS operating system, etc.
  • the network device includes an electronic device capable of automatically performing numerical calculation and information processing according to an instruction set or stored in advance, and the hardware includes but is not limited to a microprocessor, an application specific integrated circuit (ASIC), and a programmable gate. Arrays (FPGAs), digital processors (DSPs), embedded devices, and more.
  • the network device includes, but is not limited to, a computer, a network host, a single network server, a plurality of network server sets, or a plurality of servers; here, the cloud is composed of a large number of computers or network servers based on cloud computing (Cloud Computing) Composition, wherein cloud computing is a type of distributed computing, a virtual supercomputer consisting of a group of loosely coupled computers.
  • the network includes, but is not limited to, the Internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless ad hoc network (Ad Hoc network), and the like.
  • Ad Hoc network wireless ad hoc network
  • continuous means that the above devices perform the first operation and the first operation in real time or according to the set or real-time adjusted working mode requirements.
  • the first device 1 acquires a first operation of the first touch button and a second operation of the second touch button by the user on the touch screen of the multi-touch terminal.
  • the first device 1 passes a touch screen according to the multi-touch terminal a touch detecting component that acquires an operation of the user on the touch screen; then, the first device 1 matches the position of the operation with the position of the first touch button on the touch screen, and the first device 1 An operation of touching the button position as a first operation, similarly, the first device 1 matches the position of the operation with the position of the second touch button on the touch screen, and the second touch button position is met
  • the operation is the second operation.
  • the operations include, but are not limited to, clicking, double clicking, long pressing, pressing, releasing, sliding (including but not limited to various directions), rotation, and the like.
  • Figure 5 shows a schematic diagram of a touch screen for a multi-touch terminal in accordance with a preferred embodiment of the present invention.
  • the button B is the first touch button
  • the button E is the second touch button.
  • the first device 1 acquires a first operation of the first touch button on the touch screen of the multi-touch terminal and a second operation on the second touch button operating.
  • the setting information of the position, the graphic, and the like of the first touch button and the second touch button may be determined based on a default setting, may be determined according to a user or other adjustment settings, and may also be performed according to a specific operation of the user. Determining, if the user double-clicks at a certain position of the touch screen, determining the position as the center position of the second touch button, and the range interval corresponding to the center position may be determined according to a default setting (such as a preset radius range) ), or based on other operations of the user, etc.
  • a default setting such as a preset radius range
  • the first touch button and the second touch button are adapted to at least one of the following positions of the touch screen:
  • a size attribute of the touch screen wherein the size attribute includes, but is not limited to, a horizontal and vertical length, an aspect ratio, and the like; for example, if the touch screen is larger, the first touch button and the second The corresponding range of the touch button is also large, if the touch screen is small, The corresponding range of the first touch button and the second touch button is also proportionally reduced.
  • the aspect ratio of the touch screen is 4:3, the position and size of the touch button are set according to the corresponding ratio.
  • the aspect ratio of the touch screen is 16:9 (widescreen), and the position and size of the touch button are set according to corresponding design manners corresponding to the wide screen;
  • the user holds the state attribute of the multi-touch terminal, wherein the status attribute includes but is not limited to the holding state of the multi-touch terminal (single hand grip, dual hand grip, horizontal and vertical grip, etc.)
  • the arrangement of the touch buttons is adapted to the horizontal and vertical adjustment of the screen, and the like;
  • the current application scenario information of the touch screen where the current application includes, but is not limited to, an application or other application corresponding to the touch button; for example, adjusting the current page content according to an application corresponding to the touch button
  • the position of the touch button is not affected, so as to affect the display of the current page content, or, for example, if the touch screen currently has other applications, the position of the touch button is separated from the application to avoid affecting other The operation of the application.
  • the second device 2 performs an object operation corresponding to the second operation according to the operation execution object corresponding to the first operation.
  • the second device 2 determines an operation execution object corresponding to the first operation based on the setting of the first touch button according to the first operation. For example, an object within a range of positions corresponding to the first touch button is used as an operation execution object; or according to a predefined binding setting, such as whenever the first touch button is touched, Touching a predefined object bound to the button as an operation execution object.
  • a predefined binding setting is adopted, the predefined object may be located at any position of the touch screen, and is not limited to The position range corresponding to the first touch button is described.
  • the object F is the operation execution object; then the second The device 2 performs an object operation corresponding to the second operation by the operation execution object according to the second operation. SP, the content corresponding to the second operation is performed by the operation execution object.
  • the object F performs the "moving” operation; if the second operation is "interacting with other objects", the object F performs the "interactive" operation.
  • the first operation and the second operation are at least partially overlapped in time series.
  • the first operation and the second operation may be operated simultaneously or within a predetermined threshold period, when the first operation and the second operation at least partially overlap in time sequence, performed
  • the operation may be the same as or different from the one or more of the first operation, the second operation, the combination of the first operation and the second operation, and the like.
  • the second operation is to move the operation execution object corresponding to the first operation within a predetermined range, when the first operation and the second operation are at least part of the time sequence
  • the corresponding operation is to cause the operation execution object corresponding to the first operation to move within another predetermined range or the like.
  • the first operation may be used to control movement of the operation execution object.
  • the first operation may further move the operation execution object within a predetermined range at an arbitrary angle, in addition to determining the operation execution object; where the predetermined range includes but is not limited to Any range within the touch screen or a range corresponding to the first touch button.
  • the first operation first determines that the operation execution object is F, and then controls the object F to move within 360 degrees of the screen by long pressing and dragging operations.
  • first operation and the second The operation may control the movement of the operation execution object, and if the first operation is used to control the movement of the operation execution object, the second touch button and the second operation may be further liberated, so that the first The second operation performs more complex functions.
  • the second operation includes adjusting an operation action area of the object operation; the second device 2 performs an object execution according to the operation in the operation action area based on the current position of the operation execution object The object operates.
  • the second operation further includes adjusting an operation action area of the object operation, for example, setting the operation action area to different sizes, different shapes (such as a fan shape, a circle shape, a rectangle shape, or other shapes). region.
  • FIG. 5 shows that the operational action area is: a region having a radius of G
  • FIG. 6 shows that the operational action region is: a circular region having a radius of J
  • FIG. The operation action area is: a sector area with a radius of K, etc.;
  • the second device 2 determines an operation execution range of the object operation based on the operation action area of the operation execution object and a current position of the operation execution object, and then, within the area of the operation area, The object operation is performed by the operation execution object. For example, if the operation action area is a circle with a radius r, and the current position of the operation execution object is x, the operation action area of the current position of the operation execution object is the center of X, r a circle having a radius; if the current position of the operation execution object is y, the operation action area of the current position of the operation execution object is a circle having a radius of y and a radius of r; The object operation is to perform the operation execution object to perform interaction with other target objects, and if the scope of the object operation is within the operation action area, the corresponding operation is performed, otherwise the corresponding operation is not performed.
  • the second device 2 may perform the object operation according to the operation execution object for an operation action target within the operation action area based on the current position. Specifically, the second device 2 may acquire the operation action area of the current position, and determine an operation action target corresponding to the operation action area, for example, if the operation action area is a circle with a radius r, And the current position of the operation execution object is x, and the operation action area of the current position of the operation execution object is a circle with X as a center and r as a radius, and the second device 2 acquires the All target objects in the area as the target of the operation.
  • the second device 2 acts on the target to perform the operation of the object of the operation execution object.
  • the operation execution object is a moving target
  • the operation action area is a circle centered on the moving target
  • the operation target is All the vehicles in the parking lot
  • the object operation is to obtain relevant introduction information of the operation target, so that the second device 2 acquires vehicle introduction information of all vehicles.
  • the object operation further includes various interaction information, that is, the operation execution object and the interaction of the operation target, for example, when the operation execution object is a game character, and
  • the interaction includes, but is not limited to, a dialogue, an attack, a pickup, and the like.
  • the second operation includes adjusting an operational target of the object operation.
  • the second operation is used to adjust the operation when the second operation is consistent with one or more operation logics for adjusting an operation action target of the object operation by using a predefined operation logic.
  • the operational target of the object operation is used to adjust the operation when the second operation is consistent with one or more operation logics for adjusting an operation action target of the object operation by using a predefined operation logic.
  • the operation logic is adjusted according to the operation logic set by the sequence.
  • Operational action of object operations If the user performs the second operation of "clicking on the touch screen blank area one by one and pressing and sliding up", it indicates that the second operation adjusts the operation target of the object operation to be different from the original operation target Other goals.
  • the operation sequence is only an example, and is not intended to limit the present invention.
  • Other operation sequences, as applicable to the present invention can also be used to adjust the operational target of the object operation.
  • the touch control device further includes a fifth device (not shown), wherein the fifth device adjusts the current position synchronization display based on the operation execution object during the adjustment of the operation action area The operational action area.
  • the fifth device displays the adjusted operation function based on the current position of the operation execution object during the adjustment of the operation area by displaying a range boundary or a display range circle on the screen. region.
  • the operation is successful, and if the operation execution object is outside the operation action area, the fifth device passes the display.
  • Corresponding prompt color or prompt mark and other information the prompt operation is not allowed.
  • the touch control device further includes a sixth device (not shown), wherein the sixth device performs a subsequent operation corresponding to the object operation after the execution of the object operation is completed.
  • the sixth device performs the object operation according to the executed object operation according to one or more conditions such as a default setting, a user selection, and the like.
  • the executed object operation is to obtain an introduction of the target object
  • the subsequent operation is to interact with the target object (such as a conversation, etc.).
  • the subsequent operation may also be performed within a predetermined time threshold after the execution of the object operation is completed. If the user makes an operation such as selecting or determining within the threshold, the subsequent operation is continued. On the other hand, if the threshold value is exceeded and the user has not made an operation such as selection or determination, the execution of the subsequent operation is stopped.
  • Intelligent agents in the field of artificial intelligence such as agent-based schedule secretary, travel secretary, etc.
  • the agent performs the corresponding operations on behalf of their corresponding users. For example, the agent is moved to a specific area by the first touch button, and the hotel performs an operation such as hotel inquiry, reservation, interview with other users by the second touch button, and the like.
  • GIS including navigation applications, for example, for different user role agents such as walkers, public transport passengers, and motorists, to support their operations.
  • the agent is moved to a specific area by the first touch button, and the path query and navigation, friend inquiry and appointment are performed by the agent through the second touch button.
  • Game design or application For example, in an existing multi-touch screen-based operation, the action execution direction of most of the operation targets must be consistent with the orientation of the target. This leads to limitations in the design of the operation of the object; in contrast, the invention can make the user character design parallel to the user operation design, which not only improves the design efficiency, but also simplifies the design complexity and improves the design. Design robustness.
  • FIG. 2 is a schematic diagram of a touch control device for a multi-touch terminal according to a preferred embodiment of the present invention; wherein the touch control device includes a first device ⁇ , a second device 2', and a third device 3' .
  • the third device 3' detects whether the user touches the target control area of the touch screen; if yes, displays the first touch button or the second touch button for the user to perform an operation;
  • the first device acquires a first touch on the touch screen of the multi-touch terminal by the user a first operation of the button and a second operation of the second touch button;
  • the second device 2' performs an object operation corresponding to the second operation according to the operation execution object corresponding to the first operation.
  • the first device ⁇ and the second device 2 ′ are the same as or substantially the same as the corresponding device shown in FIG. 1 , and thus are not described herein again, and are included herein by reference.
  • continuous means that the above devices respectively touch the target control area in real time or according to the set or real-time adjusted working mode requirements.
  • the third device 3' detects whether the user touches a target control area of the touch screen; if so, displays the first touch button or the second touch button for the user to operate.
  • the third device 3' detects whether the user touches a target control area of the touch screen according to a touch detection component of the touch screen of the multi-touch terminal; where the target control area may It is a predetermined fixed area, and may also be an area that changes based on the current application scenario of the touch screen.
  • the third device 3' displays the first touch button or the second touch button according to the matched area; for example, when touched When the target area corresponding to the first touch button is displayed, the first touch button is displayed, or the first touch button and the second touch button are simultaneously displayed for the user to operate.
  • the target control area may overlap or not coincide with the area corresponding to the first touch button and/or the second touch button.
  • the multi-touch terminal further includes a fourth device (not shown), wherein when the user stops touching the target control area of the touch screen, the fourth device is hidden The first touch button or the second touch button is hidden.
  • the first The fourth device hides the touch button corresponding to the target control area; for example, if the third device has displayed the first touch button and the second touch button, if the user no longer touches the first The fourth device hides the first touch button, and similarly, if the user no longer touches the target control region corresponding to the second touch button, The fourth device hides the second touch button.
  • step S1 the touch control device acquires a user on a touch screen of a multi-touch terminal a first operation of the first touch button and a second operation of the second touch button; in step S2, the touch control device corresponds to the second operation performed by the operation execution object corresponding to the first operation Object manipulation.
  • step S1 the touch control device acquires a first operation of the first touch button and a second operation of the second touch button by the user on the touch screen of the multi-touch terminal.
  • step S1 the touch control device acquires an operation of the user on the touch screen by using a touch detection component of the touch screen of the multi-touch terminal; Then, in step si, the touch control device matches the position of the operation with the position of the first touch button on the touch screen, and the operation conforming to the position of the first touch button is taken as the first operation, similar In step S1, the touch control device matches the position of the operation with the position of the second touch button on the touch screen, and the operation conforming to the position of the second touch button is taken as the second operation.
  • the operations include, but are not limited to, clicking, double-clicking, long-pressing, pressing, releasing, sliding (including but not limited to various directions), and the like.
  • FIG. 5 shows a schematic diagram of a touch screen for a multi-touch terminal in accordance with a preferred embodiment of the present invention.
  • the button B is the first touch button
  • the button E is the second touch button.
  • the touch control device acquires the first operation of the first touch button and the second operation on the touch screen of the multi-touch terminal of the user. The second action of touching the button.
  • the setting information of the position, the graphic, and the like of the first touch button and the second touch button may be determined based on a default setting, may be determined according to a user or other adjustment settings, and may also be performed according to a specific operation of the user. Determining, if the user double-clicks at a certain position of the touch screen, determining the position as the center position of the second touch button, and the range interval corresponding to the center position may be determined according to a default setting (such as a preset radius range) ), or based on other operations of the user, etc.
  • a default setting such as a preset radius range
  • the first touch button and the second touch button are adapted to at least one of the following positions of the touch screen:
  • a size attribute of the touch screen wherein the size attribute includes, but is not limited to, a horizontal and vertical length, an aspect ratio, and the like; for example, if the touch screen is larger, the first touch button and the second The corresponding range of the touch button is also large, if the touch screen is small, The corresponding range of the first touch button and the second touch button is also proportionally reduced.
  • the aspect ratio of the touch screen is 4:3, the position and size of the touch button are set according to the corresponding ratio.
  • the aspect ratio of the touch screen is 16:9 (widescreen), and the position and size of the touch button are set according to corresponding design manners corresponding to the wide screen;
  • the user holds the state attribute of the multi-touch terminal, wherein the status attribute includes but is not limited to the holding state of the multi-touch terminal (single hand grip, dual hand grip, horizontal and vertical grip, etc.)
  • the arrangement of the touch buttons is adapted to the horizontal and vertical adjustment of the screen, and the like;
  • the current application scenario information of the touch screen where the current application includes, but is not limited to, an application or other application corresponding to the touch button; for example, adjusting the current page content according to an application corresponding to the touch button
  • the position of the touch button is not affected, so as to affect the display of the current page content, or, for example, if the touch screen currently has other applications, the position of the touch button is separated from the application to avoid affecting other The operation of the application.
  • step S2 the touch control device performs an object operation corresponding to the second operation according to the operation execution object corresponding to the first operation.
  • the touch control device determines an operation execution object corresponding to the first operation based on the setting of the first touch button according to the first operation. For example, an object within a range of positions corresponding to the first touch button is used as an operation execution object; or according to a predefined binding setting, such as whenever the first touch button is touched, Touching a predefined object bound to the button as an operation execution object.
  • a predefined binding setting is adopted, the predefined object may be located at any position of the touch screen, and is not limited to The position range corresponding to the first touch button is described.
  • the object F is the operation execution object; then in step S2 The touch control device performs an object operation corresponding to the second operation by the operation execution object according to the second operation. That is, the content corresponding to the second operation is executed by the operation execution object.
  • the object F performs the "moving” operation; if the second operation is "interacting with other objects", the object F performs the "interactive" operation.
  • the first operation and the second operation are at least partially heavy in timing, the first operation and the second operation may be operated simultaneously or within a predetermined threshold period, when When the first operation and the second operation at least partially overlap in time sequence, the performed operation may be combined with the first operation, the second operation, the first operation and the second operation, and the like One or more of the same or different.
  • the second operation is to move the operation execution object corresponding to the first operation within a predetermined range, when the first operation and the second operation at least partially overlap in time series
  • the corresponding operation is to make the operation execution object corresponding to the first operation move in another predetermined range or the like.
  • the first operation may be used to control movement of the operation execution object.
  • the first operation may further move the operation execution object within a predetermined range at an arbitrary angle, in addition to determining the operation execution object; where the predetermined range includes but is not limited to Any range within the touch screen or a range corresponding to the first touch button.
  • the first operation first determines that the operation execution object is F, and then controls the object F to move within 360 degrees of the screen by long pressing and dragging operations.
  • first operation and the second The operation may control the movement of the operation execution object, and if the first operation is used to control the movement of the operation execution object, the second touch button and the second operation may be further liberated, so that the first The second operation performs more complex functions.
  • the second operation includes adjusting an operation action area of the object operation; in step S2, the touch control device performs the operation action area of the current position of the object based on the operation, according to the The operation execution object performs the object operation.
  • the second operation further includes adjusting an operation action area of the object operation, for example, setting the operation action area to different sizes, different shapes (such as a fan shape, a circle shape, a rectangle shape, or other shapes). region.
  • FIG. 5 shows that the operational action area is: a region having a radius of G
  • FIG. 6 shows that the operational action region is: a circular region having a radius of J
  • FIG. The operation action area is: a sector area with a radius of K, etc.;
  • step S2 the touch control device determines an operation execution range of the object operation based on the operation action area of the operation execution object and the current position of the operation execution object, and then, in the operation area Within the area, the object operation is performed by the operation execution object.
  • the operation action area is a circle with a radius r
  • the current position of the operation execution object is X
  • the operation action area of the current position of the operation execution object is the center of X, r a circle having a radius
  • the current position of the operation execution object is y
  • the operation action area of the current position of the operation execution object is a circle centered on y and radiused by r
  • the object operation is to perform the operation execution object to perform interaction with other target objects, and if the scope of the object operation is within the operation action area, the corresponding operation is performed, otherwise the corresponding operation is not performed.
  • the touch control device may target an operation within the operation action area based on the current position, according to the operation
  • the execution object performs the object operation.
  • the touch control device may acquire the operation action area of the current position, and determine an operation action target corresponding to the operation action area, for example, if the operation action area is a radius a circle of r, and the current position of the operation execution object is x, and the operation action area of the current position of the operation execution object is a circle with X as a center and r as a radius, then in step S2 The touch control device acquires all target objects in the area as the operation target.
  • step S2 the touch control device acts on the target to perform the object operation of the operation execution object.
  • the operation execution object is a moving target
  • the operation action area is a circle centered on the moving target
  • the operation target is All the vehicles in the parking lot
  • the object operation is to obtain relevant introduction information of the operation target, so that the touch control device acquires vehicle introduction information of all vehicles.
  • the object operation further includes various interaction information, that is, the operation execution object and the interaction of the operation target, for example, when the operation execution object is a game character, and
  • the interaction includes, but is not limited to, a dialogue, an attack, a pickup, and the like.
  • the second operation includes adjusting an operational target of the object operation.
  • the second operation is used to adjust the operation when the second operation is consistent with one or more operation logics for adjusting an operation action target of the object operation by using a predefined operation logic.
  • the operational target of the object operation is used to adjust the operation when the second operation is consistent with one or more operation logics for adjusting an operation action target of the object operation by using a predefined operation logic.
  • the operation target of the object operation is adjusted according to the operation logic set by the sequence; if the user performs a "click on the touch screen blank
  • the second operation of the area long-pressing and sliding up indicates that the second operation adjusts the operation target of the object operation to be other targets different from the original operation target.
  • the method further includes a step S5 (not shown), wherein, in step S5, the touch control device performs synchronous display of the current position of the object based on the operation during the adjustment of the operation area The adjusted operation area.
  • step S5 the touch control device displays the range boundary or the display range circle on the screen, and adjusts the current position synchronization display based on the operation execution object during the adjustment of the operation area.
  • the operational action area When the operation execution object operates in the operation action area, the operation is successful, and if the operation execution object operates outside the operation action area, the operation is performed, in step S5, The touch control device displays the corresponding prompt color or prompt mark and the like, and the prompt operation is not allowed.
  • the method further includes a step S6 (not shown), wherein, in the step 6, the touch control device performs a subsequent operation corresponding to the object operation after the object operation is completed.
  • the touch control device executes the location according to the executed object operation according to one or more conditions such as a default setting, a user selection, and the like.
  • the subsequent operations corresponding to the object operation For example, if the object operation that has been executed is to get an introduction to the target object, then The continuation is to interact with the target object (such as a conversation, etc.).
  • the subsequent operation may also be performed within a predetermined time threshold after the execution of the object operation is completed, and if the user makes an operation such as selecting or determining within the threshold, the subsequent operation is continued, and vice versa. If the threshold is exceeded and the user has not made an operation such as selection or determination, the execution of the subsequent operation is stopped.
  • Intelligent agents in the field of artificial intelligence such as agent-based schedule secretary, travel secretary, etc.
  • the agent performs the corresponding operations on behalf of their corresponding users. For example, the agent is moved to a specific area by the first touch button, and the hotel performs an operation such as hotel inquiry, reservation, interview with other users by the second touch button, and the like.
  • GIS Geographic Information System
  • navigation applications for example, for different user role agents such as walkers, public transport passengers, and motorists, to support their operations.
  • the agent is moved to a specific area by the first touch button, and the path query and navigation, friend inquiry and appointment are performed by the agent through the second touch button.
  • Game design or application For example, in an existing multi-touch screen-based operation, the action execution direction of most of the operation targets must be consistent with the orientation of the target. This leads to limitations in the design of the operation of the object; in contrast, the invention can make the user character design parallel to the user operation design, which not only improves the design efficiency, but also simplifies the design complexity and improves the design. Design robustness.
  • step S3 ' the touch control device detects whether the user touches a target control area of the touch screen; if yes, displays The first touch button or the second touch button for the user to operate; in step s, the touch control device acquires a user's first touch button on the touch screen of the multi-touch terminal An operation and a second operation on the second touch button; in step S2', the touch control device performs an object operation corresponding to the second operation according to the operation execution object corresponding to the first operation.
  • the steps S 1 ' and S2' are the same as or substantially the same as the corresponding steps shown in FIG. 3, and therefore are not described herein again, and are included herein by reference.
  • step S3' the touch control device detects whether the user touches a target control area of the touch screen; if so, displays the first touch button or the second touch button for the user to operate .
  • the touch control device detects whether the user touches a target control area of the touch screen according to a touch detection component of the touch screen of the multi-touch terminal;
  • the target control area may be a predetermined fixed area, or may be an area that changes based on the current application scenario of the touch screen.
  • step S3' the touch control device displays the first touch button or the second touch button according to the matched area; for example When the target area corresponding to the first touch button is touched, the first touch button is displayed, or the first touch button and the second touch button are simultaneously displayed for the user to operate. .
  • the target control area may overlap or not coincide with an area corresponding to the first touch button and/or the second touch button.
  • the method further includes a step S4' (not shown), wherein, when the user stops touching the target control area of the touch screen, in step S4', the touch control device hides the first Touch the button or the second touch button.
  • step S4 ' the touch control device hides a touch button corresponding to the target control area; for example, if the touch control device has displayed the first touch button and the second touch button in step S3' If the user no longer touches the target control area corresponding to the first touch button, in step S4', the touch control device hides the first touch button, similarly, if the user is no longer Touching the target control area corresponding to the second touch button, the touch control device hides the second touch button in step S4 ′.

Abstract

本发明的目的是提供一种用于多点触摸终端的触摸控制的方法与设备。触摸控制设备通过获取用户在多点触摸终端的触屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操作,然后根据所述第一操作所对应的操作执行对象执行所述第二操作所对应的对象操作。与现有技术相比,本发明通过在多点触摸屏上实现两个操纵按钮,支持用户进行更为复杂的人机交互,例如通过按钮1选择角色朝向与移动、通过按钮2选择并执行角色所能执行的操作,提高了多点触摸屏的操作便捷性与多样性,提高了人机交互效率并提升用户的使用体验。

Description

一种用于多点触摸终端的触摸控制方法与设备 技术领域
本发明涉及计算机领域, 尤其涉及一种用于多点触摸终端的 触摸控制的技术。 背景技术
当前, 对触摸屏的操作通常是基于单点触控操作, 此外, 即 使在现有的多点触控技术中, 用于进行触控操作的多个触摸点也 基本构成一项操作, 例如手机的滑动解锁操作, 因此无法执行复 杂的触控操作, 从而导致交互性不佳, 影响用户体验。 发明内容
本发明的目的是提供一种用于多点触摸终端的触摸控制的方 法与设备。
根据本发明的一个方面, 提供了一种用于多点触摸终端的触摸 控制方法, 其中, 该方法包括:
a 获取用户在多点触摸终端的触屏上对第一触摸按钮的第一 操作及对第二触摸按钮的第二操作;
b 根据所述第一操作所对应的操作执行对象执行所述第二操 作所对应的对象操作。
根据本发明的另一方面, 还提供了一种用于多点触摸终端的 触摸控制设备, 其中, 该设备包括:
第一装置, 用于获取用户在多点触摸终端的触屏上对第一触 摸按钮的第一操作及对第二触摸按钮的第二操作;
第二装置, 用于根据所述第一操作所对应的操作执行对象执 行所述第二操作所对应的对象操作。
与现有技术相比, 本发明通过获取用户在多点触摸终端的触 屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操作, 然后根据所述第一操作所对应的操作执行对象执行所述第二操作 所对应的对象操作, 从而通过在多点触摸屏上实现两个操纵按钮, 支持用户进行更为复杂的人机交互, 例如通过按钮 1 选择角色朝 向与移动、 通过按钮 2 选择并执行角色所能执行的操作, 提高了 多点触摸屏的操作便捷性与多样性, 提高了人机交互效率并提升 用户的使用体验。
而且, 本发明还可以检测所述用户是否触摸所述触屏的目标 控制区域, 若是, 显示所述第一触摸按钮或所述第二触摸按钮, 以供所述用户进行操作, 从而提高了交互控制的准确性, 进而提 高了多点触摸屏的操作便捷性与多样性, 提高了人机交互效率并 提升用户的使用体验。
而且, 本发明还可以当所述用户停止触摸所述触屏的目标控 制区域, 隐藏所述第一触摸按钮或所述第二触摸按钮; 或者, 本 发明还可以使得所述第一触摸按钮与所述第二触摸按钮在所述触 屏的位置与各种预设条件相适应; 从而使得界面更加友好, 提高 了人机交互效率并提升用户的使用体验。
而且, 本发明还可以将所述第一操作用于控制所述操作执行 对象的移动; 进一步地, 所述第二操作包括调整所述对象操作的 操作作用区域, 并与所述第一操作一起执行各类操作; 从而实现 了对复杂的人机交互的支持, 提高了多点触摸屏的操作便捷性与 多样性, 提高了人机交互效率并提升用户的使用体验。
而且, 本发明还可以在所述对象操作执行完毕后, 执行所述 对象操作所对应的后续操作, 从而使得整套操作更为完整, 提高 了多点触摸屏的操作便捷性与多样性, 提高了人机交互效率并提 升用户的使用体验。 附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描 述, 本发明的其它特征、 目的和优点将会变得更明显:
图 1 示出根据本发明一个方面的一种用于多点触摸终端的触 摸控制设备示意图;
图 2 示出根据本发明一个优选实施例的一种用于多点触摸终 端的触摸控制设备示意图;
图 3 示出根据本发明另一个方面的一种用于多点触摸终端的 触摸控制流程图;
图 4 示出根据本发明一个优选实施例的一种用于多点触摸终 端的触摸控制流程图;
图 5至图 7分别示出根据本发明的一个优选实施例的一种用 于多点触摸终端的触摸屏示意图。
附图中相同或相似的附图标记代表相同或相似的部件。 具体实施方式
下面结合附图对本发明作进一步详细描述。
图 1 示出根据本发明一个方面的一种用于多点触摸终端的触 摸控制设备示意图; 其中, 所述触摸控制设备包括第一装置 1、 第 二装置 2。 具体地, 所述第一装置 1获取用户在多点触摸终端的触 屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操作; 所述第二装置 2 根据所述第一操作所对应的操作执行对象执行所 述第二操作所对应的对象操作。 在此, 所述触摸控制设备包括但不限于用户设备、 网络设备、 或用户设备与网络设备通过网络相集成所构成的设备。 所述用户 设备其包括但不限于任何一种可与用户通过触摸板进行人机交互 的移动电子产品, 例如智能手机、 PDA等, 所述移动电子产品可以 采用任意操作系统,如 android操作系统、 iOS操作系统等。其中, 所述网络设备包括一种能够按照事先设定或存储的指令, 自动进 行数值计算和信息处理的电子设备, 其硬件包括但不限于微处理 器、 专用集成电路(ASIC)、 可编程门阵列 (FPGA )、 数字处理器 ( DSP)、 嵌入式设备等。 所述网络设备其包括但不限于计算机、 网络主机、 单个网络服务器、 多个网络服务器集或多个服务器构 成的云; 在此, 云由基于云计算 (Cloud Comput ing ) 的大量计算 机或网络服务器构成, 其中, 云计算是分布式计算的一种, 由一 群松散耦合的计算机集组成的一个虚拟超级计算机。 所述网络包 括但不限于互联网、 广域网、 城域网、 局域网、 VPN网络、 无线自 组织网络 (Ad Hoc 网络) 等。 本领域技术人员应能理解, 其他的 触摸控制设备同样适用于本发明, 也应包含在本发明保护范围以 内, 并在此以引用方式包含于此。
上述各装置之间是持续不断工作的, 在此, 本领域技术人员 应理解 "持续" 是指上述各装置分别实时地或者按照设定的或实 时调整的工作模式要求, 进行第一操作与第二操作的获取、 操作 的执行等, 直至所述触摸控制设备停止获取用户在多点触摸终端 的触屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操 作。
所述第一装置 1 获取用户在多点触摸终端的触屏上对第一触 摸按钮的第一操作及对第二触摸按钮的第二操作。
具体地, 所述第一装置 1 通过根据所述多点触摸终端的触屏 的触摸检测部件, 获取用户在所述触屏上的操作; 然后, 所述第 一装置 1 将所述操作的位置与在触屏上的第一触摸按钮的位置进 行匹配, 将符合所述第一触摸按钮位置的操作作为第一操作, 类 似地, 所述第一装置 1 将所述操作的位置与在触屏上的第二触摸 按钮的位置进行匹配, 将符合所述第二触摸按钮位置的操作作为 第二操作。 其中, 所述操作包括但不限于单击、 双击、 长按、 按 下后释放、 滑动 (包括但不限于各种方向)、 旋转等。
图 5 示出了根据本发明的一个优选实施例的一种用于多点触 摸终端的触摸屏示意图。 如图 5所示, 按钮 B即为所述第一触摸 按钮, 按钮 E即为所述第二触摸按钮。 当用户在按钮 B与按钮 E 上进行触摸操作时, 所述第一装置 1 获取该用户在多点触摸终端 的触屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操 作。
在此, 所述第一触摸按钮与所述第二触摸按钮的位置、 图形 等设置信息可以基于缺省设置进行确定, 也可以根据用户或其他 调整设置进行确定, 还可以根据用户的特定操作进行确定, 如用 户在触屏的某一位置双击, 则将该位置确定为所述第二触摸按钮 的中心位置, 该中心位置所对应的范围区间可以根据缺省设置确 定 (如预设的半径范围), 或基于用户的其他操作进行确定等。 本 领域技术人员应能理解, 所述位置包括触摸按钮的中心位置以及 其对应的范围区间。
优选地, 所述第一触摸按钮与所述第二触摸按钮在所述触屏 的位置与以下至少任一项相适应:
- 所述触屏的尺寸属性, 其中, 所述尺寸属性包括但不限于 横纵长度、 横纵比例等; 例如, 若所述触屏较大, 则所述第一触 摸按钮与所述第二触摸按钮的对应范围也较大, 若所述触屏较小, 则所述第一触摸按钮与所述第二触摸按钮的对应范围也等比例减 小, 若所述触屏的横纵比为 4 : 3, 则按对应比例设置触摸按钮的位 置与大小, 若所述触屏的横纵比为 16 : 9 (宽屏), 则按与宽屏对应 的相应设计方式来设置触摸按钮的位置与大小等;
- 所述用户握持所述多点触摸终端的状态属性, 其中, 所述 状态属性包括但不限于多点触摸终端的持握状态 (单手持握、 双 手持握、 横竖方向的持握等); 例如, 当用户将所述终端从竖向持 握转变为横向持握时, 则将所述触摸按钮的排布适应于屏幕的横 纵调节等;
- 所述触屏的当前应用场景信息, 其中, 所述当前应用包括 但不限于与所述触摸按钮对应的应用或其他应用; 例如, 根据与 所述触摸按钮对应的应用的当前页面内容调整所述触摸按键的位 置, 以免影响对当前页面内容的显示, 或者, 例如, 若所述触屏 当前存在其他应用, 则将所述触摸按钮的位置与该应用进行分隔 排布, 以避免影响对其他应用的操作。
所述第二装置 2 根据所述第一操作所对应的操作执行对象执 行所述第二操作所对应的对象操作。
具体地, 所述第二装置 2 根据所述第一操作, 基于所述第一 触摸按钮的设定, 确定与所述第一操作相对应的操作执行对象。 例如, 将所述第一触摸按钮所对应的位置范围内的对象作为操作 执行对象; 或根据预定义的绑定设置, 如每当触摸所述第一触摸 按钮时, 则将与所述第一触摸按钮所绑定的某一个预定义的对象 作为操作执行对象, 在此, 若采用预定义的绑定设置, 则所述预 定义的对象可以位于所述触屏的任意位置, 而不仅限于所述第一 触摸按钮所对应的位置范围内。
以图 5为例, 物体 F即为所述操作执行对象; 然后所述第二 装置 2 根据所述第二操作, 由所述操作执行对象执行所述第二操 作所对应的对象操作。 SP, 由所述操作执行对象来执行所述第二 操作所对应的内容。
例如, 所述第二操作为 "移动" , 则所述物体 F 执行该 "移 动"操作; 若所述第二操作为 "与其他对象交互" , 则所述物体 F 执行该 "交互"操作。
优选地, 所述第一操作与所述第二操作在时序上至少部分重 叠。
具体地, 所述第一操作与所述第二操作可以同时操作或在预 定的阈值时段内进行操作, 当所述第一操作与所述第二操作在时 序上至少部分重叠时, 所执行的操作可以与所述第一操作、 所述 第二操作、 所述第一操作与所述第二操作的结合等上述一项或多 项相同或不同。 例如, 若所述第二操作为让所述第一操作所对应 的操作执行对象在某一预定范围内进行移动, 则当所述第一操作 与所述第二操作在时序上至少部'分重叠时, 对应的操作为让所述 第一操作所对应的操作执行对象在另一预定范围内进行移动等。
优选地, 所述第一操作可以用于控制所述操作执行对象的移 动。
具体地, 所述第一操作除用于确定所述操作执行对象外, 还 可以将所述操作执行对象以任意角度在预定的范围内进行移动; 在此, 所述预定范围包括但不限于所述触屏内的任意范围或所述 第一触摸按钮所对应的范围。
例如, 如图 5 所示, 所述第一操作首先确定所述操作执行对 象为 F, 然后通过长按并拖动等操作, 控制所述对象 F在屏幕内的 360 ° 内进行移动。
在此, 本领域技术人员应能理解, 所述第一操作与所述第二 操作均可以控制所述操作执行对象的移动, 而若采用所述第一操 作控制所述操作执行对象的移动, 可以进一步解放所述第二触摸 按钮与所述第二操作, 以使得所述第二操作执行更复杂的功能。
更优选地, 所述第二操作包括调整所述对象操作的操作作用 区域; 所述第二装置 2 在基于所述操作执行对象的当前位置的所 述操作作用区域, 根据所述操作执行对象执行所述对象操作。
具体地, 所述第二操作还包括调整所述对象操作的操作作用 区域, 例如, 将所述操作作用区域设置为不同大小、 不同形状(如 扇形、 圆形、 矩形或其他形状) 的各类区域。 在此, 例如, 图 5 示出了所述操作作用区域为: 以 G为半径的区域; 图 6示出了所 述操作作用区域为: 以 J为半径的圆形区域; 图 7示出了所述操 作作用区域为: 以 K为半径的扇形区域等;
然后, 所述第二装置 2 基于所述操作执行对象的所述操作作 用区域以及所述操作执行对象的当前位置, 确定所述对象操作的 操作执行范围, 然后, 在操作区域的区域范围内, 由所述操作执 行对象来执行所述对象操作。 例如, 若所述操作作用区域为半径 r 的圆圈, 且所述操作执行对象的当前位置为 x, 则所述操作执行对 象的当前位置的所述操作作用区域即为以 X为圆心、 以 r为半径 的圆; 若所述操作执行对象的当前位置为 y, 则所述操作执行对象 的当前位置的所述操作作用区域即为以 y为圆心、 以 r为半径的 圆; 然后若所述对象操作为操作所述操作执行对象来执行与其他 目标对象的交互, 则若所述对象操作的范围在该操作作用区域内, 则执行对应操作, 反之则不执行对应操作。
更优选地, 所述第二装置 2 可以针对基于所述当前位置的所 述操作作用区域内的操作作用目标, 根据所述操作执行对象执行 所述对象操作。 具体地, 所述第二装置 2 可以获取所述当期位置的所述操作 作用区域, 确定与所述操作作用区域相对应的操作作用目标, 例 如, 若所述操作作用区域为半径 r 的圆圈, 且所述操作执行对象 的当前位置为 x,则所述操作执行对象的当前位置的所述操作作用 区域即为以 X为圆心、 以 r为半径的圆, 则所述第二装置 2获取 该区域内所有目标对象, 以作为所述操作作用目标。
然后, 所述第二装置 2 对所述操作作用目标来执行所述操作 执行对象的所述对象操作。 例如, 若所述操作执行对象为一个移 动目标, 所述操作作用区域即为以该移动目标为圆心的圆, 当这 个操作作用区域移动到一个停车场时, 则所述操作作用目标即为 在该停车场中的所有车辆, 然后, 所述对象操作即为获取所述操 作作用目标的相关介绍信息, 从而所述第二装置 2 获取所有车辆 的车辆介绍信息。
在此, 本领域技术人员应能理解, 所述对象操作还包括各种 交互信息, 即所述操作执行对象以及所述操作作用目标的交互, 例如, 当所述操作执行对象为游戏人物, 而所述操作作用目标为 目标攻击对象时, 所述交互包括但不限于对话、 攻击、 拾取等相 关操作。
更优选地, 所述第二操作包括调整所述对象操作的操作作用 目标。
具体地, 通过基于预定义的操作逻辑, 当所述第二操作符合 预定义一条或多条用于调整所述对象操作的操作作用目标的操作 逻辑时, 所述第二操作用于调整所述对象操作的操作作用目标。
例如, 当基于预定方式或基于其他方式确定了所述对象操作 的操作作用目标后, 若用户执行了预定义的一个或多个操作序列, 则按照该序列所设定的操作逻辑, 调整所述对象操作的操作作用 目标; 如用户执行了 "点击触屏空白区域一一长按并上滑" 的第 二操作, 则表示所述第二操作调整所述对象操作的操作作用目标 为不同于原有操作作用目标的其他目标。 在此, 本领域技术人员 应能理解, 所述操作序列仅为示例, 并非对本发明的限制, 其他 的操作序列如能适用于本发明, 同样可用于调整所述对象操作的 操作作用目标。
更优选地, 所述触摸控制设备还包括第五装置 (未示出), 其 中, 所述第五装置在调整所述操作作用区域过程中, 基于所述操 作执行对象的当前位置同步显示调整后的所述操作作用区域。
具体地, 所述第五装置通过在屏幕上显示范围边界或显示范 围圈等方式, 在调整所述操作用区域过程中, 基于所述操作执行 对象的当前位置同步显示调整后的所述操作作用区域。 当所述操 作执行对象在所述操作作用区域内进行操作时, 则操作成功, 反 之, 若所述操作执行对象在所述操作作用区域外进.行操作时, 则 所述第五装置通过显示对应的提示颜色或提示标记等信息, 提示 操作不允许。
优选地, 所述触摸控制设备还包括第六装置(未示出), 其中, 所述第六装置在所述对象操作执行完毕后, 执行所述对象操作所 对应的后续操作。
具体地, 当所述对象操作执行完毕后, 所述第六装置根据缺 省设置、 用户的选择等一种或多种条件, 根据所述己执行完毕的 对象操作, 执行所述对象操作所对应的后续操作。 例如, 己执行 完毕的对象操作为获取目标对象的简介, 则后续操作为与该目标 对象进行交互 (如对话等)。 优选地, 所述后续操作还可以被设定 在所述对象操作执行完毕后的预定时间阈值内进行执行, 如在该 阈值内用户作出选择或确定等操作, 则继续执行所述后续操作, 反之, 若超过该阈值而用户尚未作出选择或确定等操作, 则停止 所述后续操作的执行。
本领域技术人员应能理解, 本发明适用的应用场景包括但不 限于:
1 ) 人工智能领域的智能代理, 例如基于 agent (代理) 的 日程秘书、 旅游秘书等, agent 代表其所对应的用户执 行相应的操作。 例如通过第一触摸按钮将 agent移动到 特定区域, 并通过第二触摸按钮由该 agent执行酒店査 询、 预定, 与其他用户的面谈约定等操作。
2 ) 包括导航应用在内的地理信息系统 GIS , 例如对于步行 者、 公共交通乘客、 驾车者等不同用户角色 agent , 支 持其执行相应的操作。 例如通过第一触摸按钮将 agent 移动到特定区域, 并通过第二触摸按钮由该 agent执行 路径查询与导航、 好友査询与约会等操作。
3 ) 游戏设计或应用。例如, 现有基于多点触摸屏的操作中, 大部分操作目标的动作执行方向与该目标的朝向必须一 致。 这就导致了对该类对象的操作设计上存在局限性; 相比之下, 本发明可以使得用户角色设计与用户操作设 计相并行, 不仅提高了设计效率,还简化了设计复杂度、 提高了设计健壮性。
图 2 示出根据本发明一个优选实施例的一种用于多点触摸终 端的触摸控制设备示意图; 其中, 所述触摸控制设备包括第一装 置 Γ 、第二装置 2 ' 、 第三装置 3 ' 。 具体地, 所述第三装置 3 ' 检测所述用户是否触摸所述触屏的目标控制区域; 若是, 显示所 述第一触摸按钮或所述第二触摸按钮, 以供所述用户进行操作; 所述第一装置 获取用户在多点触摸终端的触屏上对第一触摸 按钮的第一操作及对第二触摸按钮的第二操作;所述第二装置 2 ' 根据所述第一操作所对应的操作执行对象执行所述第二操作所对 应的对象操作。
其中, 所述第一装置 Γ 、 第二装置 2 ' 与图 1所示对应装置 相同或基本相同, 故此处不再赘述, 并通过引用的方式包含于此。
上述各装置之间是持续不断工作的, 在此, 本领域技术人员 应理解 "持续" 是指上述各装置分别实时地或者按照设定的或实 时调整的工作模式要求, 进行是否触摸目标控制区域的检测、 第 一操作与第二操作的获取、 操作的执行等, 直至所述触摸控制设 备停止检测所述用户是否触摸所述触屏的目标控制区域。
所述第三装置 3 ' 检测所述用户是否触摸所述触屏的目标控 制区域; 若是, 显示所述第一触摸按钮或所述第二触摸按钮, 以 供所述用户进行操作。
具体地, 所述第三装置 3 ' 通过根据所述多点触摸终端的触 屏的触摸检测部件, 检测所述用户是否触摸所述触屏的目标控制 区域; 在此, 所述目标控制区域可以是预定固定区域, 也可以是 基于所述触屏的当前应用场景而变化的区域。
若所述用户的触摸操作与所述目标控制区域相匹配, 则所述 第三装置 3 ' 根据所匹配的区域,显示所述第一触摸按钮或所述第 二触摸按钮; 例如, 当触摸到所述第一触摸按钮所对应的目标区 域时, 则显示所述第一触摸按钮, 或者同时显示所述第一触摸按 钮与所述第二触摸按钮, 以供所述用户进行操作。 在此, 所述目 标控制区域与所述第一触摸按钮和 /或所述第二触摸按钮所对应 的区域可以重合或不重合。
优选地, 所述多点触摸终端还包括第四装置(未示出), 其中, 当所述用户停止触摸所述触屏的目标控制区域, 所述第四装置隐 藏所述第一触摸按钮或所述第二触摸按钮。
具体地, 当所述用户停止触摸所述触屏的目标控制区域, 即 当所述用户停止触摸所述触摸屏, 或所述用户的触摸操作与所述 目标控制区域不匹配时, 则所述第四装置隐藏与所述目标控制区 域相对应的触摸按钮; 例如, 若所述第三装置己经显示所述第一 触摸按钮与所述第二触摸按钮, 若所述用户不再触摸所述第一触 摸按钮所对应的目标控制区域, 则所述第四装置隐藏所述第一触 摸按钮, 类似地, 若所述用户不再触摸所述第二触摸按钮所对应 的目标控制区域, 则所述第四装置隐藏所述第二触摸按钮。
图 3 示出根据本发明另一个方面的一种用于多点触摸终端的 触摸控制流程图; 具体地, 在步骤 S 1中, 所述触摸控制设备获取 用户在多点触摸终端的触屏上对第一触摸按钮的第一操作及对第 二触摸按钮的第二操作; 在步骤 S2中, 所述触摸控制设备根据所 述第一操作所对应的操作执行对象执行所述第二操作所对应的对 象操作。
上述各步骤之间是持续不断工作的, 在此, 本领域技术人员 应理解 "持续" 是指上述各步骤分别实时地或者按照设定的或实 时调整的工作模式要求, 进行第一操作与第二操作的获取、 操作 的执行等, 直至所述触摸控制设备停止获取用户在多点触摸终端 的触屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操 作。
在步骤 S1中, 所述触摸控制设备获取用户在多点触摸终端的 触屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操 作。
具体地, 在步骤 S1中, 所述触摸控制设备通过根据所述多点 触摸终端的触屏的触摸检测部件, 获取用户在所述触屏上的操作; 然后, 在步骤 s i中, 所述触摸控制设备将所述操作的位置与在触 屏上的第一触摸按钮的位置进行匹配, 将符合所述第一触摸按钮 位置的操作作为第一操作, 类似地, 在步骤 S1中, 所述触摸控制 设备将所述操作的位置与在触屏上的第二触摸按钮的位置进行匹 配, 将符合所述第二触摸按钮位置的操作作为第二操作。 其中, 所述操作包括但不限于单击、 双击、 长按、 按下后释放、 滑动(包 括但不限于各种方向) 等。
图 5 示出了根据本发明的一个优选实施例的一种用于多点触 摸终端的触摸屏示意图。 如图 5所示, 按钮 B即为所述第一触摸 按钮, 按钮 E即为所述第二触摸按钮。 当用户在按钮 B与按钮 E 上进行触摸操作时, 在步骤 S 1中, 所述触摸控制设备获取该用户 在多点触摸终端的触屏上对第一触摸按钮的第一操作及对第二触 摸按钮的第二操作。
在此, 所述第一触摸按钮与所述第二触摸按钮的位置、 图形 等设置信息可以基于缺省设置进行确定, 也可以根据用户或其他 调整设置进行确定, 还可以根据用户的特定操作进行确定, 如用 户在触屏的某一位置双击, 则将该位置确定为所述第二触摸按钮 的中心位置, 该中心位置所对应的范围区间可以根据缺省设置确 定 (如预设的半径范围), 或基于用户的其他操作进行确定等。 本 领域技术人员应能理解, 所述位置包括触摸按钮的中心位置以及 其对应的范围区间。
优选地, 所述第一触摸按钮与所述第二触摸按钮在所述触屏 的位置与以下至少任一项相适应:
- 所述触屏的尺寸属性, 其中, 所述尺寸属性包括但不限于 横纵长度、 横纵比例等; 例如, 若所述触屏较大, 则所述第一触 摸按钮与所述第二触摸按钮的对应范围也较大, 若所述触屏较小, 则所述第一触摸按钮与所述第二触摸按钮的对应范围也等比例减 小, 若所述触屏的横纵比为 4 : 3, 则按对应比例设置触摸按钮的位 置与大小, 若所述触屏的横纵比为 16 : 9 (宽屏), 则按与宽屏对应 的相应设计方式来设置触摸按钮的位置与大小等;
- 所述用户握持所述多点触摸终端的状态属性, 其中, 所述 状态属性包括但不限于多点触摸终端的持握状态 (单手持握、 双 手持握、 横竖方向的持握等); 例如, 当用户将所述终端从竖向持 握转变为横向持握时, 则将所述触摸按钮的排布适应于屏幕的横 纵调节等;
- 所述触屏的当前应用场景信息, 其中, 所述当前应用包括 但不限于与所述触摸按钮对应的应用或其他应用; 例如, 根据与 所述触摸按钮对应的应用的当前页面内容调整所述触摸按键的位 置, 以免影响对当前页面内容的显示, 或者, 例如, 若所述触屏 当前存在其他应用, 则将所述触摸按钮的位置与该应用进行分隔 排布, 以避免影响对其他应用的操作。
在步骤 S2中, 所述触摸控制设备根据所述第一操作所对应的 操作执行对象执行所述第二操作所对应的对象操作。
具体地,在步骤 S2中,所述触摸控制设备根据所述第一操作, 基于所述第一触摸按钮的设定, 确定与所述第一操作相对应的操 作执行对象。 例如, 将所述第一触摸按钮所对应的位置范围内的 对象作为操作执行对象; 或根据预定义的绑定设置, 如每当触摸 所述第一触摸按钮时, 则将与所述第一触摸按钮所绑定的某一个 预定义的对象作为操作执行对象, 在此, 若采用预定义的绑定设 置, 则所述预定义的对象可以位于所述触屏的任意位置, 而不仅 限于所述第一触摸按钮所对应的位置范围内。
以图 5为例, 物体 F即为所述操作执行对象; 然后在步骤 S2 中, 所述触摸控制设备根据所述第二操作, 由所述操作执行对象 执行所述第二操作所对应的对象操作。 即, 由所述操作执行对象 来执行所述第二操作所对应的内容。
例如, 所述第二操作为 "移动" , 则所述物体 F 执行该 "移 动"操作; 若所述第二操作为 "与其他对象交互" , 则所述物体 F 执行该 "交互"操作。
优选地, 所述第一操作与所述第二操作在时序上至少部分重 具体地, 所述第一操作与所述第二操作可以同时操作或在预 定的阈值时段内进行操作, 当所述第一操作与所述第二操作在时 序上至少部分重叠时, 所执行的操作可以与所述第一操作、 所述 第二操作、 所述第一操作与所述第二操作的结合等上述一项或多 项相同或不同。 例如, 若所述第二操作为让所述第一操作所对应 的操作执行对象在某一预定范围内进行移动, 则当所述第一操作 与所述第二操作在时序上至少部分重叠时, 对应的操作为让所述 第一操作所对应的操作执行对象在另一预定范围内进行移动等。
优选地, 所述第一操作可以用于控制所述操作执行对象的移 动。
具体地, 所述第一操作除用于确定所述操作执行对象外, 还 可以将所述操作执行对象以任意角度在预定的范围内进行移动; 在此, 所述预定范围包括但不限于所述触屏内的任意范围或所述 第一触摸按钮所对应的范围。
例如, 如图 5 所示, 所述第一操作首先确定所述操作执行对 象为 F, 然后通过长按并拖动等操作, 控制所述对象 F在屏幕内的 360 ° 内进行移动。
在此, 本领域技术人员应能理解, 所述第一操作与所述第二 操作均可以控制所述操作执行对象的移动, 而若采用所述第一操 作控制所述操作执行对象的移动, 可以进一步解放所述第二触摸 按钮与所述第二操作, 以使得所述第二操作执行更复杂的功能。
更优选地, 所述第二操作包括调整所述对象操作的操作作用 区域; 在步骤 S2中, 所述触摸控制设备在基于所述操作执行对象 的当前位置的所述操作作用区域, 根据所述操作执行对象执行所 述对象操作。
具体地, 所述第二操作还包括调整所述对象操作的操作作用 区域, 例如, 将所述操作作用区域设置为不同大小、 不同形状(如 扇形、 圆形、 矩形或其他形状) 的各类区域。 在此, 例如, 图 5 示出了所述操作作用区域为: 以 G为半径的区域; 图 6示出了所 述操作作用区域为: 以 J为半径的圆形区域; 图 7示出了所述操 作作用区域为: 以 K为半径的扇形区域等;
然后, 在步骤 S2中, 所述触摸控制设备基于所述操作执行对 象的所述操作作用区域以及所述操作执行对象的当前位置, 确定 所述对象操作的操作执行范围, 然后, 在操作区域的区域范围内, 由所述操作执行对象来执行所述对象操作。 例如, 若所述操作作 用区域为半径 r的圆圈, 且所述操作执行对象的当前位置为 X, 则 所述操作执行对象的当前位置的所述操作作用区域即为以 X 为圆 心、 以 r为半径的圆; 若所述操作执行对象的当前位置为 y, 则所 述操作执行对象的当前位置的所述操作作用区域即为以 y为圆心、 以 r 为半径的圆; 然后若所述对象操作为操作所述操作执行对象 来执行与其他目标对象的交互, 则若所述对象操作的范围在该操 作作用区域内, 则执行对应操作, 反之则不执行对应操作。
更优选地, 在步骤 S2中, 所述触摸控制设备可以针对基于所 述当前位置的所述操作作用区域内的操作作用目标, 根据所述操 作执行对象执行所述对象操作。
具体地, 在步骤 S2中, 所述触摸控制设备可以获取所述当期 位置的所述操作作用区域, 确定与所述操作作用区域相对应的操 作作用目标, 例如, 若所述操作作用区域为半径 r 的圆圈, 且所 述操作执行对象的当前位置为 x,则所述操作执行对象的当前位置 的所述操作作用区域即为以 X为圆心、 以 r为半径的圆, 则在步 骤 S2中, 所述触摸控制设备获取该区域内所有目标对象, 以作为 所述操作作用目标。
然后, 在步骤 S2中, 所述触摸控制设备对所述操作作用目标 来执行所述操作执行对象的所述对象操作。 例如, 若所述操作执 行对象为一个移动目标, 所述操作作用区域即为以该移动目标为 圆心的圆, 当这个操作作用区域移动到一个停车场时, 则所述操 作作用目标即为在该停车场中的所有车辆, 然后, 所述对象操作 即为获取所述操作作用目标的相关介绍信息, 从而所述触摸控制 设备获取所有车辆的车辆介绍信息。
在此, 本领域技术人员应能理解, 所述对象操作还包括各种 交互信息, 即所述操作执行对象以及所述操作作用目标的交互, 例如, 当所述操作执行对象为游戏人物, 而所述操作作用目标为 目标攻击对象时, 所述交互包括但不限于对话、 攻击、 拾取等相 关操作。
更优选地, 所述第二操作包括调整所述对象操作的操作作用 目标。
具体地, 通过基于预定义的操作逻辑, 当所述第二操作符合 预定义一条或多条用于调整所述对象操作的操作作用目标的操作 逻辑时, 所述第二操作用于调整所述对象操作的操作作用目标。
例如, 当基于预定方式或基于其他方式确定了所述对象操作 的操作作用目标后, 若用户执行了预定义的一个或多个操作序列, 则按照该序列所设定的操作逻辑, 调整所述对象操作的操作作用 目标; 如用户执行了 "点击触屏空白区域一一长按并上滑" 的第 二操作, 则表示所述第二操作调整所述对象操作的操作作用目标 为不同于原有操作作用目标的其他目标。 在此, 本领域技术人员 应能理解, 所述操作序列仅为示例, 并非对本发明的限制, 其他 的操作序列如能适用于本发明, 同样可用于调整所述对象操作的 操作作用目标。
更优选地, 所述方法还包括步骤 S5 (未示出), 其中, 在步骤 S5 中, 所述触摸控制设备在调整所述操作作用区域过程中, 基于 所述操作执行对象的当前位置同步显示调整后的所述操作作用区 域。
具体地, 在步骤 S5中, 所述触摸控制设备通过在屏幕上显示 范围边界或显示范围圈等方式, 在调整所述操作用区域过程中, 基于所述操作执行对象的当前位置同步显示调整后的所述操作作 用区域。 当所述操作执行对象在所述操作作用区域内进行操作时, 则操作成功, 反之, 若所述操作执行对象在所述操作作用区域外 进行操作时, 贝 I」, 在步骤 S5中, 所述触摸控制设备通过显示对应 的提示颜色或提示标记等信息, 提示操作不允许。
优选地, 所述方法还包括步骤 S6 (未示出), 其中, 在步骤 6 中, 所述触摸控制设备在所述对象操作执行完毕后, 执行所述对 象操作所对应的后续操作。
具体地, 当所述对象操作执行完毕后, 在步骤 S6中, 所述触 摸控制设备根据缺省设置、 用户的选择等一种或多种条件, 根据 所述已执行完毕的对象操作, 执行所述对象操作所对应的后续操 作。 例如, 己执行完毕的对象操作为获取目标对象的简介, 则后 续操作为与该目标对象进行交互 (如对话等)。 优选地, 所述后续 操作还可以被设定在所述对象操作执行完毕后的预定时间阈值内 进行执行, 如在该阈值内用户作出选择或确定等操作, 则继续执 行所述后续操作, 反之, 若超过该阈值而用户尚未作出选择或确 定等操作, 则停止所述后续操作的执行。
本领域技术人员应能理解, 本发明适用的应用场景包括但不 限于:
1 ) 人工智能领域的智能代理, 例如基于 agent (代理) 的 日程秘书、 旅游秘书等, agent 代表其所对应的用户执 行相应的操作。 例如通过第一触摸按钮将 agent移动到 特定区域, 并通过第二触摸按钮由该 agent执行酒店査 询、 预定, 与其他用户的面谈约定等操作。
2 ) 包括导航应用在内的地理信息系统 GIS, 例如对于步行 者、 公共交通乘客、 驾车者等不同用户角色 agent , 支 持其执行相应的操作。 例如通过第一触摸按钮将 agent 移动到特定区域, 并通过第二触摸按钮由该 agent执行 路径査询与导航、 好友査询与约会等操作。
3 ) 游戏设计或应用。例如, 现有基于多点触摸屏的操作中, 大部分操作目标的动作执行方向与该目标的朝向必须一 致。 这就导致了对该类对象的操作设计上存在局限性; 相比之下, 本发明可以使得用户角色设计与用户操作设 计相并行, 不仅提高了设计效率,还简化了设计复杂度、 提高了设计健壮性。
图 4 示出根据本发明一个优选实施例的一种用于多点触摸终 端的触摸控制流程图。 具体地, 在步骤 S3 ' 中, 所述触摸控制设 备检测所述用户是否触摸所述触屏的目标控制区域; 若是, 显示 所述第一触摸按钮或所述第二触摸按钮, 以供所述用户进行操作; 在步骤 s 中, 所述触摸控制设备获取用户在多点触摸终端的触 屏上对第一触摸按钮的第一操作及对第二触摸按钮的第二操作; 在步骤 S2 ' 中, 所述触摸控制设备根据所述第一操作所对应的操 作执行对象执行所述第二操作所对应的对象操作。
其中, 所述步骤 S 1 ' 、 步骤 S2 ' 与图 3所示对应步骤相同或 基本相同, 故此处不再赘述, 并通过引用的方式包含于此。
上述各步骤之间是持续不断工作的, 在此, 本领域技术人员 应理解 "持续" 是指上述各步骤分别实时地或者按照设定的或实 时调整的工作模式要求, 进行是否触摸目标控制区域的检测、 第 一操作与第二操作的获取、 操作的执行等, 直至所述触摸控制设 备停止检测所述用户是否触摸所述触屏的目标控制区域。
在步骤 S3 ' 中, 所述触摸控制设备检测所述用户是否触摸所 述触屏的目标控制区域; 若是, 显示所述第一触摸按钮或所述第 二触摸按钮, 以供所述用户进行操作。
具体地, 在步骤 S3 ' 中, 所述触摸控制设备通过根据所述多 点触摸终端的触屏的触摸检测部件, 检测所述用户是否触摸所述 触屏的目标控制区域; 在此, 所述目标控制区域可以是预定固定 区域, 也可以是基于所述触屏的当前应用场景而变化的区域。
若所述用户的触摸操作与所述目标控制区域相匹配, 则在步 骤 S3 ' 中, 所述触摸控制设备根据所匹配的区域, 显示所述第一 触摸按钮或所述第二触摸按钮; 例如, 当触摸到所述第一触摸按 钮所对应的目标区域时, 则显示所述第一触摸按钮, 或者同时显 示所述第一触摸按钮与所述第二触摸按钮, 以供所述用户进行操 作。 在此, 所述目标控制区域与所述第一触摸按钮和 /或所述第二 触摸按钮所对应的区域可以重合或不重合。 优选地, 所述方法还包括步骤 S4 ' (未示出), 其中, 当所述 用户停止触摸所述触屏的目标控制区域, 在步骤 S4 ' 中, 所述触 摸控制设备隐藏所述第一触摸按钮或所述第二触摸按钮。
具体地, 当所述用户停止触摸所述触屏的目标控制区域, 即 当所述用户停止触摸所述触摸屏, 或所述用户的触摸操作与所述 目标控制区域不匹配时, 则在步骤 S4 ' 中, 所述触摸控制设备隐 藏与所述目标控制区域相对应的触摸按钮; 例如, 若在步骤 S3 ' 中, 所述触摸控制设备已经显示所述第一触摸按钮与所述第二触 摸按钮, 若所述用户不再触摸所述第一触摸按钮所对应的目标控 制区域, 则在步骤 S4 ' 中, 所述触摸控制设备隐藏所述第一触摸 按钮, 类似地, 若所述用户不再触摸所述第二触摸按钮所对应的 目标控制区域, 则在步骤 S4 ' 中, 所述触摸控制设备隐藏所述第 二触摸按钮。
对于本领域技术人员而言, 显然本发明不限于上述示范性实 施例的细节, 而且在不背离本发明的精神或基本特征的情况下, 能够以其他的具体形式实现本发明。 因此, 无论从哪一点来看, 均应将实施例看作是示范性的, 而且是非限制性的, 本发明的范 围由所附权利要求而不是上述说明限定, 因此旨在将落在权利要 求的等同要件的含义和范围内的所有变化涵括在本发明内。 不应 将权利要求中的任何附图标记视为限制所涉及的权利要求。 此外, 显然 "包括"一词不排除其他单元或步骤, 单数不排除复数。 装 置权利要求中陈述的多个单元或装置也可以由一个单元或装置通 过软件或者硬件来实现。 第一, 第二等词语用来表示名称, 而并 不表示任何特定的顺序。

Claims

权 利 要 求 书
1. 一种用于多点触摸终端的触摸控制方法, 其中, 该方法包 括:
a 获取用户在多点触摸终端的触屏上对第一触摸按钮的第一 操作及对第二触摸按钮的第二操作;
b 根据所述第一操作所对应的操作执行对象执行所述第二操 作所对应的对象操作。
2. 根据权利要求 1所述的方法, 其中, 该方法还包括:
- 检测所述用户是否触摸所述触屏的目标控制区域; 若是, 显示所述第一触摸按钮或所述第二触摸按钮, 以供所述用户进行 操作。
3. 根据权利要求 1或 2所述的方法, 其中, 该方法还包括: - 当所述用户停止触摸所述触屏的目标控制区域, 隐藏所述 第一触摸按钮或所述第二触摸按钮。
4. 根据权利要求 1至 3中任一项所述的方法, 其中, 所述第 一操作用于控制所述操作执行对象的移动。
5. 根据权利要求 4所述的方法, 其中, 所述第二操作包括调 整所述对象操作的操作作用区域;
其中, 所述步骤 b包括:
- 在基于所述操作执行对象的当前位置的所述操作作用区 域, 根据所述操作执行对象执行所述对象操作。
6. 根据权利要求 5所述的方法, 其中, 所述步骤 b包括:
- 针对基于所述当前位置的所述操作作用区域内的操作作用 目标, 根据所述操作执行对象执行所述对象操作。
7. 根据权利要求 6所述的方法, 其中, 所述第二操作包括调 整所述对象操作的操作作用目标。
8. 根据权利要求 5所述的方法, 其中, 该方法还包括:
- 在调整所述操作作用区域过程中, 基于所述操作执行对象 的当前位置同步显示调整后的所述操作作用区域。
9. 根据权利要求 1至 8中任一项所述的方法, 其中, 所述第 一操作与所述第二操作在时序上至少部分重叠。
10. 根据权利要求 1 至 9 中任一项所述的方法, 其中, 该方 法还包括:
- 在所述对象操作执行完毕后, 执行所述对象操作所对应的 后续操作。
11. 根据权利要求 1至 10中任一项所述的方法, 其中, 所述 第一触摸按钮与所述第二触摸按钮在所述触屏的位置与以下至少 任一项相适应:
- 所述触屏的尺寸属性;
- 所述用户握持所述多点触摸终端的状态属性;
- 所述触屏的当前应用场景信息。
12. 一种用于多点触摸终端的触摸控制设备, 其中, 该设备包 括:
第一装置, 用于获取用户在多点触摸终端的触屏上对第一触 摸按钮的第一操作及对第二触摸按钮的第二操作;
第二装置, 用于根据所述第一操作所对应的操作执行对象执 行所述第二操作所对应的对象操作。
13. 根据权利要求 12所述的设备, 其中, 该设备还包括: 第三装置, 用于检测所述用户是否触摸所述触屏的目标控制 区域; 若是, 显示所述第一触摸按钮或所述第二触摸按钮, 以供 所述用户进行操作。
14. 根据权利要求 12或 13所述的设备, 其中, 该设备还包 括:
第四装置, 用于当所述用户停止触摸所述触屏的目标控制区 域, 隐藏所述第一触摸按钮或所述第二触摸按钮。
15. 根据权利要求 12至 14 中任一项所述的设备, 其中, 所 述第一操作用于控制所述操作执行对象的移动。
16. 根据权利要求 15所述的设备, 其中, 所述第二操作包括 调整所述对象操作的操作作用区域;
其中, 所述第二装置用于:
- 在基于所述操作执行对象的当前位置的所述操作作用区 域, 根据所述操作执行对象执行所述对象操作。
17. 根据权利要求 16所述的设备,其中,所述第二装置用于: - 针对基于所述当前位置的所述操作作用区域内的操作作用 目标, 根据所述操作执行对象执行所述对象操作。
18. 根据权利要求 17所述的设备, 其中, 所述第二操作包括 调整所述对象操作的操作作用目标。
19. 根据权利要求 16所述的设备, 其中, 该设备还包括: 第五装置, 用于在调整所述操作作用区域过程中, 基于所述 操作执行对象的当前位置同步显示调整后的所述操作作用区域。
20. 根据权利要求 12至 19 中任一项所述的设备, 其中, 所 述第一操作与所述第二操作在时序上至少部分重叠。
21. 根据权利要求 12至 20 中任一项所述的设备, 其中, 该 设备还包括:
第六装置, 用于在所述对象操作执行完毕后, 执行所述对象 操作所对应的后续操作。
22. 根据权利要求 12至 21 中任一项所述的设备, 其中, 所 述第一触摸按钮与所述第二触摸按钮在所述触屏的位置与以下至 少任一项相适应:
- 所述触屏的尺寸属性;
- 所述用户握持所述多点触摸终端的状态属性;
- 所述触屏的当前应用场景信息。
PCT/CN2014/000767 2014-07-25 2014-08-14 一种用于多点触摸终端的触摸控制方法与设备 WO2016011568A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020147028782A KR101739782B1 (ko) 2014-07-25 2014-08-14 멀티 터치 단말에 사용되는 터치 제어 방법 및 설비
EP14193542.9A EP2977881A1 (en) 2014-07-25 2014-11-17 Method and apparatus of touch control for multi-point touch terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410357414.XA CN104076986B (zh) 2014-07-25 2014-07-25 一种用于多点触摸终端的触摸控制方法与设备
CN201410357414.X 2014-07-25

Publications (1)

Publication Number Publication Date
WO2016011568A1 true WO2016011568A1 (zh) 2016-01-28

Family

ID=51598292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/000767 WO2016011568A1 (zh) 2014-07-25 2014-08-14 一种用于多点触摸终端的触摸控制方法与设备

Country Status (11)

Country Link
US (1) US10503323B2 (zh)
EP (1) EP2977881A1 (zh)
JP (1) JP6033824B2 (zh)
KR (1) KR101739782B1 (zh)
CN (1) CN104076986B (zh)
AU (3) AU2014277695A1 (zh)
CA (1) CA2878801C (zh)
HK (1) HK1197466A1 (zh)
RU (1) RU2630392C2 (zh)
TW (1) TWI571792B (zh)
WO (1) WO2016011568A1 (zh)

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150193096A1 (en) * 2014-01-07 2015-07-09 Samsung Electronics Co., Ltd. Electronic device and method for operating the electronic device
CN104635925A (zh) * 2015-01-05 2015-05-20 徐杨 一种滑动手势锁定目标的方法及装置
CN104636063B (zh) * 2015-01-22 2018-10-12 杭州电魂网络科技股份有限公司 电子屏幕虚拟摇杆的构建方法
CN104881230A (zh) * 2015-02-16 2015-09-02 上海逗屋网络科技有限公司 在触控终端进行人机交互的方法和设备
CN104699399A (zh) * 2015-02-16 2015-06-10 上海逗屋网络科技有限公司 一种用于在触摸终端上确定目标操作对象的方法与设备
CN104965655A (zh) * 2015-06-15 2015-10-07 北京极品无限科技发展有限责任公司 一种触摸屏游戏控制方法
CN104898953B (zh) * 2015-06-16 2016-10-26 深圳市腾讯计算机系统有限公司 基于触控屏的操控方法和装置
CN105260100B (zh) * 2015-09-29 2017-05-17 腾讯科技(深圳)有限公司 一种信息处理方法和终端
CN105148517B (zh) 2015-09-29 2017-08-15 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN108355348B (zh) 2015-10-10 2021-01-26 腾讯科技(成都)有限公司 信息处理方法、终端及计算机存储介质
CN105335065A (zh) 2015-10-10 2016-02-17 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN105320410B (zh) * 2015-12-01 2019-03-15 成都龙渊网络科技有限公司 一种用于在触摸终端上进行触摸操控的方法与设备
CN105426052A (zh) * 2015-12-01 2016-03-23 成都龙渊网络科技有限公司 用于在触摸终端上管理操作工具的方法与设备
CN105582670B (zh) * 2015-12-17 2019-04-30 网易(杭州)网络有限公司 瞄准射击控制方法及装置
CN105373336B (zh) * 2015-12-17 2018-06-12 网易(杭州)网络有限公司 信息面板呈现控制方法及装置
CN105597315B (zh) * 2015-12-17 2019-05-10 网易(杭州)网络有限公司 虚拟对象投掷控制方法及装置
CN105653187A (zh) * 2015-12-24 2016-06-08 杭州勺子网络科技有限公司 一种用于多点触摸终端的触摸控制方法
CN105930079A (zh) * 2016-04-15 2016-09-07 上海逗屋网络科技有限公司 用于在多点触摸终端上执行用户操作的方法及设备
CN107728870B (zh) * 2016-10-17 2020-03-24 西安艾润物联网技术服务有限责任公司 停车场管理系统及信息显示控制方法
CN106527924A (zh) * 2016-10-20 2017-03-22 北京乐动卓越科技有限公司 一种第三人称视角3d动作手机游戏的角色控制方法及装置
CN106422329A (zh) * 2016-11-01 2017-02-22 网易(杭州)网络有限公司 游戏操控方法和装置
CN106512406A (zh) * 2016-11-01 2017-03-22 网易(杭州)网络有限公司 游戏操控方法和装置
JP6143934B1 (ja) * 2016-11-10 2017-06-07 株式会社Cygames 情報処理プログラム、情報処理方法、及び情報処理装置
CN106354418B (zh) * 2016-11-16 2019-07-09 腾讯科技(深圳)有限公司 一种基于触摸屏的操控方法和装置
CN106843722B (zh) * 2016-12-26 2019-12-31 上海莉莉丝网络科技有限公司 一种用于触控终端的触摸控制方法以及触摸控制装置
CN108905212B (zh) * 2017-03-27 2019-12-31 网易(杭州)网络有限公司 游戏画面的显示控制方法及装置、存储介质、电子设备
CN107008003B (zh) * 2017-04-13 2020-08-14 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及计算机可读存储介质
CN107126698B (zh) * 2017-04-24 2020-06-30 网易(杭州)网络有限公司 游戏虚拟对象的控制方法、装置、电子设备及可读介质
JP7104473B2 (ja) * 2017-10-31 2022-07-21 シャープ株式会社 エネルギー管理システム、電力変換制御装置、および電力変換制御方法
CN107895179A (zh) * 2017-11-29 2018-04-10 合肥赑歌数据科技有限公司 一种基于临近值分析的工件分类系统及方法
CN108037888B (zh) * 2017-12-11 2020-11-24 网易(杭州)网络有限公司 技能控制方法、装置、电子设备及存储介质
CN108196765A (zh) * 2017-12-13 2018-06-22 网易(杭州)网络有限公司 显示控制方法、电子设备及存储介质
CN108434728B (zh) * 2018-02-09 2022-05-13 网易(杭州)网络有限公司 操作控件的适配方法、装置、电子设备及存储介质
CN109126133A (zh) * 2018-08-27 2019-01-04 广州要玩娱乐网络技术股份有限公司 游戏单位编队控制方法、装置、存储介质及移动终端
CN112578981A (zh) * 2019-09-29 2021-03-30 华为技术有限公司 具有柔性屏幕的电子设备的控制方法及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090327975A1 (en) * 2008-06-27 2009-12-31 Stedman Roy W Multi-Touch Sorting Gesture
CN101799727A (zh) * 2009-02-11 2010-08-11 晨星软件研发(深圳)有限公司 多点触控接口的讯号处理装置、方法及使用者接口图像的选取方法
CN103164157A (zh) * 2011-12-19 2013-06-19 三星电子株式会社 在便携式终端中用于提供多点触摸交互的方法和设备
CN103500038A (zh) * 2011-01-13 2014-01-08 索尼电脑娱乐美国公司 从一个触摸输入到另一个触摸输入的对象的手部控制

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11342265A (ja) * 1998-06-01 1999-12-14 Sony Computer Entertainment Inc 記録媒体及びエンタテインメントシステム
US6762752B2 (en) * 2001-11-29 2004-07-13 N-Trig Ltd. Dual function input device and method
US6938221B2 (en) * 2001-11-30 2005-08-30 Microsoft Corporation User interface for stylus-based user input
KR100774928B1 (ko) * 2006-09-27 2007-11-09 엘지전자 주식회사 투명아이콘이 디스플레이되는 이동단말기 및 이를 이용한투명아이콘의 디스플레이방법
EP1970799B1 (en) * 2007-03-15 2017-08-16 LG Electronics Inc. Electronic device and method of controlling mode thereof and mobile communication terminal
US8342926B2 (en) * 2008-07-13 2013-01-01 Sony Computer Entertainment America Llc Game aim assist
JP5608165B2 (ja) * 2008-09-12 2014-10-15 コーニンクレッカ フィリップス エヌ ヴェ 携帯用デバイスのグラフィカルユーザインタフェースにおけるナビゲーション
FR2936326B1 (fr) * 2008-09-22 2011-04-29 Stantum Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts
KR101055924B1 (ko) * 2009-05-26 2011-08-09 주식회사 팬택 터치 기기에서의 유저 인터페이스 장치 및 방법
JP5127780B2 (ja) * 2009-06-16 2013-01-23 株式会社ソニー・コンピュータエンタテインメント ゲーム制御プログラム、ゲーム装置、及びゲーム制御方法
JP5127792B2 (ja) * 2009-08-18 2013-01-23 キヤノン株式会社 情報処理装置、その制御方法、プログラム及び記録媒体
CN101655771B (zh) * 2009-09-07 2011-07-20 上海合合信息科技发展有限公司 多触点字符输入方法及系统
EP2966638B1 (en) * 2009-11-26 2018-06-06 LG Electronics Inc. Mobile terminal and control method thereof
CN101776968A (zh) * 2010-01-18 2010-07-14 华为终端有限公司 触控方法和装置
US9262073B2 (en) * 2010-05-20 2016-02-16 John W. Howard Touch screen with virtual joystick and methods for use therewith
JP5818914B2 (ja) * 2011-12-20 2015-11-18 株式会社ソニー・コンピュータエンタテインメント 操作子及び操作装置
EP2631747B1 (en) * 2012-02-24 2016-03-30 BlackBerry Limited Method and apparatus for providing a user interface on a device that indicates content operators
JP5891083B2 (ja) * 2012-03-26 2016-03-22 京セラ株式会社 装置、方法、及びプログラム
TW201343227A (zh) * 2012-04-25 2013-11-01 Fu Li Ye Internat Corp 具有觸控面板裝置媒體的互動遊戲控制方法
TWM457924U (zh) * 2012-11-23 2013-07-21 Higgstec Inc 雙感測區觸碰感測器、觸控裝置
US20140165003A1 (en) * 2012-12-12 2014-06-12 Appsense Limited Touch screen display
CN102999297B (zh) * 2012-12-19 2016-01-20 深圳怡化电脑股份有限公司 Atm设备上控制广告播放的方法及图形界面
GB201300031D0 (en) * 2013-01-02 2013-02-13 Canonical Ltd Ubuntu UX innovations
US9423268B2 (en) * 2014-06-20 2016-08-23 Apple Inc. Graphical representation generation for multiple points of interest

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090327975A1 (en) * 2008-06-27 2009-12-31 Stedman Roy W Multi-Touch Sorting Gesture
CN101799727A (zh) * 2009-02-11 2010-08-11 晨星软件研发(深圳)有限公司 多点触控接口的讯号处理装置、方法及使用者接口图像的选取方法
CN103500038A (zh) * 2011-01-13 2014-01-08 索尼电脑娱乐美国公司 从一个触摸输入到另一个触摸输入的对象的手部控制
CN103164157A (zh) * 2011-12-19 2013-06-19 三星电子株式会社 在便携式终端中用于提供多点触摸交互的方法和设备

Also Published As

Publication number Publication date
CA2878801C (en) 2019-01-15
US20150049058A1 (en) 2015-02-19
US20170205908A9 (en) 2017-07-20
EP2977881A1 (en) 2016-01-27
JP2016031755A (ja) 2016-03-07
AU2014277695A1 (en) 2015-01-22
RU2015104255A (ru) 2016-08-27
KR101739782B1 (ko) 2017-05-25
CN104076986B (zh) 2015-12-09
RU2630392C2 (ru) 2017-09-07
HK1197466A1 (zh) 2015-01-16
AU2018208753A1 (en) 2018-08-16
US10503323B2 (en) 2019-12-10
CA2878801A1 (en) 2015-03-24
CN104076986A (zh) 2014-10-01
AU2016208316A1 (en) 2016-08-11
TWI571792B (zh) 2017-02-21
JP6033824B2 (ja) 2016-11-30
KR20160023532A (ko) 2016-03-03
TW201604762A (zh) 2016-02-01

Similar Documents

Publication Publication Date Title
WO2016011568A1 (zh) 一种用于多点触摸终端的触摸控制方法与设备
JP2023182812A (ja) 情報処理方法、端末及びコンピュータ記憶媒体
WO2013182149A1 (zh) 一种悬浮窗位置控制的方法及终端
JP7150108B2 (ja) ゲームプログラム、情報処理装置、情報処理システム、および、ゲーム処理方法
KR20180006956A (ko) 정보 처리 방법과 단말기, 및 컴퓨터 저장 매체
CN103257811A (zh) 基于触摸屏的图片显示系统和方法
WO2013167028A2 (zh) 一种在触屏终端界面实现悬浮式全局按钮的方法及系统
JP2023089287A (ja) ゲームプログラム、情報処理装置、情報処理システム、および、ゲーム処理方法
CN107547738B (zh) 一种提示方法及移动终端
JP6258513B2 (ja) 触感制御システムおよび触感制御方法
TWI483172B (zh) 編排行動裝置用戶介面的方法和系統
WO2018216079A1 (ja) ゲームプログラム、情報処理装置、情報処理システム、および、ゲーム処理方法
US20160282966A1 (en) Input devices and methods
JP6548852B2 (ja) タッチ入力判定装置、タッチパネル入力装置、タッチ入力判定方法、及びタッチ入力判定プログラム
CN104951223B (zh) 一种触摸屏实现放大镜的方法、装置及主机
JP6289655B2 (ja) 画面操作装置及び画面操作方法
WO2016150382A1 (en) Input devices and methods
CN104484117A (zh) 人机交互方法及装置
JP5707519B1 (ja) 入力装置、入力方法及びプログラム
WO2015100601A1 (zh) 操作控制方法和终端

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20147028782

Country of ref document: KR

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14898231

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14898231

Country of ref document: EP

Kind code of ref document: A1