WO2019090486A1 - 一种触摸控制方法及装置 - Google Patents

一种触摸控制方法及装置 Download PDF

Info

Publication number
WO2019090486A1
WO2019090486A1 PCT/CN2017/109781 CN2017109781W WO2019090486A1 WO 2019090486 A1 WO2019090486 A1 WO 2019090486A1 CN 2017109781 W CN2017109781 W CN 2017109781W WO 2019090486 A1 WO2019090486 A1 WO 2019090486A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
terminal
touch operation
user
target
Prior art date
Application number
PCT/CN2017/109781
Other languages
English (en)
French (fr)
Inventor
芮江
薛竹飙
徐博
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201780082151.8A priority Critical patent/CN110168487B/zh
Priority to ES17931613T priority patent/ES2947297T3/es
Priority to AU2017438902A priority patent/AU2017438902B2/en
Priority to CN202210308200.8A priority patent/CN114816208A/zh
Priority to CN202210308205.0A priority patent/CN114879893B/zh
Priority to PCT/CN2017/109781 priority patent/WO2019090486A1/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP17931613.8A priority patent/EP3683665B1/en
Priority to US16/754,599 priority patent/US11188225B2/en
Priority to EP22193002.7A priority patent/EP4163778A1/en
Publication of WO2019090486A1 publication Critical patent/WO2019090486A1/zh
Priority to US17/374,133 priority patent/US11526274B2/en
Priority to US17/977,804 priority patent/US11809705B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the embodiments of the present invention relate to the field of communications technologies, and in particular, to a touch control method and apparatus.
  • various terminals for example, mobile phones, tablet computers, etc.
  • a touch screen as an input device, which greatly improves user input and operation efficiency.
  • parameters such as touch sensitivity of the touch screen and response events of different touch actions are already set when the touch screen (or terminal) is shipped from the factory.
  • the user's response requirements to parameters such as touch sensitivity of the touch screen are often different. For example, when copying text information in a web page, more detailed operations are required, and the game is controlled. The characters in the game need to control the experience more quickly. Obviously, the fixed parameters set at the factory can not meet the user's touch requirements, reducing the input and output efficiency of the terminal.
  • Embodiments of the present application provide a touch control method and apparatus, which can implement refinement and personalized control of a touch screen, and improve input and output efficiency of the terminal.
  • an embodiment of the present application provides a touch control method, including: a terminal acquiring a first touch operation input by a user on a touch screen; and a first touch operation acting on a target interface (ie, a target application running in a foreground)
  • a target interface ie, a target application running in a foreground
  • the terminal maps the first touch operation to the second touch operation, so that the target application responds to the second touch operation to implement an application function related to the second touch operation. That is to say, the user inputs the first touch operation to the touch screen, but according to the mapping relationship preset by the user, the target application running on the terminal finally responds to the user with the second touch operation, thereby realizing the refinement and customization of the touch screen.
  • Touch control to improve the input and output efficiency of the terminal.
  • the terminal mapping the first touch operation to the second touch operation includes: when the terminal detects the target interface During the first touch operation, the terminal may search for at least one preset area (including the first preset area) associated with the target application preset by the user; then, when the touch point of the first touch operation falls into the first preset In the area, the terminal may further acquire a touch mapping rule preset for the first preset area; and map the first touch operation performed by the user to the second touch operation according to the touch mapping rule.
  • the terminal mapping the first touch operation to the second touch operation includes: modifying, by the terminal, a coordinate value of the touch point in the first touch operation, and using the modified coordinate value as the coordinate value of the touch point in the second touch operation .
  • the subsequent target application can provide corresponding visual output to the user according to the coordinate value of the modified touch point.
  • the touch mapping rule includes a coordinate mapping parameter, wherein the terminal maps the first touch operation to the second touch operation according to the touch mapping rule, including: the terminal increases the parameter according to the coordinate mapping parameter.
  • the coordinate value of the touch point in the first touch operation is reduced or decreased, and the coordinate value of the touch point in the second touch operation is obtained.
  • the terminal increases or decreases the coordinate value of the touch point in the first touch operation according to the coordinate mapping parameter, including: the terminal multiplies the coordinate value of the touch point in the first touch operation by the coordinate mapping parameter, where the coordinate The mapping parameter is greater than 1, or less than 1.
  • the method further includes: if the modified coordinate value exceeds a control boundary preset for the first touch operation, The terminal uses the coordinate value closest to the modified coordinate value on the control boundary as the coordinate value of the touch point in the second touch operation, thereby preventing the application from being unable to be applied because the modified coordinate exceeds the control area corresponding to the first touch operation. Correctly respond to the problem of the first touch operation.
  • the terminal mapping the first touch operation to the second touch operation according to the touch mapping rule includes: the first touch event generated by the user when the first touch operation is performed by the user according to the touch mapping rule The second touch event generated when the second touch operation is performed by the user is mapped, and the second touch event is reported to the target application.
  • the target is The application may present a response effect corresponding to the second touch operation according to the second touch event, and implement a personalized personalized function of the touch operation in the preset area.
  • the terminal maps the first touch operation to the second touch operation, so that the target application responds to the second touch operation, specifically: the terminal reports the touch event generated when the user performs the first touch operation. Applying to the target, so that the target application instructs the terminal to determine a first touch operation according to the touch event; the terminal maps the determined first touch operation to a second touch operation according to the touch mapping rule, and indicates the target The application responds to the second touch operation.
  • the terminal may report the first touch event generated by the first touch operation to the target application according to the normal process.
  • the terminal may The touch mapping rule invokes a function corresponding to the second touch operation to implement an application function corresponding to the second touch operation.
  • the touch mapping rule described above may be used to indicate that a click operation is mapped to a double click operation, or to indicate that a long press operation is mapped to a continuous click operation.
  • an embodiment of the present application provides a touch control method, including: in response to a first input of a user, a terminal displays a setting interface for instructing a user to customize a touch area; and in response to a second input of the user, the terminal acquires a target touch area customized by the user on the setting interface, and a touch mapping rule customized by the user for the target touch area, the touch mapping rule is used to indicate that the first touch operation acquired in the target touch area is mapped to the first Two touch operations.
  • the terminal can find the corresponding target touch mapping rule in response to the touch operation, and obtain a customized touch experience in the customized touch area.
  • the terminal acquires a target touch area customized by the user on the setting interface, and the method includes: receiving, by the terminal, a target touch area drawn by the user on the setting interface by using a preset area template; or receiving the terminal
  • the K boundary points marked by the user on the setting interface, and the K boundary points are connected in a specified order to form the target touch area, K>2.
  • the terminal acquires a touch mapping rule customized by the user on the target touch area on the setting interface, including: receiving, by the terminal, a coordinate mapping parameter set by the user for the target touch area, where The coordinate mapping parameter is used to indicate a mapping rule of the touch point coordinate value of the terminal in response to the first touch operation; and/or, the terminal receives an event mapping parameter set by the user for the target touch area, and the event mapping parameter is used to indicate that the terminal responds The mapping rule of the touch event at the time of the first touch operation.
  • the user can divide the logical area of the touch screen of the terminal, and obtain a touch area that is customized by the user, and the user can set a touch mapping rule that conforms to the current application scene and its own operating habits in the customized touch area.
  • the subsequent user can realize the fine and personalized control of the touch screen of the terminal, thereby improving the input and output efficiency of the terminal in different application scenarios.
  • the method further includes: the terminal prompting the user to respond to the target coordinate area under the current coordinate mapping parameter
  • the terminal prompting the user to respond to the target coordinate area under the current coordinate mapping parameter
  • the method further includes: the terminal receiving the user as the Touching the effective object of the mapping rule setting, the effective object includes at least one application and/or at least one display interface. That is to say, the touch control method provided by the present application can provide a customized touch experience for different application scenarios.
  • the method further includes: the terminal establishing the target touch area, the touch mapping rule of the target touch area, and establishing the effective object Correlation relationship, so that when a touch operation input by the user is received subsequently, the corresponding association relationship can be found to respond to the touch operation.
  • the terminal displays a setting interface for indicating a user-defined touch area, including: the terminal is superimposed and displayed on the display interface of the target application that is running in the foreground, and is used to indicate a user-defined touch area. Transparently set the interface to intuitively provide users with the ability to set custom touch areas for the current target application.
  • an embodiment of the present application provides a terminal, including a processor, a memory, and an input device connected by a bus, where the input device is configured to acquire a first touch operation input by a user on a touch screen, and send the
  • the processor is configured to determine that the first touch operation acts on the first preset area in the target interface, and map the first touch operation to the second touch operation, so that the target application responds to the second touch operation;
  • the target interface is any interface that is presented by the target application and covers the first preset area, and the target application runs in the foreground.
  • the processor maps the first touch operation to the second touch operation, specifically: the processor searches for the At least one preset area associated with the target application, the at least one preset area includes a first preset area; when the touch point of the first touch operation falls into the first preset area, the processor acquires the first preset area a preset touch mapping rule; and mapping the first touch operation to the second touch operation according to the touch mapping rule.
  • the processor maps the first touch operation to the second touch operation, specifically: the processor modifies the coordinate value of the touch point in the first touch operation, and uses the modified coordinate value as the first The coordinate value of the touch point in the second touch operation.
  • the touch mapping rule includes a coordinate mapping parameter
  • the processor maps the first touch operation to the second touch operation according to the touch mapping rule, and specifically includes: the processor according to the coordinate mapping
  • the parameter increases or decreases the coordinate value of the touch point in the first touch operation to obtain the coordinate value of the touch point in the second touch operation.
  • the processor increases or decreases the coordinate value of the touch point in the first touch operation according to the coordinate mapping parameter, and specifically includes: the coordinate of the touch point in the first touch operation by the processor The value is multiplied by the coordinate mapping parameter; the coordinate mapping parameter is greater than 1, or less than 1.
  • the processor is further configured to determine that the modified coordinate value exceeds a manipulation boundary preset for the first touch operation, and the coordinate of the manipulation boundary that is closest to the modified coordinate value The value is the coordinate value of the touch point in the second touch operation.
  • the processor maps the first touch operation to the second touch operation according to the touch mapping rule, specifically: the processor generates the first touch operation when the user performs the first touch operation according to the touch mapping rule.
  • the first touch event is mapped to a second touch event generated when the user performs the second touch operation, and the second touch event is reported to the target application.
  • the processor maps the first touch operation to the second touch operation, so that the target application responds to the second touch operation, and specifically includes: the processor generates when the user performs the first touch operation The touch event is reported to the target application, such that the target application instructs the terminal to determine a first touch operation according to the touch event; the processor maps the determined first touch operation to a second touch operation according to the touch mapping rule And instructing the target application to respond to the second touch operation.
  • an embodiment of the present application provides a terminal, including a processor, a memory, a display, and an input device connected by a bus, wherein the input device is configured to receive a first input and a second input of a user; And a display interface for indicating a user-defined touch area in response to the first input; the processor, configured to acquire a target touch area customized by the user on the setting interface, and the user in response to the second input A touch mapping rule customized to the target touch area, the touch mapping rule is used to indicate that the first touch operation acquired in the target touch area is mapped to a second touch operation.
  • the input device is further configured to: receive a target touch area drawn by the user on the setting interface by using a preset area template; or receive K boundaries marked by the user on the setting interface. Point, the K boundary points are connected in the specified order to form the target touch area, K>2.
  • the input device is further configured to: receive a coordinate mapping parameter set by the user for the target touch area, where the coordinate mapping parameter is used to indicate that the terminal responds to the touch point coordinate value when the first touch operation is performed. Mapping rules; and/or receiving an event mapping parameter set by the user for the target touch area, the event mapping parameter is used to indicate a mapping rule of the touch event when the terminal responds to the first touch operation.
  • the display is further configured to prompt the user to perform a touch effect when the terminal responds to a touch operation in the target touch area under the current coordinate mapping parameter.
  • the input device is further configured to: receive an effective object set by the user for the touch mapping rule, the effective object includes at least one application and/or at least one display interface.
  • the processor is further configured to establish an association relationship between the target touch area, the touch mapping rule of the target touch area, and the effective object, and store the association relationship in the memory. in.
  • the display is specifically configured to superimpose and display a translucent setting interface for indicating a user-defined touch area on a display interface of a target application that is running in the foreground.
  • an embodiment of the present application provides a terminal, including: an acquiring unit, configured to acquire a first touch operation input by a user on a touch screen; and a mapping unit, configured to: when the first touch operation acts on the target interface Mapping a first touch operation to a second touch operation to cause the target application to respond to the second touch operation; wherein the target interface is any interface presented by the target application that covers the first preset area The target application is running in the foreground.
  • the mapping unit is specifically configured to: when the terminal detects the first touch operation in the target interface, search for at least one preset area associated with the target application, where the at least one preset area includes a first preset area; when the touch point of the first touch operation falls into the first preset area, acquiring a touch mapping rule preset for the first preset area; mapping the first touch operation to the first according to the touch mapping rule Two touch operations.
  • the mapping unit is specifically configured to: modify the coordinate value of the touch point in the first touch operation, and use the modified coordinate value as the coordinate value of the touch point in the second touch operation.
  • the touch mapping rule includes a coordinate mapping parameter, wherein the mapping unit is specifically configured to: increase or decrease a coordinate value of the touch point in the first touch operation according to the coordinate mapping parameter, A coordinate value of the touched point in the second touch operation is obtained.
  • the mapping unit is specifically configured to: multiply the coordinate value of the touch point in the first touch operation by the coordinate mapping parameter; the coordinate mapping parameter is greater than 1, or less than 1.
  • the mapping unit is further configured to: if the modified coordinate value exceeds a control boundary preset for the first touch operation, the manipulation boundary is closest to the modified coordinate value The coordinate value is used as the coordinate value of the touch point in the second touch operation.
  • the mapping unit is specifically configured to: map, according to the touch mapping rule, a first touch event generated when a user performs a first touch operation to a second generated when a user performs a second touch operation Touch the event and escalate the second touch event to the target application.
  • the mapping unit is specifically configured to report a touch event generated when the user performs the first touch operation to the target application, so that the target application instructs the terminal to determine the first according to the touch event. a touch operation; the terminal maps the determined first touch operation to a second touch operation according to the touch mapping rule, and instructs the target application to respond to the second touch operation.
  • an embodiment of the present application provides a terminal, including: an obtaining unit, configured to acquire a first input and a second input of a user; and a display unit, configured to display a setting interface for indicating a user-defined touch area;
  • the acquiring unit is further configured to acquire a target touch area customized by the user on the setting interface, and a touch mapping rule customized by the user to the target touch area, where the touch mapping rule is used to indicate that the target touch area is acquired.
  • the first touch operation is mapped to a second touch operation.
  • the acquiring unit is specifically configured to: receive a target touch area drawn by the user on the setting interface by using a preset area template; or receive K boundary points marked by the user on the setting interface, The K boundary points are connected in a specified order to form the target touch area, K>2.
  • the acquiring unit is specifically configured to: receive a coordinate mapping parameter set by the user for the target touch area, where the coordinate mapping parameter is used to indicate a mapping rule of the touch point coordinate value of the terminal in response to the first touch operation And/or, receiving an event mapping parameter set by the user for the target touch area, the event mapping parameter The number is used to indicate a mapping rule for the terminal to respond to a touch event when the first touch operation is performed.
  • the display unit is further configured to prompt the user to respond to the touch effect when the terminal touches the touch operation in the target touch area under the current coordinate mapping parameter.
  • the obtaining unit is further configured to: receive an effective object set by the user for the touch mapping rule, the effective object includes at least one application and/or at least one display interface.
  • the terminal further includes a storage unit configured to establish an association relationship between the target touch area, the touch mapping rule of the target touch area, and the effective object.
  • the display unit is specifically configured to superimpose and display a semi-transparent setting interface for indicating a user-defined touch area on a display interface of a target application that is running in the foreground.
  • an embodiment of the present application provides a terminal, including: a processor, a memory, a bus, and a communication interface; the memory is configured to store a computer execution instruction, and the processor is connected to the memory through the bus, when the terminal is running The processor executes the computer-executed instructions stored by the memory to cause the terminal to perform any of the touch control methods described above.
  • the embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium stores an instruction, when the instruction is run on any one of the foregoing terminals, causing the terminal to perform any one of the above touch control method.
  • the embodiment of the present application provides a computer program product including instructions, when the terminal runs on any of the above terminals, causing the terminal to execute any of the above touch control methods.
  • the names of the components in the terminal are not limited to the device itself, and in actual implementation, the components may appear under other names. As long as the functions of the various components are similar to the embodiments of the present application, they are within the scope of the claims and their equivalents.
  • FIG. 1 is a schematic structural diagram 1 of a terminal according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram 1 of an application scenario of a touch control method according to an embodiment of the present application
  • FIG. 3 is a schematic structural diagram 1 of an Android system according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram 2 of an Android system according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic flowchart 1 of a touch control method according to an embodiment of the present disclosure
  • FIG. 6 is a second schematic diagram of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 7A is a schematic diagram 3 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 7B is a schematic diagram 4 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram 5 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram 6 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram 7 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram 8 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram 9 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram 10 of an application scenario of a touch control method according to an embodiment of the present disclosure
  • FIG. 14 is a schematic diagram 11 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 15 is a schematic diagram 12 of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 16 is a schematic diagram of interaction of a touch control method according to an embodiment of the present application.
  • FIG. 17 is a second schematic flowchart of a touch control method according to an embodiment of the present disclosure.
  • FIG. 18 is a schematic diagram of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 19 is a schematic diagram of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 20 is a schematic diagram of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 21 is a schematic diagram of an application scenario of a touch control method according to an embodiment of the present application.
  • FIG. 22 is a schematic diagram of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 23 is a schematic diagram of an application scenario of a touch control method according to an embodiment of the present disclosure.
  • FIG. 24 is a schematic structural diagram 2 of a terminal according to an embodiment of the present disclosure.
  • FIG. 25 is a schematic structural diagram 3 of a terminal according to an embodiment of the present application.
  • first and second are used for descriptive purposes only, and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, features defining “first” and “second” may include one or more of the features either explicitly or implicitly. In the description of the embodiments of the present application, “multiple” means two or more unless otherwise stated.
  • a touch control method provided by an embodiment of the present application can be applied to a mobile phone, a wearable device, an augmented reality (AR), a virtual reality (VR) device, a tablet computer, a notebook computer, a super mobile personal computer. (ultra-mobile personal computer, UMPC), netbook, personal digital assistant (PDA), etc., on any terminal having a touch screen.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the terminal in the embodiment of the present application may be the mobile phone 100.
  • the embodiment will be specifically described below by taking the mobile phone 100 as an example. It should be understood that the illustrated mobile phone 100 is only one example of the above terminal, and the mobile phone 100 may have more or fewer components than those shown in the figure, two or more components may be combined, or Has a different component configuration.
  • the mobile phone 100 may specifically include: a processor 101, a radio frequency (RF) circuit 102, a memory 103, a touch screen 104, a Bluetooth device 105, one or more sensors 106, and wireless fidelity (WIreless-Fidelity).
  • RF radio frequency
  • Wi-Fi device 107 positioning device 108, audio circuit 109, peripheral interface 110, and power system 111 and the like.
  • These components can communicate over one or more communication buses or signal lines (not shown in Figure 1). It will be understood by those skilled in the art that the hardware structure shown in FIG. 1 does not constitute a limitation to a mobile phone, and the mobile phone 100 may include more or less components than those illustrated, or some components may be combined, or different component arrangements.
  • the processor 101 is a control center of the mobile phone 100, and connects various parts of the mobile phone 100 by using various interfaces and lines, and executes the mobile phone 100 by running or executing an application stored in the memory 103 and calling data stored in the memory 103.
  • the processor 101 may include one or more processing units; for example, the processor 101 may be a Kirin 960 chip manufactured by Huawei Technologies Co., Ltd.
  • the processor 101 may further include A fingerprint verification chip for verifying the collected fingerprint.
  • the radio frequency circuit 102 can be used to receive and transmit wireless signals during transmission or reception of information or calls.
  • the radio frequency circuit 102 can process the downlink data of the base station and then process it to the processor 101; in addition, transmit the data related to the uplink to the base station.
  • radio frequency circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency circuit 102 can also communicate with other devices through wireless communication.
  • the wireless communication can use any communication standard or protocol, including but not limited to global mobile communication systems, general packet radio services, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
  • the memory 103 is used to store applications and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running applications and data stored in the memory 103.
  • the memory 103 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.); the storage data area can be stored according to the use of the mobile phone. Data created at 100 o'clock (such as audio data, phone book, etc.).
  • the memory 103 may include a high speed random access memory (RAM), and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the memory 103 can store various operating systems, for example, developed by Apple. Operating system, developed by Google Inc. Operating system, etc.
  • the above memory 103 may be independent and connected to the processor 101 via the above communication bus; the memory 103 may also be integrated with the processor 101.
  • the touch screen 104 may specifically include a touch panel 104-1 and a display 104-2.
  • the touch panel 104-1 can collect touch operations on or near the user of the mobile phone 100 (for example, the user uses a finger, a stylus, or the like on the touch panel 104-1 or on the touchpad 104.
  • the operation near -1), and the collected touch information is sent to other devices (for example, processor 101).
  • the touch operation of the user in the vicinity of the touch panel 104-1 may be referred to as a hovering touch; the hovering touch may mean that the user does not need to directly touch the touchpad in order to select, move or drag a target (eg, an icon, etc.) And only the user is located near the terminal in order to perform the desired function.
  • the touch panel 104-1 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • a display (also referred to as display) 104-2 can be used to display information entered by the user or information provided to the user as well as various menus of the handset 100.
  • the display 104-2 can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the touch panel 104-1 may be overlaid on the display 104-2, and when the touch panel 104-1 detects a touch operation thereon or nearby, the touch panel 104-1 transmits to the processor 101 to determine the type of the touch operation, and then the processor 101 may provide a corresponding visual output on display 104-2 depending on the type of touch operation.
  • the touchpad 104-1 and the display 104-2 are implemented as two separate components to implement the input and output functions of the handset 100, in some embodiments, the touchpad 104- 1 is integrated with the display screen 104-2 to implement the input and output functions of the mobile phone 100.
  • the user may set the touch mapping rules of different touch areas on the touch screen 104 in different application scenarios, for example, as shown in (a) of FIG. 2, when the application A is run, the touch screen may be set.
  • the touch sensitivity of the rectangular touch area 21a at the center of 104 is twice that of other areas, or, as shown in (b) of FIG. 2, the touch action in the touch area 22b can be customized when the application B is run (for example, a single Response events such as hitting, long press, etc.
  • a user-defined touch area can be obtained, and the user can set a touch map in accordance with the current application scene and its own operating habits in the customized touch area.
  • the rules are such that the subsequent user obtains a customized touch experience in the customized touch area, which realizes the fine-grained and personalized control of the touch screen 104, and provides a richer touch experience for the terminal including the touch screen 104.
  • the touch sensitivity may be used to reflect a ratio of a moving distance of the display object when the terminal responds to a touch operation on the touch screen 104 and an actual sliding distance of the finger on the touch screen 104 in the touch operation.
  • the larger the ratio the lower the touch sensitivity is when the touch sensitivity is lower.
  • lower touch sensitivity can improve the accuracy of these operations; for some real-time operations, such as in-game attacks, running, etc. Operation, high touch sensitivity increases the speed and user experience of these operations.
  • the response event of the touch action refers to a specific touch operation corresponding to the touch event generated by the mobile phone 100 when the mobile phone 100 receives the touch action input by the user at the specific position of the touch screen 104.
  • the mobile phone 100 can generate two action events: action down and action up at point C.
  • the application calls the mobile phone 100.
  • the preset library function can determine that the two touch events correspond to the click operation, and then the application function of the click operation at the C point is implemented in response to the click operation.
  • the above-mentioned touch mapping rules may include other parameters for adjusting the user's touch habits, such as touch precision, touch pressure sensing, and touch time, in addition to the above-mentioned touch sensitivity and the response event of the touch action. This does not impose any restrictions.
  • the touch screen 104 is formed by stacking a plurality of layers of materials. In the embodiment of the present application, only the touch panel (layer) and the display screen (layer) are shown, and other layers are not described in the embodiment of the present application. .
  • the touch panel 104-1 may be disposed on the front surface of the mobile phone 100 in the form of a full-board
  • the display screen 104-2 may also be disposed on the front surface of the mobile phone 100 in the form of a full-board, so that the front of the mobile phone can be borderless. Structure.
  • the mobile phone 100 can also include a Bluetooth device 105 for enabling data exchange between the handset 100 and other short-range terminals (eg, mobile phones, smart watches, etc.).
  • the Bluetooth device in the embodiment of the present application may be an integrated circuit or a Bluetooth chip or the like.
  • the handset 100 can also include at least one sensor 106, such as a fingerprint acquisition device 112, a light sensor, a motion sensor, and other sensors.
  • the fingerprint collection device 112 can be configured on the back of the handset 100 (eg, below the rear camera) or on the front side of the handset 100 (eg, below the touch screen 104).
  • the fingerprint collection device 112 can be configured in the touch screen 104 to implement the fingerprint recognition function, that is, the fingerprint collection device 112 can be integrated with the touch screen 104 to implement the fingerprint recognition function of the mobile phone 100;
  • the light sensor can include an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display of the touch screen 104 according to the brightness of the ambient light, and the proximity sensor can turn off the power of the display when the mobile phone 100 moves to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer Gesture calibration), vibration recognition related functions (such as pedometer, tapping), etc.; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that can be configured in the mobile phone 100 are not described here.
  • the Wi-Fi device 107 is configured to provide the mobile phone 100 with network access complying with the Wi-Fi related standard protocol, and the mobile phone 100 can access the Wi-Fi access point through the Wi-Fi device 107, thereby helping the user to send and receive emails, Browsing web pages and accessing streaming media, etc., it provides users with wireless broadband Internet access.
  • the Wi-Fi device 107 can also function as a Wi-Fi wireless access point, and can provide Wi-Fi network access to other terminals.
  • the positioning device 108 is configured to provide a geographic location for the mobile phone 100. It can be understood that the positioning device 108 can be specifically a receiver of a positioning system such as a global positioning system (GPS) or a Beidou satellite navigation system, and a Russian GLONASS. After receiving the geographical location transmitted by the positioning system, the positioning device 108 sends the information to the processor 101 for processing, or sends it to the memory 103 for storage. In still other embodiments, the positioning device 108 may also be an assisted global positioning system (AGPS) receiver, and the AGPS system assists the positioning device 108 in performing ranging and positioning services by acting as an auxiliary server.
  • AGPS assisted global positioning system
  • the secondary location server provides location assistance over a wireless communication network in communication with a location device 108, such as a GPS receiver, of the handset 100.
  • the positioning device 108 can also be a Wi-Fi access point based positioning technology. Since each Wi-Fi access point has a globally unique media access control (MAC) address, the terminal can scan and collect the surrounding Wi-Fi access points when Wi-Fi is turned on. Broadcast signals, so the MAC address broadcasted by the Wi-Fi access point can be obtained; the terminal sends the data (such as the MAC address) capable of indicating the Wi-Fi access point to the location server through the wireless communication network, and is retrieved by the location server. The geographic location of each Wi-Fi access point, combined with the strength of the Wi-Fi broadcast signal, calculates the geographic location of the terminal and sends it to the location device 108 of the terminal.
  • MAC media access control
  • the audio circuit 109, the speaker 113, and the microphone 114 can provide an audio interface between the user and the handset 100.
  • the audio circuit 109 can transmit the converted electrical data of the received audio data to the speaker 113 for conversion to the sound signal output by the speaker 113; on the other hand, the microphone 114 converts the collected sound signal into an electrical signal by the audio circuit 109. After receiving, it is converted into audio data, and then the audio data is output to the RF circuit 102 for transmission to, for example, another mobile phone, or the audio data is output to the memory 103 for further processing.
  • the peripheral interface 110 is used to provide various interfaces for external input/output devices (such as a keyboard, a mouse, an external display, an external memory, a subscriber identity module card, etc.). For example, it is connected to the mouse through a universal serial bus (USB) interface, and is connected with a subscriber identification module (SIM) card provided by the telecommunication operator through a metal contact on the card slot of the subscriber identification module. . Peripheral interface 110 can be used to couple the external input/output peripherals described above to processor 101 and memory 103.
  • USB universal serial bus
  • SIM subscriber identification module
  • the mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) that supplies power to the various components.
  • the battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 111. And other functions.
  • the mobile phone 100 may further include a camera (front camera and/or rear camera), A flash lamp, a micro projection device, a near field communication (NFC) device, and the like are not described herein.
  • a camera front camera and/or rear camera
  • a flash lamp a micro projection device
  • NFC near field communication
  • the mobile phone 100 can run an operating system such as Android or IOS.
  • the Android operating system can be divided into four layers.
  • the application layer 201 is respectively from the upper layer to the lower layer (ie, The APP layer), the application framework layer 202 (ie, the Framework layer), the system runtime layer 203 (ie, the Libraries layer), and the Linux kernel layer 204.
  • the Linux kernel layer 204 can be used to control the security, memory management, process management, network stack, and driver model of the mobile phone 100.
  • the Linux kernel layer 204 also serves as an abstraction layer between hardware (eg, CPU, network card, memory, etc.) and the software stack, which hides specific hardware details for the upper layer (system runtime layer 203, application framework layer 202, and applications).
  • Program layer 201) provides a unified service.
  • the system runtime layer 203 includes some C/C++ libraries, for example, a media library, a system C library, and a display management library (Surface Manager). These libraries can be used by different components in the Android system, and the system runtime layer 203 can pass.
  • the Framework layer 202 provides services to developers.
  • the Framework layer 202 provides developers with an API framework for full access to applications. Specifically, the Framework layer 202 provides a very large number of APIs for developing applications, and an APP that satisfies related business needs can be constructed by calling the corresponding API.
  • the application layer 201 mainly includes an APP written in the Java language.
  • the user operates the operation interface on the APP, the user interacts with the system runtime layer 203 or the Linux kernel layer 204 by calling the related API in the Framework layer 202.
  • the function corresponding to the operation interface is mainly included in the Java language.
  • the APP running in the application layer 201 acquires the touch operation input by the user on the touch screen 104 is a process of distributing the message layer by layer from the bottom layer.
  • the touch screen 104 when the user's finger touches the touch screen 104 in the hardware layer, the touch screen 104 obtains information about the touch operation (for example, the coordinates of the touch point, etc.), and further, the touch screen 104 can be driven by the corresponding The form of the interrupt reports the original touch event generated by the touch action to the Linux kernel layer 204.
  • the framework layer 202 includes an event bus layer 202a that communicates with the lower layer and an input read distribution layer 202b that communicates with the upper layer. After the Linux kernel layer 204 obtains the original touch event, the coordinate event can be encapsulated and the like is generated.
  • Advanced touch events eg, action down events, action move events, and action up events, etc.
  • Layer 202b Advanced touch events (eg, action down events, action move events, and action up events, etc.) that can be read by the upper layer, and send the advanced touch events to the event bus layer 202a, which is then distributed by the event bus layer 202a to the input read distribution.
  • Layer 202b Advanced touch events (eg, action down events, action move events, and action up events, etc.) that can be read by the upper layer, and send the advanced touch events to the event bus layer 202a, which is then distributed by the event bus layer 202a to the input read distribution.
  • Layer 202b Layer 202b.
  • the application process of the application A can call the C/C++ library function in the system runtime layer 203 to determine the operation corresponding to the operation of the advanced touch event, for example, a click operation.
  • the library function in the system runtime layer 203 can call back a callback function that the application A has previously written for the click operation, which specifies the function that the application A performs in response to the user's click operation.
  • the callback function may be an onclick function, such that the application A performs a callback function corresponding to the click operation at the touch point position, for example, the onclick function written by the application A for the click operation at the touch point is used to implement the video playback function.
  • the function indicated by the application A in the application layer 201 to implement the above-mentioned callback function is a process of delivering the control instruction from the upper layer to the bottom layer layer by layer, and finally by the relevant hardware.
  • the application process of the application layer 201 in the application layer 201 determines that the video playback function needs to be implemented according to the touch operation reported by the bottom layer, and the generated video playback instruction is sent to the input in the Framework layer 202 layer by layer.
  • the distribution layer 202b and the event bus layer 202a are further sent by the event bus layer 202a to the Linux kernel layer 204.
  • the Linux kernel layer 204 drives the processor, the memory, and the touch screen 104 to implement video playback. Output.
  • the terminal may be Linux.
  • the kernel layer 204 (or the framework layer 202) determines whether the location of the touch point in the touch operation falls within the user-defined touch area. If it falls within the user-defined touch area, the user can modify the touch mapping rule according to the user setting. Relevant information carried in the touch operation. For example, the user defines that the click operation in the touch area 1 is mapped to a double-click operation.
  • the terminal may act on the touch.
  • the response event is modified from a click operation to a double-click operation, and then the callback function corresponding to the double-click operation is called back to the APP running in the application layer 201 to implement the touch effect of the double-click operation, thereby realizing the refined and customized touch control of the touch screen 104.
  • the terminal may act on the touch.
  • the response event is modified from a click operation to a double-click operation, and then the callback function corresponding to the double-click operation is called back to the APP running in the application layer 201 to implement the touch effect of the double-click operation, thereby realizing the refined and customized touch control of the touch screen 104.
  • the method includes:
  • the terminal acquires a first input of the user when the target application is running, where the first input is used to trigger the terminal to enter a setting interface of the custom touch area.
  • the target application may be any application such as a video application, a game application, and a communication application installed in the terminal, and the embodiment of the present application does not impose any limitation.
  • a control for customizing the touch area can be displayed in the display interface of the application A.
  • the control 600 can be displayed in the login interface of the application A, and can be used to prompt the user to customize the touch mapping rules of different touch regions when running the application A, so as to improve the input and response efficiency when using the A runtime. . Then, when it is detected that the user clicks on the control 600, the first input to the user is obtained.
  • an option 700 of "custom touch mapping rule” may also be set in the setting interface of the application A.
  • the “modification rule” may be clicked.
  • Options customize touch mapping rules for different touch areas and different touch areas.
  • the user can also provide a custom touch area and a touch mapping rule entry for different applications in the setting interface of the terminal operating system.
  • the terminal provides an option 701 of "customized touch" in the setting interface.
  • the user can select a custom touch for different applications (for example, application A).
  • Area and touch mapping rules Taking the application A as an example, after the user selects the effective application of the customized touch as the application A, as shown in FIG. 7B, the established touch area (for example, the touch area 1 and the touch area 2 in FIG. 7B) can be modified, and the user clicks the touch.
  • the touch area 1 can be The size and location of the touch area and the touch mapping rule of the touch area 1 are modified.
  • the user can also click the button 703 to add a custom touch area to create a new touch area and a touch mapping rule, which is not limited in this embodiment of the present application. .
  • the user can also input the first input for opening the touch mapping rule in the custom touch screen to the terminal by using a voice or the like, and the embodiment of the present application does not impose any limitation on this.
  • the terminal displays a semi-transparent setting interface in a display interface of the target application.
  • the terminal may overlay a layer of semi-transparent layer on the display interface of the current target application as the setting interface displayed on the touch screen of the terminal.
  • the terminal can prompt the user to draw a customized target touch area on the setting interface 800, and the user can freely define the target touch area that he/she needs, and can set the target touch area to be effective in the customized target touch area. Touch mapping rules to improve the input and output performance of the target application while it is running.
  • the terminal acquires a second input of the user in the setting interface, where the second input includes a target touch area customized by the user on the touch screen, and a target touch mapping rule set for the target touch area.
  • the user may draw an arbitrary position on the setting interface 800 using the area template 802 (for example, a rectangular template, a triangle template, or a circular template, etc.) preset by the terminal.
  • a target touch area 801 of a certain size is output.
  • the terminal can record the specific position and size of the target touch region 801 on the touch screen by the planar geometric function of the region template (for example, a rectangular area function, a circular area function, and the like).
  • the user may also draw boundary points of the target touch area on the setting interface 800 in a certain order (for example, clockwise or counterclockwise order).
  • the line may form a target touch area 901.
  • the terminal can record the specific position and size of the target touch area 901 on the touch screen by the coordinates of the above boundary points.
  • the target touch area 901 can be represented as Area2 ⁇ A, B, C, D, E ⁇ , where A, B, C, D, E are the target touch areas 901 in a clockwise order.
  • the coordinates of the five boundary points are described in a clockwise order.
  • the touch mapping rule of the target touch area may be further set.
  • the user sets a circular area located in the lower left corner of the game application A as the target touch area 801.
  • the user may be further prompted to modify the target touch.
  • the touch mapping rule of the area 801 is, for example, the touch sensitivity 1001 of the target touch area 801 and the response event 1002 of the touch action.
  • the terminal may display the touch sensitivity 1001 in the form of a progress bar in the current setting interface, and the user may change the progress of the progress bar by the drag operation, thereby modifying the touch sensitivity of the target touch area 801.
  • the progress bar of the touch sensitivity 1001 is exemplified by a value interval of -100 to 100.
  • the user sets the touch sensitivity 1001 to 0, it means that the touch sensitivity of the target touch area 801 does not need to be modified, that is, the terminal responds to the user.
  • the default touch sensitivity of the terminal can be used in the touch operation in the target touch area 801. That is, if the terminal (or the target application) pre-defines that the user can control the display object corresponding to the operation to move 1 meter every time the user slides 1 cm on the touch screen, then when the user sets the touch sensitivity 1001 to 0, the user is on the touch screen. Each time the slide is 1 cm, the terminal still controls the corresponding display object to move 1 meter in response to the operation.
  • Touch sensitivity 1001 When the user sets the touch sensitivity 1001 to be greater than 0, it indicates that the user desires to be within the target touch area 801. Touch sensitivity is high with current default values. Taking the value of the touch sensitivity 1001 as 100, when the user moves a touch motion of 1 cm in the target touch area 801, the terminal can control the corresponding display object to move 2 meters in response to the touch action, that is, twice the default touch sensitivity. The distance responds to a user's touch action within the target touch area 801. Exemplarily, as shown in FIG.
  • the terminal may The abscissa and the ordinate of points A and B are multiplied by 2 to obtain A(0,0) and B'(2,2), and the modified coordinate points are reported to the target application, so that the target application considers the user to A (0, 0) moves to B' (2, 2), thereby responding to the user's current touch action by 2 times the distance.
  • the terminal can control the corresponding display object to move 0.5 m in response to the touch action, that is, 1 with the default touch sensitivity.
  • the /2-fold distance responds to a user's touch action within the target touch area 801. As shown in FIG.
  • the terminal when the user moves from the A (0, 0) point to the B (1, 1) point in the target touch area 801, the terminal can set the above point A according to the touch sensitivity set by the user as -100. And the abscissa and ordinate of point B are multiplied by 0.5 to obtain A(0,0) and B"(0.5,0.5), and the modified coordinate points are reported to the target application, so that the target application considers the user to be from A ( 0, 0) and B" (0.5, 0.5), thereby responding to the user's current touch action by 1/2 times the distance.
  • the target touch area 801 in the lower left corner of the game application A is generally used to control the moving direction and the moving distance of the game character, when the user increases the touch sensitivity in the target touch area 801.
  • the movement of the game character can be controlled to a farther position by a touch operation with a short moving distance, thereby improving the moving speed of the game character, bringing a better game experience to the user, and increasing the input of the terminal when running the application A. And the efficiency of the output operation.
  • the terminal may correspondingly prompt the user of the current touch sensitivity. meaning. As shown in FIG. 12, when the user sets the value of the touch sensitivity 1001 to 80, the terminal can prompt the user through the floating window 1101 that the moving speed of the game character will be accelerated by 1.8 times.
  • the terminal may use the touch sensitivity 80 set by the user as the coordinate mapping parameter in the touch mapping rule; the magnification of 1.8 times corresponding to the touch sensitivity 80 may also be used as the coordinate mapping parameter in the touch mapping rule, of course, when the user When the set touch sensitivity is less than 0, the reduction magnification corresponding to the current touch sensitivity can be used as the coordinate mapping parameter in the touch mapping rule.
  • the terminal detects that the user inputs the first touch operation in the target touch area, the terminal may increase or decrease the coordinate value of the coordinate point in the first touch operation according to the coordinate mapping parameter, thereby mapping the first touch operation to the first touch operation. Two touch operations.
  • the terminal can provide the user with a custom function for the target touch area in the form of touch sensitivity, and the terminal can save the touch sensitivity of the user-defined setting in the form of coordinate mapping parameters, so as to implement touch according to the coordinate mapping parameter.
  • Custom function for sensitivity can be provided.
  • the user can also customize multiple target touch regions in the setting interface displayed in step S502, and set the touch mapping rules of each target touch region.
  • the user customizes two target touch areas on the setting interface, one is a circular area in the lower left corner of the touch screen (ie, the target touch area 1), and the other is the right touch screen. Under The rectangular area of the corner (ie, the target touch area 2).
  • the user sets the value of the touch sensitivity in the touch mapping rule to 80, thereby increasing the moving speed of the application A game character.
  • these operations are generally set up when the application A is released or when the terminal is shipped from the factory. For example, double-clicking the attack button can launch an attack, etc.
  • double-clicking the attack button can launch an attack, etc.
  • the user performs a double-click operation, and the user may wish to click the operation to achieve the double-click attack effect; while some game applications use the frequency of the continuous click operation to determine the input value of a certain function, but continuous Clicking is difficult, and the user may wish to achieve the effect of a continuous click by long press.
  • the user may also customize the response event of the touch action in the target touch area, and as shown in FIG. 13, the user may select “map the click operation in the target touch area 2”. For double-clicking operations, and the option to map long-press operations to continuous clicks.
  • the terminal may save the response event of the touch action selected by the user.
  • the terminal receives the click operation input by the user in the target touch area 2, the terminal may follow the touch mapping rule set by the user for the target touch area 2 in advance.
  • the hit operation is mapped to a double-click operation to achieve the effect of the double-click operation, and the efficiency of the input and output operations of the terminal while running the application A is improved.
  • the terminal can also provide the user with more detailed setting rules of the touch mapping rule. For example, when the user settings map the click operation to the double-click operation, the time interval of the double-click operation can be further set; when the user setting is long pressed, When mapped to a continuous click operation, you can set the time threshold for the long press operation (that is, how long the touch is mapped to a continuous click operation), and the time between adjacent click operations mapped by the long press operation.
  • the parameters such as the interval make the touch experience of the subsequent user in the operation interface of the application A more in line with its own operating habits.
  • the terminal may further prompt the user to set the target touch area and its touch mapping rule.
  • the touch mapping rule that raises the touch sensitivity to 80 in the target touch area 1 may be effective for all interfaces in the application A running process, or may set the touch mapping rule to only one or more of the application A running time.
  • the interface for example, is effective in the battle interface in the battle scenario.
  • the terminal can set the touch mapping rule to the class of the Activity of one or more interfaces set by the user.
  • the name is associated.
  • the terminal may pre-store the correspondence between different types of display interfaces in the application A and their Activity class names.
  • the setting interface in the application A includes Activity1 and Activity2
  • the competition interface in the application A includes Activity3 and Activity4, then,
  • the terminal may associate the touch mapping rule with Activity3 and Activity4, and then when the terminal recognizes that the touch operation input by the user occurs on the associated interface,
  • the touch mapping rule is used in the target touch area 1 to respond to the user's touch action.
  • the user can manually enter the touch mapping rules that are valid only for the current interface in the respective display interfaces, and the embodiment of the present application does not impose any limitation on this.
  • the user can also set the above touch mapping rule to be effective for one or more other applications, and the terminal can map the touch.
  • the rule is associated with the identity of one or more applications set by the user so that subsequent terminals can run this one
  • the touch mapping rule can also be used in the target touch area 1 to respond to the user's touch action when the application is used.
  • the user when the user customizes multiple target touch regions in the setting interface, overlapping phenomenon may occur between multiple target touch regions.
  • the user first customizes the touch mapping rule A of the target touch area 3 and the target touch area 3 in the setting interface, and further customizes the target touch area 4 and the target touch area in the setting interface. There is an overlap area between the 4 and the target touch area 3.
  • the terminal may display an error prompt to the user, or, as shown in FIG. 15, the terminal may confirm to the user twice whether to change the touch sensitivity previously set for the target touch area 3, if the user confirms to change the touch of the target touch area 3.
  • the terminal can set the values of the touch sensitivity of the target touch area 3 and the target touch area 4 to 30 at a time.
  • the terminal may continue to perform the following step S504, that is, save each user-defined one with a certain data structure.
  • step S504 that is, save each user-defined one with a certain data structure.
  • the terminal saves the target touch area, the target touch mapping rule of the target touch area, and a correspondence between the target applications.
  • the terminal may use the preset touch data structure to set the target touch area and the target touch area.
  • Corresponding relationship between the target touch mapping rule and the application (or interface) in which the target touch mapping rule is valid is saved in the memory, so that when the touch operation received by the user is subsequently received, the corresponding target touch mapping rule can be found in response to the touch. operating.
  • the terminal may set a profile file for each target touch area with a target touch area as a granularity, and the profile file is associated with the corresponding one or more applications (or interfaces), each of which The location of the corresponding target touch area in the touch screen and the target touch mapping rule of the target touch area are recorded in the profile file.
  • the user inputs the touch sensitivity of the customized target touch area Area1 and Area1 to the terminal in the setting interface of the game application A (ie, the second input in step S503 above).
  • the terminal In response to the second input, the terminal generates profile1, which includes coordinate information of Area1 and a target touch mapping rule set by the user on Area1.
  • the target touch mapping rule modifies the value of the touch sensitivity to 80, and the response event of the touch action still follows the default response mechanism of the terminal without changing.
  • the terminal establishes the correspondence between the profile 1 and the application A.
  • the subsequent terminal can obtain all the profile files corresponding to the application A by querying the mapping relationship shown in Table 1 when the application A is running.
  • the terminal may also share the generated profile file to other devices, so that the same customized target area and touch control effect can be copied on other devices.
  • the display parameter such as the screen resolution of the receiving device that receives the profile file is different from the display parameter of the terminal that sends the profile file
  • the receiving device may also perform corresponding conversion on the received profile file, and the present application is used. The embodiment does not impose any restrictions on this.
  • the terminal may also upload the generated profile file to the server in the cloud, when other terminals are running related applications (for example, the above-mentioned game application A). You can also download profile1 corresponding to application A from the cloud, and copy the same custom target area and touch control effect on other devices.
  • the server can also optimize the profile file corresponding to an application (or interface) through algorithms such as big data statistics, for example, when 90% of the terminals are running the application A.
  • the server can optimize the profile file including Area1 corresponding to the application A, and adjust the value of the touch sensitivity in the target touch mapping rule to 80, and further
  • the terminal with the touch sensitivity lower than 80 delivers the optimized profile file, so that the terminal achieves a touch sensitivity with higher touch sensitivity when running the application A.
  • the terminal may also prompt the user whether to use the touch mapping rule set in the optimized profile file to improve the user experience.
  • the user can divide the logical area of the touch screen of the terminal, and obtain a user-defined touch area, and the user can set the current touch area in the customized touch area. And the touch mapping rule of the operating habits of the user, so that the subsequent user obtains the customized touch feeling in the customized touch area, and realizes the refined and personalized control of the touch screen of the terminal, thereby improving the terminal in different application scenarios. Input and output efficiency.
  • a touch control method is provided, as shown in FIG. 17, including:
  • the terminal acquires a first touch operation input by the user on the touch screen.
  • the terminal may acquire coordinate information of the touch point through which the first touch operation passes.
  • the touch point mentioned here may be a touch point detected by the touch screen when the user inputs the first touch operation, or may be a corresponding pixel point of the touch point detected by the touch screen on the display screen.
  • the target application may be any application such as a video application, a game application, a communication application, and the like installed in the terminal, and the embodiment of the present application does not impose any limitation.
  • the terminal can display the display interface of the application A on the touch screen in real time when the application A is running, and the user can implement the related functions provided by the application A by inputting corresponding input operations to the touch screen.
  • the user can click the analog manipulation handle 1702 in the lower left corner area of the competition interface 1701 to control the game characters to move up, down, left, and right. Then, when the user moves the analog manipulation handle 1702 to the right (ie, the first touch operation described above), the touch screen of the terminal may sequentially report the detected touch information (eg, including a touch event, coordinate information of the touch point, etc.) to the touch screen.
  • the kernel layer, the framework layer, and the application layer of the terminal may sequentially report the detected touch information (eg, including a touch event, coordinate information of the touch point, etc.) to the touch screen.
  • the touch screen can also carry the detected touch information such as the touch time of the current touch operation in the first touch operation, which is not limited in this embodiment.
  • the coordinate information in the touch operation may be absolute coordinate information of the touch point on the touch screen, or may be relative coordinate information converted by the terminal to the absolute coordinate information.
  • the absolute coordinate information refers to the coordinates of the touch point in the coordinate system defined by the manufacturer of the touch screen.
  • the coordinate system of the touch screen can be set in the IC chip of the touch screen, as shown in (a) of FIG. 19, and the first point is set to the origin O(0, 0) of the touch screen.
  • the P point coordinate can be determined to be P(5, 0) in the first coordinate system.
  • P(5, 0) is The above absolute coordinate information.
  • the coordinate system set on the touch screen may be different from the operating system defined coordinate system of the terminal.
  • the operating system of the terminal sets the second coordinate system with the upper left corner of the touch screen as the origin O' (0, 0), then the touch operation input by the user at the P point of the touch screen is The second coordinate system is mapped to a touch operation at the P'(5,15) point of the touch screen.
  • P'(5,15) is the above relative coordinate information.
  • mapping process of the foregoing absolute coordinate information and the relative coordinate information may be performed by a kernel layer in the terminal, or may be performed by a frame layer in the terminal, which is not limited in this embodiment.
  • the terminal determines whether the first touch operation is applied to the first preset area of the target interface.
  • the target interface refers to any interface that is displayed by the target application running in the foreground in step S601.
  • the first preset area is the user-defined target touch area in the foregoing embodiment, and the target interface can cover the target touch area. Part or all of it.
  • the action of the first touch operation on the first preset area may mean that the operation object or the operation track of the first touch operation of the user detected by the touch screen by the touch screen is dropped into the first preset area, for example, When it is detected that the coordinates of the touch point of the first touch operation fall into the first preset area, it may be determined that the first touch operation acts on the first preset area.
  • step S602 after the terminal obtains the first touch operation in the target interface, in order to determine whether to use the user-defined touch mapping rule to respond to the first touch operation, the terminal may acquire the foreground operation at this time.
  • the identifier of the target application and then, according to the identifier of the target application, all the profile files corresponding to the target application are searched for in the corresponding relationship shown in Table 1. Because the specific location of the user-defined target touch area is recorded in the profile file, the terminal may determine, according to the coordinate information of the touch point in the first touch operation, which profile file the first touch operation specifically falls into. The target touches the area.
  • the terminal obtains the identifier of the application that is currently running as the package name of the application A, then the table 1 can determine that the user has customized two profile files (ie, profile 1 and profile 2 in Table 1) when the application A is run. Further, the terminal may compare the coordinate information P(x, y) of the touch point in the first touch operation with Area1 in profile1 and Area2 in profile2, respectively, and determine that point P falls within the target touch area Area1 in the lower left corner of the touch screen. .
  • the coordinate system used by the coordinate information of the touch point should be the same as the coordinate system used by the target touch area recorded in Table 1.
  • the terminal records the location of Area1 according to the second coordinate system defined by the operating system, and the coordinate information P(x, y) of the touched point in the first touch operation reported by the touch screen to the terminal is recorded according to the first coordinate system.
  • the kernel layer may map the coordinate information P(x, y) to the coordinate P' (x', y in the second coordinate system. '), and then determine whether the coordinate P'(x', y') It falls within the target touch area Area1.
  • the terminal maps the first touch operation to a second touch operation.
  • the target application responds to the second touch operation to implement a customized touch function when the target application is running.
  • the touch screen is configured to package the detected first touch operation as the original touch event to the kernel layer 204 of the terminal, and the kernel layer 204 maps the coordinate information P(x, y) carried in the original touch event to the first
  • the coordinate P' (x', y') in the two coordinate system, and the upper touch readable high-level touch event is reported to the frame layer 202, and the frame layer 202 queries the profile file shown in Table 1.
  • the terminal can search for the target touch mapping rule recorded in the profile1, and the coordinate mapping parameter for reflecting the touch sensitivity is set in the target touch mapping rule, for example, the coordinate mapping parameter is 1.8, that is, the application A responds with a distance ratio of 1.8 times.
  • the first touch action input by the user in the target touch area Area1 then the frame layer 202 can multiply the abscissa and the ordinate in the coordinate P'(x', y') by 1.8 times to obtain the modified corrected coordinates.
  • Q (1.8x', 1.8y') the corrected coordinate Q (1.8x', 1.8y') is used as the coordinate information of the touch point in the second touch operation.
  • the frame layer 202 carries the modified coordinate Q (1.8x', 1.8y') in the above-mentioned advanced touch event and reports it to the running application A in the application layer, so that the application A can be based on the modified corrected coordinate Q (1.8x' , 1.8y') responds to the second touch operation, that is, the user inputs the first touch operation at the P (x, y) point to the touch screen, and the application in the terminal finally responds to the user in the Q The second touch operation of the (1.8x', 1.8y') point.
  • the terminal detects the first touch operation of the user at the point P(1, 0). Since the point P falls within the user-defined target touch area Area1 in the profile1, the terminal follows the profile1.
  • the application A obtains the second touch operation with the coordinate value Q (1.8, 0)
  • the application A considers the user
  • the finger manipulation analog joystick 1702 is moved 1.8 cm from O(0,0) to Q(1.8,0) to the right, while the actual user manipulates the analog joystick 1702 from O(0,0) to P(1,0).
  • the point is moved 1 cm to the right, the effect of controlling the analog manipulation handle 1702 to move 1.8 cm is achieved, thereby increasing the moving speed of the game character.
  • the terminal modifies the coordinates of the touch point in the first touch operation according to the value of the sensitivity, the modified coordinate (for example, the Q point mentioned above) exceeds the manipulation boundary of the manipulation area Area1 of the analog manipulation handle 1702, and FIG. 21 As shown in the figure, the terminal may report the coordinate point of the touch point in the second touch operation to the application A as the coordinate point of the touch point in the second touch operation, so as to avoid the modified coordinate exceeding the control area corresponding to the first touch operation. The problem that the application cannot respond correctly to the first touch operation.
  • the implementation manner of the fixed touch sensitivity is illustrated, that is, That is, as shown in (a) of FIG. 22, after the user sets the touch sensitivity of the target touch area Area1 to 80, the terminal always responds to the user's touch operation in the target touch area Area1 with a ratio of 1.8 times.
  • the terminal can also modify the above touch operation in a non-linear manner, and finally reach the touch sensitivity set by the user. For example, as shown in (b) of FIG. 22, after the user sets the touch sensitivity of the target touch area Area1 to 80, the terminal can change the touch sensitivity according to the distance the user slides within the Area1, and gradually increase the touch when the sliding distance is larger. Sensitivity until it increases to 1.8 times the default touch sensitivity.
  • the application A can invoke the related library function in the system runtime layer 203, and the library function is based on the advanced touch.
  • the parameters passed in the event help the application A to determine the specific touch operation performed by the user at point P, such as a click operation. Then, after determining that the current touch operation is a click operation, if the response event of the click operation recorded in the profile file corresponding to the touch point P is a double-click operation, the terminal does not call back the application A for the click operation.
  • the framework layer 202 acquires the advanced touch event 1 generated when the user performs the click operation performed at the P point, it can be determined according to the advanced touch event 1 that the user performs a click operation at the P point, and the frame layer 202 At this time, according to the profile file corresponding to the P point, the advanced touch event 1 generated by the above click operation is modified to the advanced touch event 2 that should be generated when the user performs the double-click operation performed at the P point, and the advanced touch event 2 is reported to Application A running in the application layer.
  • the application A calls the related library function in the system runtime layer 203, it can be determined that the specific touch operation performed by the user at point P is a double-click operation, and then the terminal can call back the callback function written in the application A for the double-click operation, The effect of the double click operation at the touch point P can be caused by the application A in response to the double tap operation.
  • the terminal may determine, by the library function, that the current touch operation is a sliding operation, and then modify the coordinate information in the touch operation according to the user-defined touch sensitivity in the profile file.
  • the terminal After the touch operation is obtained, the touch operation is not required to be modified, and the target application responds according to the relevant touch information carried in the actual touch operation, which is not limited in this embodiment.
  • the above terminal and the like include hardware structures and/or software modules corresponding to each function.
  • the embodiments of the present application can be implemented in a combination of hardware or hardware and computer software in combination with the elements and algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the embodiments of the present application.
  • the embodiment of the present application may perform the division of the function module on the terminal or the like according to the foregoing method example, for example.
  • each function module can be divided for each function, or two or more functions can be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 24 is a schematic diagram showing a possible structure of a terminal involved in the foregoing embodiment, where the terminal includes: an obtaining unit 2401, a storage unit 2402, a display unit 2403, and Mapping unit 2404.
  • the obtaining unit 2401 is configured to support the terminal to perform the processes S501 and S503 in FIG. 5, and the process S601 in FIG. 17; the storage unit 2402 is configured to support the terminal to execute the process S504 in FIG. 5; and the display unit 2403 is configured to support the terminal to execute FIG.
  • the process S502 in the mapping unit 2404 is configured to support the terminal to perform the processes S602-S604 in FIG. All the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again.
  • the mapping unit 2404 can be used as a processing module
  • the storage unit 2402 can be used as a storage module
  • the acquisition unit 2401 can be used as an input module
  • the display unit 1106 can be used as a display module.
  • the processing module 2502 is configured to control and manage the action of the terminal.
  • the input module 2503 is used to support interaction between the terminal and the user.
  • the storage module 2501 is configured to save program codes and data of the terminal.
  • the display module 2504 is for displaying information input by the user or information provided to the user and various menus of the terminal.
  • the terminal may acquire, by using the input module 2503, a first touch operation input by the user on the touch screen; when the first touch operation acts on the target interface (ie, the interface of the target application running in the foreground)
  • the processing module 2502 may map the first touch operation to a second touch operation to cause the target application to respond to the second touch operation to achieve refinement and personalized control of the touch screen.
  • the processing module 2502 may be a processor or a controller, for example, may be a central processing unit (CPU), a GPU, a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit. (Application-Specific Integrated Circuit, ASIC), Field Programmable Gate Array (FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the input module 2503 may be an input/output device or a communication interface such as a touch screen or a microphone.
  • the memory module 2501 may be a memory, which may include a high speed random access memory (RAM), and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • RAM high speed random access memory
  • nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the display module 2504 can be a display, and the display can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • a touch panel can be integrated on the display for collecting touch operations on or near the display, and transmitting the collected touch information to other devices (such as a processor, etc.).
  • the terminal provided by the embodiment of the present application may be the mobile phone 100 shown in FIG. 1 .
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be from a website site, computer, server or data center Transfer to another website site, computer, server, or data center by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL), or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device such as a server, data center, or the like that includes one or more available media.
  • the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
  • Supplying Of Containers To The Packaging Station (AREA)
  • Input Circuits Of Receivers And Coupling Of Receivers And Audio Equipment (AREA)
  • Electrical Discharge Machining, Electrochemical Machining, And Combined Machining (AREA)

Abstract

一种触摸控制方法及装置,涉及通信技术领域,该方法包括:(S601)终端获取用户在触摸屏上输入的第一触摸操作;(S602, S603, S604)当第一触摸操作作用于目标界面中的第一预设区域时,该终端将第一触摸操作映射为第二触摸操作,以使得目标应用响应第二触摸操作;其中,该目标界面为该目标应用所呈现的任一覆盖第一预设区域的界面,该目标应用在前台运行。

Description

一种触摸控制方法及装置 技术领域
本申请实施例涉及通信技术领域,尤其涉及一种触摸控制方法及装置。
背景技术
目前,各种终端(例如,手机、平板电脑等)普遍采用触摸屏作为输入装置,极大地改善了用户的输入和操作效率。一般,触摸屏的触摸灵敏度、不同触摸动作的响应事件等参数在触摸屏(或终端)出厂时就已经设置完成。
但是,在不同的应用场景的不同触摸区域,用户对触摸屏的触摸灵敏度等参数的响应需求往往是不同的,例如,在复制网页中的文字信息时更多需要精细化的操作,而在控制游戏中的角色跑动时却需要更加快速的控制体验。显然,出厂时设置好的固定参数已经不能满足用户的触摸需求,降低了终端的输入和输出效率。
发明内容
本申请的实施例提供一种触摸控制方法及装置,可实现对触摸屏的精细化和个性化控制,提高终端的输入和输出效率。
为达到上述目的,本申请的实施例采用如下技术方案:
第一方面,本申请的实施例提供一种触摸控制方法,包括:终端获取用户在触摸屏上输入的第一触摸操作;当第一触摸操作作用于目标界面(即前台运行的目标应用所呈现的界面)中的第一预设区域时,终端将第一触摸操作映射为第二触摸操作,使得目标应用响应第二触摸操作,实现与第二触摸操作相关的应用功能。也就是说,用户向触摸屏输入的是第一触摸操作,但根据用户预先设置的映射关系,终端上运行的目标应用最终向用户响应的是第二触摸操作,从而实现对触摸屏的精细化、定制化触摸控制,提高终端的输入输出效率。
在一种可能的设计方法中,第一触摸操作作用于目标界面中的第一预设区域时,终端将第一触摸操作映射为第二触摸操作,包括:当终端在上述目标界面中检测到第一触摸操作时,终端可查找用户预先设置的与该目标应用关联的至少一个预设区域(包括上述第一预设区域);那么,当第一触摸操作的触摸点落入第一预设区域时,终端可进一步获取为第一预设区域预设的触摸映射规则;并按照该触摸映射规则将用户执行的第一触摸操作映射为第二触摸操作。
可选的,终端将第一触摸操作映射为第二触摸操作,包括:该终端修改第一触摸操作中触摸点的坐标值,将修改后的坐标值作为第二触摸操作中触摸点的坐标值。后续目标应用可根据修改后的触摸点的坐标值向用户提供相应的视觉输出
在一种可能的设计方法中,该触摸映射规则包括坐标映射参数;其中,该终端按照该触摸映射规则将第一触摸操作映射为第二触摸操作,包括:该终端按照该坐标映射参数,增大或减小第一触摸操作中触摸点的坐标值,得到第二触摸操作中触摸点的坐标值。这样,用户在预设区域中通过较小幅度的操作便可以实现较大幅度的操作效果,或者,用户在预设区域中通过较大幅度的操作便可以实现较小幅度的操作效果, 实现了预设区域内触摸灵敏度的定制化效果。
例如,终端按照该坐标映射参数,增大或减小第一触摸操作中触摸点的坐标值,包括:终端将第一触摸操作中触摸点的坐标值乘以该坐标映射参数,其中,该坐标映射参数大于1,或小于1。
在一种可能的设计方法中,在该终端修改第一触摸操作所作用的触摸点的坐标值之后,还包括:若修改后的坐标值超出为第一触摸操作预设的操控边界,则该终端将该操控边界上距离该修改后的坐标值最近的坐标值,作为第二触摸操作中触摸点的坐标值,从而避免因修改后的坐标超出第一触摸操作对应的操控区域而导致应用无法正确响应该第一触摸操作的问题。
在一种可能的设计方法中,终端按照该触摸映射规则将第一触摸操作映射为第二触摸操作,包括:终端根据该触摸映射规则,将用户执行第一触摸操作时产生的第一触摸事件映射为用户执行第二触摸操作时产生的第二触摸事件,并将第二触摸事件上报给该目标应用。也就是说,在向目标应用上报第一触摸操作产生的第一触摸事件之前,便可根据触摸映射规则修改成用户执行第二触摸操作时产生的第二触摸事件上报至目标应用,那么,目标应用可按照第二触摸事件呈现出第二触摸操作对应的响应效果,实现预设区域内触摸操作自定义的个性化功能。
在一种可能的设计方法中,终端将第一触摸操作映射为第二触摸操作,以使得该目标应用响应第二触摸操作,具体包括:终端将用户执行第一触摸操作时产生的触摸事件上报给该目标应用,以使得该目标应用指示该终端根据该触摸事件确定出第一触摸操作;该终端根据该触摸映射规则将确定出的第一触摸操作映射为第二触摸操作,并指示该目标应用响应第二触摸操作。也就是说,终端可按照正常流程向目标应用上报第一触摸操作产生的第一触摸事件,当目标应用根据第一触摸事件确定出用户执行的具体操作(即第一触摸操作)后,可根据该触摸映射规则调用与第二触摸操作对应的函数实现第二触摸操作相应的应用功能。
例如,上述触摸映射规则可用于指示将单击操作映射为双击操作,或者,用于指示将长按操作映射为连续单击操作。
第二方面,本申请的实施例提供一种触摸控制方法,包括:响应于用户的第一输入,终端显示用于指示用户自定义触摸区域的设置界面;响应于用户的第二输入,终端获取用户在该设置界面上自定义的目标触摸区域,以及用户对该目标触摸区域自定义的触摸映射规则,该触摸映射规则用于指示将该目标触摸区域内获取到的第一触摸操作映射为第二触摸操作。这样,后续接收到用户输入的具体触摸操作时,终端可查找到对应的目标触摸映射规则响应该触摸操作,在自定义的触摸区域中获得定制的触控感受。
在一种可能的设计方法中,终端获取用户在该设置界面上自定义的目标触摸区域,包括:终端接收用户通过预设的区域模板在该设置界面上绘制的目标触摸区域;或者,终端接收用户在该设置界面上标记的K个边界点,该K个边界点按照指定顺序连线后形成该目标触摸区域,K>2。
在一种可能的设计方法中,终端获取用户在该设置界面上对该目标触摸区域自定义的触摸映射规则,包括:终端接收用户对该目标触摸区域设置的坐标映射参数,该 坐标映射参数用于指示该终端响应第一触摸操作时触摸点坐标值的映射规则;和/或,终端接收用户对该目标触摸区域设置的事件映射参数,该事件映射参数用于指示该终端响应第一触摸操作时触摸事件的映射规则。
也就是说,用户可对终端的触摸屏进行逻辑区域的划分,可得到用户自定义的触摸区域,并且,用户可在自定义的触摸区域中设置符合当前应用场景以及自身操作习惯的触摸映射规则,以便后续用户在自定义的触摸区域中获得定制的触控感受,实现了对终端触摸屏的精细化、个性化的控制,从而提高终端的在不同应用场景下的输入和输出效率。
在一种可能的设计方法中,在该终端接收用户对该目标触摸区域设置的坐标映射参数之后,还包括:该终端向用户提示在当前的坐标映射参数下,该终端响应该目标触摸区域内的触摸操作时的触摸效果,使得用户可以对当前设置的坐标映射参数得到快速认知。
在一种可能的设计方法中,在该终端获取用户在该设置界面上自定义的目标触摸区域,以及用户对该目标触摸区域自定义的触摸映射规则之后,还包括:该终端接收用户为该触摸映射规则设置的生效对象,该生效对象包括至少一个应用和/或至少一个显示界面。也就是说,本申请提供的触摸控制方法可针对不同的应用场景提供定制化的触控感受。
在一种可能的设计方法中,在终端接收用户为该触摸映射规则设置的生效对象之后,还包括:终端将该目标触摸区域、该目标触摸区域的触摸映射规则、以及该生效对象之间建立关联关系,以便后续接收到用户输入的触摸操作时,能够查找到对应的关联关系响应该触摸操作
在一种可能的设计方法中,终端显示用于指示用户自定义触摸区域的设置界面,包括:终端在正在前台运行的目标应用的显示界面上,叠加显示用于指示用户自定义触摸区域的半透明设置界面,从而直观的为用户提供针对当前目标应用的自定义触摸区域的设置功能。
第三方面,本申请的实施例提供一种终端,包括通过总线相连的处理器、存储器以及输入设备,其中,该输入设备,用于获取用户在触摸屏上输入的第一触摸操作,并发送给该处理器;该处理器,用于判断第一触摸操作作用于目标界面中的第一预设区域,并将第一触摸操作映射为第二触摸操作,以使得目标应用响应第二触摸操作;其中,该目标界面为该目标应用所呈现的任一覆盖第一预设区域的界面,该目标应用在前台运行。
在一种可能的设计方法中,当该输入设备在该目标界面中检测到第一触摸操作时,该处理器将第一触摸操作映射为第二触摸操作,具体包括:该处理器查找与该目标应用关联的至少一个预设区域,该至少一个预设区包括第一预设区域;当第一触摸操作的触摸点落入第一预设区域时,该处理器获取为第一预设区域预设的触摸映射规则;并按照该触摸映射规则将第一触摸操作映射为第二触摸操作。
在一种可能的设计方法中,该处理器将第一触摸操作映射为第二触摸操作,具体包括:该处理器修改第一触摸操作中触摸点的坐标值,将修改后的坐标值作为第二触摸操作中触摸点的坐标值。
在一种可能的设计方法中,该触摸映射规则包括坐标映射参数;其中,该处理器按照该触摸映射规则将第一触摸操作映射为第二触摸操作,具体包括:该处理器按照该坐标映射参数,增大或减小第一触摸操作中触摸点的坐标值,得到第二触摸操作中触摸点的坐标值。
在一种可能的设计方法中,该处理器按照该坐标映射参数,增大或减小第一触摸操作中触摸点的坐标值,具体包括:该处理器将第一触摸操作中触摸点的坐标值乘以该坐标映射参数;该坐标映射参数大于1,或小于1。
在一种可能的设计方法中,该处理器,还用于确定修改后的坐标值超出为第一触摸操作预设的操控边界,并将该操控边界上距离该修改后的坐标值最近的坐标值,作为第二触摸操作中触摸点的坐标值。
在一种可能的设计方法中,该处理器按照该触摸映射规则将第一触摸操作映射为第二触摸操作,具体包括:该处理器根据该触摸映射规则,将用户执行第一触摸操作时产生的第一触摸事件映射为用户执行第二触摸操作时产生的第二触摸事件,并将第二触摸事件上报给该目标应用。
在一种可能的设计方法中,该处理器将第一触摸操作映射为第二触摸操作,以使得该目标应用响应第二触摸操作,具体包括:该处理器将用户执行第一触摸操作时产生的触摸事件上报给该目标应用,以使得该目标应用指示该终端根据该触摸事件确定出第一触摸操作;该处理器根据该触摸映射规则将确定出的第一触摸操作映射为第二触摸操作,并指示该目标应用响应第二触摸操作。
第四方面,本申请的实施例提供一种终端,包括通过总线相连的处理器、存储器、显示器以及输入设备,其中,该输入设备,用于接收用户的第一输入和第二输入;该显示器,用于响应于第一输入,显示用于指示用户自定义触摸区域的设置界面;该处理器,用于响应于第二输入,获取用户在该设置界面上自定义的目标触摸区域,以及用户对该目标触摸区域自定义的触摸映射规则,该触摸映射规则用于指示将该目标触摸区域内获取到的第一触摸操作映射为第二触摸操作。
在一种可能的设计方法中,该输入设备,还用于:接收用户通过预设的区域模板在该设置界面上绘制的目标触摸区域;或者,接收用户在该设置界面上标记的K个边界点,该K个边界点按照指定顺序连线后形成该目标触摸区域,K>2。
在一种可能的设计方法中,该输入设备,还用于:接收用户对该目标触摸区域设置的坐标映射参数,该坐标映射参数用于指示该终端响应第一触摸操作时触摸点坐标值的映射规则;和/或,接收用户对该目标触摸区域设置的事件映射参数,该事件映射参数用于指示该终端响应第一触摸操作时触摸事件的映射规则。
在一种可能的设计方法中,该显示器,还用于向用户提示在当前的坐标映射参数下,上述终端响应该目标触摸区域内的触摸操作时的触摸效果。
在一种可能的设计方法中,该输入设备,还用于:接收用户为该触摸映射规则设置的生效对象,该生效对象包括至少一个应用和/或至少一个显示界面。
在一种可能的设计方法中,该处理器,还用于将该目标触摸区域、该目标触摸区域的触摸映射规则、以及该生效对象之间建立关联关系,并将该关联关系存储至该存储器中。
在一种可能的设计方法中,该显示器,具体用于在正在前台运行的目标应用的显示界面上,叠加显示用于指示用户自定义触摸区域的半透明设置界面。
第五方面,本申请的实施例提供一种终端,包括:获取单元,用于获取用户在触摸屏上输入的第一触摸操作;映射单元,用于当第一触摸操作作用于目标界面中的第一预设区域时,将第一触摸操作映射为第二触摸操作,以使得目标应用响应第二触摸操作;其中,该目标界面为该目标应用所呈现的任一覆盖第一预设区域的界面,该目标应用在前台运行。
在一种可能的设计方法中,映射单元具体用于:当终端在该目标界面中检测到第一触摸操作时,查找与该目标应用关联的至少一个预设区域,该至少一个预设区包括第一预设区域;当第一触摸操作的触摸点落入第一预设区域时,获取为第一预设区域预设的触摸映射规则;按照该触摸映射规则将第一触摸操作映射为第二触摸操作。
在一种可能的设计方法中,映射单元具体用于:该终端修改第一触摸操作中触摸点的坐标值,将修改后的坐标值作为第二触摸操作中触摸点的坐标值。
在一种可能的设计方法中,该触摸映射规则包括坐标映射参数;其中,该映射单元,具体用于:按照该坐标映射参数,增大或减小第一触摸操作中触摸点的坐标值,得到第二触摸操作中触摸点的坐标值。
在一种可能的设计方法中,该映射单元,具体用于:将第一触摸操作中触摸点的坐标值乘以该坐标映射参数;该坐标映射参数大于1,或小于1。
在一种可能的设计方法中,该映射单元,还用于:若修改后的坐标值超出为第一触摸操作预设的操控边界,则将该操控边界上距离该修改后的坐标值最近的坐标值,作为第二触摸操作中触摸点的坐标值。
在一种可能的设计方法中,该映射单元,具体用于:根据该触摸映射规则,将用户执行第一触摸操作时产生的第一触摸事件映射为用户执行第二触摸操作时产生的第二触摸事件,并将第二触摸事件上报给该目标应用。
在一种可能的设计方法中,该映射单元,具体用于:将用户执行第一触摸操作时产生的触摸事件上报给该目标应用,以使得该目标应用指示该终端根据该触摸事件确定出第一触摸操作;该终端根据该触摸映射规则将确定出的第一触摸操作映射为第二触摸操作,并指示该目标应用响应第二触摸操作。
第五方面,本申请的实施例提供一种终端,包括:获取单元,用于获取用户的第一输入和第二输入;显示单元,用于显示用于指示用户自定义触摸区域的设置界面;获取单元,还用于获取用户在该设置界面上自定义的目标触摸区域,以及用户对该目标触摸区域自定义的触摸映射规则,该触摸映射规则用于指示将该目标触摸区域内获取到的第一触摸操作映射为第二触摸操作。
在一种可能的设计方法中,获取单元具体用于:接收用户通过预设的区域模板在该设置界面上绘制的目标触摸区域;或者,接收用户在该设置界面上标记的K个边界点,该K个边界点按照指定顺序连线后形成该目标触摸区域,K>2。
在一种可能的设计方法中,获取单元具体用于:接收用户对该目标触摸区域设置的坐标映射参数,该坐标映射参数用于指示该终端响应第一触摸操作时触摸点坐标值的映射规则;和/或,接收用户对该目标触摸区域设置的事件映射参数,该事件映射参 数用于指示该终端响应第一触摸操作时触摸事件的映射规则。
在一种可能的设计方法中,该显示单元,还用于向用户提示在当前的坐标映射参数下,该终端响应该目标触摸区域内的触摸操作时的触摸效果。
在一种可能的设计方法中,获取单元,还用于:接收用户为该触摸映射规则设置的生效对象,该生效对象包括至少一个应用和/或至少一个显示界面。
在一种可能的设计方法中,该终端还包括存储单元,用于将该目标触摸区域、该目标触摸区域的触摸映射规则、以及该生效对象之间建立关联关系。
在一种可能的设计方法中,该显示单元具体用于:在正在前台运行的目标应用的显示界面上,叠加显示用于指示用户自定义触摸区域的半透明设置界面。
第七方面,本申请的实施例提供一种终端,包括:处理器、存储器、总线和通信接口;该存储器用于存储计算机执行指令,该处理器与该存储器通过该总线连接,当终端运行时,该处理器执行该存储器存储的该计算机执行指令,以使终端执行上述任一项触摸控制方法。
第八方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当该指令在上述任一项终端上运行时,使得终端执行上述任一项触摸控制方法。
第九方面,本申请实施例提供一种包含指令的计算机程序产品,当其在上述任一项终端上运行时,使得终端执行上述任一项触摸控制方法。
本申请的实施例中,上述终端内各部件的名字对设备本身不构成限定,在实际实现中,这些部件可以以其他名称出现。只要各个部件的功能和本申请的实施例类似,即属于本申请权利要求及其等同技术的范围之内。
另外,第三方面至第九方面中任一种设计方式所带来的技术效果可参见上述第一方面或第二方面中不同设计方法所带来的技术效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种终端的结构示意图一;
图2为本申请实施例提供的一种触摸控制方法的应用场景示意图一;
图3为本申请实施例提供的一种安卓系统的架构示意图一;
图4为本申请实施例提供的一种安卓系统的架构示意图二;
图5为本申请实施例提供的一种触摸控制方法的流程示意图一;
图6为本申请实施例提供的一种触摸控制方法的应用场景示意图二;
图7A为本申请实施例提供的一种触摸控制方法的应用场景示意图三;
图7B为本申请实施例提供的一种触摸控制方法的应用场景示意图四;
图8为本申请实施例提供的一种触摸控制方法的应用场景示意图五;
图9为本申请实施例提供的一种触摸控制方法的应用场景示意图六;
图10为本申请实施例提供的一种触摸控制方法的应用场景示意图七;
图11为本申请实施例提供的一种触摸控制方法的应用场景示意图八;
图12为本申请实施例提供的一种触摸控制方法的应用场景示意图九;
图13为本申请实施例提供的一种触摸控制方法的应用场景示意图十;
图14为本申请实施例提供的一种触摸控制方法的应用场景示意图十一;
图15为本申请实施例提供的一种触摸控制方法的应用场景示意图十二;
图16为本申请实施例提供的一种触摸控制方法的交互示意图;
图17为本申请实施例提供的一种触摸控制方法的流程示意图二;
图18为本申请实施例提供的一种触摸控制方法的应用场景示意图十三;
图19为本申请实施例提供的一种触摸控制方法的应用场景示意图十四;
图20为本申请实施例提供的一种触摸控制方法的应用场景示意图十五;
图21为本申请实施例提供的一种触摸控制方法的应用场景示意图十六;
图22为本申请实施例提供的一种触摸控制方法的应用场景示意图十七;
图23为本申请实施例提供的一种触摸控制方法的应用场景示意图十八;
图24为本申请实施例提供的一种终端的结构示意图二;
图25为本申请实施例提供的一种终端的结构示意图三。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供的一种触摸控制方法,可应用于手机、可穿戴设备、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等具有触摸屏的任意终端上,当然,在以下实施例中,对该终端的具体形式不作任何限制。
如图1所示,本申请实施例中的终端可以为手机100。下面以手机100为例对实施例进行具体说明。应该理解的是,图示手机100仅是上述终端的一个范例,并且手机100可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。
如图1所示,手机100具体可以包括:处理器101、射频(radio frequency,RF)电路102、存储器103、触摸屏104、蓝牙装置105、一个或多个传感器106、无线保真(WIreless-Fidelity,Wi-Fi)装置107、定位装置108、音频电路109、外设接口110以及电源系统111等部件。这些部件可通过一根或多根通信总线或信号线(图1中未示出)进行通信。本领域技术人员可以理解,图1中示出的硬件结构并不构成对手机的限定,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图1对手机100的各个部件进行具体的介绍:
处理器101是手机100的控制中心,利用各种接口和线路连接手机100的各个部分,通过运行或执行存储在存储器103内的应用程序,以及调用存储在存储器103内的数据,执行手机100的各种功能和处理数据。在一些实施例中,处理器101可包括一个或多个处理单元;举例来说,处理器101可以是华为技术有限公司制造的麒麟960芯片。在本申请一些实施例中,上述处理器101还可以包括 指纹验证芯片,用于对采集到的指纹进行验证。
射频电路102可用于在收发信息或通话过程中,无线信号的接收和发送。特别地,射频电路102可以将基站的下行数据接收后,给处理器101处理;另外,将涉及上行的数据发送给基站。通常,射频电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路102还可以通过无线通信和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
存储器103用于存储应用程序以及数据,处理器101通过运行存储在存储器103的应用程序以及数据,执行手机100的各种功能以及数据处理。存储器103主要包括存储程序区以及存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等);存储数据区可以存储根据使用手机100时所创建的数据(比如音频数据、电话本等)。此外,存储器103可以包括高速随机存取存储器(ramdom access memory,RAM),还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。存储器103可以存储各种操作系统,例如,苹果公司所开发的
Figure PCTCN2017109781-appb-000001
操作系统,谷歌公司所开发的
Figure PCTCN2017109781-appb-000002
操作系统等。上述存储器103可以是独立的,通过上述通信总线与处理器101相连接;存储器103也可以和处理器101集成在一起。
触摸屏104具体可以包括触控板104-1和显示器104-2。
其中,触控板104-1可采集手机100的用户在其上或附近的触摸操作(比如用户使用手指、触控笔等任何适合的物体在触控板104-1上或在触控板104-1附近的操作),并将采集到的触摸信息发送给其他器件(例如处理器101)。其中,用户在触控板104-1附近的触摸操作可以称之为悬浮触控;悬浮触控可以是指,用户无需为了选择、移动或拖动目标(例如图标等)而直接接触触控板,而只需用户位于终端附近以便执行所想要的功能。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型来实现触控板104-1。
显示器(也可称为显示屏)104-2可用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单。可以采用液晶显示器、有机发光二极管等形式来配置显示器104-2。触控板104-1可以覆盖在显示器104-2之上,当触控板104-1检测到在其上或附近的触摸操作后,传送给处理器101以确定触摸操作的类型,随后处理器101可以根据触摸操作的类型在显示器104-2上提供相应的视觉输出。虽然在图1中,触控板104-1与显示屏104-2是作为两个独立的部件来实现手机100的输入和输出功能,但是在某些实施例中,可以将触控板104-1与显示屏104-2集成而实现手机100的输入和输出功能。
在本申请实施例中,用户可在不同应用场景中对触摸屏104上不同触摸区域的触摸映射规则进行设置,例如,如图2中的(a)所示,在运行应用A时可设置位于触摸屏104中心的矩形触摸区域21a的触摸灵敏度为其他区域的两倍,或者,如图2中的(b)所示,在运行应用B时可自定义触摸区域22b中触摸动作(例如单 击、长按等触摸动作)的响应事件。
这样,在不同应用场景下,通过对触摸屏104进行逻辑区域的划分,可得到用户自定义的触摸区域,并且,用户可在自定义的触摸区域中设置符合当前应用场景以及自身操作习惯的触摸映射规则,以便后续用户在自定义的触摸区域中获得定制的触控感受,实现了对触摸屏104的精细化、个性化的控制,为包含触摸屏104的终端提供了更加丰富的触控体验。
其中,上述触摸灵敏度,可用于反映终端在触摸屏104上响应某一触摸操作时显示对象的移动距离与该触摸操作中手指在上述触摸屏104上的实际滑动距离的比值,当触摸灵敏度越高时,该比值越大,当触摸灵敏度越低时,该比值越小。对于一些精细化的操作,例如,修图、标记文本等操作,较低的触摸灵敏度可提高这些操作的准确性;而对于一些实时性较强的操作,例如,游戏中的攻击、跑动等操作,较高的触摸灵敏度可提高这些操作的速度和用户体验。
上述触摸动作的响应事件,是指手机100接收到用户在上述触摸屏104特定位置上输入的触摸动作时,手机100产生的触摸事件的所对应的具体触摸操作。例如,用户点击触摸屏104上的C点时,手机100可生成位于C点的action down和action up两件触摸事件,手机100将这两件触摸事件上报给对应的应用后,应用通过调用手机100内预设的库函数可判断出这两件触摸事件对应于单击操作,进而响应该单击操作实现C点处单击操作的应用功能。
当然,上述触摸映射规则除了上述触摸灵敏度以及触摸动作的响应事件之外,还可以包括其他用于调整用户触摸习惯的参数,例如,触摸精度、触摸压力感应以及触摸时间等,本申请实施例对此不做任何限制。
可以理解的是,触摸屏104是由多层的材料堆叠而成,本申请实施例中只展示出了触控板(层)和显示屏(层),其他层在本申请实施例中不予记载。另外,触控板104-1可以以全面板的形式配置在手机100的正面,显示屏104-2也可以以全面板的形式配置在手机100的正面,这样在手机的正面就能够实现无边框的结构。
手机100还可以包括蓝牙装置105,用于实现手机100与其他短距离的终端(例如手机、智能手表等)之间的数据交换。本申请实施例中的蓝牙装置可以是集成电路或者蓝牙芯片等。
手机100还可以包括至少一种传感器106,比如指纹采集器件112、光传感器、运动传感器以及其他传感器。具体地,可以在手机100的背面(例如后置摄像头的下方)配置指纹采集器件112,或者在手机100的正面(例如触摸屏104的下方)配置指纹采集器件112。又例如,可以在触摸屏104中配置指纹采集器件112来实现指纹识别功能,即指纹采集器件112可以与触摸屏104集成在一起来实现手机100的指纹识别功能;光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节触摸屏104的显示器的亮度,接近传感器可在手机100移动到耳边时,关闭显示器的电源。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计 姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
Wi-Fi装置107,用于为手机100提供遵循Wi-Fi相关标准协议的网络接入,手机100可以通过Wi-Fi装置107接入到Wi-Fi接入点,进而帮助用户收发电子邮件、浏览网页和访问流媒体等,它为用户提供了无线的宽带互联网访问。在其他一些实施例中,该Wi-Fi装置107也可以作为Wi-Fi无线接入点,可以为其他终端提供Wi-Fi网络接入。
定位装置108,用于为手机100提供地理位置。可以理解的是,该定位装置108具体可以是全球定位系统(global positioning system,GPS)或北斗卫星导航系统、俄罗斯GLONASS等定位系统的接收器。定位装置108在接收到上述定位系统发送的地理位置后,将该信息发送给处理器101进行处理,或者发送给存储器103进行保存。在另外的一些实施例中,该定位装置108还可以是辅助全球卫星定位系统(assisted global positioning system,AGPS)的接收器,AGPS系统通过作为辅助服务器来协助定位装置108完成测距和定位服务,在这种情况下,辅助定位服务器通过无线通信网络与终端例如手机100的定位装置108(即GPS接收器)通信而提供定位协助。在另外的一些实施例中,该定位装置108也可以是基于Wi-Fi接入点的定位技术。由于每一个Wi-Fi接入点都有一个全球唯一的媒体介入控制(media access control,MAC)地址,终端在开启Wi-Fi的情况下即可扫描并收集周围的Wi-Fi接入点的广播信号,因此可以获取到Wi-Fi接入点广播出来的MAC地址;终端将这些能够标示Wi-Fi接入点的数据(例如MAC地址)通过无线通信网络发送给位置服务器,由位置服务器检索出每一个Wi-Fi接入点的地理位置,并结合Wi-Fi广播信号的强弱程度,计算出该终端的地理位置并发送到该终端的定位装置108中。
音频电路109、扬声器113、麦克风114可提供用户与手机100之间的音频接口。音频电路109可将接收到的音频数据转换后的电信号,传输到扬声器113,由扬声器113转换为声音信号输出;另一方面,麦克风114将收集的声音信号转换为电信号,由音频电路109接收后转换为音频数据,再将音频数据输出至RF电路102以发送给比如另一手机,或者将音频数据输出至存储器103以便进一步处理。
外设接口110,用于为外部的输入/输出设备(例如键盘、鼠标、外接显示器、外部存储器、用户识别模块卡等)提供各种接口。例如通过通用串行总线(universal serial bus,USB)接口与鼠标连接,通过用户识别模块卡卡槽上的金属触点与电信运营商提供的用户识别模块卡(subscriber identification module,SIM)卡进行连接。外设接口110可以被用来将上述外部的输入/输出外围设备耦接到处理器101和存储器103。
手机100还可以包括给各个部件供电的电源装置111(比如电池和电源管理芯片),电池可以通过电源管理芯片与处理器101逻辑相连,从而通过电源装置111实现管理充电、放电、以及功耗管理等功能。
尽管图1未示出,手机100还可以包括摄像头(前置摄像头和/或后置摄像头)、 闪光灯、微型投影装置、近场通信(NFC near field communication,NFC)装置等,在此不予赘述。
进一步地,上述手机100中可以运行Android或IOS等操作系统,以Android操作系统为例,Android操作系统可以划分为四层,如图3所示,从高层到低层分别是应用程序层201(即APP层)、应用程序框架层202(即Framework层)、系统运行库层203(即Libraries层)和Linux内核层204。
其中,Linux内核层204可用于控制手机100的安全(Security),存储器管理(Memory Management),程序管理(Process Management),网络堆栈(Network Stack),驱动程序模型(Driver Model)等功能。Linux内核层204同时也作为硬件(例如,CPU、网卡以及内存等)和软件栈之间的抽象层,它可隐藏具体硬件细节从而为上层(系统运行库层203、应用程序框架层202以及应用程序层201)提供统一的服务。
系统运行库层203中包含一些C/C++库,例如,媒体库、系统C库以及显示管理库(Surface Manager)等,这些库能被Android系统中不同的组件使用,系统运行库层203可通过Framework层202为开发人员提供服务。
Framework层202为开发人员提供了一个可以完全访问应用程序所使用的API框架。具体的,Framework层202为开发应用程序提供了非常多的API,通过调用相应的API可以构造满足相关业务需求的APP。
应用程序层201主要包括用java语言编写的APP,用户在操作APP上的操作界面时,通过调用Framework层202中的相关API,与系统运行库层203或Linux内核层204进行交互,实现与该操作界面相对应的功能。
在本申请实施例中,应用程序层201中运行的APP(以应用A为例)获取到触摸屏104上用户输入的触摸操作是一个从底层向上层逐层分发消息的过程。
具体的,如图4所示,用户手指触摸硬件层中的触摸屏104时,触摸屏104得到这一触摸操作的相关信息(例如,触摸点的坐标等),进而,触摸屏104可通过相应的驱动以中断的形式向Linux内核层204上报该触摸动作产生的原始触摸事件。Framework层202中包括与下层通信的事件总线层202a以及与上层通信的输入读取分发层202b,Linux内核层204得到上述原始触摸事件后,可对该触摸事件进行坐标系转换等封装工作,生成上层能够读取的高级触摸事件(例如,action down事件、action move事件以及action up事件等),并将该高级触摸事件发送给事件总线层202a,再由事件总线层202a分发给输入读取分发层202b。
最终,由输入读取分发层202b将上述高级触摸事件上报至应用程序层201中正在运行的应用A的应用进程。此时,应用A的应用进程可调用系统运行库层203中的C/C++库函数确定该高级触摸事件对应的操作具体为什么操作,例如,单击操作。进而,系统运行库层203中的库函数可回调应用A预先为单击操作编写的回调函数,该回调函数规定了应用A响应于用户的单击操作所执行的功能。例如,该回调函数可以是onclick函数,使得应用A执行与触摸点位置处单击操作对应的回调函数,例如,应用A为触摸点处的单击操作编写的onclick函数用于实现视频播放功能。
与获取上述触摸操作相对应的,应用程序层201中应用A实现上述回调函数所指示的功能是一个从上层向底层逐层下发,最终由相关硬件执行该控制指令的过程。
以实现上述视频播放功能为例,应用程序层201中应用A的应用进程根据底层上报的触摸操作确定需要实现视频播放功能时,可生成视频播放指令逐层发送给Framework层202中的输入读取分发层202b以及事件总线层202a,再由事件总线层202a向Linux内核层204发送该视频播放指令,最终,由Linux内核层204通过驱动调用处理器、存储器以及触摸屏104等硬件实现视频播放这一输出。
在本申请实施例中,由于用户预先在触摸屏104中自定义了应用A运行时某个触摸区域的触摸映射规则,因此,当Linux内核层204得到触摸屏104上报的触摸操作之后,可由终端的Linux内核层204(或Framework层202)判断该触摸操作中触摸点的位置是否落入用户自定义的触摸区域内,若落入用户自定义的触摸区域内,则可按照用户设置的触摸映射规则修改该触摸操作中携带的相关信息。例如,用户预先定义在触摸区域1中的单击操作映射为双击操作,那么,当触摸屏104上报的触摸操作被终端确定为落入触摸区域1的单击操作时,终端可将该触摸动作的响应事件从单击操作修改为双击操作,进而向应用程序层201中运行的APP回调双击操作对应的回调函数,实现双击操作的触控效果,从而实现对触摸屏104的精细化、定制化触摸控制,以提高终端的输入效率。
以下,将结合具体实施例详细阐述本申请实施例提供的一种触摸控制方法,如图5所示,包括:
S501、终端在运行目标应用时获取用户的第一输入,该第一输入用于触发终端进入自定义触摸区域的设置界面。
其中,上述目标应用可以是终端内安装的视频类应用、游戏类应用、通讯类应用等任意应用,本申请实施例对此不作任何限制。
以游戏类应用A为例,在终端运行应用A的过程中,可以在应用A的显示界面中显示一个用于自定义触摸区域的控件。如图6所示,可在应用A的登录界面中显示控件600,用于提示用户在运行应用A时可自定义不同触摸区域的触摸映射规则,以提高用用A运行时的输入和响应效率。那么,当检测到用户点击该控件600时,即获取到用户的第一输入。
又或者,如图7A所示,也可以在应用A的设置界面中设置“定制触摸映射规则”的选项700,当用户点击“定制触摸映射规则”的选项700后,可点击“修改规则”的选项自定义不同触摸区域以及不同触摸区域的触摸映射规则。此时,当检测到用户点击该“修改规则”的选项时,即获取到用户的第一输入。
当然,用户也可以在终端操作系统的设置界面中提供为不同应用设置自定义触摸区域以及触摸映射规则的入口。如图7B所示,终端在设置界面中提供“定制化触摸”的选项701,用户开启该“定制化触摸”的选项701后,可选择为不同的应用(例如应用A)设置自定义的触摸区域以及触摸映射规则。以应用A为例,用户选择定制化触摸的生效应用为应用A后,如图7B所示,可修改已经建立的触摸区域(例如图7B中的触摸区域1和触摸区域2),用户点击触摸区域1的按钮702后,可对触摸区域1 的大小和位置,以及触摸区域1的触摸映射规则进行修改,当然,用户也可以点击添加自定义触摸区域的按钮703,创建新的触摸区域和触摸映射规则,本申请实施例对此不作任何限制。
当然,用户也可以通过语音等方式向终端输入用于开启自定义触摸屏内触摸映射规则的第一输入,本申请实施例对此不做任何限制。
S502、终端在上述目标应用的显示界面中显示半透明的设置界面。
响应于用户的上述第一输入,在步骤S502中,终端可在当前目标应用的显示界面上叠加绘制一层半透明的图层作为上述设置界面显示在终端的触摸屏中,此时,如图8所示,终端可提示用户在设置界面800上画出自定义的目标触摸区域,用户可自由定义自己所需的目标触摸区域,并可在自定义的目标触摸区域内设置对该目标触摸区域生效的触摸映射规则,以提升目标应用运行时的输入和输出性能。
S503、终端获取用户在上述设置界面中的第二输入,该第二输入包括用户在触摸屏上自定义的目标触摸区域,以及对该目标触摸区域设置的目标触摸映射规则。
在本申请的一些实施例中,仍如图8所示,用户可使用终端预先设置的区域模板802(例如,矩形模板、三角形模板或圆形模板等),在设置界面800上的任意位置画出一定大小的目标触摸区域801。此时,终端可通过区域模板的平面几何函数(例如,矩形面积函数、圆形的面积函数等)记录目标触摸区域801的在触摸屏上的具体位置和大小。例如,如图8所示,目标触摸区域801可表示为Area1=f(p,r),其中p代表圆心坐标,r代表圆形半径。
在本申请的另一些实施例中,如图9所示,用户也可按照一定顺序(例如顺时针或逆时针顺序)在设置界面800上画出目标触摸区域的边界点,这些边界点的连线可形成目标触摸区域901。此时,终端可通过上述边界点的坐标来记录目标触摸区域901在触摸屏上的具体位置和大小。例如,仍如图9所示,目标触摸区域901可表示为Area2{A,B,C,D,E},其中,A,B,C,D,E为目标触摸区域901沿顺时针顺序的五个边界点的坐标。
进一步地,当用户在设置界面800上自定义了的目标触摸区域之后,可继续对该目标触摸区域的触摸映射规则进行设置。示例性的,如图10所示,用户将位于游戏应用A左下角的圆形区域设置为目标触摸区域801,终端记录该目标触摸区域801的位置和大小之后,可进一步提示用户修改该目标触摸区域801的触摸映射规则,例如,目标触摸区域801的触摸灵敏度1001以及触摸动作的响应事件1002等。
其中,终端可以将触摸灵敏度1001通过进度条的形式显示在当前的设置界面中,用户通过拖动操作可以改变进度条的进度,从而修改目标触摸区域801的触摸灵敏度。
仍如图10所示,触摸灵敏度1001的进度条以-100至100的取值区间举例,当用户设置触摸灵敏度1001为0时,则说明无需修改目标触摸区域801的触摸灵敏度,即终端响应用户在目标触摸区域801内的触摸操作时使用终端默认的触摸灵敏度即可。也就是说,如果终端(或目标应用)预先定义用户在触摸屏上每滑动1cm时,可控制该操作对应的显示对象移动1米,那么,当用户设置触摸灵敏度1001为0时,用户在触摸屏上每滑动1cm,终端在响应该操作时仍然控制对应的显示对象移动1米。
当用户设置触摸灵敏度1001大于0时,则说明用户希望在目标触摸区域801内的 触摸灵敏度高与当前的默认值。以触摸灵敏度1001的取值为100举例,此时用户在目标触摸区域801内每移动1cm的触摸动作,终端可响应该触摸动作控制对应的显示对象移动2米,即以默认触摸灵敏度的2倍距离响应用户在目标触摸区域801内的触摸动作。示例性的,如图11所示,用户在目标触摸区域801内从A(0,0)点移动至B(1,1)点时,按照用户设置的触摸灵敏度为100举例,终端可以将上述A点和B点的横坐标和纵坐标均乘以2,得到A(0,0)和B’(2,2),并将修改后的坐标点上报给目标应用,使得目标应用认为用户从A(0,0)移动至B’(2,2),从而以2倍距离响应用户本次的触摸动作。
相应的,当用户设置触摸灵敏度1001小于0时,则说明用户希望在目标触摸区域801内的触摸灵敏度低于当前的默认值。以触摸灵敏度1001的取值为-100举例,此时用户在目标触摸区域801内每移动1cm的触摸动作,终端可响应该触摸动作控制对应的显示对象移动0.5米,即以默认触摸灵敏度的1/2倍距离响应用户在目标触摸区域801内的触摸动作。仍如图11所示,用户在目标触摸区域801内从A(0,0)点移动至B(1,1)点时,按照用户设置的触摸灵敏度为-100举例,终端可以将上述A点和B点的横坐标和纵坐标均乘以0.5,得到A(0,0)和B”(0.5,0.5),并将修改后的坐标点上报给目标应用,使得目标应用认为用户从A(0,0)和B”(0.5,0.5),从而以1/2倍距离响应用户本次的触摸动作。
这样一来,仍以图10为例,由于游戏应用A左下角的目标触摸区域801内一般用于控制游戏人物的移动方向和移动距离,当用户将目标触摸区域801内的触摸灵敏度调高时,可通过移动距离较短的触摸操作控制游戏人物移动至较远的位置,从而提升游戏人物的移动速度,为用户带来较好的游戏体验的同时,也增加了终端在运行应用A时输入和输出操作的效率。
可选的,为了使得用户可以对触摸灵敏度这一触摸映射规则得到快速认知,当用户调整触摸灵敏度1001的取值在进度条的不同位置上时,终端可相应的提示用户当前触摸灵敏度的具体含义。如图12所示,用户将触摸灵敏度1001的取值设置为80时,终端可通过悬浮窗1101提示用户此时游戏人物的移动速度将加快1.8倍。
此时,终端可将用户设置的触摸灵敏度80作为触摸映射规则中的坐标映射参数;也可以将与触摸灵敏度80对应的1.8倍的放大倍率作为触摸映射规则中的坐标映射参数,当然,当用户设置的触摸灵敏度小于0时,可将与当前触摸灵敏度对应的缩小倍率作为触摸映射规则中的坐标映射参数。后续当终端检测到用户在上述目标触摸区域输入第一触摸操作时,可根据上述坐标映射参数增大或减小该第一触摸操作中坐标点的坐标值,从而将第一触摸操作映射为第二触摸操作。
也就是说,终端可以以触摸灵敏度的形式向用户提供对目标触摸区域的自定义功能,而终端可以以坐标映射参数的形式保存用户自定义设置的触摸灵敏度,以便后续根据该坐标映射参数实现触摸灵敏度的自定义功能。
可以理解的是,用户还可以在步骤S502显示的设置界面中自定义多个目标触摸区域,并对每一个目标触摸区域的触摸映射规则进行设置。
仍以游戏应用A为例,如图13所示,用户在设置界面上自定义了两个目标触摸区域,一个是触摸屏左下角的圆形区域(即目标触摸区域1),另一个是触摸屏右下 角的矩形区域(即目标触摸区域2)。
针对目标触摸区域1,用户设置其触摸映射规则中触摸灵敏度的取值为80,从而提升应用A游戏人物的移动速度。针对目标触摸区域2,由于该区域在应用A中一般用于实现游戏中的各类攻击操作,这些操作一般在应用A发布或者在终端出厂时就已经设置好。例如,双击攻击按钮可以发动攻击等。但是在使用过程中用户执行双击操作比较吃力,用户可能希望单击操作也能达到双击操作的攻击效果;而某些游戏应用利用连续单击操作的频率来决定某项功能的输入值,但连续单击操作比较吃力,用户可能希望通过长按操作达到连续单击操作的效果。
对此,在本申请实施例中,用户还可以对目标触摸区域内触摸动作的响应事件进行自定义,仍如图13所示,用户可在目标触摸区域2内勾选“将单击操作映射为双击操作”,以及“将长按操作映射为连续单击操作”的选项。终端会保存用户选择的触摸动作的响应事件,后续当终端接收到用户在目标触摸区域2内输入的单击操作时,终端可根据用户预先为目标触摸区域2设置的触摸映射规则,将该单击操作映射为双击操作,以实现双击操作的效果,同时提升了终端在运行应用A时输入和输出操作的效率。
另外,终端还可以为用户提供更加细致的触摸映射规则的设置选项,例如,当用户设置将单击操作映射为双击操作时,可进一步设置双击操作时的时间间隔;当用户设置将长按操作映射为连续单击操作时,可设置该长按操作持续的时间阈值(即触摸超过多长时间则映射为连续单击操作),以及长按操作所映射的相邻单击操作之间的时间间隔等参数,使得后续用户在应用A的操作界面中的触控体验更加符合自身的操作习惯。
进一步地,如图14所示,当终端接收到用户对某一目标触摸区域(例如上述目标触摸区域1)设置的触摸映射规则后,还可以进一步提示用户设置该目标触摸区域以及其触摸映射规则生效的对象。例如,可设置目标触摸区域1中将触摸灵敏度提升至80这一触摸映射规则对应用A运行过程中的所有界面均生效,也可以设置该触摸映射规则仅对应用A运行时的一个或多个界面,例如在对战场景下的对战界面生效。
由于每个应用在运行时可以通过界面的标识(例如安卓系统中Activity的类名称)来区分不同的界面,因此,终端可以将上述触摸映射规则与用户设置的一个或多个界面的Activity的类名称相关联。
例如,终端可预先存储应用A中不同类型的显示界面与其Activity类名之间的对应关系,例如,应用A中的设置界面包括Activity1和Activity2,应用A中的对战界面包括Activity3和Activity4,那么,当用户设置上述触摸映射规则的生效对象为对战界面时,终端可将该触摸映射规则与Activity3和Activity4相关联,后续当终端识别出用户输入的触摸操作发生上述被关联的界面上时,便可在目标触摸区域1使用该触摸映射规则响应用户的触摸动作。或者,用户也可以手动进入各个显示界面中设置仅对当前界面生效的触摸映射规则,本申请实施例对此不做任何限制。
当然,由于不同应用在运行时使用的包名或进程ID等标识一般也是不同的,因此,用户也可以设置上述触摸映射规则对其他一个或多个应用也生效,此时终端可以将上述触摸映射规则与用户设置的一个或多个应用的标识关联,以便后续终端运行这一个 或多个应用时也可在目标触摸区域1中使用该触摸映射规则响应用户的触摸动作。
需要说明的是,当用户在设置界面中自定义多个目标触摸区域时,多个目标触摸区域之间发生可能会出现交叠的现象。如图15所示,用户在设置界面中首先自定义设置了目标触摸区域3以及目标触摸区域3的触摸映射规则A,进而,又在设置界面中自定义设置了目标触摸区域4,目标触摸区域4与目标触摸区域3之间存在交叠区域。
此时,如果用户对目标触摸区域4设置的触摸映射规则B与上述触摸映射规则A发生冲突,例如,用户将目标触摸区域3中的触摸灵敏度取值设置为80,后续又将目标触摸区域4中的触摸灵敏度取值设置为30,则触摸映射规则B与触摸映射规则A发生冲突。那么,终端可向用户显示报错提示,或者,仍如图15所示,终端可向用户二次确认是否要更改之前为目标触摸区域3设置的触摸灵敏度,如果用户确认更改目标触摸区域3的触摸灵敏度,则终端可一并将目标触摸区域3和目标触摸区域4的触摸灵敏度的取值均设置为30。
当然,如果上述目标触摸区域3的触摸映射规则A与目标触摸区域4的触摸映射规则B没有发生冲突,则终端可继续执行下述步骤S504,即以一定的数据结构保存用户自定义的每一个目标触摸区域及其目标触摸映射规则。
S504、终端保存上述目标触摸区域、该目标触摸区域的目标触摸映射规则以及上述目标应用之间的对应关系。
具体的,当终端获取到用户在触摸屏上自定义的目标触摸区域,以及对该目标触摸区域设置的目标触摸映射规则后,终端可以以预先设置的数据结构将上述目标触摸区域、该目标触摸区域的目标触摸映射规则与该目标触摸映射规则生效的应用(或界面)之间的对应关系保存至存储器中,以便后续接收到用户输入的触摸操作时能够查找到对应的目标触摸映射规则响应该触摸操作。
如表1所示,终端可以以一个目标触摸区域为粒度,为每一个目标触摸区域设置一个profile(配置)文件,该profile文件与对应的一个或多个应用(或界面)相关联,每一个profile文件中记录有对应的目标触摸区域在触摸屏中的位置以及该目标触摸区域的目标触摸映射规则。
以profile1为例,用户在上述游戏应用A的设置界面中向终端输入了自定义的目标触摸区域Area1以及Area1的触摸灵敏度(即上述步骤S503中的第二输入)。响应于该第二输入,终端生成profile1,profile1中包括Area1的坐标信息以及用户对Area1设置的目标触摸映射规则。其中,该目标触摸映射规则中将触摸灵敏度的取值修改为80,而触摸动作的响应事件仍沿用终端默认的响应机制不做更改。同时,终端建立profile1与应用A之间的对应关系,后续终端在运行应用A时通过查询表1所示的映射关系便可得到与应用A对应的所有profile文件。
表1
Figure PCTCN2017109781-appb-000003
另外,在本申请的一些实施例中,如图16所示,终端还可以将生成的profile文件分享给其他设备,这样,可以在其它设备上复制出相同的自定义目标区域和触摸控制效果。当然,如果接收该profile文件的接收设备的屏幕分辨率等显示参数与发送该profile文件的终端的显示参数不同时,该接收设备也可以对接收到的profile文件进行相应的转换后使用,本申请实施例对此不做任何限制。
又或者,在本申请的另一些实施例中,仍如图16所示,终端还可以将生成的profile文件上传至云端的服务器,当其他终端在运行相关的应用(例如上述游戏应用A)时,也可从云端下载与应用A对应的profile1,在其它设备上复制出相同的自定义目标区域和触摸控制效果。
当然,服务器收到各个终端上报的profile文件后,也可通过大数据统计等算法对某一应用(或界面)对应的profile文件进行优化,例如,当百分之九十的终端在运行应用A时,都将Area1的触摸灵敏度的取值调整至80以上,则服务器可对应用A对应的包含Area1的profile文件进行优化,将其目标触摸映射规则中触摸灵敏度的取值调整为80,进而向触摸灵敏度低于80的终端下发优化后profile文件,使得该终端在运行应用A时实现触摸灵敏度更高的触摸控制效果。当然,终端在接收到服务器发送的优化后的profile文件后,还可以向用户提示是否使用优化后profile文件中设置的触摸映射规则,以提高用户体验。
通过上述步骤S501-504,在不同应用场景下,用户可对终端的触摸屏进行逻辑区域的划分,可得到用户自定义的触摸区域,并且,用户可在自定义的触摸区域中设置符合当前应用场景以及自身操作习惯的触摸映射规则,以便后续用户在自定义的触摸区域中获得定制的触控感受,实现了对终端触摸屏的精细化、个性化的控制,从而提高终端的在不同应用场景下的输入和输出效率。
在本申请的另一些实施例中,提供了一种触摸控制方法,如图17所示,包括:
S601、终端获取用户在触摸屏上输入的第一触摸操作。
可选地,终端可以获取该第一触摸操作经过的触摸点的坐标信息。这里所说的触摸点,可以是用户输入第一触摸操作时,触摸屏检测到的触摸点,也可以是触摸屏检测到的触摸点在显示屏上对应的像素点。
与步骤S501类似的,上述目标应用可以是终端内安装的视频类应用、游戏类应用、通讯类应用等任意应用,本申请实施例对此不作任何限制。
仍以上述游戏类应用A为例,终端在运行应用A时可实时在触摸屏上显示应用A的显示界面,用户可通过向触摸屏输入相应的输入操作实现应用A提供的相关功能。
示例性的,如图18所示,终端启动应用A后进入游戏中的对战界面1701,用户可点击对战界面1701中左下角区域的模拟操控手柄1702控制游戏人物上下左右移动。那么,当用户移动该模拟操控手柄1702向右滑动(即上述第一触摸操作)时,终端的触摸屏可将检测到的触摸信息(例如,包括触摸事件,触摸点的坐标信息等)依次上报给终端的内核层、框架层以及应用层。
当然,触摸屏还可以将检测到的本次触摸操作的触摸时间等触摸信息携带在上述第一触摸操作中,本申请实施例对此不作任何限制。
另外,上述触摸操作中的坐标信息可以是触摸点在上述触摸屏上的绝对坐标信息,也可以是终端将该绝对坐标信息转化后的相对坐标信息。
其中,上述绝对坐标信息是指触摸点在触摸屏的生产厂商定义的坐标系中的坐标。示例性的,在生产触摸屏的过程中可向触摸屏的IC芯片中设置触摸屏的坐标系,如图19中的(a)所示,以触摸屏的左下角为原点O(0,0)设置第一坐标系,那么,当触摸屏检测到用户在触摸屏的P点输入触摸操作时,可在第一坐标系中确定P点坐标为P(5,0),此时,P(5,0)即为上述绝对坐标信息。
但是在一些情况下,触摸屏上设置的坐标系与终端的操作系统定义的坐标系可能是不同的。例如,如图19中的(b)所示,终端的操作系统以触摸屏的左上角为原点O’(0,0)设置第二坐标系,那么,用户在触摸屏的P点输入的触摸操作在第二坐标系中被映射为在触摸屏的P’(5,15)点的触摸操作,此时,P’(5,15)即为上述相对坐标信息。
可选的,上述绝对坐标信息与相对坐标信息的映射过程可以由终端中的内核层完成,也可以由终端中的框架层完成,本申请实施例对此不做任何限制。
S602、终端确定上述第一触摸操作是否作用于目标界面的第一预设区域。
其中,上述目标界面是指步骤S601中在前台运行的目标应用呈现的任意界面,以第一预设区域为上述实施例中用户自定义的目标触摸区域为例,该目标界面可以覆盖目标触摸区域的部分或全部。
另外,上述第一触摸操作作用于第一预设区域可以是指:通过悬浮或触摸的方式使触摸屏检测到用户的第一触摸操作的操作对象或操作轨迹落入第一预设区域,例如,当检测到第一触摸操作的触摸点的坐标落入第一预设区域时,可确定第一触摸操作作用于第一预设区域。
具体的,在步骤S602中,当终端在目标界面中获取到上述第一触摸操作后,为了确定是否采用用户自定义的触摸映射规则响应该第一触摸操作,终端可获取此时正在前台运行的目标应用的标识,进而根据目标应用的标识,在表1所示的对应关系中查找与目标应用对应的所有profile文件。由于这些profile文件中均记录了用户自定义的目标触摸区域的具体位置,因此,终端可根据上述第一触摸操作中触摸点的坐标信息,确定该第一触摸操作具体落入了哪一个profile文件的目标触摸区域内。
例如,终端获取到当前正在运行的应用的标识为应用A的包名,那么,通过表1可确定运行应用A时用户自定义了两个profile文件(即表1中的profile1和profile2)。进而,终端可将上述第一触摸操作中触摸点的坐标信息P(x,y)分别与profile1中的Area1以及profile2中的Area2对比,判断出P点落入触摸屏左下角的目标触摸区域Area1内。
需要说明的是,终端在判断上述坐标信息是否落入目标触摸区域时,上述触摸点的坐标信息使用的坐标系应与表1中记录的目标触摸区域使用的坐标系相同。例如,终端按照操作系统定义的上述第二坐标系记录Area1的位置,而触摸屏向终端上报的第一触摸操作中触摸点的坐标信息P(x,y)是按照上述第一坐标系记录的。那么,触摸屏将上述坐标信息P(x,y)上报给终端的内核层时,内核层可将该坐标信息P(x,y)映射为第二坐标系内的坐标P’(x’,y’),进而判断坐标P’(x’,y’)是否 落入目标触摸区域Area1内。
S603、若落入上述第一预设区域内,则终端将上述第一触摸操作映射为第二触摸操作。
S604、目标应用对上述第二触摸操作进行响应,以实现目标应用运行时的定制化触控功能。
仍以第一触摸操作中触摸点的坐标信息P(x,y)落入目标触摸区域Area1内为例,此时,结合图3-图4所示的安卓系统的架构图,如图20所示,触摸屏通过驱动将检测到的第一触摸操作封装为原始触摸事件上报给终端的内核层204,进而由内核层204将该原始触摸事件中携带的坐标信息P(x,y)映射为第二坐标系内的坐标P’(x’,y’),并将该原始触摸事件封装为上层可读取的高级触摸事件上报给框架层202,框架层202通过查询表1所示的profile文件与应用之间的对应关系,可确定高级触摸事件中携带的坐标P’(x’,y’)落入profile1中用户自定义的目标触摸区域Area1内。
此时,终端可查找profile1中记录的目标触摸映射规则,该目标触摸映射规则中设置了用于反映触摸灵敏度的坐标映射参数,例如坐标映射参数为1.8,即应用A以1.8倍的距离比例响应用户在目标触摸区域Area1内输入的第一触摸动作,那么,框架层202可将坐标P’(x’,y’)中的横坐标和纵坐标均乘以1.8倍,得到修改后的修正坐标Q(1.8x’,1.8y’),将修正坐标Q(1.8x’,1.8y’)作为第二触摸操作中触摸点的坐标信息。框架层202将该修正坐标Q(1.8x’,1.8y’)携带在上述高级触摸事件中上报给应用层中正在运行的应用A,使得应用A可以根据修改后的修正坐标Q(1.8x’,1.8y’)对第二触摸操作进行响应,也就是说,用户向触摸屏输入的是在P(x,y)点的第一触摸操作,而终端内的应用最终向用户响应的是在Q(1.8x’,1.8y’)点的第二触摸操作。
例如,如图21所示,终端检测到用户在点P(1,0)处的第一触摸操作,由于点P落入profile1中用户自定义的目标触摸区域Area1内,因此,终端按照profile1中触摸灵敏度的取值80,将坐标P(1,0)修改为坐标Q(1.8*1,1.8*0)=Q(1.8,0)。那么,应用A获取到坐标值为Q(1.8,0)的第二触摸操作后,如果应用A上一次接收到的触摸操作中触摸点的坐标为O(0,0),则应用A认为用户手指操控模拟操控手柄1702从O(0,0)至Q(1.8,0)向右移动了1.8㎝,而实际用户操控模拟操控手柄1702从O(0,0)点至P(1,0)点向右移动了1㎝,便实现了控制模拟操控手柄1702移动1.8㎝的效果,从而提升游戏人物的移动速度。
另外,如果上述模拟操控手柄1702的操控区域是一定的,例如,仅允许用户在图21中Area1所示的圆形区域内操控模拟操控手柄1702。此时,如果终端按照摸灵敏度的取值修改第一触摸操作中触摸点的坐标后,该修改后的坐标(例如上述Q点)超出模拟操控手柄1702的操控区域Area1的操控边界,则图21所示,终端可将操控边界上距离Q点最近的Z点作为映射后的第二触摸操作中触摸点的坐标信息上报给应用A,避免因修改后的坐标超出第一触摸操作对应的操控区域而导致应用无法正确响应该第一触摸操作的问题。
需要说明的是,上述实施例中是以固定触摸灵敏度的实现方式举例说明的,也就 是说,如图22中的(a)所示,用户设置目标触摸区域Area1的触摸灵敏度为80后,终端始终以1.8倍的比例响应用户在目标触摸区域Area1内的触摸操作。
可以理解的是,终端也可以以非线性的方式修改上述触摸操作,最终到达用户设置的触摸灵敏度。例如,如图22中的(b)所示,用户设置目标触摸区域Area1的触摸灵敏度为80后,终端可根据用户在Area1内滑动的距离改变触摸灵敏度,当滑动距离越大时逐渐增加其触摸灵敏度,直至增加至默认触摸灵敏度的1.8倍为止。
另外,如图23所示,框架层202将上述高级触摸事件上报给应用层中正在运行的应用A后,应用A可调用系统运行库层203中相关的库函数,由库函数根据该高级触摸事件中传递的参数帮助应用A确定用户在P点执行的具体触摸操作,例如单击操作。那么,确定出当前的触摸操作为单击操作后,如果触摸点P对应的profile文件中记录了单击操作的响应事件为双击操作,则此时终端不会回调应用A中为单击操作编写的回调函数,而是回调应用A中为双击操作编写的回调函数,即将(单击操作)第一触摸操作映射为第二触摸操作(双击操作),使得应用A响应该双击操作在触摸点P实现双击操作的效果。
当然,如果框架层202获取到用户在P点执行的单击操作时生成的高级触摸事件1后,能够根据该高级触摸事件1确定出用户在P点执行的是单击操作,则框架层202此时便可根据P点对应的profile文件,将上述单击操作产生的高级触摸事件1修改为用户在P点执行的双击操作时应该产生的高级触摸事件2,并将高级触摸事件2上报给应用层中正在运行的应用A。这样,应用A调用系统运行库层203中相关的库函数时,可确定出用户在P点执行的具体触摸操作为双击操作,那么,终端可回调应用A中为双击操作编写的回调函数,同样可使得应用A响应该双击操作在触摸点P实现双击操作的效果。
进一步地,在一个应用的运行过程中,一般在用户执行滑动类型的触摸操作时才会对其触摸灵敏度的高低有不同的需求,因此,当触摸操作中的坐标信息落入上述目标触摸区域内时,终端可通过库函数确定当前的触摸操作为滑动操作后,再根据profile文件中用户自定义的触摸灵敏度修改触摸操作中的坐标信息。
当然,如果触摸操作中的坐标信息没有落入为目标应用预先设置的某一个目标触摸区域内,或者,该坐标信息对应的profile文件中设置的触摸映射规则为终端默认的触摸映射规则,则终端在获取到该触摸操作后无需对该触摸操作进行修改,由目标应用按照实际触摸操作中携带的相关触摸信息进行响应即可,本申请实施例对此不做任何限制。
可以理解的是,上述终端等为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述终端等进行功能模块的划分,例 如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图24示出了上述实施例中所涉及的终端的一种可能的结构示意图,该终端包括:获取单元2401、存储单元2402、显示单元2403以及映射单元2404。
获取单元2401用于支持终端执行图5中的过程S501和S503,以及图17中的过程S601;存储单元2402用于支持终端执行图5中的过程S504;显示单元2403用于支持终端执行图5中的过程S502;映射单元2404用于支持终端执行图17中的过程S602-S604。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
在采用集成的单元的情况下,可将上述映射单元2404作为处理模块,将存储单元2402作为存储模块,将获取单元2401作为输入模块,将显示单元1106作为显示模块。
此时,如图25所示,示出了上述实施例中所涉及的终端的一种可能的结构示意图。其中,处理模块2502用于对终端的动作进行控制管理。输入模块2503用于支持终端与用户之间的交互。存储模块2501用于保存终端的程序代码和数据。显示模块2504用于显示由用户输入的信息或提供给用户的信息以及终端的各种菜单。
在本申请实施例中,终端可通过输入模块2503获取用户在触摸屏上输入的第一触摸操作;当所述第一触摸操作作用于目标界面(即目标应用在前台运行的界面)中的第一预设区域时,处理模块2502可将所述第一触摸操作映射为第二触摸操作,以使得目标应用响应所述第二触摸操作,以实现对触摸屏的精细化和个性化控制。
其中,上述应用切换方法涉及的各步骤的所有相关内容均可以援引到上述实施例步骤S501-S504或S601-S604中的相关描述,在此不再赘述。
示例性的,处理模块2502可以是处理器或控制器,例如可以是中央处理器(Central Processing Unit,CPU),GPU,通用处理器,数字信号处理器(Digital Signal Processor,DSP),专用集成电路(Application-Specific Integrated Circuit,ASIC),现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。
输入模块2503可以是触摸屏、麦克风等输入输出设备或通信接口。
存储模块2501可以是存储器,该存储器可以包括高速随机存取存储器(RAM),还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。
显示模块2504可以为显示器,具体可以采用液晶显示器、有机发光二极管等形式来配置显示器。另外,显示器上还可以集成触控板,用于采集在其上或附近的触摸操作,并将采集到的触摸信息发送给其他器件(例如处理器等)。
当处理模块2502为处理器,输入模块2503为触摸屏,存储模块2501为存储器,显示模块2504为显示器时,本申请实施例所提供的终端可以为图1所示的手机100。
在上述实施例中,可以全部或部分的通过软件,硬件,固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式出现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘,硬盘、磁带)、光介质(例如,DVD)或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (33)

  1. 一种触摸控制方法,其特征在于,包括:
    终端获取用户在触摸屏上输入的第一触摸操作;
    当所述第一触摸操作作用于目标界面中的第一预设区域时,所述终端将所述第一触摸操作映射为第二触摸操作,以使得目标应用响应所述第二触摸操作;
    其中,所述目标界面为所述目标应用所呈现的任一覆盖所述第一预设区域的界面,所述目标应用在前台运行。
  2. 根据权利要求1所述的方法,其特征在于,当所述第一触摸操作作用于目标界面中的第一预设区域时,所述终端将所述第一触摸操作映射为第二触摸操作,包括:
    当终端在所述目标界面中检测到所述第一触摸操作时,所述终端查找与所述目标应用关联的至少一个预设区域,所述至少一个预设区包括所述第一预设区域;
    当所述第一触摸操作的触摸点落入所述第一预设区域时,所述终端获取为所述第一预设区域预设的触摸映射规则;
    所述终端按照所述触摸映射规则将所述第一触摸操作映射为第二触摸操作。
  3. 根据权利要求1或2所述的方法,其特征在于,所述终端将所述第一触摸操作映射为第二触摸操作,包括:
    所述终端修改所述第一触摸操作中触摸点的坐标值,将修改后的坐标值作为所述第二触摸操作中触摸点的坐标值。
  4. 根据权利要求2所述的方法,其特征在于,所述触摸映射规则包括坐标映射参数;
    其中,所述终端按照所述触摸映射规则将所述第一触摸操作映射为第二触摸操作,包括:
    所述终端按照所述坐标映射参数,增大或减小所述第一触摸操作中触摸点的坐标值,得到所述第二触摸操作中触摸点的坐标值。
  5. 根据权利要求4所述的方法,其特征在于,所述终端按照所述坐标映射参数,增大或减小所述第一触摸操作中触摸点的坐标值,包括:
    所述终端将所述第一触摸操作中触摸点的坐标值乘以所述坐标映射参数;所述坐标映射参数大于1,或小于1。
  6. 根据权利要求3-5中任一项所述的方法,其特征在于,在所述终端修改所述第一触摸操作所作用的触摸点的坐标值之后,还包括:
    若修改后的坐标值超出为所述第一触摸操作预设的操控边界,则所述终端将所述操控边界上距离所述修改后的坐标值最近的坐标值,作为所述第二触摸操作中触摸点的坐标值。
  7. 根据权利要求2-6中任一项所述的方法,其特征在于,所述终端按照所述触摸映射规则将所述第一触摸操作映射为第二触摸操作,包括:
    所述终端根据所述触摸映射规则,将用户执行所述第一触摸操作时产生的第一触摸事件映射为用户执行第二触摸操作时产生的第二触摸事件,并将所述第二触摸事件上报给所述目标应用。
  8. 根据权利要求2-6中任一项所述的方法,其特征在于,所述终端将所述第一触 摸操作映射为第二触摸操作,以使得所述目标应用响应所述第二触摸操作,具体包括:
    所述终端将用户执行所述第一触摸操作时产生的触摸事件上报给所述目标应用,以使得所述目标应用指示所述终端根据所述触摸事件确定出所述第一触摸操作;
    所述终端根据所述触摸映射规则将确定出的所述第一触摸操作映射为第二触摸操作,并指示所述目标应用响应所述第二触摸操作。
  9. 根据权利要求7或8所述的方法,其特征在于,所述触摸映射规则用于指示将单击操作映射为双击操作,或者,将长按操作映射为连续单击操作。
  10. 一种触摸控制方法,其特征在于,包括:
    响应于用户的第一输入,所述终端显示用于指示用户自定义触摸区域的设置界面;
    响应于用户的第二输入,所述终端获取用户在所述设置界面上自定义的目标触摸区域,以及用户对所述目标触摸区域自定义的触摸映射规则,所述触摸映射规则用于指示将所述目标触摸区域内获取到的第一触摸操作映射为第二触摸操作。
  11. 根据权利要求10所述的方法,其特征在于,所述终端获取用户在所述设置界面上自定义的目标触摸区域,包括:
    所述终端接收用户通过预设的区域模板在所述设置界面上绘制的目标触摸区域;或者,
    所述终端接收用户在所述设置界面上标记的K个边界点,所述K个边界点按照指定顺序连线后形成所述目标触摸区域,K>2。
  12. 根据权利要求10或11所述的方法,其特征在于,所述终端获取用户在所述设置界面上对所述目标触摸区域自定义的触摸映射规则,包括:
    所述终端接收用户对所述目标触摸区域设置的坐标映射参数,所述坐标映射参数用于指示所述终端响应所述第一触摸操作时触摸点坐标值的映射规则;和/或,
    所述终端接收用户对所述目标触摸区域设置的事件映射参数,所述事件映射参数用于指示所述终端响应所述第一触摸操作时触摸事件的映射规则。
  13. 根据权利要求12所述的方法,其特征在于,在所述终端接收用户对所述目标触摸区域设置的坐标映射参数之后,还包括:
    所述终端向用户提示在当前的坐标映射参数下,所述终端响应所述目标触摸区域内的触摸操作时的触摸效果。
  14. 根据权利要求10-13中任一项所述的方法,其特征在于,在所述终端获取用户在所述设置界面上自定义的目标触摸区域,以及用户对所述目标触摸区域自定义的触摸映射规则之后,还包括:
    所述终端接收用户为所述触摸映射规则设置的生效对象,所述生效对象包括至少一个应用和/或至少一个显示界面。
  15. 根据权利要求14所述的方法,其特征在于,在所述终端接收用户为所述触摸映射规则设置的生效对象之后,还包括:
    所述终端将所述目标触摸区域、所述目标触摸区域的触摸映射规则、以及所述生效对象之间建立关联关系。
  16. 根据权利要求10-15中任一项所述的方法,其特征在于,所述终端显示用于指示用户自定义触摸区域的设置界面,包括:
    所述终端在正在前台运行的目标应用的显示界面上,叠加显示用于指示用户自定义触摸区域的半透明设置界面。
  17. 一种终端,其特征在于,包括通过总线相连的处理器、存储器以及输入设备,其中,
    所述输入设备,用于获取用户在触摸屏上输入的第一触摸操作;
    所述处理器,用于判断所述第一触摸操作作用于目标界面中的第一预设区域,并将所述第一触摸操作映射为第二触摸操作,以使得目标应用响应所述第二触摸操作;
    其中,所述目标界面为所述目标应用所呈现的任一覆盖所述第一预设区域的界面,所述目标应用在前台运行。
  18. 根据权利要求17所述的终端,其特征在于,当所述输入设备在所述目标界面中检测到所述第一触摸操作时,所述处理器将所述第一触摸操作映射为第二触摸操作,具体包括:
    所述处理器查找与所述目标应用关联的至少一个预设区域,所述至少一个预设区包括所述第一预设区域;当所述第一触摸操作的触摸点落入所述第一预设区域时,所述处理器获取为所述第一预设区域预设的触摸映射规则;并按照所述触摸映射规则将所述第一触摸操作映射为第二触摸操作。
  19. 根据权利要求17或18所述的终端,其特征在于,所述处理器将所述第一触摸操作映射为第二触摸操作,具体包括:
    所述处理器修改所述第一触摸操作中触摸点的坐标值,将修改后的坐标值作为所述第二触摸操作中触摸点的坐标值。
  20. 根据权利要求18所述的终端,其特征在于,所述触摸映射规则包括坐标映射参数;其中,所述处理器按照所述触摸映射规则将所述第一触摸操作映射为第二触摸操作,具体包括:
    所述处理器按照所述坐标映射参数,增大或减小所述第一触摸操作中触摸点的坐标值,得到所述第二触摸操作中触摸点的坐标值。
  21. 根据权利要求20所述的终端,其特征在于,所述处理器按照所述坐标映射参数,增大或减小所述第一触摸操作中触摸点的坐标值,具体包括:
    所述处理器将所述第一触摸操作中触摸点的坐标值乘以所述坐标映射参数;所述坐标映射参数大于1,或小于1。
  22. 根据权利要求19-21中任一项所述的终端,其特征在于,
    所述处理器,还用于确定修改后的坐标值超出为所述第一触摸操作预设的操控边界,并将所述操控边界上距离所述修改后的坐标值最近的坐标值,作为所述第二触摸操作中触摸点的坐标值。
  23. 根据权利要求18-22中任一项所述的终端,其特征在于,所述处理器按照所述触摸映射规则将所述第一触摸操作映射为第二触摸操作,具体包括:
    所述处理器根据所述触摸映射规则,将用户执行所述第一触摸操作时产生的第一触摸事件映射为用户执行第二触摸操作时产生的第二触摸事件,并将所述第二触摸事件上报给所述目标应用。
  24. 根据权利要求18-22中任一项所述的终端,其特征在于,所述处理器将所述 第一触摸操作映射为第二触摸操作,以使得所述目标应用响应所述第二触摸操作,具体包括:
    所述处理器将用户执行所述第一触摸操作时产生的触摸事件上报给所述目标应用,以使得所述目标应用指示所述终端根据所述触摸事件确定出所述第一触摸操作;
    所述处理器根据所述触摸映射规则将确定出的所述第一触摸操作映射为第二触摸操作,并指示所述目标应用响应所述第二触摸操作。
  25. 一种终端,其特征在于,包括通过总线相连的处理器、存储器、显示器以及输入设备,其中,
    所述输入设备,用于接收用户的第一输入和第二输入;
    所述显示器,用于响应于所述第一输入,显示用于指示用户自定义触摸区域的设置界面;
    所述处理器,用于响应于所述第二输入,获取用户在所述设置界面上自定义的目标触摸区域,以及用户对所述目标触摸区域自定义的触摸映射规则,所述触摸映射规则用于指示将所述目标触摸区域内获取到的第一触摸操作映射为第二触摸操作。
  26. 根据权利要求25所述的终端,其特征在于,
    所述输入设备,还用于:接收用户通过预设的区域模板在所述设置界面上绘制的目标触摸区域;或者,接收用户在所述设置界面上标记的K个边界点,所述K个边界点按照指定顺序连线后形成所述目标触摸区域,K>2。
  27. 根据权利要求25或26所述的终端,其特征在于,
    所述输入设备,还用于:接收用户对所述目标触摸区域设置的坐标映射参数,所述坐标映射参数用于指示所述终端响应所述第一触摸操作时触摸点坐标值的映射规则;和/或,接收用户对所述目标触摸区域设置的事件映射参数,所述事件映射参数用于指示所述终端响应所述第一触摸操作时触摸事件的映射规则。
  28. 根据权利要求27所述的终端,其特征在于,
    所述显示器,还用于向用户提示在当前的坐标映射参数下,上述终端响应所述目标触摸区域内的触摸操作时的触摸效果。
  29. 根据权利要求25-28中任一项所述的终端,其特征在于,
    所述输入设备,还用于:接收用户为所述触摸映射规则设置的生效对象,所述生效对象包括至少一个应用和/或至少一个显示界面。
  30. 根据权利要求29所述的终端,其特征在于,
    所述处理器,还用于将所述目标触摸区域、所述目标触摸区域的触摸映射规则、以及所述生效对象之间建立关联关系,并将所述关联关系存储至所述存储器中。
  31. 根据权利要求25-30中任一项所述的终端,其特征在于,
    所述显示器,具体用于在正在前台运行的目标应用的显示界面上,叠加显示用于指示用户自定义触摸区域的半透明设置界面。
  32. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在终端上运行时,使得所述终端执行如权利要求1-9或10-16中任一项所述的触摸控制方法。
  33. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在终 端上运行时,使得所述终端执行如权利要求1-9或10-16中任一项所述的触摸控制方法。
PCT/CN2017/109781 2017-11-07 2017-11-07 一种触摸控制方法及装置 WO2019090486A1 (zh)

Priority Applications (11)

Application Number Priority Date Filing Date Title
ES17931613T ES2947297T3 (es) 2017-11-07 2017-11-07 Método y dispositivo de control táctil
AU2017438902A AU2017438902B2 (en) 2017-11-07 2017-11-07 Touch control method and apparatus
CN202210308200.8A CN114816208A (zh) 2017-11-07 2017-11-07 一种触摸控制方法及装置
CN202210308205.0A CN114879893B (zh) 2017-11-07 2017-11-07 一种触摸控制方法及装置
PCT/CN2017/109781 WO2019090486A1 (zh) 2017-11-07 2017-11-07 一种触摸控制方法及装置
CN201780082151.8A CN110168487B (zh) 2017-11-07 2017-11-07 一种触摸控制方法及装置
EP17931613.8A EP3683665B1 (en) 2017-11-07 2017-11-07 Touch control method and device
US16/754,599 US11188225B2 (en) 2017-11-07 2017-11-07 Touch control method and apparatus
EP22193002.7A EP4163778A1 (en) 2017-11-07 2017-11-07 Touch control method and apparatus
US17/374,133 US11526274B2 (en) 2017-11-07 2021-07-13 Touch control method and apparatus
US17/977,804 US11809705B2 (en) 2017-11-07 2022-10-31 Touch control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/109781 WO2019090486A1 (zh) 2017-11-07 2017-11-07 一种触摸控制方法及装置

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/754,599 A-371-Of-International US11188225B2 (en) 2017-11-07 2017-11-07 Touch control method and apparatus
US17/374,133 Continuation US11526274B2 (en) 2017-11-07 2021-07-13 Touch control method and apparatus

Publications (1)

Publication Number Publication Date
WO2019090486A1 true WO2019090486A1 (zh) 2019-05-16

Family

ID=66438726

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/109781 WO2019090486A1 (zh) 2017-11-07 2017-11-07 一种触摸控制方法及装置

Country Status (6)

Country Link
US (3) US11188225B2 (zh)
EP (2) EP3683665B1 (zh)
CN (3) CN114879893B (zh)
AU (1) AU2017438902B2 (zh)
ES (1) ES2947297T3 (zh)
WO (1) WO2019090486A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110442263A (zh) * 2019-07-23 2019-11-12 深圳市锐尔觅移动通信有限公司 触控显示屏处理方法、装置、存储介质及电子设备
CN111735450A (zh) * 2020-04-08 2020-10-02 腾讯科技(深圳)有限公司 一种惯性导航数据传输方法及装置
CN114341784A (zh) * 2019-11-12 2022-04-12 深圳市欢太科技有限公司 触摸事件的处理方法、装置、移动终端及存储介质

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879893B (zh) * 2017-11-07 2023-04-25 华为技术有限公司 一种触摸控制方法及装置
JP6880326B2 (ja) * 2018-06-27 2021-06-02 富士フイルム株式会社 撮像装置、撮像方法、及びプログラム
US10643420B1 (en) * 2019-03-20 2020-05-05 Capital One Services, Llc Contextual tapping engine
CN112525566B (zh) * 2019-09-17 2022-12-13 Oppo广东移动通信有限公司 设备测试方法、装置及电子设备
CN110727372B (zh) * 2019-09-29 2023-04-07 RealMe重庆移动通信有限公司 防误触方法、终端及存储介质
CN113127188A (zh) * 2019-12-31 2021-07-16 华为技术有限公司 一种性能优化方法与电子设备
CN112000247B (zh) * 2020-08-27 2024-07-23 努比亚技术有限公司 一种触控信号处理方法、设备及计算机可读存储介质
CN116048305B (zh) * 2023-03-27 2023-06-16 南京芯驰半导体科技有限公司 一种触摸显示控制方法、装置、系统及芯片
CN117102085B (zh) * 2023-10-24 2024-01-26 江苏省电子信息产品质量监督检验研究院(江苏省信息安全测评中心) 一种电子元器件一体化检测分选设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713759A (zh) * 2012-09-29 2014-04-09 宏景科技股份有限公司 触摸式输入装置及其控制方法、及产生定义档的方法
CN105045522A (zh) * 2015-09-01 2015-11-11 广东欧珀移动通信有限公司 手持终端的触摸控制方法及装置
CN107203321A (zh) * 2017-03-27 2017-09-26 网易(杭州)网络有限公司 游戏画面的显示控制方法及装置、存储介质、电子设备
CN107305466A (zh) * 2016-04-19 2017-10-31 西安中兴新软件有限责任公司 一种移动终端的操作方法和装置

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5611040A (en) * 1995-04-05 1997-03-11 Microsoft Corporation Method and system for activating double click applications with a single click
WO2008052100A2 (en) 2006-10-26 2008-05-02 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
CN101676844A (zh) 2008-09-18 2010-03-24 联想(北京)有限公司 触摸屏输入信息的处理方法及装置
US8510665B2 (en) 2009-03-16 2013-08-13 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9563350B2 (en) * 2009-08-11 2017-02-07 Lg Electronics Inc. Mobile terminal and method for controlling the same
US8358281B2 (en) 2009-12-15 2013-01-22 Apple Inc. Device, method, and graphical user interface for management and manipulation of user interface elements
JP4937421B2 (ja) * 2010-02-26 2012-05-23 株式会社カプコン コンピュータプログラム及びコンピュータ装置
CN102207783A (zh) 2010-03-31 2011-10-05 鸿富锦精密工业(深圳)有限公司 可自定义触摸动作的电子装置及方法
KR101361214B1 (ko) * 2010-08-17 2014-02-10 주식회사 팬택 터치스크린의 제어영역을 설정하는 인터페이스 장치 및 방법
CN102479039A (zh) * 2010-11-30 2012-05-30 汉王科技股份有限公司 一种触控设备的控制方法
CN102110220B (zh) * 2011-02-14 2013-01-23 宇龙计算机通信科技(深圳)有限公司 一种应用程序监控方法及装置
JP5295328B2 (ja) * 2011-07-29 2013-09-18 Kddi株式会社 スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム
JP5865039B2 (ja) * 2011-11-30 2016-02-17 キヤノン株式会社 情報処理装置、情報処理装置の制御方法、及びプログラム
US20130249810A1 (en) * 2012-03-22 2013-09-26 Microsoft Corporation Text entry mode selection
CN102707882A (zh) * 2012-04-27 2012-10-03 深圳瑞高信息技术有限公司 虚拟图标触摸屏应用程序的操控转换方法及触摸屏终端
KR101431581B1 (ko) * 2012-06-22 2014-09-23 성균관대학교산학협력단 모바일 단말기를 기반으로 한 가상 게임 컨트롤러 장치 및 이를 이용한 원격 컨트롤 시스템
CN103272382B (zh) * 2013-05-23 2016-07-06 深圳市时讯互联科技有限公司 蓝牙游戏手柄模拟智能终端触摸屏控制游戏的方法及装置
US9561432B2 (en) * 2014-03-12 2017-02-07 Wargaming.Net Limited Touch control with dynamic zones
CN104492080A (zh) * 2014-11-17 2015-04-08 深圳雷柏科技股份有限公司 一种游戏操控器摇杆灵敏度的调节方法及游戏操控器
CN104484111A (zh) * 2014-12-30 2015-04-01 小米科技有限责任公司 触摸屏的内容显示方法及装置
KR102211776B1 (ko) * 2015-01-02 2021-02-03 삼성전자주식회사 컨텐츠 선택 방법 및 그 전자 장치
US9772743B1 (en) * 2015-03-30 2017-09-26 Electronic Arts Inc. Implementation of a movable control pad on a touch enabled device
US9891811B2 (en) * 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
CN105094345B (zh) * 2015-09-29 2018-07-27 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN105335065A (zh) * 2015-10-10 2016-02-17 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN105959530A (zh) * 2016-04-26 2016-09-21 乐视控股(北京)有限公司 根据应用程序的个性化属性调用摄像头功能的方法和系统
CN106020678A (zh) * 2016-04-29 2016-10-12 青岛海信移动通信技术股份有限公司 一种在移动设备进行触控操作的方法和装置
CN106354418B (zh) 2016-11-16 2019-07-09 腾讯科技(深圳)有限公司 一种基于触摸屏的操控方法和装置
CN108604158A (zh) * 2016-12-01 2018-09-28 华为技术有限公司 一种终端应用操作区的自定制方法和终端
CN106775259A (zh) * 2017-01-09 2017-05-31 广东欧珀移动通信有限公司 一种信息的处理方法、装置及终端
CN107126698B (zh) 2017-04-24 2020-06-30 网易(杭州)网络有限公司 游戏虚拟对象的控制方法、装置、电子设备及可读介质
CN114879893B (zh) * 2017-11-07 2023-04-25 华为技术有限公司 一种触摸控制方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713759A (zh) * 2012-09-29 2014-04-09 宏景科技股份有限公司 触摸式输入装置及其控制方法、及产生定义档的方法
CN105045522A (zh) * 2015-09-01 2015-11-11 广东欧珀移动通信有限公司 手持终端的触摸控制方法及装置
CN107305466A (zh) * 2016-04-19 2017-10-31 西安中兴新软件有限责任公司 一种移动终端的操作方法和装置
CN107203321A (zh) * 2017-03-27 2017-09-26 网易(杭州)网络有限公司 游戏画面的显示控制方法及装置、存储介质、电子设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110442263A (zh) * 2019-07-23 2019-11-12 深圳市锐尔觅移动通信有限公司 触控显示屏处理方法、装置、存储介质及电子设备
CN114341784A (zh) * 2019-11-12 2022-04-12 深圳市欢太科技有限公司 触摸事件的处理方法、装置、移动终端及存储介质
CN111735450A (zh) * 2020-04-08 2020-10-02 腾讯科技(深圳)有限公司 一种惯性导航数据传输方法及装置

Also Published As

Publication number Publication date
US11188225B2 (en) 2021-11-30
US20220004316A1 (en) 2022-01-06
EP3683665A1 (en) 2020-07-22
CN110168487B (zh) 2022-04-05
EP3683665B1 (en) 2023-03-29
US11526274B2 (en) 2022-12-13
US11809705B2 (en) 2023-11-07
CN110168487A (zh) 2019-08-23
EP4163778A1 (en) 2023-04-12
CN114816208A (zh) 2022-07-29
EP3683665A4 (en) 2020-10-21
AU2017438902B2 (en) 2022-02-03
CN114879893A (zh) 2022-08-09
CN114879893B (zh) 2023-04-25
US20200257445A1 (en) 2020-08-13
US20230112839A1 (en) 2023-04-13
AU2017438902A1 (en) 2020-04-30
ES2947297T3 (es) 2023-08-04

Similar Documents

Publication Publication Date Title
US11809705B2 (en) Touch control method and apparatus
US11902355B2 (en) Method for sharing data in local area network and electronic device
WO2021104365A1 (zh) 对象分享方法及电子设备
US11989383B2 (en) Application window display method and terminal
US12067211B2 (en) Multi-window display interface with historical task bar
AU2021209226A1 (en) Display method and apparatus
WO2018223558A1 (zh) 数据处理方法及电子设备
WO2019183997A1 (zh) 视频的预览方法及电子设备
WO2020215969A1 (zh) 内容输入方法及终端设备
WO2020000276A1 (zh) 一种快捷按键的控制方法及终端
CN110178111B (zh) 一种终端的图像处理方法及装置
CN116048334A (zh) 页面切换方法、页面切换装置、电子设备和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17931613

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017931613

Country of ref document: EP

Effective date: 20200417

ENP Entry into the national phase

Ref document number: 2017438902

Country of ref document: AU

Date of ref document: 20171107

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE