CN111949161A - Touch method, mobile terminal and computer readable storage medium - Google Patents

Touch method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN111949161A
CN111949161A CN202010817000.6A CN202010817000A CN111949161A CN 111949161 A CN111949161 A CN 111949161A CN 202010817000 A CN202010817000 A CN 202010817000A CN 111949161 A CN111949161 A CN 111949161A
Authority
CN
China
Prior art keywords
touch
detecting
finger
mobile terminal
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010817000.6A
Other languages
Chinese (zh)
Inventor
龚乾坤
王张旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co Ltd filed Critical Shenzhen Microphone Holdings Co Ltd
Priority to CN202010817000.6A priority Critical patent/CN111949161A/en
Publication of CN111949161A publication Critical patent/CN111949161A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a touch method, which is applied to a mobile terminal with a touch pen, and comprises the following steps: detecting touch operation of a finger after detecting a stylus signal; and executing the function corresponding to the detected touch operation. The application also provides a mobile terminal and a computer readable storage medium. Different from the mode of realizing the function only by detecting the touch operation of the finger, the touch method further detects the touch operation of the finger to realize the function corresponding to the touch operation after detecting the touch pen signal, provides a new touch operation mode on the basis of the touch pen, and expands the interaction mode of the touch pen and the touch area.

Description

Touch method, mobile terminal and computer readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a touch method, a mobile terminal, and a computer-readable storage medium.
Background
In some implementations, when a user uses a touch screen device with a stylus, the user may contact the touch screen device with the stylus, and then implement a shortcut function corresponding to a touch operation according to the touch operation detected on the touch screen device, such as pressing a function button to open or close an application, etc., however, the shortcut function is configured only by pressing the function button with the stylus, and the implementation of the shortcut function that is single or even correspondingly implemented is very limited.
The above is only for the purpose of assisting understanding of the technical solutions of the present application, and does not represent an admission that the above is prior art.
Disclosure of Invention
The present application mainly aims to provide a touch method, a mobile terminal and a computer readable storage medium, and aims to solve the problem that a shortcut function implemented by configuring a shortcut key function only through a touch pen click operation is limited and single in implementation.
In order to achieve the above object, the present application provides a touch method applied to a mobile terminal with a stylus, where the touch method includes:
detecting touch operation of a finger after detecting a stylus signal;
and executing the function corresponding to the detected touch operation.
Optionally, the stylus is not in contact with a touch area of the mobile terminal.
Optionally, after the step of detecting the touch operation of the finger, the touch method includes:
acquiring the triggering times of the touch operation of the finger within a preset time length after the touch pen signal;
and when the triggering times reach preset times, executing the function corresponding to the touch operation for the detected preset times.
Optionally, the step of executing the function corresponding to the detected touch operation includes:
acquiring a currently running application;
and executing a function corresponding to the touch operation in the application.
Optionally, after detecting the stylus signal, the step of detecting the touch operation of the finger includes:
outputting a preset touch area prompt message after detecting a touch pen signal;
and detecting the touch operation of the finger in the preset touch area.
Optionally, after the step of detecting the touch operation of the finger, the touch method includes:
detecting a preset confirmation operation;
and when the preset confirmation operation is detected, executing the function corresponding to the detected touch operation.
Optionally, the touch area is located on a touch screen, or on a rear shell of the mobile terminal, or on a side of the mobile terminal.
Optionally, the touch operation includes at least one of a click operation, a slide operation, and a long press operation.
Optionally, the function includes at least one of a handset state or mode switch, a function open, a function close, an application edit, an application install, an application delete, or an application move.
Optionally, the application editing function comprises at least one of copy, paste, erase, undo, save, exit, and save and exit.
In addition, to achieve the above object, the present application also provides a mobile terminal, including: the touch control system comprises a memory, a processor and a touch control program stored in the memory and capable of running on the processor, wherein the touch control program realizes the steps of the touch control method when being executed by the processor.
In addition, to achieve the above object, the present application further provides a computer readable storage medium, on which a touch program is stored, and the touch program implements the steps of the touch method as described above when executed by the processor.
The touch method provided by the application is different from a mode of realizing the function by only detecting the touch operation of the finger, the touch method is based on the fact that after a touch pen signal is detected, the touch operation of the finger is further detected to realize the function corresponding to the touch operation, it needs to be explained that the touch pen is not in contact with a touch area, a new touch operation mode is provided on the basis of the touch pen, the interaction mode of the touch pen and the touch area is expanded, and after the touch pen signal is detected based on the touch pen, the use function of the touch pen is expanded by combining the touch operation of the finger, and possibility is provided for expanding and realizing other functions.
Drawings
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a touch method according to a first embodiment of the present application;
FIG. 4 is a specific implementation process of a combined operation of first clicking and then sliding after detecting a stylus signal;
fig. 5 is a flowchart illustrating a touch method according to a second embodiment of the present application;
FIG. 6 is a specific implementation process of three simultaneous sliding operations within a preset duration after a stylus signal is detected;
FIG. 7 is a specific implementation process of sequentially triggering three sliding operations within a preset duration after detecting a stylus signal;
fig. 8 is a flowchart illustrating a touch method according to a third embodiment of the present application;
fig. 9 is a flowchart illustrating a touch method according to a fourth embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning by themselves. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present application may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
The application provides a touch method, which is applied to a mobile terminal with a touch pen, wherein optionally, the touch pen is not contacted with a touch area of the mobile terminal. Referring to fig. 3, fig. 3 is a flowchart illustrating a touch method according to a first embodiment of the present application. In this embodiment, the touch method includes the following steps:
step S10, detecting the touch operation of the finger after detecting the touch pen signal;
the detection of the stylus signal may be obtained by detecting whether the mobile terminal and the stylus are in connection communication, for example, by detecting whether the mobile terminal and the stylus are in bluetooth connection, and when the mobile terminal and the stylus are in bluetooth connection, the detection of the stylus signal may be indicated. It should be noted that the stylus is not in contact with the touch area of the mobile terminal, and the stylus may be in a floating state near the touch area, where the touch area is located on the touch screen, or on the rear case of the mobile terminal, or on the side of the mobile terminal.
The touch operation comprises at least one operation type of click operation, slide operation and long press operation. The touch operation of the finger is detected, that is, the touch operation of the finger is detected in the touch area of the mobile terminal, and it can be understood that when the touch operation of the finger is detected in the touch area, the operation type of the touch operation can be determined according to the acquired touch parameter corresponding to the touch operation. For example, detecting a touch operation of a finger, and acquiring a touch parameter corresponding to the touch operation; when the touch parameter meets the preset touch parameter, it can be determined that the detected touch operation is a sliding operation. The preset touch parameter is a touch parameter of a sliding operation, and the preset touch parameter includes, but is not limited to, a sliding distance, a sliding speed, and a sliding direction. Similarly, when the touch parameter of the touch operation satisfies the preset touch parameter of the long press operation, it may be determined that the detected touch operation is the long press operation. The preset touch parameter includes, but is not limited to, a long pressing force and a long pressing time. It is easily conceivable that the detected touch operation is determined as the click operation when the touch parameter of the touch operation satisfies the preset touch parameter of the click operation. The preset touch parameters include, but are not limited to, a click pressure and a click duration. It can be understood that the corresponding relationship between the sliding operation and the preset touch parameter can be set in advance, and the corresponding relationship can be set in the production process of the terminal device, or can be set by the user according to the operation habit of the user, which is not limited to this.
Optionally, the touch operation may also be a combined operation, where the combined operation may detect, in the touch area, a touch operation implementation triggered by multiple touch points, where the multiple touch points may be implemented by touch operations triggered simultaneously or sequentially in the touch area. It should be noted that each touch point corresponds to a touch operation, and the touch operations include at least one operation type of a click operation, a slide operation, and a long press operation, so the touch operations corresponding to the touch points may be the same or different. For example, the combined operation formed by the touch operations triggered by the two contacts may be two click operations triggered simultaneously or two click operations triggered sequentially in sequence; the operation may be a first click operation and then a slide operation, or a first slide operation and then a click operation, which is not limited. Referring to fig. 4, fig. 4 shows a specific implementation process of a combined operation of detecting a stylus signal, then performing a clicking operation, and then performing a sliding operation, where T1, T2, and T3 are different time points, T1 is a time point when the stylus signal is detected, T2 is a time point when the clicking operation occurs, T3 is a time point when the sliding operation occurs, P1 is a stylus signal generated when the stylus approaches a touch area, P2 is the clicking operation, and P3 is the sliding operation. It is understood that the combination operation can be implemented by a single-hand touch operation, such as a left hand or a right hand, for example, a touch operation is implemented by touching the touch area with one finger of the right hand, and a touch operation is implemented by touching the touch area with the other finger of the right hand; or may be realized by a two-hand touch operation, for example, a touch operation is realized by touching the touch area with the left hand, and a touch operation is realized by touching the touch area with the right hand, which is not limited herein.
Step S20, executing a function corresponding to the detected touch operation.
In the actual application process, the corresponding relationship between the touch operation and the function realized by the touch operation can be set in advance, the corresponding relationship can be set by default in the production process of the mobile terminal, and can also be set by the user according to the operation habit of the user, without limitation. Executing a function correspondingly realized by touch operation when the touch operation of a finger is detected after the touch pen signal is detected, wherein the function comprises but is not limited to a mobile phone state or mode switching, such as a screen awakening function, an unlocking function, a conference mode or a power saving mode; function on, such as emergency call; function off, such as turning off an alarm clock or bluetooth; the application editing function, optionally, comprises at least one of copy, paste, erase, undo, save, exit, and save and exit. It should be noted that, after detecting the stylus signal, the function implemented corresponding to the touch operation of the detected finger is different from the function implemented corresponding to the touch operation of only the detected finger. For example, when the touch operation is a click operation, after detecting the stylus signal, it is detected that the corresponding function implemented by the click operation is to wake up the screen, and it is detected that only the corresponding function implemented by the click operation is to open the application.
Step S10 is followed by:
detecting a preset confirmation operation;
and when the preset confirmation operation is detected, executing the function corresponding to the detected touch operation.
It is understood that the detecting of the predetermined confirmation operation may be within a predetermined time period after step S10, and when the predetermined confirmation operation is detected within the predetermined time period, it indicates that the finger touch operation is detected to be input again after the stylus signal is detected, for example, the predetermined confirmation touch operation may be set by the user autonomously in advance, such as the stylus pressing the touch area, or detecting that the stylus signal is close to the touch area but not in contact with the touch area; optionally, the preset touch trajectory operation is used as a preset confirmation operation of the default setting of the system, and if the circle-drawing touch operation is detected in the touch area, the preset confirmation operation is detected, and a function corresponding to the touch operation can be triggered to be executed, so that false triggering of the function corresponding to the touch operation can be effectively avoided. It can be understood that the predetermined confirmation touch operation is different from the touch operation corresponding to the implementation function, where the touch operation corresponding to the implementation function, such as the touch operation, is a combined operation, that is, a touch operation implementation triggered by multiple contacts can be detected in the touch area, where multiple contacts can be simultaneously or sequentially triggered in the touch area, that is, multiple touch operations are simultaneously triggered or multiple touch operations are sequentially triggered, for example, a double-click operation; or a non-combined input operation such as only a click operation, a slide operation, or a long press operation, wherein the touch operation includes at least one of the click operation, the slide operation, or the long press operation.
In the technical scheme disclosed in this embodiment, different from a mode of realizing a function by only detecting a touch operation of a finger, the embodiment further detects the touch operation of the finger to realize a function corresponding to the touch operation based on the detection of a stylus signal, and it should be noted that the stylus is not in contact with a touch area, so that a new touch operation mode is provided based on the stylus, an interactive realization mode of the stylus and the touch area is expanded, and after the stylus signal is detected based on the stylus, the touch operation of the finger is combined to expand a use function of the stylus and provide possibility for expanding and realizing other functions.
Based on the first embodiment, a second embodiment of the touch method of the present application is provided, please refer to fig. 5, and fig. 5 is a flowchart illustrating the second embodiment of the touch method of the present application. In this embodiment, after the touch operation of the finger is detected in step S10, the touch method includes the following steps
Step S30, acquiring the triggering times of the touch operation of the finger within a preset time length after the touch pen signal;
and step S40, when the number of times of triggering reaches a preset number of times, executing a function corresponding to the touch operation for the detected preset number of times.
The preset duration is a preset duration after the stylus signal is detected, and it should be noted that the preset duration may be a fixed duration set by the user, such as 30 seconds or 1 minute; the object may also be an object acted by the touch operation of the current finger, and the object is various different application programs, for example, a preset time length may be respectively set when a game type application program is run or an office type application program is run, which is not limited to this. The trigger times can be obtained through an identification parameter, where the identification parameter is used to record the trigger times of the finger touch operation, and each time the touch operation is detected, 1 may be added to the identification parameter to update the trigger times of the touch operation, which is not limited herein. It is understood that the number of triggers may be a single trigger or multiple triggers. The touch operation includes at least one of a click operation, a slide operation, and a long press operation.
In the actual application process, when the number of triggering times is multiple, multiple finger touch operations may be triggered simultaneously, or may be triggered sequentially in sequence, which is not limited herein. For example, when the touch operation is a sliding operation and the number of triggering times is three, the three sliding operations may be triggered simultaneously, and the sliding operation may be performed in the touch area by three fingers simultaneously, that is, three contacts are detected in the touch area, and any one of the contacts corresponds to one sliding operation. Referring to fig. 6, fig. 6 shows a specific implementation process of three simultaneous sliding triggers within a preset duration after a stylus signal is detected, where T1 and T2 are different time points, T1 is a time when the stylus signal is detected, T2 is a time when the three simultaneous sliding triggers, T2 is within the preset duration of the T1 time point, P1 is a stylus signal generated by the stylus approaching the touch area, and P3, P4 and P5 are sliding triggers; or, the three sliding operations are sequentially triggered, please refer to fig. 7, where fig. 7 is a specific implementation process of sequentially triggering the three sliding operations within a preset time duration after detecting the stylus signal, where T1, T2, T3, and T4 are different time points, T1 is the time when the stylus signal is detected, T2 is the time when the first sliding operation occurs, T3 is the time when the second sliding operation occurs, T4 is the time when the third sliding operation occurs, T2, T3, and T4 are within the preset time duration of the time point T1, P1 is the stylus signal generated by the stylus approaching the touch area, and P3, P4, and P5 are the sliding operations; or, the three sliding operations are sequentially triggered within a preset time interval respectively. Similarly, multiple clicking operations or multiple long pressing operations can be triggered simultaneously within a preset time length; or, multiple click operations within a preset time length or multiple long-time sequential trigger according to the operation are similar to the specific implementation manner described above, and are not described herein again.
In the technical solution disclosed in this embodiment, different from a manner of realizing a function only by detecting multiple touch operations of a finger, such as a double-click operation, in this embodiment, after a stylus signal is detected, the touch operation of the finger and the number of trigger times of the touch operation within a preset duration are further detected, it is easily understood that the number of trigger times is single, twice or three times … … pushing the touch operation, and after the stylus signal is detected, the functions correspondingly realized by the touch operations with different trigger times may be the same or different, so as to further provide a possibility for implementing other functions in an expanded manner.
A third embodiment of the touch method of the present application is provided based on any one of the above embodiments, please refer to fig. 8, and fig. 8 is a flowchart illustrating the third embodiment of the touch method of the present application. In this embodiment, step S20 includes:
step S21, acquiring the current running application;
step S22, executing a function corresponding to the touch operation in the application.
The method comprises the steps of acquiring a currently running application, monitoring a system process interface, and further acquiring application information of the application process when the application process is determined to be started currently, wherein the application information can be identification information of the application, such as a process identification and an application package name, and further can uniquely determine the application corresponding to the currently running application process. Applications include, but are not limited to, entertainment applications, such as gaming applications, shopping applications, video applications, and music applications; office applications, such as PS, PPT, and reader applications; and chat applications such as WeChat and QQ.
The function corresponding to the touch operation in the application is executed, and after the application is based on the detection of the stylus signal, the detected finger touch operation and the corresponding relation between the functions can be set in advance according to the operation habit of the user, or can be set by default of the system, which is not limited to this. In the actual application process, functions corresponding to the detected touch operation of the finger after the detection of the stylus signal can be respectively set for different applications, for example, when the touch operation is a click operation, the current application interface function is quitted by detecting the click operation of the finger after one application detects the stylus signal, and the storage function is realized by another application; after the applications are subjected to type classification processing, the applications corresponding to the same type are uniformly set to detect the function corresponding to the touch operation of the finger after detecting the touch pen signal, the applications corresponding to different types are respectively set to detect the function corresponding to the touch operation of the finger after detecting the touch pen signal, and certainly, the applications corresponding to different types can also be correspondingly set to realize the functions; in order to facilitate the operation, a function corresponding to the detected touch operation of the finger after the detection of the stylus signal may be uniformly set for all applications, for example, when the touch operation is a single-click operation, the current application interface operation is exited by detecting the click operation of the finger after different applications detect the stylus signal, which is not limited in this embodiment.
For convenience of understanding, for example, when the currently running application is a drawing application, if a single finger is detected to slide in the touch area after a stylus signal is detected, an eraser function can be called to erase the current canvas pattern; after the touch pen signal is detected, the two fingers are detected to simultaneously slide in the touch area, so that the cancel function can be realized, and the cancel function is used for canceling the previous drawing; after the touch pen signal is detected, the three fingers are detected to simultaneously slide in the touch area, the functions of saving and quitting can be realized, the current drawing is saved, and the application main interface is quitted. It can be understood that, when the stylus signal is not detected, only the touch operation of the finger is detected, only the function corresponding to the touch operation of the finger is executed, and the function corresponding to the touch operation of the finger detected after the stylus signal is detected cannot be executed.
In the technical solution disclosed in this embodiment, a function corresponding to a detected finger touch operation in a currently running application is implemented based on a detected stylus signal according to the currently running application, that is, a function corresponding to the same finger touch operation in different applications is detected based on the detected stylus signal, which may be the same or different, and by associating and presetting the detected finger touch operation and the implemented function after the application and the stylus signal are detected, not only is it possible to implement other functions for each application expansion, but also a new touch operation manner for implementing application functions is provided.
A fourth embodiment of the touch method of the present application is provided based on any one of the above embodiments, please refer to fig. 9, and fig. 9 is a flowchart illustrating the fourth embodiment of the touch method of the present application. In this embodiment, step S10 includes:
step S11, outputting a preset touch area prompt message after detecting the touch pen signal;
step S12, detecting a touch operation of the finger in the preset touch area.
It should be noted that the touch area is located on at least one of the touch screen, the rear case of the mobile terminal, and the side of the mobile terminal. The preset touch area prompt information includes, but is not limited to, voice prompt information, prompt box information, and label information. The prompt box information is an area frame output when a preset touch area is located on the touch screen, the shape of the area frame includes but is not limited to rectangle, square and circle, and the boundary line of the area frame includes but is not limited to solid line or dotted line; the marking information is a marking color output when the preset touch area is located on the touch screen. The preset touch area may be set according to the touch habit of the user, for example, the area with high touch frequency is set by the user through left-handed or right-handed touch, which is not limited in this embodiment. The touch operation of the finger in the preset touch area is detected, the coordinate position of the touch area where the touch operation is located can be obtained when the touch operation of the finger is detected, and then whether the touch operation of the finger occurs in the preset touch area can be obtained by judging whether the coordinate position is located in the preset touch area. It can be understood that, when the touch operation of the finger is detected in the non-preset touch area, the touch operation is not responded or prompt information may be output to prompt the user that the touch operation is in the non-preset area, which is not limited in this embodiment. Optionally, after the stylus signal is detected, the touch operation of the finger is not detected within a preset time length, and the prompt information of the preset touch area can be hidden and not displayed.
In an actual application process, when the preset touch area is located on the touch screen, the touch screen area can be further divided into a plurality of touch areas, one or more touch areas in the plurality of touch areas are set as the preset touch area, and the preset touch area can be output in a prompt box information mode. When the preset touch area is a plurality of touch areas and the touch operation of the finger is combined operation, corresponding to the combined operation, a touch sequence can be set for the plurality of touch areas, or priorities can be set for the plurality of touch areas, so that the combined operation is sequentially triggered according to the touch sequence or priorities set for the plurality of touch areas.
In the technical scheme disclosed in this embodiment, after the stylus signal is detected, the touch operation touch area range of the finger is specified by outputting the preset touch area prompt information, so that the user can conveniently perform touch operation and misoperation performed by the user can be avoided.
The present application further provides a mobile terminal, where the mobile terminal includes a memory, a processor, and a touch program stored in the memory and capable of being executed on the processor, and the touch program is executed by the processor to implement the steps of the touch method in any of the above embodiments.
The present application further provides a computer-readable storage medium, in which a touch program is stored, and when the touch program is executed by a processor, the steps of the touch method in any of the above embodiments are implemented.
In the embodiments of the mobile terminal and the computer-readable storage medium provided in the present application, all technical features of the embodiments of the touch method are included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the touch method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method as described in the above various possible embodiments.
An embodiment of the present application further provides a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method described in the above various possible embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
It should be noted that step numbers such as S10 and S20 are used herein for the purpose of more clearly and briefly describing the corresponding content, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S20 first and then S10 in specific implementation, which should be within the scope of the present application.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. A touch method is applied to a mobile terminal with a touch pen, and is characterized by comprising the following steps:
detecting touch operation of a finger after detecting a stylus signal;
and executing the function corresponding to the detected touch operation.
2. The touch method according to claim 1, wherein after the step of detecting the touch operation of the finger, the touch method comprises:
acquiring the triggering times of the touch operation of the finger within a preset time length after the touch pen signal;
and when the triggering times reach preset times, executing the function corresponding to the touch operation for the detected preset times.
3. The touch method of claim 1, wherein the step of executing the function corresponding to the detected touch operation comprises:
acquiring a currently running application;
and executing a function corresponding to the touch operation in the application.
4. The touch method of claim 1, wherein the step of detecting the touch operation of the finger after detecting the stylus signal comprises:
outputting a preset touch area prompt message after detecting a touch pen signal;
and detecting the touch operation of the finger in the preset touch area.
5. The touch method according to claim 1, wherein after the step of detecting the touch operation of the finger, the touch method comprises:
detecting a preset confirmation operation;
and when the preset confirmation operation is detected, executing the function corresponding to the detected touch operation.
6. The touch method according to any one of claims 1 to 5, wherein the touch area is located on a touch screen, or on a rear case of the mobile terminal, or on a side of the mobile terminal.
7. The touch method of claim 1, wherein the touch operation comprises at least one of a click operation, a slide operation, and a long press operation.
8. The touch method of claim 1, wherein the function comprises at least one of a cell phone state or mode switch, a function open, a function close, an application edit, an application install, an application delete, or an application move.
9. A mobile terminal, characterized in that the mobile terminal comprises: a memory, a processor and a touch program stored in the memory and executable on the processor, wherein the touch program when executed by the processor implements the steps of the touch method according to any one of claims 1 to 8.
10. A computer-readable storage medium, wherein a touch program is stored on the computer-readable storage medium, and when executed by a processor, the touch program implements the steps of the touch method according to any one of claims 1 to 8.
CN202010817000.6A 2020-08-13 2020-08-13 Touch method, mobile terminal and computer readable storage medium Pending CN111949161A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010817000.6A CN111949161A (en) 2020-08-13 2020-08-13 Touch method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010817000.6A CN111949161A (en) 2020-08-13 2020-08-13 Touch method, mobile terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN111949161A true CN111949161A (en) 2020-11-17

Family

ID=73342239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010817000.6A Pending CN111949161A (en) 2020-08-13 2020-08-13 Touch method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111949161A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023045560A1 (en) * 2021-09-22 2023-03-30 惠州Tcl移动通信有限公司 Touch screen adjusting method, storage medium, and terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023045560A1 (en) * 2021-09-22 2023-03-30 惠州Tcl移动通信有限公司 Touch screen adjusting method, storage medium, and terminal

Similar Documents

Publication Publication Date Title
CN107222613B (en) Display method and terminal
CN107885448B (en) Control method for application touch operation, mobile terminal and readable storage medium
CN109697008B (en) Content sharing method, terminal and computer readable storage medium
CN109407957A (en) Screen touch control method, terminal and computer readable storage medium
CN107809534B (en) Control method, terminal and computer storage medium
CN108845711B (en) Screen touch method, terminal and computer readable storage medium
CN112181233B (en) Message processing method, intelligent terminal and computer readable storage medium
CN112230823A (en) Control method of mobile terminal, mobile terminal and storage medium
CN108563388B (en) Screen operation method, mobile terminal and computer-readable storage medium
CN112764704A (en) Screen projection method and system, screen projection equipment, mobile equipment and storage medium
CN110058767B (en) Interface operation method, wearable terminal and computer-readable storage medium
CN107566613A (en) A kind of application switch control method, mobile terminal and computer-readable recording medium
CN108810262B (en) Application configuration method, terminal and computer readable storage medium
CN109495643B (en) Object multi-chat frame setting method and terminal
CN110083294B (en) Screen capturing method, terminal and computer readable storage medium
CN109710149B (en) Interactive display method, equipment and computer readable storage medium
CN109683796B (en) Interaction control method, equipment and computer readable storage medium
CN112199141A (en) Message processing method, terminal and computer readable storage medium
CN109710168B (en) Screen touch method and device and computer readable storage medium
CN111949161A (en) Touch method, mobile terminal and computer readable storage medium
CN107807876B (en) Split screen display method, mobile terminal and storage medium
CN107580106B (en) Call control method, mobile terminal and computer readable storage medium
CN107613108B (en) Operation processing method and device and computer readable storage medium
CN107315523B (en) Split screen processing method, mobile terminal and computer readable storage medium
CN112306328B (en) Control method of mobile terminal, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination