WO2018112803A1 - 触摸屏手势识别的方法及装置 - Google Patents

触摸屏手势识别的方法及装置 Download PDF

Info

Publication number
WO2018112803A1
WO2018112803A1 PCT/CN2016/111350 CN2016111350W WO2018112803A1 WO 2018112803 A1 WO2018112803 A1 WO 2018112803A1 CN 2016111350 W CN2016111350 W CN 2016111350W WO 2018112803 A1 WO2018112803 A1 WO 2018112803A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch
touch screen
operation instruction
palm
Prior art date
Application number
PCT/CN2016/111350
Other languages
English (en)
French (fr)
Inventor
袁博
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2016/111350 priority Critical patent/WO2018112803A1/zh
Priority to CN201680057377.8A priority patent/CN108604160A/zh
Publication of WO2018112803A1 publication Critical patent/WO2018112803A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Embodiments of the present invention relate to the field of touch screen technologies, and in particular, to a touch screen gesture recognition method and apparatus.
  • touch screens can detect a change in position of a finger or a touch pen on the touch screen, such as clicking, sliding, etc., thereby executing a corresponding instruction in response to a user operation.
  • the types of touch screen gesture recognition are not rich enough, and the response of the touch screen is not intuitive enough for complex user commands.
  • the application will consume system memory and generate cache garbage during the running process, causing the mobile phone to run slower.
  • the user can release the memory and clear the cache through the setting menu, but the operation is cumbersome; in another prior art, the user can trigger the operation of clearing the cache garbage by shaking the mobile phone, but the method is easy to be accidentally touched.
  • the embodiment of the invention provides a touch screen gesture recognition method and device, and provides a new touch screen gesture application, which simplifies user operations and enriches the human-computer interaction mode of the mobile terminal.
  • an embodiment of the present invention provides a method for touch screen gesture recognition.
  • the method includes: detecting a gesture acting on the touch screen; parsing a feature parameter of the gesture; responding to the first operation instruction when a feature parameter of the gesture matches a first operation instruction; wherein the first operation
  • the instruction is a palm touch.
  • the detecting the gesture on the touch screen may be when the mobile terminal is in the unlocked state, or when the mobile terminal is in the locked state, or when the mobile terminal is in the blanking state.
  • the parsing the feature parameter of the gesture includes: detecting a sensing signal of the gesture on the touch screen; and detecting whether the sensing signal is a multi-wave peak; when the sensing signal is a multi-wave peak, detecting the sensing signal of the multi-wave peak surrounding area; and determining that the gesture is a palm touch when the sensing signal of the multi-wave peak surrounding area is 0.
  • the sensing signal is a single peak
  • the gesture is determined to be a finger touch.
  • Multi-peak peak The area enclosed by the plurality of sampling points at the tip and the edge of the palm corresponds to the palm portion of the palm.
  • the sensing signal generated by the palm portion is approximately 0. It can be understood that the sensing signal generated by the palm portion may be very weak, and a sensing signal generated by the palm portion may be set. Threshold, if the sensing signal is less than the threshold, determining that the gesture is a palm touch.
  • the parsing the feature parameter of the gesture includes: detecting a pressure value signal of the gesture on the touch screen; and detecting the pressure value signal Whether it is a multi-wave peak; when the pressure value signal is a multi-wave peak, detecting the sensing signal of the multi-wave peak enclosing area; determining the gesture as a palm when the multi-wave peak enclosing area pressure value signal is 0 touch.
  • the parsing the feature parameter of the gesture includes: obtaining the pressure of at least one sampling point on the gesture Value and movement trajectory.
  • the sampling point may be a fingertip of the finger and a number of points on the edge of the palm, or may be a number of points on the side edge.
  • the movement trajectory is determined by detecting the start and end points during the movement. When more than a predetermined number of sampling points of at least one of the sampling points move in the same direction, it can be judged that the palm is performing a palm touch action.
  • the moving speed of the at least one sampling point is substantially the same, and the moving speed of the sampling point can be known by measuring the moving speed of the sampling point in the vertical and horizontal directions of the touch screen.
  • the method before the responding to the first operation instruction, the method further includes: detecting whether the gesture acts on a designated area of the touch screen; When the gesture is a palm touch and acts on a designated area of the touch screen, the palm touch is responded to.
  • the touch screen may be partitioned, for example, the touch screen is divided into four areas of the same area of upper left, lower left, upper right, and lower right, and the designated area may be one of the four areas or some areas.
  • the touch screen can also be divided into three areas of left, center, and right, and the areas of the three areas can be different or the same.
  • the method before the responding to the first operation instruction, further includes: detecting whether the gesture is a first palm touch; When the gesture is the first palm touch, it is detected whether there is a second palm touch after the preset time; when the second palm touch is detected after the preset time, the second palm touch is responded. When the palm touch is detected again, it may coincide with the position where the palm touch is detected for the first time, or may be biased.
  • the responding to the first operation instruction is specifically: releasing memory or clearing a cache.
  • the first operation instructions mentioned herein include, but are not limited to, releasing memory, clearing the cache, opening other applications, or changing system settings and the like.
  • the first operation instruction may be to open a camera application to take a photo, open a flashlight application, open a recording application, turn Bluetooth on or off, turn Wi-Fi on or off, and the like.
  • the responding to the first operation instruction further includes: presenting, on the touch screen, a prompt box that responds to the first operation instruction.
  • the mobile terminal may present the prompt information to the user, and the presentation manner may be in the form of text, picture, audio, video, or the like.
  • the user can choose whether to respond to the first operation instruction.
  • an embodiment of the present invention provides an apparatus, including a touch screen, one or more processors, a memory, and one or more programs; the one or more programs are stored in the memory and configured to Executed by the one or more processors, the one or more programs comprising instructions for: detecting a gesture acting on the touch screen; parsing a characteristic parameter of the gesture; and characterizing the characteristic of the gesture Responding to the first operational instruction when the first operational command is matched; wherein the first operational instruction is a palm touch.
  • the parsing the feature parameter of the gesture includes: obtaining a pressure value and a movement trajectory of at least one sampling point on the gesture.
  • the responding to the first operation instruction is specifically: releasing memory or clearing a cache.
  • an embodiment of the present invention provides a device, including a touch screen, a processor, and a memory; the touch screen detects a gesture acting on the touch screen; and the processor is configured to: according to a rule of a feature parameter of the parsing gesture stored in the memory, And parsing the feature parameter of the gesture, when the feature parameter of the gesture matches the first operation instruction, responding to the first operation instruction; the memory is configured to store a rule for resolving the feature parameter of the gesture; wherein the first operation instruction is a palm touch.
  • the rule for resolving the feature parameter of the gesture is specifically: detecting a sensing signal of the gesture on the touch screen; detecting the sensing Whether the signal is a multi-peak; when the sensing signal is a multi-peak, detecting the sensing signal of the multi-peak surrounding area; when the multi-peak surrounding area sensing signal is 0, determining that the gesture is a palm touch .
  • the rule for parsing the feature parameter of the gesture is specifically: detecting a pressure value signal of the gesture on the touch screen; and detecting the pressure value signal Whether it is a multi-wave peak; when the pressure value signal is a multi-wave peak, detecting the sensing signal of the multi-wave peak enclosing area; determining the gesture as a palm when the multi-wave peak enclosing area pressure value signal is 0 touch.
  • the rule for parsing the feature parameter of the gesture is specifically: obtaining the pressure of at least one sampling point on the gesture Value and movement trajectory.
  • the responding to the first operation instruction is specifically: releasing memory or clearing a cache.
  • an embodiment of the present invention provides a device for recognizing a touch screen gesture, the device comprising: a detecting unit, a parsing unit, a judging unit, and an executing unit; wherein the detecting unit is configured to detect a gesture acting on the touch screen; The parsing unit is configured to parse the feature parameter of the gesture; the determining unit is configured to determine whether the feature parameter of the gesture matches the first operation instruction; and the execution unit is configured to: when the feature parameter of the gesture matches the first operation instruction, Responding to the first operation instruction; wherein the first operation instruction is a palm touch, and in response to the first operation instruction, releasing the memory or clearing the cache.
  • an embodiment of the present invention provides a computer storage medium storing one or more programs, the one or more programs including instructions for detecting a gesture acting on a touch screen; a feature parameter of the gesture; responding to the first operation instruction when the feature parameter of the gesture matches the first operation instruction; wherein the first operation instruction is a palm touch.
  • the responding to the first operation instruction is specifically: releasing the memory or clearing the cache.
  • FIG. 1 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of a method for gesture recognition of a touch screen according to an embodiment of the present invention
  • FIG. 3 is a flowchart of a method for detecting a palm touch according to an embodiment of the present invention
  • FIG. 4 is a flowchart of another method for detecting a palm touch according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a method for gesture recognition of a touch screen according to an embodiment of the present invention
  • FIG. 6 is a flowchart of another method for gesture recognition of a touch screen according to an embodiment of the present invention.
  • FIG. 7A is a schematic diagram of selecting a sampling point according to an embodiment of the present invention.
  • FIG. 7B is a schematic diagram of another selection of sampling points according to an embodiment of the present invention.
  • FIG. 7C is a schematic diagram of sampling point movement according to an embodiment of the present invention.
  • FIG. 8A is a three-dimensional coordinate diagram showing intensity of an induced signal according to an embodiment of the present invention.
  • FIG. 8B is another three-dimensional coordinate diagram showing the intensity of an induced signal according to an embodiment of the present invention.
  • FIG. 8C is a schematic diagram of a palm sampling point movement according to an embodiment of the present invention.
  • FIG. 9A is a schematic diagram of a touch screen partition according to an embodiment of the present invention.
  • FIG. 9B is a schematic diagram of another touch screen partition according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a device according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of a device for gesture recognition of a touch screen according to an embodiment of the present invention.
  • the terminal involved in the embodiment of the present invention may be a terminal with a touch screen, including but not limited to a mobile phone, a tablet computer, a personal digital assistant (PDA), a wireless handheld device, a wireless netbook, a portable computer, and a media player. Devices, smart watches, etc.
  • the operating system carried by the terminal includes but is not limited to DOS, Unix, Linux, or other operating system. Some embodiments of the present invention take a mobile phone as an example, and those skilled in the art should understand that these embodiments are also applicable to other terminals having a touch screen.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile phone 100. It should be understood that the handset 100 is just one example of the above-described mobile terminal with a touch screen, and that the handset 100 may have more or fewer components than shown, two or more components may be combined, or may have these Different configurations or arrangements of components.
  • the various components shown in FIG. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the mobile phone 100 includes a processor 110, a memory 120, an input device 130, a display device 140, a sensor 150, an audio circuit 160, a radio frequency circuit 170, and a power source 180. These components pass one or more communications Bus or signal line to communicate.
  • the processor 110 is a control center of the handset 100 that connects various portions of the handset 100 using various interfaces and lines, by executing or executing a software program or set of instructions stored in the memory 120, and invoking data stored in the memory 120, executing
  • the various functions of the mobile phone 100 and the processing of data enable overall monitoring of the handset 100.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem, where the application processor mainly processes an operating system, a user interface, an application, etc., and the modem is mainly Handle wireless communications. It can be understood that the above modem processor may not be integrated into the processor 110.
  • the processor 110 can include an image signal processor and a dual core/multicore processor.
  • the memory 120 can be used to store software programs and function modules, and the processor 110 executes various functional applications and data processing of the mobile phone 100 by running software programs and function modules stored in the memory 120.
  • the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to The data created by the use of the mobile phone 100 (such as audio data, phone book, etc.) and the like.
  • the memory 120 includes a high speed random access memory, and also includes a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the memory 120 can store various operating systems, such as the Windows operating system of Microsoft Corporation, or the Android operating system developed by Google Inc., or the IOS operating system of Apple Inc., and the like.
  • the input device 130 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function controls of the handset 100.
  • the input device 130 may include a touch panel 131 and other input devices 132.
  • the touch panel 131 also referred to as a touch screen, can collect touch operations on or near the user (eg, the user uses any suitable part or object such as a finger, a stylus, or the like on the touch panel 131 or near the touch panel 131. Operation) and drive the corresponding connection device according to a preset program.
  • the touch panel 131 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 110 is provided and can receive commands from the processor 110 and execute them.
  • the touch panel 131 is implemented by various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input device 130 includes other input devices 132.
  • other input devices 132 include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switches) One or more of a button, a Home button, etc., a trackball, a mouse, a joystick, and the like.
  • function keys such as volume control buttons, switches
  • the touch panel 131 includes a touch-sensitive surface for performing various operations related to contact detection, such as determining whether contact has occurred (eg, detecting a finger press event), Touching the pressure value and coordinate information, determining if there is movement of the contact and tracking the movement over the entire touch-sensitive surface (eg, detecting one or more finger-drag events), and determining if the contact has terminated (eg, detecting finger lift) Event or contact interruption). Determining the movement of the contact point may include determining the rate (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the contact point, the movement of the contact point being caused by a series of contact data Said.
  • Touch detection techniques include, but are not limited to, capacitive, resistive, infrared, surface acoustic wave technology, and the like.
  • touch panel 131 should be understood as a generalized touch input device, and the touch sensitive surface can be integrated with the display screen, or can be separated to be connected to the system as a separate touch input device, such as coordinating mouse movement and mouse.
  • Button presses with or without single or multiple keyboard presses or hold
  • user movement taps drags, scrolls, etc. on the touchpad
  • stylus input can be used as touch input devices.
  • finger input eg, single finger contact, single finger tap gesture, single finger swipe gesture
  • one of these finger inputs Multiple or more may be replaced by an input from another touch input device (eg, a stylus input).
  • the user's gestures are flexible and can be click, double click, circle, draw line, single finger touch, or multi finger touch, and so on.
  • the selection of a particular gesture is flexible as long as substantially the same effect is achieved.
  • the position or area of the user's gesture acting on the touch-sensitive surface is flexible, and may be the area of an application interface element displayed on the display screen or a nearby area, and the display does not display the blank of the application interface element. Area, the display shows the area of a function setting, and so on.
  • the specific location or area of the gesture acting on the touch-sensitive surface can be flexibly set as long as substantially the same effect can be achieved.
  • the display device 140 can be used to display information input by the user or information provided to the user and various menus of the mobile phone 100.
  • the display device 140 may include a display panel 141, optionally, using an LCD (Liquid)
  • the display panel 141 is configured in the form of a Crystal Display (LCD) or an OLED (Organic Light-Emitting Diode).
  • the touch panel 131 can cover the display panel 141.
  • the touch panel 131 detects a touch operation on or near the touch panel 131, the touch panel 131 transmits to the processor 110 to determine the type of the touch event, and then the processor 110 according to the touch event.
  • the type provides a corresponding visual output on display panel 141.
  • Visual output includes text, graphics, icons, videos, and any combination thereof.
  • some visual output or all of the visual output may correspond to a user interface object.
  • the touch panel 131 and the display panel 141 are used as two independent components to implement the input and output functions of the mobile phone 100 in FIG. 1, in some embodiments, the touch panel 131 is integrated with the display panel 141. The input and output functions of the mobile phone 100 are implemented.
  • the handset 100 can also include sensors 150, such as light sensors, motion sensors, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 141 when the mobile phone 100 moves to the ear. / or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the mobile phone 100 can also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like. I will not repeat them here.
  • the handset 100 can also include an audio circuit 160 in which a speaker 161, the microphone 162 can provide an audio interface between the user and the handset 100.
  • the audio circuit 160 can transmit the converted electrical data of the received audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal by the audio circuit 160. After receiving, it is converted into audio data, and then processed by the audio data output processor 110, sent to another mobile phone via the radio frequency circuit 180, or outputted to the memory 120 for further processing.
  • the audio circuit 160 also includes a headset jack that provides an interface between the audio circuit 160 and a removable audio input/output peripheral, such as an output only headset or with an output (for example, a single or binaural headset) and an input (eg, a microphone) headset.
  • a headset jack that provides an interface between the audio circuit 160 and a removable audio input/output peripheral, such as an output only headset or with an output ( For example, a single or binaural headset) and an input (eg, a microphone) headset.
  • a radio frequency (RF, Radio Freqency) circuit 170 can be used for transmitting and receiving information or receiving and transmitting signals during a call, converting the electrical signal into an electromagnetic signal or converting the electromagnetic signal into an electrical signal, and transmitting the electromagnetic signal to the communication network and other communication devices.
  • Communication in particular, after receiving the downlink information of the base station, The processor 110 is processed; in addition, the data for designing the uplink is transmitted to the base station.
  • Well-known circuitry for performing these functions may be included, including but not limited to antenna systems, radio frequency transceivers, one or more amplifiers, tuners, one or more oscillators, digital signal processors, codec chipsets, SIM (Subscriber Identity Module) card and so on.
  • the radio frequency circuit 170 can communicate with the network and other devices via wireless communication, such as the Internet, an intranet, and/or a wireless network (such as a cellular telephone network, a wireless local area network, and/or a metropolitan area network).
  • Wireless communication can use any of a variety of communication standards, protocols, and technologies including, but not limited to, global mobile communication systems, enhanced data GSM environments, high speed downlink packet access, high speed uplink packet access, wideband code division Multiple access, code division multiple access, time division multiple access, Bluetooth, wireless fidelity (eg, IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet voice protocol, Wi-MAX, email protocol (eg, Internet Message Access Protocol (IMAP) and/or Post Office Protocol (POP)), instant messaging (eg, Extensible Messaging Processing Site Protocol (IMAP) and/or Post Office Protocol (POP)), instant messaging (eg, Extensible Messaging Processing Site Protocol (IMAP) and/or Post Office Protocol
  • the power source 180 is used to power and manage the mobile phone 100.
  • the power source 180 can include a power management system, one or more power sources (eg, battery, alternating current), a recharging system, a power fault detection circuit, a power converter or inverter, a power status indicator (eg, a light emitting diode) ) and any other components associated with the generation, management, and distribution of power in the handset.
  • a power management system e.g, one or more power sources (eg, battery, alternating current), a recharging system, a power fault detection circuit, a power converter or inverter, a power status indicator (eg, a light emitting diode) ) and any other components associated with the generation, management, and distribution of power in the handset.
  • the mobile phone 100 may further include a camera, a Bluetooth module, and the like, and details are not described herein.
  • the touch screen 131 of the mobile phone 100 can detect user input on the screen, such as finger taps, slides, and the like. Taking a capacitive touch screen as an example, the touch screen 131 detects a capacitance change point when a finger touches, and the position of the finger touch can be known by the coordinates of the point. Among them, the finger click input can be regarded as the coordinates of a point on the screen, and the finger swipe input can be regarded as the displacement between the coordinates of the starting point on the screen and the coordinates of the ending point.
  • the processor 110 can receive user input coordinates detected by the touch screen 131 and can execute corresponding user instructions. Optionally, when the touch screen 131 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display according to the type of the touch event.
  • the touch screen 131 can detect the pressure value input by the user.
  • the touch screen 131 can provide feedback to the user according to the magnitude of the pressure value.
  • the feedback can be a tactile feedback to ensure that only the user currently making the input is aware of it. For example, vibration and the like.
  • the touch screen 131 integrates a capacitive sensor to know the change in touch pressure based on a change in capacitance value due to screen deformation upon touch.
  • the touch screen 131 of the mobile phone is equipped with pressure sensors that specifically sense the pressure values. The pressure sensors can be disposed at the four corners of the touch screen 131. When a touch event occurs, the sensor can sense the pressure value.
  • the processor will respond to the user's pressing.
  • the processor can call the getPressure method in the MotionEvent Class to obtain the user input pressure value. Value. It can be understood that the device can also call other interface functions to detect the pressure value, and there is no limitation here.
  • the mobile phone 100 generates cache garbage during the running process, and the running system processes and applications consume memory, causing the mobile phone 100 to run slower.
  • the Android system itself does not have a statement that the process is closed. Whenever the user wants to exit a process and return to the desktop or open another program, the return key can be pressed. When the user presses the back button, the process is not actually closed, is still stored in memory, and runs in the background so that the program can be opened more quickly the next time it is called. To really close an open process, it is usually when the Android system thinks that there is not enough memory to run the new process, thus closing some processes that are already open but not useful, which requires a variety of The mechanism by which processes are identified.
  • ActivityManagerService.java records the priority of each process.
  • the priority of a process is represented by the oom_adj value, and the higher the oom_adj value, the lower the priority of the process.
  • a process in use has an oom_adj value of 0.
  • the process gets a higher oom_adj value and a lower priority; the exact oom_adj value usually depends on the process in the LRU ( Last recently used) The location of the list.
  • the above memory management mechanism can not completely eliminate the problem of memory shortage, and there may be a situation in which the management mechanism fails, so it is necessary for the user to manually release the memory.
  • the free memory here can also be used to clean up the system background and defragment the memory.
  • the mobile phone system will accumulate a large amount of cache during the running process. For example, an application that uses the news browsing will generate a large amount of image cache, and if the music is listened to, it will also generate a large amount of cache. These caches consume the storage space of the mobile phone, and the user needs to Manually clear the cache.
  • users can free up memory and clear the cache by setting menus or specific applications. This operation requires manual operation by the user, which is cumbersome.
  • Some specific applications It can also be used to perform similar functions, such as clearing the cache after opening a specific application, or cleaning the cache by shaking the phone, but this method requires a specific application to be installed, which causes inconvenience to the user.
  • an embodiment of the present invention provides a method for gesture recognition of a touch screen, which specifically includes the following steps:
  • Gesture recognition as an intuitive and natural input method, frees people from traditional contact input devices and interacts with computers in a more natural way, making computer interfaces easier.
  • Gestures are mainly divided into static gestures and dynamic gestures.
  • Dynamic gestures can be regarded as continuous static gesture sequences.
  • Dynamic gestures are rich and intuitive, combined with static gestures to create richer semantics.
  • the use of dynamic gesture recognition to construct a new interactive interface is a new generation of human-computer interaction interface requirements for the naturalness of the input method, which can make up for the lack of traditional interaction methods.
  • the touch screen of the mobile terminal can detect the gesture input of the user, for example, the touch-sensitive surface of the touch screen can detect the gesture input of the user.
  • the touch screen can detect the gesture input through the pressure sensing function.
  • the detecting the gesture on the touch screen may be when the mobile terminal is in the unlocked state, or when the mobile terminal is in the locked state, or when the mobile terminal is in the blanking state.
  • the interface in which the mobile terminal is in the locked state is called a lock screen interface.
  • the lock screen interface refers to the user interface displayed when the mobile terminal is illuminated and the display screen of the mobile terminal is illuminated.
  • the lock screen interface includes multiple menu controls (such as unlocking the module, clock) or application control icons (such as the music application's switch controls).
  • the lock screen interface can have one or more.
  • the mobile terminal includes three lock screen interfaces (ie, lock screen interface 1, lock screen interface 2, lock screen interface 3). When the display screen of the mobile terminal is illuminated, the lock screen interface 2 is displayed.
  • the mobile terminal When the user slides the input screen to the left while the display screen of the mobile terminal is locked, the mobile terminal responds to the left sliding motion, and the mobile terminal will present the current The displayed lock screen interface 2 is switched to the lock screen interface 1.
  • the mobile terminal When the user inputs the action of sliding to the right through the touch screen, the mobile terminal responds to the right sliding action, and the mobile terminal switches the currently displayed lock screen interface 2 to the lock screen interface 3.
  • the home screen page refers to the page displayed after the mobile terminal is unlocked.
  • the home screen page in the mobile terminal contains the application's control icons, menu icons, and the like. There may be one or more home screen pages in the mobile terminal, and as the control icons increase, the home screen pages in the mobile terminal may be automatically increased. of course, The mobile terminal can also add or delete a home screen page according to a user's input instruction.
  • the non-home screen page refers to a page that is not displayed after the mobile terminal is unlocked and needs a user command or a system command to display. For example, after the mobile terminal is unlocked, the mobile terminal detects that the user's finger swipes to the left, and then displays Other non-home screen pages.
  • the screen-off state is to turn off the screen backlight and turn off the screen.
  • the mobile terminal may determine that the user operation is not detected within a certain time, and the screen is turned off, or the user manually turns off the screen backlight to turn off the screen. In the off state, the mobile terminal may shut down some system processes to save power.
  • the touch screen can obtain the feature parameters of at least one sampling point on the gesture.
  • the characteristic parameters of the at least one sampling point may be pressure values of the points, or may be capacitance change values of the points.
  • the touch screen can also acquire the movement track of the gesture, for example, the touch screen can acquire the start point and the end point of the at least one sampling point on the gesture during the movement.
  • the pressure value of the sampling point can be obtained by measuring the contact area of the gesture with the touch screen; for other touch screens, the gesture pressing causes the piezoelectric film on the touch screen to be deformed, and according to the piezoelectric effect, The charge is generated, and the pressure value of the sampling point can be obtained by measuring the amount of charge; for other touch screens, the gesture pressure changes the capacitance at the sampling point of the touch screen, thereby measuring the pressure value of the sampling point.
  • the touch screen can detect the shape of the palm.
  • the principle of the infrared touch screen is that the infrared emitting tube and the receiving tube are arranged on the frame of the touch screen to form a dense infrared matrix.
  • the finger blocks the two infrared rays passing through the touch point to determine the coordinates of the touch point.
  • the palm touch it will block the infrared rays in the hand area.
  • the mobile terminal determines whether the blocked infrared region matches the pre-stored palm shape, thereby determining whether the user input is a palm touch.
  • the first operation instruction may be a palm touch; in response to the first operation instruction, the mobile terminal may release the memory and clear the cache.
  • the first operation instruction is responded to, for example, triggering the mobile terminal to execute an instruction to release the memory and clear the cache.
  • the mobile terminal may present the prompt information to the user, where the presentation manner may be a text, a picture, an audio, a video, or the like, for example, a prompt box that can respond to the first operation instruction on the touch screen. Or with sound
  • the frequency form asks the user whether to respond to the first operation instruction.
  • the user may select whether to respond to the first operation instruction, for example, the user may select whether to respond to the first operation instruction in the prompt box, or select whether to respond to the first operation instruction by using a voice command.
  • the mobile terminal may present the user with a prompt message that has responded to the first operation instruction, and the presentation manner may be in the form of text, picture, audio, video, etc., for example, the first operation instruction may be presented on the touch screen. Tip box.
  • the first operation instructions mentioned herein include, but are not limited to, releasing memory, clearing the cache, opening other applications, or changing system settings and the like.
  • the first operation instruction may be to open a camera application to take a photo, open a flashlight application, open a recording application, turn Bluetooth on or off, turn Wi-Fi on or off, etc., without limitation.
  • the palm touch mentioned here can be the movement of the palm on the touch screen.
  • the palm of the embodiment of the present invention may be a portion between the finger and the wrist, and may also include a finger and a portion between the finger and the wrist.
  • the moving action on the touch screen may be moving from the left side to the right side, moving from the right side to the left side, moving from the top to the bottom, or moving from the bottom to the top, or Move along the diagonal of the screen and move along other tracks.
  • the feature parameter of the gesture is matched with the first operation instruction, and may be an action that the feature parameter of the gesture measured by the touch screen conforms to the palm touch.
  • the feature parameter of the at least one sampling point may be compared with the preset characteristic parameter of the palm touch to determine whether the content is consistent.
  • the rule of comparison may be that the characteristic parameters of the two are completely consistent, or the similarity of the two may exceed a certain threshold, such as 70% similarity, or 90% similarity.
  • determining whether the palm touch action is performed can be divided into two steps.
  • the touch screen determines whether the user gesture is the shape of the palm.
  • the touch screen determines whether the shape of the palm moves in a specific direction.
  • the touch screen determines whether the user gesture is a shape of a palm, and selects at least one sampling point on the user gesture to detect a pressure value or a capacitance value of at least one sampling point.
  • the touch screen may select a fingertip of a finger as a sampling point on a user gesture, or select several points of a fingertip and a palm edge as sampling points. It will be appreciated that the sampling points may be continuous points (expressed as a curve) or discrete points that are discrete.
  • the contact between the palm and the touch screen is also different.
  • the ratio of the palm area to the touch screen is small, the palm can be overlaid on the touch screen, and the finger and the palm edge are in contact with the touch screen.
  • the ratio of the palm area to the touch screen is small, and the user's palm is small, or the touch screen area is large (for example, a touch screen of a tablet). This ensures that the user's palm is overlaid on the touch screen.
  • the sampling point can be a fingertip of the finger and several points of the palm edge. Referring to FIG. 7A, a total of nine sampling points are selected.
  • the sampling points at the fingertips are a, b, c, d, and e, and the sampling points selected at the palm edge are f, g, h, and i.
  • the ratio of the palm area to the touch screen is large, the user's palm is not enough to cover the touch screen, or the palm can cover the touch screen but there is no moving space.
  • the user can wipe the screen with one side edge of the palm. .
  • the sampling point can be a number of points on the side edge. Referring to FIG. 7B, when the user's palm side of the palm of the hand wipes the screen toward the touch screen, the four sampling points of j, k, l, and m of the side palm are selected.
  • the touch screen can also acquire the movement track of the gesture.
  • the touch screen can acquire the start point and the end point of the at least one sampling point on the gesture during the movement.
  • the touch screen tracks the start and end points of each sample point movement.
  • when one of the sampling points moves in a certain direction it can be judged that the palm is performing a palm touch action.
  • more than a predetermined number of sampling points of the at least one sampling point move in the same direction it may be determined that the palm is performing a palm touch action.
  • more than a predetermined number may be more than 2 sampling points, may be more than 4 sampling points, may be more than half of the sampling points, and may be more than two-thirds of the sampling points.
  • the direction of the movement means that the movement trajectories are substantially the same, and the movement trajectories of the respective sampling points are not required to be completely parallel.
  • the moving speed of the at least one sampling point is substantially the same, and the moving speed of the sampling point can be known by measuring the moving speed of the sampling point in the vertical and horizontal directions of the touch screen. Referring to FIG.
  • sampling points a, c, e, and g are selected in the sampling points, and the moving starting points are respectively a1, c1, e1, and g1, and the mobile terminals are respectively a2, c2, e2, and g2, if detected,
  • the four sampling points move in the same direction, or the moving speed is the same, it can be judged that the palm is touching the palm.
  • an embodiment of the present invention provides a method for detecting a palm touch, including:
  • S301 Detect a sensing signal of the gesture on the touch screen.
  • the sensing signal can be understood as the value of the capacitance change generated by the gesture.
  • the capacitive touch screen can detect the change in capacitance generated by the user's gesture on the touch screen.
  • S302 Detect whether the sensing signal is a multi-peak.
  • the gesture is determined to be a finger touch. See Figure 8A, sitting in three dimensions In the standard system, the x-axis and the y-axis represent the vertical and horizontal directions of the touch screen, and the z-axis represents the intensity of the induced signal. A single peak appears in Figure 8A, indicating that the gesture is a finger touch.
  • the sensing signal is multi-peak.
  • the multi-peak peak envelops the area, that is, the area enclosed by the plurality of sampling points of the fingertip and the palm edge, corresponding to the palm portion of the palm.
  • the sensing signal generated by the palm portion is approximately 0. It is understood that the sensing signal generated by the palm portion may be very weak.
  • a threshold may be set for the sensing signal generated by the palm portion, and if the sensing signal is less than the threshold, the gesture is determined to be a palm touch.
  • the x-axis and the y-axis represent the longitudinal and lateral directions of the touch screen, and the z-axis represents the induced signal strength.
  • a multi-wave peak appears, and a region surrounded by multiple peaks has an area where the induced signal is approximately 0, which is the palm region.
  • the touch screen detects that the peak corresponding to the sampling point also moves.
  • a plurality of sampling points formed by the fingertip and the palm edge can be approximated as a plurality of points on one ellipse (the four points o, p, q, r in FIG. 8C). Then the movement of the palm on the screen can be approximated as an ellipse moving on the screen.
  • an embodiment of the present invention provides another method for detecting a palm touch, including:
  • S402 Detect whether the pressure value signal is a multi-peak.
  • the z-axis represents the magnitude of the pressure value.
  • the multi-peak peak envelops the area, that is, the area enclosed by the plurality of sampling points of the fingertip and the palm edge, corresponding to the palm portion of the palm.
  • the pressure value signal generated by the palm portion is approximately 0. It is understood that the pressure value signal generated by the palm portion may be very Weak, a threshold value can be set for the pressure value signal generated by the palm portion, and if the pressure value signal is smaller than the threshold, the gesture is determined to be a palm touch.
  • another embodiment of the present invention provides a method for gesture recognition of a touch screen, including:
  • S501 Detect a gesture acting on the touch screen. Similar to S201, it will not be described here.
  • S502 Detect whether a gesture acts on a designated area of the touch screen.
  • the touch screen detects whether the user's gesture input acts in a designated area of the touch screen.
  • the touch screen may be partitioned, for example, the touch screen is divided into four areas of the same area of upper left, lower left, upper right, and lower right, and the designated area may be one of the four areas or some of the areas. region.
  • the touch screen may be divided into three areas of left, center, and right, and the areas of the three areas may be different or the same.
  • S503 Detect whether the gesture is a palm touch.
  • S502 may be: detecting whether a gesture acts on a designated area of the touch screen; S503 may be: detecting whether the gesture is a palm touch when the gesture acts on a designated area of the touch screen. In another case, S502 may be: detecting whether the gesture is a palm touch; S503 may be: detecting whether the gesture acts on a designated area of the touch screen when the gesture is a palm touch.
  • S504 Respond to the palm touch when the gesture is a palm touch and acts on a designated area of the touch screen.
  • the mobile terminal can release the memory, clear the cache, and the like.
  • the specific method refer to the manner of responding to the first operation instruction in S203, and details are not described herein again.
  • another embodiment of the present invention provides another method for touch screen gesture recognition, including:
  • S601 Detect a gesture acting on the touch screen. Similar to S201.
  • S602 Parse the feature parameter of the gesture. Similar to S202.
  • the contact area of the gesture with the touch screen may be detected.
  • the contact area of the palm and the touch screen is greater than a preset value, determining that the user gesture is a palm touch.
  • the contact area of the user gesture with the touch screen can also be understood as the coverage area of the touch screen being gestured by the user; the contact area can be It is preset, for example, more than one-third of the area of the touch screen, or more than one-half of the area of the touch screen, and so on.
  • the preset time after the palm touch is detected for the first time for example, after the touch screen detects the palm touch, the interval is 1 second, and the palm touch is detected again. It can be understood that the preset time here can also be 2 seconds, 3 seconds, and the like.
  • the palm touch is detected again, it may coincide with the position where the palm touch is detected for the first time, or may be biased. If the detected palm touch coincides with the position where the palm touch is detected for the first time, it means that the user's palm does not move during the preset time; if there is a deviation, the user's palm is within the preset time. With mobile.
  • first”, “second”, etc. may be used herein to describe various elements, but these elements should not be limited by these terms. These terms are only used to distinguish one element from another.
  • the first palm touch can be named a second palm touch, and similarly, the second palm touch can be named the first palm touch without departing from the scope of the present invention.
  • Both the first palm touch and the second palm touch are touch, but they may not be the same touch, and may be the same touch in some scenarios.
  • S605 Respond to the second palm touch when detecting the second palm touch after the preset time.
  • the touch screen When the touch screen detects that there is a palm touch for a preset time, it can respond to the user's palm touch. In response to the user's palm touch, the mobile terminal can release the memory, clear the cache, and the like. For the specific method, refer to the manner of responding to the first operation instruction in S203, and details are not described herein again. It is worth noting that there is always a palm touch in the preset time, no matter whether the palm moves or not, it can be judged that the palm touch is not an erroneous operation. Optionally, if no second palm touch is detected after the preset time, the first palm touches the wrong operation, and the mobile terminal does not respond to the erroneous operation, ignoring the touch event.
  • an embodiment of the present invention provides an apparatus, including a touch screen 101, a processor 102, and a memory 103.
  • the touch screen 101 detects a gesture acting on the touch screen.
  • the processor 102 is configured to parse the feature parameter of the gesture according to the rule of the feature parameter of the parsing gesture stored by the memory, and respond to the first operation instruction when the feature parameter of the gesture matches the first operation instruction.
  • the first operation instruction may be a palm touch; in response to the first operation instruction, the mobile terminal may release the memory and clear the cache.
  • the memory 103 is configured to store rules for parsing feature parameters of the gesture, and the specific rules include: detecting the hand The sensing signal on the touch screen; detecting whether the sensing signal is multi-peak; when the sensing signal is multi-peak, detecting the sensing signal of the multi-peak peak surrounding area; when the multi-peak peaking area is 0, determining the gesture is Touch the palm.
  • the specific rule includes: detecting a pressure value signal of the gesture on the touch screen; detecting whether the pressure value signal is a multi-peak; and when the pressure value signal is a multi-peak, detecting a pressure value signal of the multi-peak peak surrounding area; when the multi-peak peak is enclosed When the pressure value signal of the area is 0, the gesture is determined to be a palm touch.
  • an embodiment of the present invention provides an apparatus, including a touch screen, one or more processors, a memory, and one or more programs; wherein one or more programs are stored in the memory and configured to be configured by one or more Executing by the processor, the one or more programs include instructions for: detecting a gesture acting on the touch screen; parsing a characteristic parameter of the gesture; and responding to the first operation instruction when the feature parameter of the gesture matches the first operation instruction .
  • the first operation instruction may be a palm touch; in response to the first operation instruction, the mobile terminal may release the memory and clear the cache.
  • the characteristic parameter of the gesture may be: detecting the sensing signal of the gesture on the touch screen; detecting whether the sensing signal is a multi-peak; and detecting the multi-peak peaking region when the sensing signal is multi-peak; when the multi-peak When the sensing signal of the surrounding area is 0, the gesture is determined to be a palm touch.
  • the characteristic parameter of the gesture is analyzed, specifically: detecting a pressure value signal of the gesture on the touch screen; detecting whether the pressure value signal is a multi-peak; and detecting a pressure value signal of the multi-peak peak surrounding area when the pressure value signal is a multi-peak When the pressure value signal of the multi-peak surrounding area is 0, the gesture is determined to be a palm touch.
  • an embodiment of the present invention further provides a device for recognizing a touch screen gesture, including a detecting unit 111, a parsing unit 112, a judging unit 113, and an executing unit 114.
  • the detecting unit 111 is configured to detect a gesture acting on the touch screen.
  • the parsing unit 112 is configured to parse the feature parameters of the gesture.
  • the determining unit 113 is configured to determine whether the feature parameter of the gesture matches the first operation instruction.
  • the executing unit 114 is configured to respond to the first operation instruction when the feature parameter of the gesture matches the first operation instruction.
  • the first operation instruction may be a palm touch; in response to the first operation instruction, the mobile terminal may release the memory and clear the cache.
  • the characteristic parameter of the gesture may be: detecting the sensing signal of the gesture on the touch screen; detecting whether the sensing signal is a multi-peak; and detecting the multi-peak peaking region when the sensing signal is multi-peak; when the multi-peak When the sensing signal of the surrounding area is 0, the gesture is determined to be a palm touch.
  • the characteristic parameter of the gesture is analyzed, specifically: detecting a pressure value signal of the gesture on the touch screen; detecting whether the pressure value signal is a multi-peak; and detecting a pressure value signal of the multi-peak peak surrounding area when the pressure value signal is a multi-peak When the pressure value signal of the multi-peak surrounding area is 0, the gesture is determined Touch for the palm of your hand.
  • an embodiment of the present invention further provides a storage medium for storing computer software instructions for: detecting a gesture acting on a touch screen; parsing a feature parameter of the gesture; and matching a feature parameter of the gesture with the first operation instruction Respond to the first operational instruction.
  • the first operation instruction may be a palm touch; in response to the first operation instruction, the mobile terminal may release the memory and clear the cache.
  • the characteristic parameter of the gesture may be: detecting the sensing signal of the gesture on the touch screen; detecting whether the sensing signal is a multi-peak; and detecting the multi-peak peaking region when the sensing signal is multi-peak; when the multi-peak When the sensing signal of the surrounding area is 0, the gesture is determined to be a palm touch.
  • the characteristic parameter of the gesture is analyzed, specifically: detecting a pressure value signal of the gesture on the touch screen; detecting whether the pressure value signal is a multi-peak; and detecting a pressure value signal of the multi-peak peak surrounding area when the pressure value signal is a multi-peak When the pressure value signal of the multi-peak surrounding area is 0, the gesture is determined to be a palm touch.
  • the units described as separate components are or are not physically separated, and the components displayed as units are or are not physical units, ie, located in one place, or distributed to multiple network units. Some or all of the units are selected according to actual needs to achieve the objectives of the embodiments of the present invention.
  • the functional units in the various embodiments of the present invention are integrated in one processing unit, and each unit is physically physically present, and two or more units are integrated in one unit.
  • the foregoing program is stored in a computer readable storage medium, and when executed, the program includes the above method.
  • the steps of the embodiment; and the foregoing storage medium includes: a medium that stores program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.

Abstract

一种触摸屏手势识别的方法,所述方法包括:检测作用在触摸屏上的手势(S201);解析所述手势的特征参数(S202);当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令(S203);其中,所述第一操作指令是手掌触摸。简化了用户操作。

Description

触摸屏手势识别的方法及装置 技术领域
本发明实施例涉及触摸屏技术领域,具体涉及一种触摸屏手势识别的方法及装置。
背景技术
触摸屏的应用越来越广泛,给人机交互带来了多样化的体验。现有技术中,触摸屏可以检测手指或触摸笔在触摸屏上的位置变化,比如点击、滑动等,从而响应用户操作执行相应指令。然而,触摸屏手势识别的种类还不够丰富,对于复杂的用户指令,触摸屏的响应方式不够直观。
应用程序在运行过程中会占用系统内存、产生缓存垃圾,导致手机运行速度变慢。现有技术中,用户可以通过设置菜单释放内存、清理缓存,但操作较为繁琐;另一现有技术中,用户可以通过摇晃手机触发清理缓存垃圾的操作,但该方式容易误触。
发明内容
本发明实施例提供了一种触摸屏手势识别的方法及装置,提供了一种新的触摸屏手势应用,简化了用户操作,丰富了移动终端的人机交互方式。
第一方面,本发明实施例提供了一种触摸屏手势识别的方法。该方法包括:检测作用在触摸屏上的手势;解析所述手势的特征参数;当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令;其中,所述第一操作指令是手掌触摸。其中,所述检测作用在触摸屏上的手势,可以是在移动终端处于解锁状态下,也可以是在移动终端处于锁定状态下,也可以是在移动终端处于熄屏状态下。通过上述技术方案,可以为用户提供新的人机交互方式。
结合第一方面,在第一方面的第一种实现方式中,所述解析所述手势的特征参数,具体包括:检测所述手势在所述触摸屏上的感应信号;检测所述感应信号是否是多波峰;当所述感应信号是多波峰时,检测所述多波峰围成区域的感应信号;当所述多波峰围成区域的感应信号为0时,确定所述手势为手掌触摸。当感应信号是单一波峰时,确定手势为手指触摸。多波峰围成区域,即指 尖及手掌边缘的多个采样点围成的区域,对应于手掌的掌心部分。由于掌心部分略微凹陷,使得掌心部分难以接触到触摸屏,所以掌心部分产生的感应信号近似为0.可以理解的是,掌心部分产生的感应信号可能十分微弱,可以给掌心部分产生的感应信号设置一个阈值,如果感应信号小于该阈值,则确定手势为手掌触摸。
结合第一方面,在第一方面的第二种实现方式中,所述解析所述手势的特征参数,具体包括:检测所述手势在所述触摸屏上的压力值信号;检测所述压力值信号是否是多波峰;当所述压力值信号是多波峰时,检测所述多波峰围成区域的感应信号;当所述多波峰围成区域的压力值信号为0时,确定所述手势为手掌触摸。
结合第一方面和第一方面的前两种实现方式,在第一方面的第三种实现方式中,所述解析所述手势的特征参数,具体包括:获得手势上的至少一个采样点的压力值和移动轨迹。采样点可以是手指的指尖和手掌边缘的若干个点,也可以是该侧边缘上的若干个点。移动轨迹通过检测移动过程中的起点和终点确定。当至少一个采样点中有超过预定个数的采样点沿着相同的方向移动时,可以判断手掌在进行手掌触摸动作。可选的,至少一个采样点的移动速度大致相同,可以通过测量采样点在触摸屏纵横方向上的移动速度得知采样点的移动速度。
结合第一方面和第一方面的前三种实现方式,在第一方面的第四种实现方式中,在响应所述第一操作指令前,还包括:检测手势是否作用在触摸屏的指定区域;当手势是手掌触摸且作用在触摸屏的指定区域时,响应所述手掌触摸。可选的,可以把触摸屏分区,例如把触摸屏分为左上、左下、右上、右下四个面积相同的区域,指定区域可以是这四个区域中的某个区域或者某几个区域。又如,也可以把触摸屏分为左中右三个区域,这三个区域的面积可以不同,也可以相同。
结合第一方面和第一方面的前三种实现方式,在第一方面的第五种实现方式中,在响应所述第一操作指令前,还包括:检测手势是否是第一手掌触摸;当手势是第一手掌触摸时,在预设时间后检测是否有第二手掌触摸;当在预设时间后检测有第二手掌触摸时,响应第二手掌触摸。当再次检测到手掌触摸时,可以与第一次检测到手掌触摸的位置重合,也可以有所偏差。
结合第一方面和第一方面的前五种实现方式,在第一方面的第六种实现方式中,所述响应所述第一操作指令,具体为:释放内存或清空缓存。可以理解的是,这里所说的第一操作指令包括但不限于释放内存、清理缓存,还可以是打开其他应用程序,或者是更改系统设置等等。例如,第一操作指令可以是打开相机应用程序拍照、打开手电筒应用程序、打开录音应用程序、打开或者关闭蓝牙功能、打开或者关闭Wi-Fi功能等等。
结合第一方面的第六种实现方式,在第一方面的第七种实现方式中,所述响应所述第一操作指令,还包括:在触摸屏上呈现是否响应第一操作指令的提示框。可选的,在响应第一操作指令时,移动终端可以给用户呈现提示信息,呈现方式可以是文字、图片、音频、视频等形式。可选的,用户可以选择是否响应第一操作指令。
通过上述技术方案,可以提供一种新的人机交互方式,极大的简化了用户操作的步骤。
第二方面,本发明实施例提供一种装置,包括包括触摸屏,一个或多个处理器,存储器,一个或多个程序;所述一个或多个程序被存储在所述存储器中并被配置为被所述一个或多个处理器执行,所述一个或多个程序包括指令,所述指令用于:检测作用在触摸屏上的手势;解析所述手势的特征参数;当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令;其中,所述第一操作指令是手掌触摸。
结合第二方面,在第二方面的第一种实现方式中,所述解析所述手势的特征参数,具体包括:获得手势上的至少一个采样点的压力值和移动轨迹。
结合第二方面,在第二方面的第二种实现方式中,所述响应所述第一操作指令,具体为:释放内存或清空缓存。
第三方面,本发明的实施例提供一种装置,包括触摸屏,处理器,存储器;所述触摸屏检测作用在触摸屏上的手势;所述处理器用于根据存储器存储的解析手势的特征参数的规则,解析手势的特征参数,当手势的特征参数与第一操作指令匹配时,响应第一操作指令;所述存储器用于存储解析手势的特征参数的规则;其中,第一操作指令是手掌触摸。
结合第三方面,在第三方面的第一种实现方式中,所述解析手势的特征参数的规则,具体为:检测所述手势在所述触摸屏上的感应信号;检测所述感应 信号是否是多波峰;当所述感应信号是多波峰时,检测所述多波峰围成区域的感应信号;当所述多波峰围成区域的感应信号为0时,确定所述手势为手掌触摸。
结合第三方面,在第三方面的第二种实现方式中,所述解析手势的特征参数的规则,具体为:检测所述手势在所述触摸屏上的压力值信号;检测所述压力值信号是否是多波峰;当所述压力值信号是多波峰时,检测所述多波峰围成区域的感应信号;当所述多波峰围成区域的压力值信号为0时,确定所述手势为手掌触摸。
结合第三方面和第三方面的前两种实现方式,在第三方面的第三种实现方式中,所述解析手势的特征参数的规则,具体为:获得手势上的至少一个采样点的压力值和移动轨迹。
结合第三方面和第三方面的前三种实现方式,在第三方面的第四种实现方式中,所述响应所述第一操作指令,具体为:释放内存或清空缓存。
第四方面,本发明实施例提供一种触摸屏手势识别的装置,所述装置包括包括检测单元,解析单元,判断单元,执行单元;其中,所述检测单元用于检测作用在触摸屏上的手势;所述解析单元用于解析手势的特征参数;所述判断单元用于判断手势的特征参数与第一操作指令是否匹配;所述执行单元用于当手势的特征参数与第一操作指令匹配时,响应第一操作指令;其中,第一操作指令是手掌触摸,响应第一操作指令是释放内存或清空缓存。
第五方面,本发明的实施例提供一种存储一个或多个程序的计算机存储介质,所述一个或多个程序包括指令,所述指令用于:检测作用在触摸屏上的手势;解析所述手势的特征参数;当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令;其中,所述第一操作指令是手掌触摸。
根据第五方面,在第五方面的第一种实现方式中,所述响应第一操作指令,具体为:释放内存或清空缓存。
附图说明
图1为本发明实施例提供的一种移动终端的结构示意图;
图2为本发明实施例提供的一种触摸屏手势识别的方法流程图;
图3为本发明实施例提供的一种检测手掌触摸的方法流程图;
图4为本发明实施例提供的另一种检测手掌触摸的方法流程图;
图5为本发明实施例提供的一种触摸屏手势识别的方法流程图;
图6为本发明实施例提供的另一种触摸屏手势识别的方法流程图;
图7A为本发明实施例提供的一种选择采样点的示意图;
图7B为本发明实施例提供的另一种选择采样点的示意图;
图7C为本发明实施例提供的一种采样点移动的示意图;
图8A为本发明实施例提供的一种表示感应信号强度的三维坐标图;
图8B为本发明实施例提供的另一种表示感应信号强度的三维坐标图;
图8C为本发明实施例提供的一种手掌采样点移动的示意图;
图9A为本发明实施例提供的一种触摸屏分区的示意图;
图9B为本发明实施例提供的另一种触摸屏分区的示意图;
图10为本发明实施例提供的一种装置示意图;
图11为本发明实施例提供的一种触摸屏手势识别的装置示意图。
具体实施方式
下面将结合实施例的附图,对本发明实施例的技术方案进行详细的描述。
本发明实施例中所涉及的终端可以是具有触摸屏的终端,包括但不限于手机、平板电脑、个人数字助理(Personal Digital Assistant,简称:PDA)、无线手持设备、无线上网本、便携电脑、媒体播放器、智能手表等。所述终端搭载的操作系统包括但不限于
Figure PCTCN2016111350-appb-000001
DOS、Unix、Linux或者其他操作系统。本发明的一些实施例以手机为例,本领域技术人员应当理解的是,这些实施例也适用于其他具有触摸屏的终端。
图1所示为一种手机100的硬件结构示意图。应当理解的是,手机100只是上述具有触摸屏的移动终端的一个示例,并且手机100可具有比所示出的更多或更少的部件,可组合两个或更多个部件,或者可具有这些部件的不同配置或布置。图1中所示的各种部件可以硬件、软件方式或软硬件组合来实现,包括一个或多个信号处理和/或专用集成电路。
手机100包括处理器110,存储器120,输入装置130,显示装置140,传感器150,音频电路160,射频电路170,电源180。这些部件通过一个或多个通信 总线或信号线来通信。
处理器110是手机100的控制中心,利用各种接口和线路连接手机100的各个部分,通过运行或执行存储在存储器120内的软件程序或指令集,以及调用存储在存储器120内的数据,执行手机100的各种功能和处理数据,从而对手机100进行整体监控。可选的,处理器110可包括一个或多个处理单元;可选的,处理器110可集成应用处理器和调制解调器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。在一些实施例中,该处理器110可以包括图像信号处理器和双核/多核处理器。
存储器120可用于存储软件程序以及功能模块,处理器110通过运行存储在存储器120的软件程序以及功能模块,从而执行手机100的各种功能应用以及数据处理。存储器120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机100的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器120包括高速随机存取存储器,还包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。存储器120可以存储各种操作系统,例如微软公司的Windows操作系统,或谷歌公司开发的Android操作系统,或苹果公司的IOS操作系统等等。
输入装置130可用于接收输入的数字或字符信息,以及产生与手机100的用户设置以及功能控制有关的键信号输入。具体地,输入装置130可包括触控面板131以及其他输入设备132。触控面板131,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触控笔等任何适合的部位或物件在触控面板131上或在触控面板131附近的操作),并根据预先预设的程式驱动相应的连接装置。可选的,触控面板131可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,并能接收处理器110发来的命令并加以执行。此外,采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板131。除了触控面板131,输入装置130还包括其他输入设备132。具体地,其他输入设备132包括但不限于物理键盘、功能键(比如音量控制按键、开关 按键、Home键等)、轨迹球、鼠标、操作杆等中的一种或多种。
具体的,触控面板131包括触敏表面(touch-sensitive surface),触敏表面用于执行与接触检测相关的各种操作,诸如确定是否已经发生了接触(例如,检测手指按下事件)、触摸的压力值和坐标信息,确定是否存在接触的移动并在整个触敏表面上跟踪该移动(例如,检测一个或多个手指拖动事件)、以及确定接触是否已经终止(例如,检测手指抬起事件或者接触中断)。确定接触点的移动可以包括确定接触点的速率(量值)、速度(量值和方向)、和/或加速度(量值和/或方向的改变),接触点的移动由一系列接触数据来表示。这些操作可被应用于单点接触(例如,一个手指接触)或者多点同时接触(例如,“多点触摸”/多个手指接触)。触摸检测技术包括但不限于电容式、电阻式、红外线式、表面声波技术等。
需要注意的是,触控面板131应理解为广义的触摸输入设备,触敏表面可以与显示屏集成在一起,也可以分开从而作为单独的触控输入设备与系统连接,例如协调鼠标移动和鼠标按钮按压(具有或没有单个或多个键盘按压或保持)、触控板上的用户移动轻击、拖动、滚动等、触控笔输入、设备的移动、口头指令、检测到的眼睛移动、生物特征输入、和/或其任意组合,它们都可以被用作为触摸输入设备。以下实施例虽然主要是参考手指输入(例如,单指接触、单指轻击手势、单指轻扫手势)来给出,但是应当理解的是,在一些实施例中,这些手指输入中的一个或多个可以由来自另一触摸输入设备的输入(例如,触控笔输入)替代。
在本文中,除非特别说明,用户的手势是灵活的,可以是点击,双击,画圈,画线,单指触碰,或多指触碰,等等。本领域普通技术人员可以理解,只要能达到基本相同的效果,具体手势的选择是灵活的。在本文中,除非特别说明,用户的手势作用于触敏表面的位置或区域也是灵活的,可以是显示屏显示的某个应用接口元素的区域或附近区域,显示屏不显示应用接口元素的空白区域,显示屏显示的某个功能设置的区域,等等。本领域普通技术人员可以理解,只要能达到基本相同的效果,手势作用于触敏表面的具体位置或区域是可以灵活设置的。
显示装置140可用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单。显示装置140可包括显示面板141,可选的,采用LCD(Liquid  Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板141。进一步的,触控面板131可覆盖显示面板141,当触控面板131检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板141上提供相应的视觉输出。视觉输出包括文本、图形、图标、视频及其任意组合。在一些实施例中,一些视觉输出或全部的视觉输出可对应于用户界面对象。虽然在图1中,触控面板131与显示面板141是作为两个独立的部件来实现手机100的输入和输出功能,但是在某些实施例中,将触控面板131与显示面板141集成而实现手机100的输入和输出功能。
手机100还可包括传感器150,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在手机100移动到耳边时,关闭显示面板141和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;可选地,手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
手机100还可包括音频电路160,其中扬声器161,传声器162可提供用户与手机100之间的音频接口。音频电路160可将接收到的音频数据转换后的电信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,传声器162将收集的声音信号转换为电信号,由音频电路160接收后转换为音频数据,再将音频数据输出处理器110处理后,经射频电路180以发送给另一手机,或者将音频数据输出至存储器120以便进一步处理。在一些实施例中,音频电路160还包括耳麦插孔,耳麦插孔提供音频电路160与可移除的音频输入/输出外围设备之间的接口,该外围设备诸如仅输出的耳机或者具有输出(例如,单耳或双耳耳机)和输入(例如,麦克风)二者的耳麦。
射频(RF,Radio Freqency)电路170可用于收发信息或通话过程中信号的接收和发送,将电信号转换为电磁信号或将电磁信号转换为电信号,并且经由电磁信号与通信网络及其他通信设备通信,特别地,将基站的下行信息接收后, 给处理器110处理;另外,将设计上行的数据发送给基站。可包括用于执行这些功能的众所周知的电路系统,包括但不限于天线系统、射频收发器、一个或多个放大器、调谐器、一个或多个振荡器、数字信号处理器、编解码芯片组、SIM(Subscriber Identity Module,用户身份模块)卡等等。射频电路170可通过无线通信与网络以及其他设备通信,网络诸如是互联网、内联网和/或无线网络(诸如蜂窝电话网络、无线局域网和/或城域网)。无线通信可使用多种通信标准、协议和技术中的任何类型,包括但不限于全球移动通信系统、增强数据GSM环境、高速下行链路分组接入、高速上行链路分组接入、宽带码分多址、码分多址、时分多址、蓝牙、无线保真(例如,IEEE 802.11a、IEEE 802.11b、IEEE802.11g和/或IEEE 802.11n)、因特网语音协议、Wi-MAX、电子邮件协议(例如,因特网消息访问协议(IMAP)和/或邮局协议(POP))、即时消息(例如,可扩展消息处理现场协议(XMPP)、用于即时消息和现场利用扩展的会话发起协议(SIMPLE)、即时消息和到场服务(IMPS))、和/或短消息服务(SMS)、或者其他任何适当的通信协议,包括在本文献提交日还未开发出的通信协议。
电源180用于给手机100进行供电及电源管理。具体的,电源180可包括电源管理系统、一个或多个电源(例如,电池、交流电)、再充电系统、电力故障检测电路、功率变换器或逆变器、电力状态指示器(例如,发光二极管)和任何其他与手机中电力的生成、管理和分配相关联的部件。
尽管未示出,手机100还可包括摄像头、蓝牙模块等,在此不再赘述。
手机100的触摸屏131可以检测用户在屏幕上的输入,例如手指点击、滑动等。以电容式触摸屏为例,触摸屏131检测手指触摸时的电容变化点,通过该点的坐标可以得知手指触摸的位置。其中,手指点击输入可以被视为屏幕上的一个点的坐标,手指滑动输入可以被视为从屏幕上的起始点坐标到终止点坐标之间的位移。处理器110可以接收触摸屏131检测到的用户输入坐标,并能执行相应的用户指令。可选的,当触摸屏131检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示器上提供相应的视觉输出。
触摸屏131可以检测出用户输入的压力值。可选的,在用户按压触摸屏131时,触摸屏131可以根据压力值的大小给用户提供按压轻重程度的反馈。可选的,该反馈可以是一个触觉反馈,以确保只有当前进行输入的用户才能感知到, 例如振动等。在一种实现方式中,触摸屏131集成了电容式传感器,根据触摸时由于屏幕形变引起的电容值变化,从而得知触摸压力的变化。在另一种实现方式中,手机触摸屏131配备有有专门感应压力值的压力传感器,这些压力传感器可以设置在触摸屏131的四个角上,当一个触摸事件发生时,传感器可以对压力值进行感知,从而区分轻点、轻按、重按这三种动作,为人机交互开拓出了全新的方式。触摸屏131的压力传感器在检测到用户的按压后,处理器将会对用户的按压进行响应,以Android操作系统为例,处理器可以调用MotionEvent Class里的getPressure方法,从而获得用户输入压力值的大小数值。可以理解的是,设备也可以调用其他接口函数对压力值进行检测,此处不做限制。
手机100在运行过程中会产生缓存垃圾,而且运行中的系统进程和应用程序会占用内存,导致手机100运行速度变慢。以Android系统的内存回收机制为例,Android系统自身并没有关闭进程的说法,每当用户要退出一个进程回到桌面或打开另一个程序的时候,可以按返回键。而当用户按下返回键后,该进程并没有真正的关闭,仍然保存在内存中,在后台运行,以便在下次调用的时候可以更快的打开该程序。要想真正的关闭一个已打开的进程,一般是当Android系统认为已经没有足够的内存来运行新的进程,从而关闭一些虽然已经开着、但是没有用了的进程,这就需要一个对各种进程进行鉴别的机制。
在Android系统中,ActivityManagerService.java记录着每一个进程的优先级。一个进程的优先级用oom_adj值表示,oom_adj值越高代表该进程优先级越低。例如,一个正在使用的进程的oom_adj值为0,一旦按下返回键,这个进程就会得到一个更高的oom_adj值和更低的优先级;具体oom_adj值的多少通常取决于该进程在LRU(last recently used)list的位置。
上述内存管理机制并不能完全消除内存紧张的问题,而且可能存在管理机制失灵的情况,所以用户手动释放内存就显得很有必要。需要注意的是,此处的释放内存也可以是清理系统后台、整理内存碎片。另外,手机系统在运行过程中会积累大量的缓存,比如使用浏览新闻的应用程序,会产生大量图片缓存,又如听音乐也会产生大量缓存,这些缓存消耗了手机的存储空间,用户有必要手动清除缓存。一般来说,用户可以通过设置菜单或特定的应用程序来释放内存、清理缓存,这种操作需要用户手动操作,较为繁琐。有些特定的应用程序 也可以用来执行类似的功能,比如打开特定的应用程序后一键清理缓存,或者采用摇晃手机的方式清理缓存等,但这种方式需要安装特定的应用程序,给用户造成了不便。
如图2所示,本发明实施例提供一种触摸屏手势识别的方法,具体包括以下步骤:
S201,检测作用在触摸屏上的手势。
手势识别作为一种直观自然的输入方式,把人们从传统接触性的输入装置中解放出来,可以以一种更自然的方式与计算机交互,使计算机界面变得更加容易。手势主要分为静态手势和动态手势两种,动态手势可以看作是连续的静态手势序列。动态手势具有丰富和直观的表达能力,与静态手势结合在一起,能创造出更丰富的语义。利用动态手势识别构建新型的交互界面,是新一代的人机交互界面对输入方式自然性的要求,可以弥补传统交互方式的不足。
移动终端的触摸屏可以检测用户的手势输入,例如,触摸屏的触敏表面可以检测到用户的手势输入。可选的,触摸屏可以通过压力感应功能实现对手势输入的检测。所述检测作用在触摸屏上的手势,可以是在移动终端处于解锁状态下,也可以是在移动终端处于锁定状态下,也可以是在移动终端处于熄屏状态下。
其中,移动终端处于锁定状态下的界面叫锁屏界面。锁屏界面是指,移动终端被锁定后,点亮移动终端的显示器幕时所显示的用户界面。锁屏界面上包括多个菜单控件(如解锁模块、时钟)或应用程序的控件图标(如音乐应用程序的开关控件)。锁屏界面可以有一个或多个。例如,移动终端包括三个锁屏界面(即锁屏界面1、锁屏界面2、锁屏界面3)。移动终端的显示器幕被点亮时显示锁屏界面2,在保持移动终端的显示器幕的锁定状态下,用户通过触摸屏输入向左滑动的动作时,移动终端响应该左滑动作,移动终端将当前显示的锁屏界面2切换为锁屏界面1。当用户通过触摸屏输入向右滑动的动作时,移动终端响应该右滑动作,移动终端将当前显示的锁屏界面2切换为锁屏界面3。
其中,在移动终端处于解锁状态下,包括主屏幕页面和非主屏幕页面。主屏幕页面是指,移动终端解锁后所显示的页面。移动终端中的主屏幕页面包含应用程序的控件图标、菜单图标等。移动终端中的主屏幕页面可以有一个或多个,且随着控件图标的增加,移动终端中的主屏幕页面可以自动增加。当然, 移动终端还可以根据用户的输入指令而增加或删除主屏幕页面。非主屏幕页面,是指移动终端解锁后并不显示、需要用户指令或者系统指令才能显示的页面,例如,在移动终端解锁后,移动终端检测到用户手指向左划动的动作,则会显示其他非主屏幕页面。
其中,熄屏状态是关闭屏幕背光、熄灭屏幕,可以是移动终端判断特定时间内没有检测到用户操作而熄灭屏幕,也可以是用户手动关闭屏幕背光而熄灭屏幕。在熄屏状态下,移动终端可能会关闭一些系统进程以节省电量。
S202,解析手势的特征参数。
在解析手势的特征参数时,触摸屏可以获得手势上的至少一个采样点的特征参数。所述的至少一个采样点的特征参数,可以是这些点的压力值,或者可以是这些点的电容变化值。触摸屏还可以获取手势的移动轨迹,比如触摸屏可以获取手势上的至少一个采样点在移动过程中的起点和终点。值得注意的是,对于一些触摸屏,采样点的压力值大小可以通过测量手势与触摸屏的接触面积获得;对于另一些触摸屏,手势按压使触摸屏上的压电薄膜产生形变,根据压电效应,可以因而产生电荷,通过测量电荷量的大小可以获知采样点的压力值;对于另外一些触摸屏,手势按压使得触摸屏采样点处的电容发生变化,从而测得采样点的压力值大小。
另外,对于红外线式触摸屏,触摸屏可以检测到手掌的形状。红外线式触摸屏的原理是在触摸屏的边框排列着红外线发射管及接收管,形成一个密集的红外线矩阵。当用户触摸屏幕时,手指会挡住经过该触摸点的横竖两条红外线,从而确定该触摸点的坐标。当有手掌触摸时,会挡住手形区域内的红外线。移动终端判断挡住的红外线区域是否和预存的手掌形状相吻合,从而确定该用户输入是不是手掌触摸。
S203,当手势的特征参数与第一操作指令匹配时,响应第一操作指令。其中,第一操作指令可以是手掌触摸;响应第一操作指令,可以是移动终端释放内存、清空缓存。
当手势的特征参数与第一操作指令匹配时,会响应第一操作指令,例如触发移动终端执行释放内存、清理缓存的指令。可选的,在响应第一操作指令时,移动终端可以给用户呈现提示信息,呈现方式可以是文字、图片、音频、视频等形式,例如可以在触摸屏上呈现是否响应第一操作指令的提示框,或者用音 频形式问询用户是否响应第一操作指令。可选的,用户可以选择是否响应第一操作指令,例如,用户可以在提示框中选择是否响应第一操作指令,也可以用语音命令选择是否响应第一操作指令。移动终端在响应第一操作指令之后,可以给用户呈现已响应第一操作指令的提示消息,呈现方式可以是文字、图片、音频、视频等形式,例如可以在触摸屏上呈现已经响应第一操作指令的提示框。可以理解的是,这里所说的第一操作指令包括但不限于释放内存、清理缓存,还可以是打开其他应用程序,或者是更改系统设置等等。例如,第一操作指令可以是打开相机应用程序拍照、打开手电筒应用程序、打开录音应用程序、打开或者关闭蓝牙功能、打开或者关闭Wi-Fi功能等等,此处不做限制。
这里所说的手掌触摸,可以是手掌在触摸屏上的移动动作。值得注意的是,本发明实施例中的手掌,可以是手指与手腕之间的部分,也可以包括手指以及手指与手腕之间的部分。所述在触摸屏上的移动动作,可以是从左侧移动到右侧,也可以是从右侧移动到左侧,也可以是从顶部移动到底部,也可以是从底部移动到顶部,也可以沿着屏幕对角线移动,也可以沿着其他轨迹移动。
所述手势的特征参数与第一操作指令匹配,可以是触摸屏测得的手势的特征参数符合手掌触摸的动作。在判断用户手势是不是符合手掌触摸时,可以根据至少一个采样点的特征参数与预设的手掌触摸的特征参数进行比较,从而判断是否符合。比较的规则可以是两者的特征参数完全符合,也可以是两者的相似度超过某个阈值,例如70%的相似度、或者90%的相似度。
可选的,判断是否是手掌触摸动作可以分为两步。第一步,触摸屏判断用户手势是不是手掌的形状。第二步,触摸屏判断该手掌形状是不是沿着特定的方向移动。在第一步中,触摸屏要判断用户手势是不是手掌的形状,可以在用户手势上选取至少一个采样点,检测至少一个采样点的压力值或电容值。例如,触摸屏可以在用户手势上选择手指的指尖作为采样点,或者选择指尖和手掌边缘的若干个点作为采样点。可以理解的是,采样点可以是连续的点(表现为一条曲线),或者不连续的分散的点。
根据不同用户的不同操作习惯,手掌与触摸屏的接触方式也有不同。例如,当手掌面积相对于触摸屏的比例较小时,手掌可以覆盖在触摸屏上,此时手指之间和手掌边缘都会与触摸屏接触。这里所说的手掌面积相对于触摸屏的比例较小,可以是用户手掌较小,或者触摸屏面积较大(例如平板电脑的触摸屏), 这样可以保证用户手掌覆盖在触摸屏上。当手掌可以覆盖在触摸屏上时,采样点可以是手指的指尖和手掌边缘的若干个点。参见图7A,共选取了9个采样点,在指尖的采样点为a、b、c、d、e,在手掌边缘选取的采样点为f、g、h、i。又如,当手掌面积相对于触摸屏的比例较大时,用户手掌不足以全部覆盖在触摸屏上,或者手掌可以覆盖在触摸屏上但没有移动的空间,此时用户可以用手掌的一侧边缘擦拭屏幕。当手掌的一侧边缘擦拭屏幕时,取样点可以是该侧边缘上的若干个点。参见图7B,当用户把手掌的小拇指侧朝向触摸屏擦拭屏幕时,选取该侧手掌上的j、k、l、m这4个采样点。
在第二步中,触摸屏还可以获取手势的移动轨迹,比如触摸屏可以获取手势上的至少一个采样点在移动过程中的起点和终点。在此过程中,触摸屏可以跟踪每个采样点移动的起点和终点。可选的,当其中一个采样点沿着某个方向移动时,可以判断手掌在进行手掌触摸动作。可选的,当至少一个采样点中有超过预定个数的采样点沿着相同的方向移动时,可以判断手掌在进行手掌触摸动作。其中,超过预定个数,可以是超过2个采样点,可以是超过4个采样点,可以是超过半数的采样点,可以是超过三分之二的采样点。其中,沿着相同的方向,可以是同时沿着大致相同的方向移动,比如同时从左向右移动,同时从上到下移动,同时从右到左移动,同时从下到上移动;大致相同的方向,是指移动轨迹大致相同,不要求各个采样点的移动轨迹完全平行。可选的,至少一个采样点的移动速度大致相同,可以通过测量采样点在触摸屏纵横方向上的移动速度得知采样点的移动速度。参见图7C,在采样点中选取a、c、e、g这4个采样点,其移动起点分别为a1、c1、e1、g1,移动终端分别为a2、c2、e2、g2,如果检测到这4个采样点的移动方向相同,或者移动速度相同,则可以判断手掌在进行手掌触摸动作。
参见图3,为了区分手指点击触摸屏幕和手掌触摸屏幕,本发明实施例提供一种检测手掌触摸的方法,包括:
S301:检测手势在触摸屏上的感应信号。
感应信号可以理解为手势产生的电容变化值。电容触摸屏可以检测用户手势作用在触摸屏上产生的电容变化值。
S302:检测感应信号是否是多波峰。
当感应信号是单一波峰时,确定手势为手指触摸。参见图8A,在三维坐 标系中,x轴和y轴表示触摸屏的纵横方向,z轴表示感应信号强度。图8A出现了单一波峰,则说明手势是手指触摸。
S303:当感应信号是多波峰时,检测多波峰围成区域的感应信号。
相较于手指,手掌接触触摸屏的面积较大,包括指尖及手掌边缘的多个采样点会接触到触摸屏,多个采样点都会在触摸屏上形成感应信号,所以感应信号是多波峰。多波峰围成区域,即指尖及手掌边缘的多个采样点围成的区域,对应于手掌的掌心部分。
S304:当多波峰围成区域的感应信号为0时,确定手势为手掌触摸。
当用户的手掌覆盖在触摸屏上时,由于掌心部分略微凹陷,使得掌心部分难以接触到触摸屏,所以掌心部分产生的感应信号近似为0.可以理解的是,掌心部分产生的感应信号可能十分微弱,可以给掌心部分产生的感应信号设置一个阈值,如果感应信号小于该阈值,则确定手势为手掌触摸。参见图8B,在三维坐标系中,x轴和y轴表示触摸屏的纵横方向,z轴表示感应信号强度。图8B中出现了多波峰,且多波峰围成的区域出现了感应信号近似为0的区域,此区域正是掌心区域。
可选的,当手势为手掌触摸且在触摸屏上移动时,触摸屏检测到采样点对应的波峰也会移动。可选的,参见图8C,由指尖和手掌边缘构成的多个采样点可以被近似看做一个椭圆上的几个点(图8C中为o、p、q、r这4个点),则手掌在屏幕上的移动可以被近似看做一个椭圆在屏幕上移动。
参见图4,对于具有压力触控功能的触摸屏,本发明实施例提供另一种检测手掌触摸的方法,包括:
S401:检测手势在触摸屏上的压力值信号。
S402:检测压力值信号是否是多波峰。
当压力值信号是单一波峰时,确定手势为手指触摸。参见图8A,z轴表示压力值大小。
S403:当压力值信号是多波峰时,检测多波峰围成区域的压力值信号。
多波峰围成区域,即指尖及手掌边缘的多个采样点围成的区域,对应于手掌的掌心部分。
S404:当多波峰围成区域的压力值信号为0时,确定手势为手掌触摸。参见图8B,z轴表示压力值大小。
当用户的手掌覆盖在触摸屏上时,由于掌心部分略微凹陷,使得掌心部分难以接触到触摸屏,所以掌心部分产生的压力值信号近似为0.可以理解的是,掌心部分产生的压力值信号可能十分微弱,可以给掌心部分产生的压力值信号设置一个阈值,如果压力值信号小于该阈值,则确定手势为手掌触摸。
参见图5,本发明另一实施例提供一种触摸屏手势识别的方法,包括:
S501:检测作用在触摸屏上的手势。与S201类似,此处不再赘述。
S502:检测手势是否作用在触摸屏的指定区域。
触摸屏检测用户的手势输入是否作用在触摸屏的指定区域内。参见图9A,可选的,可以把触摸屏分区,例如把触摸屏分为左上、左下、右上、右下四个面积相同的区域,指定区域可以是这四个区域中的某个区域或者某几个区域。参见图9B,又如,也可以把触摸屏分为左中右三个区域,这三个区域的面积可以不同,也可以相同。
S503:检测手势是否是手掌触摸。
检测方法可以参考S203,图3,图4或下文的S603,此处不再赘述。值得注意的是,S503和S502的顺序可以交换。在一种情况下,S502可以为:检测手势是否作用在触摸屏的指定区域;S503可以为:当手势作用在触摸屏的指定区域时,检测手势是否是手掌触摸。在另一种情况下,S502可以为:检测手势是否是手掌触摸;S503可以为:当手势是手掌触摸时,检测手势是否作用在触摸屏的指定区域。
S504:当手势是手掌触摸且作用在触摸屏的指定区域时,响应所述手掌触摸。
响应用户的手掌触摸,可以是移动终端释放内存、清理缓存等,具体方法可以参考S203中的响应第一操作指令的方式,此处不再赘述。
参见图6,本发明另一实施例提供另一种触摸屏手势识别的方法,包括:
S601:检测作用在触摸屏上的手势。与S201类似。
S602:解析手势的特征参数。与S202类似。
S603:检测手势是否是第一手掌触摸。
检测手势是否是手掌触摸时,可以检测手势与触摸屏的接触面积,当手掌与触摸屏的接触面积大于预设值时,确定用户手势为手掌触摸。用户手势与触摸屏的接触面积,也可以理解为触摸屏被用户手势的覆盖面积;该接触面积可 以是预设的,例如超过触摸屏面积的三分之一,或者超过触摸屏面积的二分之一,等等。
S604:当手势是第一手掌触摸时,在预设时间后检测是否有第二手掌触摸。
所述的预设时间后,可以是在第一次检测到手掌触摸之后的预设时间之后,例如,触摸屏检测到手掌触摸之后,间隔1秒,再次检测到手掌触摸。可以理解的是,此处的预设时间还可以是2秒,3秒,等等。当再次检测到手掌触摸时,可以与第一次检测到手掌触摸的位置重合,也可以有所偏差。如果再次检测到的手掌触摸与第一次检测到手掌触摸的位置重合,则说明在此预设时间内用户的手掌并没有移动;如果有所偏差,则说明在此预设时间内用户的手掌有了移动。
可以理解的是,“第一”、“第二”等可能在本文中用来描述各种元素,但是这些元素不应当被这些术语限定。这些术语只是用来将一个元素与另一元素区分开。例如,第一手掌触摸可以被命名为第二手掌触摸,并且类似地,第二手掌触摸可以被命名为第一手掌触摸,而不背离本发明的范围。第一手掌触摸和第二手掌触摸二者都是触摸,但是它们可以不是同一触摸,在某些场景下也可以是同一触摸。
S605:当在预设时间后检测有第二手掌触摸时,响应第二手掌触摸。
触摸屏检测到在预设时间内一直有手掌触摸,则可以响应用户的手掌触摸。响应用户的手掌触摸,可以是移动终端释放内存、清理缓存等,具体方法可以参考S203中的响应第一操作指令的方式,此处不再赘述。值得注意的是,在预设时间内一直有手掌触摸,不管手掌是否移动,都可以判断该手掌触摸不是误操作。可选的,如果在预设时间后检测到没有第二手掌触摸,则说明第一手掌触摸时误操作,移动终端不响应此误操作,忽略此触摸事件。
另外,参见图10,本发明实施例提供一种装置,包括触摸屏101,处理器102,存储器103。其中,触摸屏101检测作用在触摸屏上的手势。处理器102用于根据存储器存储的解析手势的特征参数的规则,解析手势的特征参数,当手势的特征参数与第一操作指令匹配时,响应第一操作指令。其中,第一操作指令可以是手掌触摸;响应第一操作指令,可以是移动终端释放内存、清空缓存。存储器103用于存储解析手势的特征参数的规则,具体规则包括:检测手 势在触摸屏上的感应信号;检测感应信号是否是多波峰;当感应信号是多波峰时,检测多波峰围成区域的感应信号;当多波峰围成区域的感应信号为0时,确定手势为手掌触摸。或者,具体规则包括:检测手势在触摸屏上的压力值信号;检测压力值信号是否是多波峰;当压力值信号是多波峰时,检测多波峰围成区域的压力值信号;当多波峰围成区域的压力值信号为0时,确定手势为手掌触摸。
另外,本发明实施例提供一种装置,包括触摸屏,一个或多个处理器,存储器,一个或多个程序;其中,一个或多个程序被存储在存储器中并被配置为被一个或多个处理器执行,这一个或多个程序包括指令,该指令用于:检测作用在触摸屏上的手势;解析手势的特征参数;当手势的特征参数与第一操作指令匹配时,响应第一操作指令。其中,第一操作指令可以是手掌触摸;响应第一操作指令,可以是移动终端释放内存、清空缓存。其中,解析手势的特征参数,具体可以为:检测手势在触摸屏上的感应信号;检测感应信号是否是多波峰;当感应信号是多波峰时,检测多波峰围成区域的感应信号;当多波峰围成区域的感应信号为0时,确定手势为手掌触摸。或者,解析手势的特征参数,具体可以为:检测手势在触摸屏上的压力值信号;检测压力值信号是否是多波峰;当压力值信号是多波峰时,检测多波峰围成区域的压力值信号;当多波峰围成区域的压力值信号为0时,确定手势为手掌触摸。
另外,参见图11,本发明实施例还提供一种触摸屏手势识别的装置,包括检测单元111,解析单元112,判断单元113,执行单元114。其中,检测单元111用于检测作用在触摸屏上的手势。解析单元112用于解析手势的特征参数。判断单元113用于判断手势的特征参数与第一操作指令是否匹配。执行单元114用于当手势的特征参数与第一操作指令匹配时,响应第一操作指令。其中,第一操作指令可以是手掌触摸;响应第一操作指令,可以是移动终端释放内存、清空缓存。其中,解析手势的特征参数,具体可以为:检测手势在触摸屏上的感应信号;检测感应信号是否是多波峰;当感应信号是多波峰时,检测多波峰围成区域的感应信号;当多波峰围成区域的感应信号为0时,确定手势为手掌触摸。或者,解析手势的特征参数,具体可以为:检测手势在触摸屏上的压力值信号;检测压力值信号是否是多波峰;当压力值信号是多波峰时,检测多波峰围成区域的压力值信号;当多波峰围成区域的压力值信号为0时,确定手势 为手掌触摸。
另外,本发明实施例还提供一种存储介质,用于存储计算机软件指令,该指令用于:检测作用在触摸屏上的手势;解析手势的特征参数;当手势的特征参数与第一操作指令匹配时,响应第一操作指令。其中,第一操作指令可以是手掌触摸;响应第一操作指令,可以是移动终端释放内存、清空缓存。其中,解析手势的特征参数,具体可以为:检测手势在触摸屏上的感应信号;检测感应信号是否是多波峰;当感应信号是多波峰时,检测多波峰围成区域的感应信号;当多波峰围成区域的感应信号为0时,确定手势为手掌触摸。或者,解析手势的特征参数,具体可以为:检测手势在触摸屏上的压力值信号;检测压力值信号是否是多波峰;当压力值信号是多波峰时,检测多波峰围成区域的压力值信号;当多波峰围成区域的压力值信号为0时,确定手势为手掌触摸。
本领域普通技术人员应当理解,结合本文中所公开的实施例描述的各示例的单元、规则及方法步骤,能够以计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
应当理解,本文中所使用的术语“和/或”是指并且涵盖相关联地列出的项目中一个或多个项目的任何和全部可能的组合。还将理解的是,术语“包括”和/或“包含”当在本说明书中使用时是指定存在所陈述的特征、整数、步骤、操作、元素和/或部件,但是并不排除存在或添加一个或多个其他特征、整数、步骤、操作、元素、部件和/或其分组。
在本文中对本发明的描述中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本发明的限制。如本在发明的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。
所述作为分离部件说明的单元是或者也不是物理上分开的,作为单元显示的部件是或者也不是物理单元,即位于一个地方,或者也分布到多个网络单元上。根据实际的需要选择其中的部分或者全部单元来实现本发明实施例方案的目的。在本发明各个实施例中的各功能单元集成在一个处理单元中,也是各个单元单独物理存在,也两个或两个以上单元集成在一个单元中。
本领域普通技术人员理解:实现上述方法实施例的全部或部分步骤通过程序指令相关的硬件来完成,前述的程序存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。

Claims (19)

  1. 一种触摸屏手势识别的方法,其特征在于,所述方法包括:
    检测作用在触摸屏上的手势;
    解析所述手势的特征参数;
    当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令;
    其中,所述第一操作指令是手掌触摸。
  2. 根据权利要求1所述的方法,其特征在于,所述解析所述手势的特征参数,具体包括:
    检测所述手势在所述触摸屏上的感应信号;
    检测所述感应信号是否是多波峰;
    当所述感应信号是多波峰时,检测所述多波峰围成区域的感应信号;
    当所述多波峰围成区域的感应信号为0时,确定所述手势为手掌触摸。
  3. 根据权利要求1所述的方法,其特征在于,所述解析所述手势的特征参数,具体包括:
    检测所述手势在所述触摸屏上的压力值信号;
    检测所述压力值信号是否是多波峰;
    当所述压力值信号是多波峰时,检测所述多波峰围成区域的感应信号;
    当所述多波峰围成区域的压力值信号为0时,确定所述手势为手掌触摸。
  4. 根据权利要求1-3所述的方法,其特征在于,所述解析所述手势的特征参数,具体包括:
    获得手势上的至少一个采样点的压力值和移动轨迹。
  5. 根据权利要求1-4所述的方法,其特征在于,在响应所述第一操作指令前,还包括:
    检测手势是否作用在触摸屏的指定区域;
    当手势是手掌触摸且作用在触摸屏的指定区域时,响应所述手掌触摸。
  6. 根据权利要求1-4所述的方法,其特征在于,在响应所述第一操作指令前,还包括:
    检测手势是否是第一手掌触摸;
    当手势是第一手掌触摸时,在预设时间后检测是否有第二手掌触摸;
    当在预设时间后检测有第二手掌触摸时,响应第二手掌触摸。
  7. 根据权利要求1-6所述的方法,其特征在于,所述响应所述第一操作指令,具体为:释放内存或清空缓存。
  8. 根据权利要求7所述的方法,其特征在于,所述响应所述第一操作指令,还包括:在触摸屏上呈现是否响应第一操作指令的提示框。
  9. 一种装置,包括触摸屏,一个或多个处理器,存储器,一个或多个程序;所述一个或多个程序被存储在所述存储器中并被配置为被所述一个或多个处理器执行,所述一个或多个程序包括指令,所述指令用于:检测作用在触摸屏上的手势;解析所述手势的特征参数;当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令;其中,所述第一操作指令是手掌触摸。
  10. 根据权利要求9所述的装置,其特征在于,所述解析所述手势的特征参数,具体包括:获得手势上的至少一个采样点的压力值和移动轨迹。
  11. 根据权利要求9所述的装置,其特征在于,所述响应所述第一操作指令,具体为:释放内存或清空缓存。
  12. 一种装置,包括触摸屏,处理器,存储器;所述触摸屏检测作用在触摸屏上的手势;所述处理器用于根据存储器存储的解析手势的特征参数的规则,解析手势的特征参数,当手势的特征参数与第一操作指令匹配时,响应第一操作指令;所述存储器用于存储解析手势的特征参数的规则;其中,第一操作指令是手掌触摸。
  13. 根据权利要求12所述的装置,其特征在于,所述解析手势的特征参数的规则,具体为:
    检测所述手势在所述触摸屏上的感应信号;
    检测所述感应信号是否是多波峰;
    当所述感应信号是多波峰时,检测所述多波峰围成区域的感应信号;
    当所述多波峰围成区域的感应信号为0时,确定所述手势为手掌触摸。
  14. 根据权利要求12所述的装置,其特征在于,所述解析手势的特征参数的规则,具体为:
    检测所述手势在所述触摸屏上的压力值信号;
    检测所述压力值信号是否是多波峰;
    当所述压力值信号是多波峰时,检测所述多波峰围成区域的感应信号;
    当所述多波峰围成区域的压力值信号为0时,确定所述手势为手掌触摸。
  15. 根据权利要求12-14所述的装置,其特征在于,所述解析手势的特征参数的规则,具体为:获得手势上的至少一个采样点的压力值和移动轨迹。
  16. 根据权利要求12-15所述的装置,其特征在于,所述响应所述第一操作指令,具体为:释放内存或清空缓存。
  17. 一种触摸屏手势识别的装置,其特征在于,所述装置包括包括检测单元,解析单元,判断单元,执行单元;
    其中,所述检测单元用于检测作用在触摸屏上的手势;
    所述解析单元用于解析手势的特征参数;
    所述判断单元用于判断手势的特征参数与第一操作指令是否匹配;
    所述执行单元用于当手势的特征参数与第一操作指令匹配时,响应第一操作指令;
    其中,第一操作指令是手掌触摸,响应第一操作指令是释放内存或清空缓存。
  18. 一种存储一个或多个程序的计算机存储介质,所述一个或多个程序包括指令,所述指令用于:
    检测作用在触摸屏上的手势;解析所述手势的特征参数;当所述手势的特征参数与第一操作指令匹配时,响应所述第一操作指令;其中,所述第一操作指令是手掌触摸。
  19. 根据权利要求18所述的计算机存储介质,所述响应第一操作指令,具体为:释放内存或清空缓存。
PCT/CN2016/111350 2016-12-21 2016-12-21 触摸屏手势识别的方法及装置 WO2018112803A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2016/111350 WO2018112803A1 (zh) 2016-12-21 2016-12-21 触摸屏手势识别的方法及装置
CN201680057377.8A CN108604160A (zh) 2016-12-21 2016-12-21 触摸屏手势识别的方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/111350 WO2018112803A1 (zh) 2016-12-21 2016-12-21 触摸屏手势识别的方法及装置

Publications (1)

Publication Number Publication Date
WO2018112803A1 true WO2018112803A1 (zh) 2018-06-28

Family

ID=62624148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/111350 WO2018112803A1 (zh) 2016-12-21 2016-12-21 触摸屏手势识别的方法及装置

Country Status (2)

Country Link
CN (1) CN108604160A (zh)
WO (1) WO2018112803A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110851014A (zh) * 2019-10-29 2020-02-28 Tcl移动通信科技(宁波)有限公司 触摸识别方法、装置、存储介质及终端设备
CN111273815A (zh) * 2020-01-16 2020-06-12 业成科技(成都)有限公司 手势触控方法和手势触控系统
CN112527097A (zh) * 2019-09-19 2021-03-19 义隆电子股份有限公司 触控装置及其操作方法
CN117555442A (zh) * 2024-01-12 2024-02-13 上海海栎创科技股份有限公司 一种多芯片级联触摸屏的手掌识别方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736845A (zh) * 2011-03-22 2012-10-17 奥多比公司 针对支持多点触摸的设备提供局部坐标系用户界面的方法和装置
CN103246836A (zh) * 2013-04-03 2013-08-14 李健 触摸屏手指滑动身份识别解锁方法
CN105302452A (zh) * 2014-07-22 2016-02-03 腾讯科技(深圳)有限公司 一种基于手势交互的操作方法及装置
CN105659522A (zh) * 2013-09-09 2016-06-08 苹果公司 用于基于指纹传感器输入来操纵用户界面的设备、方法和图形用户界面

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887332B (zh) * 2009-05-12 2013-03-06 联阳半导体股份有限公司 触控面板的定位方法及定位装置
CN101882042B (zh) * 2010-06-08 2011-10-12 苏州瀚瑞微电子有限公司 电容式触摸屏手掌判别方法
CN105335086B (zh) * 2014-08-12 2018-12-11 苏宁易购集团股份有限公司 触摸屏操作方法和装置
CN105867916A (zh) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 一种终端的控制方法及装置
CN106055404B (zh) * 2016-05-18 2020-05-01 Oppo广东移动通信有限公司 一种清理后台应用程序的方法和装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736845A (zh) * 2011-03-22 2012-10-17 奥多比公司 针对支持多点触摸的设备提供局部坐标系用户界面的方法和装置
CN103246836A (zh) * 2013-04-03 2013-08-14 李健 触摸屏手指滑动身份识别解锁方法
CN105659522A (zh) * 2013-09-09 2016-06-08 苹果公司 用于基于指纹传感器输入来操纵用户界面的设备、方法和图形用户界面
CN105302452A (zh) * 2014-07-22 2016-02-03 腾讯科技(深圳)有限公司 一种基于手势交互的操作方法及装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112527097A (zh) * 2019-09-19 2021-03-19 义隆电子股份有限公司 触控装置及其操作方法
CN110851014A (zh) * 2019-10-29 2020-02-28 Tcl移动通信科技(宁波)有限公司 触摸识别方法、装置、存储介质及终端设备
CN110851014B (zh) * 2019-10-29 2023-08-18 Tcl移动通信科技(宁波)有限公司 触摸识别方法、装置、存储介质及终端设备
CN111273815A (zh) * 2020-01-16 2020-06-12 业成科技(成都)有限公司 手势触控方法和手势触控系统
CN111273815B (zh) * 2020-01-16 2022-12-02 业成科技(成都)有限公司 手势触控方法和手势触控系统
CN117555442A (zh) * 2024-01-12 2024-02-13 上海海栎创科技股份有限公司 一种多芯片级联触摸屏的手掌识别方法及系统
CN117555442B (zh) * 2024-01-12 2024-04-09 上海海栎创科技股份有限公司 一种多芯片级联触摸屏的手掌识别方法及系统

Also Published As

Publication number Publication date
CN108604160A (zh) 2018-09-28

Similar Documents

Publication Publication Date Title
US11269575B2 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
KR102120930B1 (ko) 포터블 디바이스의 사용자 입력 방법 및 상기 사용자 입력 방법이 수행되는 포터블 디바이스
EP2820511B1 (en) Classifying the intent of user input
CN108121457B (zh) 提供字符输入界面的方法和设备
CN107479737B (zh) 便携式电子设备及其控制方法
WO2019014859A1 (zh) 一种多任务操作方法及电子设备
WO2018082269A1 (zh) 菜单显示方法及终端
US20140152585A1 (en) Scroll jump interface for touchscreen input/output device
WO2019062910A1 (zh) 一种复制和粘贴的方法、数据处理装置和用户设备
EP2770423A2 (en) Method and apparatus for operating object in user device
KR20130052151A (ko) 터치스크린을 구비한 휴대 단말기의 데이터 입력 방법 및 장치
TWI659353B (zh) 電子設備以及電子設備的工作方法
WO2018068207A1 (zh) 识别操作的方法、装置及移动终端
WO2017197636A1 (zh) 识别误触摸操作的方法和电子设备
WO2013135169A1 (zh) 一种输入法键盘的调整方法及其移动终端
WO2018112803A1 (zh) 触摸屏手势识别的方法及装置
EP2613247A2 (en) Method and apparatus for displaying keypad in terminal having touch screen
CN108475161A (zh) 显示方法及终端
KR20140106801A (ko) 시각 장애인들을 위한 휴대 단말기의 음성 서비스 지원 방법 및 장치
CN109074124B (zh) 数据处理的方法及移动设备
WO2018039914A1 (zh) 一种数据复制方法及用户终端
EP2741194A1 (en) Scroll jump interface for touchscreen input/output device
JP2011243157A (ja) 電子機器、ボタンサイズ制御方法、及びプログラム
CN108700990A (zh) 一种锁屏方法、终端及锁屏装置
US20130069881A1 (en) Electronic device and method of character entry

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924757

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924757

Country of ref document: EP

Kind code of ref document: A1