WO2016165066A1 - 一种手势控制方法、装置、终端设备和存储介质 - Google Patents

一种手势控制方法、装置、终端设备和存储介质 Download PDF

Info

Publication number
WO2016165066A1
WO2016165066A1 PCT/CN2015/076536 CN2015076536W WO2016165066A1 WO 2016165066 A1 WO2016165066 A1 WO 2016165066A1 CN 2015076536 W CN2015076536 W CN 2015076536W WO 2016165066 A1 WO2016165066 A1 WO 2016165066A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch action
joint
touch
terminal device
Prior art date
Application number
PCT/CN2015/076536
Other languages
English (en)
French (fr)
Inventor
王伟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US15/566,582 priority Critical patent/US10802704B2/en
Priority to EP15888771.1A priority patent/EP3276480A4/en
Priority to PCT/CN2015/076536 priority patent/WO2016165066A1/zh
Priority to JP2017553945A priority patent/JP6598089B2/ja
Priority to CN201580029659.2A priority patent/CN106415472B/zh
Publication of WO2016165066A1 publication Critical patent/WO2016165066A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1636Sensing arrangement for detection of a tap gesture on the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • Embodiments of the present invention relate to the field of touch control, and in particular, to a gesture control method, apparatus, terminal device, and storage medium.
  • the touch screen has a wider application range.
  • the user performs various basic touch actions such as pressing, moving, or lifting on the touch screen to generate various types of touch gestures.
  • the most used touch gesture on the existing terminal device is a click gesture.
  • the terminal device will input the location of the click gesture, determine the application corresponding to the location, and trigger the application to execute. The corresponding operation, otherwise the location does not correspond to the application, no action is triggered.
  • the recognition of the touch gesture by the terminal device is based on the position and the trajectory.
  • the touch gesture based on the two-dimensional plane is increasingly unable to meet the interaction requirements of the terminal device.
  • the technical problem to be solved by the embodiments of the present invention is to provide a gesture control method, device, terminal device, and storage medium, which can enrich the interaction mode of the terminal device.
  • a gesture control method including:
  • the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
  • the identifying a gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
  • the identifying a gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
  • the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
  • the identifying a gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
  • the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, where the click gesture includes: At least one of a single click gesture, a single point multiple click gesture, and a multi click gesture; the sliding track includes at least one of a closed track and a non-closed track.
  • the obtaining a contact area of the touch action on the touch screen and contacting the touch screen is generated
  • the z-axis acceleration includes:
  • the z-axis acceleration of the touch action is acquired by a gravity acceleration sensor that is provided by the terminal device.
  • the method before the detecting the touch action on the touch screen of the terminal device, the method further includes:
  • mapping relationship between the gesture type corresponding to the joint touch action and the preset function is customized, and the mapping relationship is saved in the mapping relationship library.
  • a second aspect of the embodiments of the present invention provides a gesture control apparatus, including:
  • a detecting module configured to detect a touch action on a touch screen of the terminal device
  • An acquiring module configured to acquire a contact area of the touch action on the touch screen and a z-axis acceleration generated when contacting the touch screen;
  • a determining module configured to determine that the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration
  • the calling module is configured to identify a gesture type corresponding to the joint touch action, and invoke a preset function of the terminal device according to the gesture type.
  • the calling module includes:
  • a first determining unit configured to determine an interface in which the joint touch action occurs, and an application to which the interface belongs
  • the first calling unit is configured to identify a gesture type corresponding to the joint touch action, and invoke a preset function corresponding to the application according to the gesture type.
  • the calling module includes:
  • a second determining unit configured to determine an interface where the joint touch action occurs
  • a second calling unit configured to: if the interface where the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
  • the calling module includes:
  • a third determining unit configured to determine an interface at which the joint touch action occurs
  • a third calling unit configured to identify a gesture type corresponding to the joint touch action, and operate the interface according to the gesture type; wherein the operation includes a screen capture, an icon arrangement, or a replacement theme.
  • the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, where the click gesture includes: At least one of a single click gesture, a single point multiple click gesture, and a multi click gesture; the sliding track includes at least one of a closed track and a non-closed track.
  • the acquiring module includes:
  • a first acquiring unit configured to acquire an image of the touch action in a contact area on the touch screen a number m of prime points, and a number n of pixels of the touch screen, an area s of the touch screen;
  • a calculating unit configured to calculate a contact area of the touch action on the touch screen as s*(m/n);
  • the second acquiring unit is configured to acquire the z-axis acceleration of the touch action by using a gravity acceleration sensor that is provided by the terminal device.
  • the mapping module is configured to customize a mapping between the gesture type and the preset function corresponding to the joint touch action Relationship and save the mapping relationship to the mapping relationship library
  • a third aspect of the embodiments of the present invention provides a gesture recognition apparatus, including a processor and a memory, wherein the memory stores a set of program codes, and the processor calls the program code stored in the memory to perform the following operations. :
  • the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
  • the processor performs the corresponding gesture type that identifies the joint touch action, and invokes a preset function corresponding to the terminal device according to the gesture type, including :
  • the processor performs the corresponding gesture type that identifies the joint touch action, and invokes a preset function corresponding to the terminal device according to the gesture type, including :
  • the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
  • the calling module includes:
  • a third determining unit configured to determine an interface at which the joint touch action occurs
  • a third calling unit configured to identify a gesture type corresponding to the joint touch action, and operate the interface according to the gesture type; wherein the operation includes a screen capture, an icon arrangement, or a replacement theme.
  • the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, where the click gesture includes: At least one of a single click gesture, a single point multiple click gesture, and a multi click gesture; the sliding track includes at least one of a closed track and a non-closed track.
  • the processor performs the acquiring contact area and contact of the touch action on the touch screen
  • the z-axis acceleration generated when the touch screen is included includes:
  • the z-axis acceleration of the touch action is acquired by a gravity acceleration sensor that is provided by the terminal device.
  • the processor is further configured to:
  • mapping relationship between the gesture type corresponding to the joint touch action and the preset function is customized, and the mapping relationship is saved in the mapping relationship library.
  • a fifth aspect of the embodiments of the present invention provides a terminal device, including any one of the above gesture control devices, a touch screen, and a gravity sensor.
  • a sixth aspect of the embodiments of the present invention provides a method for controlling a computer device to perform gesture control, where the method includes the following steps:
  • the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
  • Identifying a gesture type corresponding to the joint touch action, and calling the end according to the gesture type The default function of the end device.
  • the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, and the touch based on the contact area and the z-axis acceleration is added.
  • the interaction method of the gesture to the terminal device makes the interaction method of the terminal device more abundant.
  • FIG. 1 is a schematic flowchart of a gesture control method according to an embodiment of the present invention
  • FIG. 2 is another schematic flowchart of a gesture control method according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
  • FIG. 4 is another schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
  • Figure 5a is a schematic structural view of the calling module of Figure 4.
  • Figure 5b is another schematic structural view of the calling module of Figure 4.
  • Figure 5c is a schematic diagram of still another structure of the calling module of Figure 4.
  • Figure 6 is a schematic structural view of the acquisition module of Figure 4.
  • FIG. 7 is still another schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 1 is a schematic flowchart of a gesture control method according to an embodiment of the present invention.
  • the method comprises S101-S104.
  • the touch action of the finger on the touch screen of the terminal device can be detected in a bright screen state or a black screen state, and a touch signal is generated.
  • the bright screen state refers to a situation in which the backlight of the touch screen is lit, and the bright screen state includes a lock screen bright screen state or a non-lock screen bright screen state, and the black screen state refers to a case where the backlight of the touch screen is turned off.
  • the touch screen can be a dedicated touch screen or a display device with a touch function.
  • a touch screen may receive a touch action of one or more touch points through a touch event processing function and generate a corresponding gesture, different gestures indicating different operations.
  • the recognition and acquisition of the touch gesture may be different depending on the working principle of the touch technology, which is not limited by the present invention.
  • the basic touch action of the finger may include actions such as down, move, and up of the finger, by combining different basic touch actions into different types of gestures.
  • a tap gesture consists of pressing and lifting two basic touch actions
  • the swipe gesture consists of three basic touch actions of pressing, moving, and lifting.
  • the present invention is not limited to the touch action of the finger on the touch screen, but also the touch action of other objects on the touch screen, and the touch screen can touch the touch action of different types of objects by different touch technologies.
  • the contact area refers to the area of the contact area generated when the finger is in contact with the touch screen.
  • the touch screen is rigid, the speed of the finger is attenuated to zero in a short period of time, thereby generating gravitational acceleration in the z-axis direction, and the greater the velocity in the z-axis of the finger, the greater the gravitational acceleration generated in the z-axis direction.
  • the touch action is an action when the finger joint of the finger comes into contact with the touch screen.
  • the z-axis in the embodiment of the present invention refers to the direction of the vertical touch screen. It can be understood that when the finger touches the touch screen at a certain speed, the direction of the speed is not completely perpendicular to the touch screen, so the z-axis acceleration that can be acquired is corrected by the cosine value of the angle, and the angle cannot be directly measured. The value is set.
  • the joint touch action in the embodiment of the present invention is a new touch action defined, which is not necessarily triggered by the finger joint, or may be the action triggered by other objects hitting the touch screen at a fast speed.
  • the joint touch action can be referred to as an embodiment of the present invention.
  • the touch action is composed of at least one basic touch action, and according to the gesture type corresponding to the joint touch action, the preset function corresponding to the terminal device is invoked according to the gesture type, and the preset function corresponding to the terminal device includes: starting/closing the terminal device
  • the application or the function in the calling application For example, to open the corresponding application on the terminal device, including entering the setting menu, starting the camera, starting the screenshot, changing the theme, etc.; or calling the reply function in the SMS application, calling the video call function in the instant messaging application, calling the browser application Screen capture function, etc.
  • Different functions corresponding to different gesture types can be customized by the user in advance.
  • the application program may be a system application or a third-party application.
  • the system application refers to an application that is provided by the terminal device operating system.
  • the functions corresponding to the system application include a dialing function, a setting function, and a short message interface function.
  • the third application refers to an application installed on the terminal device.
  • the identifying the gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
  • the application presents a plurality of different interfaces on the terminal device, and the application includes multiple preset functions.
  • the interface where the joint touch action occurs is determined, and the application program to which the interface belongs, identifies the gesture type corresponding to the joint touch action, the gesture type includes a click gesture or a swipe gesture, and the applied application is invoked according to the gesture type.
  • Preset function within the program is not limited to a click gesture or a swipe gesture.
  • the interface where the joint touch action occurs is the interface of the instant communication software, and the joint touch action pair
  • the gesture type should be a two-click gesture.
  • the two-click gesture is a partial screen capture function in the instant messaging software.
  • the screen capture function is activated, and an adjustable size is displayed on the interface where the joint touch action occurs.
  • the captured image is saved to the specified location of the terminal device.
  • the interface where the joint touch action occurs is the interface of the instant communication software
  • the gesture type corresponding to the joint touch action is a three-click gesture
  • the three-click gesture is corresponding to the full-screen function in the instant communication software, and the interception is performed.
  • the entire screen generates a picture, and the generated picture is saved to a specified location of the terminal device.
  • the interface where the joint touch action occurs is the interface of the camera application software
  • the gesture type corresponding to the joint touch action is the S track gesture
  • the corresponding function of the S track gesture in the camera application software is the camera function
  • the camera function is activated.
  • the M track gesture corresponding to the preset function in the camera application is the recording function.
  • the preset functions corresponding to the same gesture type may be the same or different, and the preset functions corresponding to the gesture type may be customized as needed.
  • the identifying the gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
  • the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
  • an joint touch action occurs on the terminal device, determine an interface where the joint touch action occurs, determine a type of the interface, and if the interface is a system desktop of the terminal device, query an application associated with the gesture type, and obtain an associated application.
  • the running state if the associated application is not started, the associated application is started, and the interface after startup is displayed on the terminal device; if the associated application is in the background running state, the associated application is closed.
  • the application program in the embodiment of the present invention may be a system application or a third-party application, which is not limited herein.
  • the system desktop of the mobile terminal is generally divided into multiple sub-interfaces, and when the interface where the joint touch action occurs is any one of the sub-interfaces, the joint touch action can be determined.
  • the interface that occurs is the system desktop.
  • the joint touch action is issued.
  • the raw interface is the system desktop
  • the gesture type corresponding to the joint touch action is the C track gesture
  • the preset function corresponding to the C track gesture is called to call the camera application to obtain the running state of the camera application, if the camera application is not In the startup state, the camera application is started, the camera interface is displayed on the terminal device, and the camera application is closed if the camera application is not running in the background state; or the gesture type corresponding to the joint touch action is the S track gesture, and the S track gesture is queried.
  • the corresponding preset function is to call the short message application to obtain the running status of the short message application.
  • the short message application is not activated, the short message application is started, and the short message editing interface is displayed on the terminal device, if the short message application is running in the background The status is closed, the SMS application is closed; or the implementation of the joint touch action is implemented as a single-point three-click gesture, and the application corresponding to the single-point three-click gesture is to invoke the music player application to obtain the running state of the music playing application. If the music is played The music playing application is started when the program is not activated, and the music player's song list interface is displayed on the terminal device. If the application playing application is in the background running state, the music playing application is closed.
  • the gesture type in the embodiment of the present invention may be a click gesture or a swipe gesture, and the association relationship between the gesture type and the application may be customized according to the needs of the user.
  • the identifying the gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
  • the interface where the joint touch action occurs is determined, the gesture type corresponding to the joint touch action is recognized, and the interface is operated according to the gesture type.
  • the different applications present different interfaces on the terminal device, and the same application presents a plurality of different interfaces on the terminal device.
  • the embodiment of the present invention only operates the interface where the joint touch action occurs, such as arranging icons and replacing the theme. Or a screen capture operation, etc.; wherein the same gesture type may have the same operation in different interfaces, or may be different.
  • the interface where the joint touch action occurs is the setting interface of the instant communication application
  • the gesture type corresponding to the joint touch action is a two-click gesture
  • the two-click gesture corresponds to the screen capture function
  • the screenshot function is called to perform the current setting interface.
  • Screen capture operation; or the interface where the joint touch action occurs is the system desktop of the terminal device, and the gesture type corresponding to the joint touch action is a two-click gesture, two clicks
  • the screenshot function is called to perform a screen capture operation on the current system desktop
  • the gesture type corresponding to the joint touch action is 1 track gesture
  • the 1 track gesture corresponds to the icon arrangement.
  • the image arrangement function is called to rearrange the current system desktop.
  • the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, thereby increasing the contact-based contact.
  • the interaction method of the touch gesture of the area and the z-axis acceleration on the terminal device makes the interaction method of the terminal device more abundant.
  • FIG. 2 is another schematic flowchart of a method for controlling a gesture according to an embodiment of the present invention.
  • the method includes S201-S209.
  • the terminal device presets a gesture type corresponding to multiple joint touch actions, and the user can associate with different gesture types by selecting different functions on the terminal device. At the same time, the terminal device can also learn new gesture types and save the new gesture types locally.
  • the preset types of gestures of the terminal device include: a single click gesture, a single click gesture, a two click gesture, an upward swipe gesture, and a downward swipe gesture, wherein the gestures are joint touch actions.
  • the user can perform a mapping relationship editing interface, and the above-mentioned gesture is displayed in the mapping relationship editing interface, and the user can select one of the gestures to associate with the required function, for example, associate a single-click gesture with an operation instruction to enter the setting option, A single-click gesture is associated with an operation instruction that initiates a screen capture, and a two-click gesture is associated with an operation instruction that moves the target application into the background, and the upward swipe gesture is associated with an operation command that initiates the browser, and will be down.
  • the slide command is associated with the operation command of the switching body.
  • the user can enter the gesture learning interface and draw a custom gesture on the gesture learning interface at least twice. If the custom gesture is a sliding gesture, the similarity of the sliding trajectory of each custom gesture is greater than the pre- When the threshold is set, it means that learning a new gesture is successful; if the custom gesture is a click gesture, when the number of clicks and the number of touched points are the same, it indicates that the new gesture is successful, and the learned new gesture is saved to the local In the gesture library, when the user enters the mapping relationship editing interface, the new gesture is displayed in the mapping relationship editing interface for the user to select the operation instruction and the new one. Gestures are associated.
  • the new gesture that needs to be learned is to draw the letter C, and the user enters the gesture learning interface.
  • the letter C is drawn twice on the gesture learning interface, and the sliding track obtained by the two drawing is obtained.
  • the sliding trajectory is fitted into a plane figure, and the similarity algorithm is used to compare the similarity between the two. The greater the similarity, the closer the shape of the two sliding trajectories is, and the larger the difference is, if the calculated similarity is greater than the preset threshold.
  • the gesture of drawing the letter C that needs to be learned is a three-click gesture
  • a three-click gesture is performed twice on the gesture learning interface, and the touch points are compared twice. The number is 3 times and the number of clicks on each touch point is 1 time, indicating that the three-click gesture is successful.
  • the preset function may be to start/close an application on the terminal device and start a function in the application.
  • the touch action of the finger on the touch screen of the terminal device can be detected in a bright screen state, and a touch signal is generated, and the bright screen state includes a lock screen bright screen state and a non-lock screen bright screen state.
  • the touch screen can be a dedicated touch screen or a display device with a touch function.
  • a touch screen may receive a touch action of one or more touch points through a touch event processing function and generate a corresponding gesture, different gestures indicating different operations.
  • the recognition and acquisition of the touch gesture may be different depending on the working principle of the touch technology, which is not limited by the present invention.
  • the basic touch action of the finger may include actions such as down, move, and up of the finger, by combining different basic touch actions into different types of gestures.
  • a tap gesture consists of pressing and lifting two basic touch actions
  • the swipe gesture consists of three basic touch actions of pressing, moving, and lifting.
  • the touch screen of the terminal device is a capacitive touch screen, and the four corners of the capacitive touch screen are set to four electrodes.
  • the surface of the capacitive touch screen has a high frequency signal.
  • a coupling capacitor is generated on the contact area.
  • a small current from the contact area to the finger direction is generated, and the current on the four electrodes changes, and the touch action on the terminal device can be detected by detecting the change in the current on the electrode.
  • the present invention is not limited to the touch action of the finger on the touch screen, and may be other The touch action of the object on the touch screen, the different touch technology touch screen can sense the touch action of different types of objects.
  • the contact area refers to the area of the contact area generated when the finger is in contact with the touch screen.
  • the area of the contact area that is pressed for the first time is acquired; when the finger touches the touch screen, there is a certain speed in the z-axis direction. Since the touch screen is rigid, the speed of the finger is attenuated to zero in a short period of time, thereby generating gravitational acceleration in the z-axis direction, and the greater the velocity in the z-axis of the finger, the greater the gravitational acceleration generated in the z-axis direction.
  • the z-axis in the embodiment of the present invention refers to the direction of the vertical touch screen.
  • the method for calculating the contact area may be: obtaining the number m of pixels in the contact area of the touch action on the touch screen, and acquiring the number n of pixels of the touch screen, wherein the number n of pixels of the touch screen is fixed, determined by the resolution of the touch screen.
  • the area s of the touch screen is also fixed, and the area s of the touch screen is also fixed.
  • the area s can be pre-stored in the ROM of the terminal device, and the area of the contact area is calculated by the formula s*(m/n).
  • the z-axis acceleration of the touch action is obtained by the gravity acceleration sensor provided by the terminal device, and the gravity acceleration sensor is now commonly built in terminal devices such as smart phones and tablet computers.
  • the z-axis acceleration of the embodiment of the present invention refers to the absolute value of the acceleration based on the gravitational acceleration.
  • the gravitational acceleration is 9.8 N/S
  • the z-axis acceleration is based on 9.8 N/S.
  • the absolute value of the amount of change, the change trend of the amount of change may be a positive change or a reverse change.
  • the method for obtaining the number m of pixel points in the contact area of the touch action on the touch screen may be: taking the touch screen as a capacitive touch screen as an example, according to the principle of the capacitive touch screen, when the user's finger touches the capacitive touch screen, the human body electric field A coupling capacitor is formed between the finger and the contact area, and a high frequency signal is present on the contact area, so that the finger absorbs a current which flows out from the electrodes at the four corners of the capacitive touch screen, theoretically flowing through the four electrodes. The current is inversely proportional to the distance from the finger to the four corners.
  • the contact area of the touch action can be calculated by the current on the four electrodes, and the contact area is denoised and patterned to obtain a plane figure. The number m of pixels in the graph.
  • the contact area calculated according to S203 is compared with the preset area, and the z-axis acceleration calculated according to S203 is compared with the preset acceleration. If the comparison result of both meets the constraint condition greater than, the process proceeds to S205; otherwise, Execute S206.
  • the preset area and the preset acceleration may be set as needed, and the invention is not limited.
  • the joint touch action in the embodiment of the present invention is a new touch action defined, which is not necessarily triggered by the finger joint, or may be the action triggered by other objects hitting the touch screen at a fast speed, as long as the above is satisfied.
  • the limitation of the contact area and the z-axis acceleration may be referred to as the joint touch action of the embodiment of the present invention.
  • the preset function of the terminal device is invoked by the location of the touch action, and the prior art has been disclosed in detail, and details are not described herein again.
  • the corresponding touch type is identified according to the basic touch action included in the joint touch action, for example, the single click gesture is a sequential press action and a lifting action on one touch point, and a single point
  • the double-click gesture is that two presses and lifts occur on one touch point, and the time interval between the two actions is less than the preset duration;
  • the swipe gesture is that a press action occurs at one touch point, and the other touch point A lifting motion occurs, and a sliding trajectory is formed during the movement.
  • the sliding trajectory of the sliding gesture is a closed graphic
  • the operation instruction for querying the closed graphic association in the mapping relationship library of the terminal device is to start the screen capture, and then the operation of starting the screen capture is performed, and a resizable screenshot frame is displayed on the user interface.
  • the image in the screenshot box is saved to the specified location of the terminal device;
  • the sliding trajectory of the sliding gesture is a non-closed graphic, and the operation instruction for querying the non-closed graphic in the mapping relationship library of the terminal device is the theme switching. Then perform the theme switching operation, display an interface containing multiple theme thumbnails, and after the user selects the desired theme, the current theme is Switch to the selected topic.
  • the sliding track of the sliding gesture is a circle
  • the associated operation instruction in the mapping relationship library of the terminal device is an instruction to start capturing a part of the screen, and a screenshot is taken on the area included in the circle, and the picture of the screenshot is saved to a specified position of the terminal device.
  • the shape of the sliding track can be further subdivided in order to support more operation instructions.
  • the different shapes correspond to different operation commands, for example, the sliding track is a letter or a number, and the like, which is not limited by the present invention.
  • the click gesture is a single-click gesture
  • the operation instruction associated with the single-click gesture is queried in the mapping relationship library of the terminal device to enter the application setting, and then the operation of entering the application setting is performed
  • the click gesture is a single-click double-click gesture
  • the query is performed.
  • the mapping relationship library of the terminal device queries the operation command associated with the single-click gesture to start the browser, and then performs the operation of starting the browser;
  • the click gesture is a double-click gesture, and the double-point list is queried in the mapping relationship library of the terminal device.
  • the action associated with the gesture is to start the camera, and the operation of starting the camera is performed;
  • the click gesture is a single-point three-click gesture, and the operation instruction associated with the single-point three-hook gesture is queried in the mapping relationship library of the terminal device to start capturing the full screen. Perform a full screen capture operation to save the captured image to the specified location of the terminal device.
  • the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, thereby increasing the contact-based contact.
  • the interaction method of the touch gesture of the area and the z-axis acceleration on the terminal device makes the interaction method of the terminal device more abundant.
  • FIG. 3 is a schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
  • the gesture control apparatus includes: a detection module 30, an acquisition module 31, a determination module 32, and a calling module 33.
  • the detecting module 30 is configured to detect a touch action on the touch screen of the terminal device.
  • the acquiring module 31 is configured to acquire a contact area of the touch action on the touch screen and a z-axis acceleration generated when contacting the touch screen.
  • the determining module 32 is configured to determine that the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration.
  • Calling module 33 configured to identify a gesture type corresponding to the joint touch action, and according to the The gesture type invokes a preset function of the terminal device.
  • the gesture control device of the embodiment of the present invention is used to perform the gesture control method in the first embodiment of the method, and is based on the same concept as the method embodiment, and the technical effects thereof are also the same. For the specific process, refer to the description of the method embodiment 1. I won't go into details here.
  • FIG. 4 is a schematic diagram of another structure of a gesture control apparatus according to an embodiment of the present invention.
  • the gesture control apparatus includes a detection module 30, an acquisition module 31, and a determination module 32.
  • the mapping module 34 is configured to customize a mapping relationship between the gesture type corresponding to the joint touch action and the preset function, and save the mapping relationship to the mapping relationship library.
  • the calling module 33 includes: a first determining unit 331 and a second calling unit 332.
  • the first determining unit 331 is configured to determine an interface where the joint touch action occurs, and an application to which the interface belongs.
  • the first invoking unit 332 is configured to identify a gesture type corresponding to the joint touch action, and invoke a preset function corresponding to the application according to the gesture type.
  • the calling module 33 includes: a second determining unit 333 and a second calling unit 334.
  • the second determining unit 333 is configured to determine an interface where the joint touch action occurs.
  • the second invoking unit 334 is configured to: if the interface where the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
  • the calling module 33 includes: a third determining unit 335 and a third calling unit 336.
  • a third determining unit configured to determine an interface at which the joint touch action occurs
  • a third calling unit configured to identify a gesture type corresponding to the joint touch action, and operate the interface according to the gesture type; wherein the operation includes a screen capture, an icon arrangement, or a replacement theme.
  • the click gesture includes: the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, and the click gesture includes: a single-point single-click gesture, a single-point multiple-click gesture, and a multi-click gesture.
  • At least one of the sliding trajectories includes at least one of a closed trajectory and a non-closed trajectory.
  • the obtaining module 31 includes: a first obtaining unit 311, a calculating unit 312, and a second acquiring unit 313.
  • a first acquiring unit 311, configured to acquire the touch action in a contact area on the touch screen The number of pixels m, and the number of pixels n of the touch screen, the area s of the touch screen.
  • the calculating unit 312 is configured to calculate that the contact area of the touch action on the touch screen is s*(m/n).
  • the second acquiring unit 313 is configured to acquire the z-axis acceleration of the touch action by using a gravity acceleration sensor that is provided by the terminal device.
  • the gesture control device of the embodiment of the present invention is used to perform the gesture control method in the second embodiment of the method.
  • the method and the first embodiment are based on the same concept, and the technical effects thereof are also the same.
  • FIG. 7 is another schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
  • a gesture control apparatus is used to implement a gesture control method according to an embodiment of the present invention.
  • the gesture control apparatus includes a processor 71.
  • the memory 72 and the communication interface 73, the number of the processors 71 in the gesture control device may be one or more, and FIG. 7 takes a processor as an example.
  • the processor 71, the memory 72, and the communication interface 73 may be connected by a bus or other means, and the bus connection is taken as an example in FIG.
  • the memory 72 stores a set of program codes
  • the processor 71 is configured to call the program code stored in the memory 72 for performing the following operations:
  • the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
  • the processor 71 performs the corresponding gesture type that identifies the joint touch action, and invokes the preset function corresponding to the terminal device according to the gesture type, including:
  • processor 71 performs the identification of the joint touch action Corresponding to the gesture type, and calling the preset function corresponding to the terminal device according to the gesture type includes:
  • the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
  • the processor 71 performs the corresponding gesture type that identifies the joint touch action, and invokes the preset function corresponding to the terminal device according to the gesture type, including:
  • the gesture type corresponding to the touch action includes a click gesture or a swipe gesture
  • the click gesture includes: a single-point single-click gesture, a single-point multiple-click gesture, and a multi-click gesture.
  • the sliding track includes at least one of a closed track and a non-closed track.
  • the processor 71 performs the acquiring the contact area of the touch action on the touch screen and the z-axis acceleration generated when contacting the touch screen, including:
  • the z-axis acceleration of the touch action is acquired by a gravity acceleration sensor that is provided by the terminal device.
  • the processor 71 is further configured to:
  • mapping relationship between the gesture type corresponding to the joint touch action and the preset function is customized, and the mapping relationship is saved in the mapping relationship library.
  • the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, thereby increasing the contact-based contact.
  • the interaction method of the touch gesture of the area and the z-axis acceleration on the terminal device makes the interaction method of the terminal device more abundant.
  • the embodiment of the present invention further provides a terminal device, including a touch screen 83, a gravity sensor 82, and a gesture control device 81.
  • the touch screen 83, the gravity sensor 82, and the gesture control device 81 can be connected through a bus, or can be connected by other means.
  • the gesture recognition device 81 is in the device embodiments 1 and 2
  • the gesture recognition device is configured to acquire a z-axis acceleration generated when a touch action contacts the touch screen.
  • the terminal device can be a smartphone, a tablet or other device with a touch screen. For the specific working process of the terminal device, reference may be made to the description of the method embodiments 1 and 2, and details are not described herein again.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

本发明实施例公开了一种手势控制方法,包括:检测对终端设备的触摸屏的触摸动作;获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。相应的,本发明实施例还提供了一种手势控制装置和终端设备,可丰富终端设备的交互方式。

Description

一种手势控制方法、装置、终端设备和存储介质 技术领域
本发明实施例涉及触摸控制领域,尤其涉及一种手势控制方法、装置、终端设备和存储介质。
背景技术
触摸屏作为一种人机交互设备,其应用范围越来越广泛。用户在触摸屏上执行按下、移动或抬起等基本触摸动作生成各种类型的触摸手势。由于操作的便捷性,现有的终端设备上使用最多的触摸手势为点击手势,对于一个输入的点击手势,终端设备将输入的点击手势的位置,确定该位置对应的应用程序,触发应用程序执行相应的操作,否则该位置不对应应用程序,则不触发任何操作。
目前终端设备对触摸手势的识别都是基于位置和轨迹的,随着终端设备上的功能数量和应用软件数量的增加,这种基于二维平面的触摸手势越来越不能满足终端设备的交互需求,急需引入一种新的交互方式。
发明内容
本发明实施例所要解决的技术问题在于,提供一种手势控制方法、装置、终端设备和存储介质,可丰富终端设备的交互方式。
为了解决上述技术问题,本发明实施例第一方面提供了一种手势控制方法,包括:
检测对终端设备的触摸屏的触摸动作;
获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
结合第一方面,在第一种可能的实现方式中,所述识别关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
结合第一方面,在第二种可能的实现方式中,所述识别关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
确定所述关节触摸动作发生的界面;
若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
结合第一方面,在第三种可能的实现方式中,所述识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
确定所述关节触摸动作发生的界面;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
结合第一方面至第三种可能的实现方式中的任意一种,在第四种可能的实现方式中,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
结合第一方面至第四种可能的实现方式中的任意一种,在第五种可能的实现方式中,所述获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度包括:
获取所述触摸动作在所述触摸屏上的接触区域内的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
结合第一方面至第四种可能的实现方式中的任意一种,在第五种可能的实现方式中,所述检测对终端设备的触摸屏的触摸动作之前,还包括:
自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至映射关系库中。
本发明实施例第二方面提供了一种手势控制装置,包括:
检测模块,用于检测对终端设备的触摸屏的触摸动作;
获取模块,用于获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
确定模块,用于若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
调用模块,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
结合第二方面,在第一种可能的实现方式中,所述调用模块包括:
第一确定单元,用于确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
第一调用单元,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
结合第二方面,在第二种可能的实现方式中,所述调用模块包括:
第二确定单元,用于确定所述关节触摸动作发生的界面;
第二调用单元,用于若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
结合第二方面,在第三种可能的实现方式中,所述调用模块包括:
第三确定单元,用于确定所述关节触摸动作发生的界面;
第三调用单元,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
结合第二方面至第三种可能的实现方式中的任意一种,在第四种可能的实现方式中,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
结合第二方面至第四种可能的实现方式中的任意一种,在第五种可能的实现方式中,所述获取模块包括:
第一获取单元,用于获取所述触摸动作在所述触摸屏上的接触区域内的像 素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
计算单元,用于计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
第二获取单元,用于通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
结合第二方面至第五种可能的实现方式中的任意一种,在第六种可能的实现方式中,映射模块,用于自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至映射关系库中
本发明实施例第三方面提供了一种手势识别装置,包括处理器和存储器,所述存储器中存储一组程序代码,所述处理器调用所述存储器中存储的程序代码,用于执行以下操作:
检测对终端设备的触摸屏的触摸动作;
获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
结合第三方面,在第一种可能的实现方式中,所述处理器执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
结合第三方面,在第二种可能的实现方式中,所述处理器执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
确定所述关节触摸动作发生的界面;
若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
结合第三方面,在第三种可能的实现方式中,所述调用模块包括:
第三确定单元,用于确定所述关节触摸动作发生的界面;
第三调用单元,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
结合第三方面至第三种可能的实现方式中的任意一种,在第四种可能的实现方式中,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
结合第三方面至第四种可能的实现方式中的任意一种,在第五种可能的实现方式中,所述处理器执行所述获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度包括:
获取所述触摸动作在所述触摸屏上的接触区域内的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
结合第三方面至第五种可能的实现方式中的任意一种,在第六种可能的实现方式中,所述处理器还用于执行:
自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至映射关系库中。
本发明实施例第五方面提供了一种终端设备,包括上述任意一种手势控制装置、触摸屏和重力传感器。
本发明实施例第六方面提供了一种用于控制计算机设备执行手势控制方法,所述方法包括以下步骤:
检测对终端设备的触摸屏的触摸动作;
获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终 端设备的预设功能。
实施本发明,具有如下有益效果:
通过获取触摸屏上发生的接触面积和z轴加速度,识别出关节触摸动作,识别关节触摸动作对应的手势类型,根据手势类型调用终端设备的预设功能,增加了基于接触面积和z轴加速度的触摸手势对终端设备的交互方法,使终端设备的交互方法更加丰富。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的一种手势控制方法的流程示意图;
图2为本发明实施例提供的一种手势控制方法的另一流程示意图;
图3是本发明实施例提供的一种手势控制装置的结构示意图;
图4是本发明实施例提供的一种手势控制装置的另一结构示意图;
图5a是图4中调用模块的一种结构示意图;
图5b是图4中调用模块的另一结构示意图;
图5c是图4中调用模块的又一结构示意图;
图6是图4中获取模块的结构示意图;
图7是本发明实施例提供的一种手势控制装置的又一结构示意图;
图8是本发明实施例提供的一种终端设备的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
参见图1,为本发明实施例提供的一种手势控制方法的流程示意图,在本 发明实施例中,所述方法包括S101-S104。
S101、检测对终端设备的触摸屏的触摸动作。
具体的,可以在亮屏状态或黑屏状态下检测手指对终端设备的触摸屏的触摸动作,并且产生触摸信号。其中,亮屏状态指触摸屏的背景灯点亮的情况,亮屏状态包括锁屏亮屏状态或非锁屏亮屏状态,黑屏状态指触摸屏的背景灯熄灭的情况。该触摸屏可以是专门的触摸屏,也可以是具备触摸功能的显示设备。在多点触控制技术中,触摸屏可以通过触摸事件处理功能接收一个或多个触摸点的触摸动作,并且产生相应的手势,不同的手势指示不同的操作。触摸手势的识别和获取可以因触摸技术的工作原理的不同而不同,本发明对此不作限定。手指的基本触摸动作可以包括手指的按下(down)、移动(move)以及抬起(up)等动作,通过不同的基本触摸动作的组合成不同类型的手势。例如,点击手势由按下和抬起两个基本触摸动作组成,滑动手势由按下、移动和抬起三个基本触摸动作组成。手指与触摸屏接触时,触摸屏上会产生触摸信号,根据触摸信号检测对终端设备的触摸屏的触摸动作。
应该理解的是,本发明并不限于手指对触摸屏的触摸动作,也可以是其他物体对触摸屏的触摸动作,不同的触摸技术其触摸屏能感应不同类型的物体的触摸动作。
S102、获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度。
具体的,接触面积指手指与触摸屏接触时产生的接触区域的面积,手指首次按下时,获取首次按下的接触区域的面积;手指刚接触到触摸屏时在z轴方向存在一定大小的速度,由于触摸屏是刚性的,手指的速度在很短的时间内衰减为零,从而在z轴方向产生重力加速度,手指z轴方向的速度越大,在z轴方向产生的重力加速度就越大。优选的,触摸动作是手指的指关节与触摸屏接触时的动作。
在本发明实施例中的z轴是指垂直触摸屏的方向。可以理解的是,手指以一定的速度与触摸屏接触时,该速度的方向并非完全垂直触摸屏,因此可以获取的z轴加速度利用角度的余弦值进行修正,角度的大小无法直接进行测量,可以根据经验值进行设置。
S103、若所述接触面积大于预设面积且所述z轴加速度大于预设加速度, 确定所述触摸动作为关节触摸动作。
具体的,若S102获取的接触面积大于预设面积且z轴加速度大于预设加速度,确定触摸屏上的触摸动作为关节触摸动作。可以理解的是,本发明实施例中的关节触摸动作是定义的一种新的触摸动作,并非一定由手指关节触发的动作,也有可能是其他物体以很快的速度敲击触摸屏触发的动作,只要满足上述接触面积和z轴加速度的限制条件均可称为本发明实施例的关节触摸动作。
S104、识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
具体的,触摸动作由至少一个基本触摸动作组成,根据关节触摸动作对应的手势类型,根据手势类型调用终端设备对应的预设功能,终端设备对应的预设功能包括:启动/关闭终端设备上的应用程序或调用应用程序中的功能。例如开启终端设备上对应的应用程序,包括进入设置菜单、启动照相机、启动截图、更换主题等;或调用短信应用程序中的回复功能、调用即时通信应用程序中视频通话功能、调用浏览器应用程序中的截屏功能等。不同手势类型对应的不同功能可以预先由用户进行自定义。其中,应用程序可以是系统应用程序或第三方应用程序,系统应用程序指终端设备操作系统自带的应用程序,例如,系统应用程序对应的功能包括拨号功能、进行设置功能、进入短信界面功能等,第三应用程序指安装在终端设备上的应用程序。
可选的,所述识别关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
具体的,应用程序在终端设备上呈现多个不同的界面,且应用程序内包括多种预设功能。当终端设备上发生关节触摸动作时,确定关节触摸动作发生的界面,以及界面所属的应用程序,识别关节触摸动作对应的手势类型,手势类型包括点击手势或滑动手势,根据手势类型调用所属的应用程序内的预设功能。
例如,关节触摸动作发生的界面为即时通信软件的界面,关节触摸动作对 应的手势类型为两点单击手势,两点单击手势在即时通信软件内对应的预设功能为部分截屏功能,则启动截屏功能,在关节触摸动作发生的界面上显示一个大小可调节的截屏区域,用户调节好所需的截屏区域后,将截取的图片保存至终端设备的指定位置。或关节触摸动作发生的界面为即时通信软件的界面,关节触摸动作对应的手势类型为三点单击手势,三点单击手势在即时通信软件内对应的预设功能为截取全屏功能,则截取整个屏幕生成图片,将生成的图片保存至终端设备的指定位置。
又例如,关节触摸动作发生的界面为照相应用软件的界面,关节触摸动作对应的手势类型为S轨迹手势,S轨迹手势在照相应用软件内对应的预设功能为拍照功能,则启动拍照功能。或M轨迹手势在照相应用软件内对应的预设功能为录像功能。
需要说明的是,在不同类型的应用程序中,相同的手势类型对应的预设功能可能相同,也可能不相同,手势类型对应的预设功能可以根据需要进行自定义。
可选的,所述识别关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
确定所述关节触摸动作发生的界面;
若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
具体的,当终端设备上发生关节触摸动作时,确定关节触摸动作发生的界面,判断界面的类型,若该界面为终端设备的系统桌面,查询手势类型关联的应用程序,并获取关联的应用程序的运行状态,如果关联的应用程序为未启动状态,启动所述关联的应用程序,在终端设备上显示启动后的界面;如果关联的应用程序为后台运行状态,关闭关联的应用程序。本发明实施例中的应用程序可以是系统应用程序或第三方应用程序,此处不作限制。可以理解的是,在移动终端中,由于触摸屏尺寸的限制,移动终端的系统桌面一般划分为多个子界面,当关节触摸动作发生的界面为其中的任意一个子界面时,则可以确定关节触摸动作发生的界面为系统桌面。
例如,当检测到终端设备上发生关节触摸动作时,确定出关节触摸动作发 生的界面为系统桌面,识别关节触摸动作对应的手势类型为C轨迹手势,查询出C轨迹手势对应的预设功能为调用照相应用程序,获取照相应用程序的运行状态,如果照相应用程序为未启动状态,则启动照相应用程序,在终端设备上显示照相机界面,如果照相应用程序未后台运行状态,则关闭照相应用程序;或识别关节触摸动作对应的手势类型为S轨迹手势,查询S轨迹手势对应的预设功能为调用短信应用程序,获取短信应用程序的运行状态,如果短信应用程序为未启动状态,则启动短信应用程序,在终端设备上显示短信编辑界面,如果短信应用程序为后台运行状态,则关闭短信应用程序;或识别关节触摸动作对应的实施为单点三击手势,查询单点三击手势对应的应用程序为调用音乐播放器应用程序,获取音乐播放应用程序的运行状态,如果音乐播放应用程序为未启动状态则启动音乐播放应用程序,在终端设备上显示音乐播放器的歌曲列表界面,如果应用播放应用程序为后台运行状态,则关闭音乐播放应用程序。本发明实施例中的手势类型可以是点击手势或滑动手势,手势类型与应用程序的关联关系可以根据用户需要进行自定义。
可选的,所述识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
确定所述关节触摸动作发生的界面;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作。
具体的,当终端设备上发生关节触摸动作时,确定关节触摸动作发生的界面,识别关节触摸动作对应的手势类型,根据手势类型对界面进行操作。不同的应用程序在终端设备上呈现不同的界面,且相同的应用程序在终端设备上呈现多个不同的界面,本发明实施例只对关节触摸动作发生的界面进行操作,例如排列图标、更换主题或截屏等操作;其中,相同的手势类型在不同的界面中对应的操作可以相同,也可以不相同。
例如,关节触摸动作发生的界面为即时通信应用程序的设置界面,关节触摸动作对应的手势类型为两点单击手势,两点单击手势对应截屏功能,则调用截图功能对当前的设置界面进行截屏操作;或关节触摸动作发生的界面为终端设备的系统桌面,关节触摸动作对应的手势类型为两点单击手势,两点单击手 势对应截屏功能,则调用截图功能对当前的系统桌面进行截屏操作;或关节触摸动作发生的界面为终端设备的系统桌面,关节触摸动作对应的手势类型为1轨迹手势,1轨迹手势对应图标排列,则调用图片排列功能对当前的系统桌面进行重新排列。
实施本发明的实施例,通过获取触摸屏上发生的接触面积和z轴加速度,识别出关节触摸动作,识别关节触摸动作对应的手势类型,根据手势类型调用终端设备的预设功能,增加了基于接触面积和z轴加速度的触摸手势对终端设备的交互方法,使终端设备的交互方法更加丰富。
参见图2,为本发明实施例提供的一种手势的控制方法的另一流程示意图,在本发明实施例中,所述方法包括S201-S209。
S201、自定义关节触摸动作对应的手势类型与预设功能的映射关系,并将映射关系保存至映射关系库中。
具体的,终端设备预先设置多种关节触摸动作对应的手势类型,用户可通过选择终端设备上的不同的功能与不同的手势类型进行关联。同时,终端设备也可以学习新的手势类型,将新的手势类型保存到本地。
示例性,终端设备预设的手势类型包括:单点单击手势、单点双击手势、两点单击手势、向上滑动手势、向下滑动手势,其中,上述手势均为关节触摸动作。用户可以进行映射关系编辑界面,映射关系编辑界面中显示有上述手势,用户可以选择其中一个手势与需要的功能进行关联,例如,将单点单击手势与进入设置选项的操作指令进行关联,将单点双击手势与启动截屏的操作指令进行关联,将两点单击手势与将目标应用转入后台运行的操作指令进行关联,将向上滑动手势与启动浏览器的操作指令进行关联,将向下滑动指令与切换主体的操作指令进行关联。
需要学习新的手势类型时,用户可以进入手势学习界面,在手势学习界面上绘制自定义手势至少两次,如果自定义手势为滑动手势,当每次自定义手势的滑动轨迹的相似度大于预设阈值时,表示学习新的手势成功;如果自定义手势为点击手势时,当每次的点击次数和触摸点数量相同,表明学习新的手势成功,将学习到的新的手势保存至本地的手势库中,当用户进入映射关系编辑界面中,新的手势就会显示在映射关系编辑界面中,供用户选择操作指令与新的 手势进行关联。
示例性的,需要学习的新的手势为画字母C,用户进入手势学习界面中,根据向导信息的提示,分两次在手势学习界面上画字母C,获取两次绘制得到的滑动轨迹,将滑动轨迹拟合成平面图形,采用相似度算法比较二者的相似度,相似度越大表明两次的滑动轨迹的形状越接近,反之形状差异越大,若计算得到的相似度大于预设阈值,表明新的手势学习成功;需要学习的画字母C的手势为三点单击手势,根据向导信息的提示,分两次在手势学习界面上进行三点单击手势,比较两次的触摸点的数量均为3次且每个触摸点上的点击次数均为1次,表明学习三点单击手势成功。
需要说明的是,预设功能可以是启动/关闭终端设备上的应用程序、启动应用程序内的一项功能。
S202、检测对终端设备的触摸屏的触摸动作。
具体的,可以在亮屏状态下检测手指对终端设备的触摸屏的触摸动作,并且产生触摸信号,亮屏状态包括锁屏亮屏状态和非锁屏亮屏状态。该触摸屏可以是专门的触摸屏,也可以是具备触摸功能的显示设备。在多点触控制技术中,触摸屏可以通过触摸事件处理功能接收一个或多个触摸点的触摸动作,并且产生相应的手势,不同的手势指示不同的操作。触摸手势的识别和获取可以因触摸技术的工作原理的不同而不同,本发明对此不作限定。手指的基本触摸动作可以包括手指的按下(down)、移动(move)以及抬起(up)等动作,通过不同的基本触摸动作的组合成不同类型的手势。例如,点击手势由按下和抬起两个基本触摸动作组成,滑动手势由按下、移动和抬起三个基本触摸动作组成。手指与触摸屏接触时,触摸屏上会产生触摸信号,根据触摸信号检测对终端设备的触摸屏的触摸动作。
示例性的,终端设备的触摸屏为电容触摸屏,电容触摸屏的4个角上设置为4个电极,电容触摸屏的表面有高频信号,当手指与触摸屏接触时,在接触区域上产生一个耦合电容,于是会产生一个接触区域到手指方向的小电流,4个电极上的电流会发生变化,可以通过检测电极上的电流的变化检测对终端设备的触摸动作。
应该理解的是,本发明并不限于手指对触摸屏的触摸动作,也可以是其他 物体对触摸屏的触摸动作,不同的触摸技术触摸屏能感应不同类型的物体的触摸动作。
S203、获取触摸动作在触摸屏上的接触区域内的像素点数量m,以及触摸屏的像素点数量n,触摸屏的面积s;计算得到触摸动作在触摸屏上的接触面积为s*(m/n);通过终端设备自带的重力加速度传感器获取触摸动作的z轴重力减速度。
具体的,接触面积指手指与触摸屏接触时产生的接触区域的面积,手指首次按下时,获取首次按下的接触区域的面积;手指刚接触到触摸屏时在z轴方向存在一定大小的速度,由于触摸屏是刚性的,手指的速度在很短的时间内衰减为零,从而在z轴方向产生重力加速度,手指z轴方向的速度越大,在z轴方向产生的重力加速度就越大。在本发明实施例中的z轴是指垂直触摸屏的方向。计算接触面积的方法可以是:获取触摸动作在触摸屏上的接触区域内的像素点数量m,获取触摸屏的像素点数量n,其中,触摸屏的像素点数量n是固定的,由触摸屏的分辨率决定的是,获取触摸屏的面积s,触摸屏的面积s也是固定的,面积s可以预先存储在终端设备的ROM中,通过公式s*(m/n)计算得到接触区域的面积。通过终端设备自带的重力加速度传感器获取触摸动作的z轴加速度,重力加速度传感器现在普遍内置于智能手机、平板电脑等终端设备中。
需要说明的是,本发明实施例的z轴加速度指以重力加速度为基准的加速度的绝对值,本领域技术人员可知,重力加速度为9.8N/S,z轴加速度为以9.8N/S为基准的变化量的绝对值,变化量的变化趋势可以是正向变化或反向变化。
其中,获取触摸动作在触摸屏上的接触区域内的像素点数量m的方法可以是:以触摸屏为电容触摸屏为例,由上述电容触摸屏的原理可知,当用户的手指接触电容触摸屏时,由于人体电场的作用手指和接触区域之间形成一个耦合电容,接触区域上存在高频信号,于是手指吸收走一个电流,这个电流分别从电容触摸屏的四个角上的电极中流出,理论上流经四个电极的电流与手指到四角的距离成反比,通过对四个电极上的电流可以计算出触摸动作的接触区域,对接触区域进行去噪声和图形拟合处理得到一个平面图形,此时可以统计该平面图形中的像素点的数量m。
S204、是否同时满足接触面积大于预设面积和z轴加速度大于预设加速度。
具体的,根据S203计算得到的接触面积与预设面积进行比较,以及根据S203计算得到的z轴加速度与预设加速度进行比较,如果二者的比较结果均满足大于的限制条件,执行S205,否则执行S206。
其中,预设面积和预设加速度可以根据需要进行设置,本发明不作限制。
S205、确定触摸动作为关节触摸动作。
其中,本发明实施例中的关节触摸动作是定义的一种新的触摸动作,并非一定由手指关节触发的动作,也有可能是其他物体以很快的速度敲击触摸屏触发的动作,只要满足上述接触面积和z轴加速度的限制条件均可称为本发明实施例的关节触摸动作。
S206、确定触摸动作不为关节触摸动作,获取触摸动作的位置,根据位置调用终端设备的功能。
其中,通过触摸动作的位置调用终端设备的预设功能,现有技术已作详细的披露,此处不再赘述。
S207、识别关节触摸动作对应的手势类型。
其中,在触摸操作的持续过程中,根据关节触摸动作包含的基本触摸动作识别对应的手势类型,例如,单点单击手势为在一个触摸点上发生依次按下动作和抬起动作、单点双击手势为在一个触摸点上发生两次按下动作和抬起动作,且两次动作的时间间隔小于预设时长;滑动手势为在一个触摸点上发生按下动作,在另一个触摸点上发生抬起动作,移动的过程中形成滑动轨迹。
S208、获取滑动手势的滑动轨迹,在终端设备的映射关系库中查询滑动轨迹关联的操作指令,并执行查询到的操作指令所指示的操作。
例如,滑动手势的滑动轨迹为封闭图形,在终端设备的映射关系库中查询封闭图形关联的操作指令为启动截屏,则执行启动截屏的操作,在用户界面上显示一个可调节大小的截图框,当用户确定保存截图时将截图框内的图片保存到终端设备的指定位置;滑动手势的滑动轨迹为非封闭图形,在终端设备的映射关系库中查询非封闭图形关联的操作指令为主题切换,则执行主题切换的操作,显示包含多个主题缩略图的界面,在用户选择所需的主题后,将当前主题 切换为选择的主题。滑动手势的滑动轨迹为圆圈,在终端设备的映射关系库中查询关联的操作指令为启动截取部分屏幕的指令,对圆圈内包含的区域进行截图,将截图的图片保存至终端设备的指定位置。
可以理解的是,为了实现支持更多的操作指令,可以进一步的对滑动轨迹的形状进行细分,不同的形状对应不同的操作指令,例如滑动轨迹为字母或数字等,本发明不作限制。
S209、在终端设备的映射关系库中查询点击手势关联的操作指令,并执行查询到的操作指令所指示的操作。
例如,点击手势为单点点击手势,在终端设备的映射关系库中查询到单点点击手势关联的操作指令为进入应用设置,则执行进入应用设置的操作;点击手势为单点双击手势,查询终端设备的映射关系库查询到单点点击手势关联的操作指令为启动浏览器,则执行启动浏览器的操作;点击手势为双点单击手势,在终端设备的映射关系库查询到双点单击手势关联的操作为启动照相机,则执行启动照相机的操作;点击手势为单点三击手势,在终端设备的映射关系库中查询到单点三击手势关联的操作指令为启动截取全屏,则执行截取全屏的操作,将截取的图片保存到终端设备的指定位置。
实施本发明的实施例,通过获取触摸屏上发生的接触面积和z轴加速度,识别出关节触摸动作,识别关节触摸动作对应的手势类型,根据手势类型调用终端设备的预设功能,增加了基于接触面积和z轴加速度的触摸手势对终端设备的交互方法,使终端设备的交互方法更加丰富。
参见图3,为本发明实施例提供的一种手势控制装置的结构示意图,在本发明实施例中,所述手势控制装置包括:检测模块30、获取模块31、确定模块32和调用模块33。
检测模块30,用于检测对终端设备的触摸屏的触摸动作。
获取模块31,用于获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度。
确定模块32,用于若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作。
调用模块33,用于识别所述关节触摸动作对应的手势类型,并根据所述 手势类型调用所述终端设备的预设功能。
本发明实施例的手势控制装置用于执行方法实施例一的手势控制方法,与方法实施例一基于同一构思,其带来的技术效果也相同,具体过程请参照方法实施例一的描述,此处不再赘述。
参见图4-图6,为本发明实施例提供的一种手势控制装置的另一结构示意图,在本发明实施例中,所述手势控制装置除包括检测模块30、获取模块31、确定模块32和调用模块33,还包括映射模块34。
映射模块34,用于自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至所述映射关系库中。
可选的,调用模块33包括:第一确定单元331和第二调用单元332。
第一确定单元331,用于确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序。
第一调用单元332,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
可选的,调用模块33包括:第二确定单元333和第二调用单元334。
第二确定单元333,用于确定所述关节触摸动作发生的界面。
第二调用单元334,用于若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
可选的,调用模块33包括:第三确定单元335和第三调用单元336。
第三确定单元,用于确定所述关节触摸动作发生的界面;
第三调用单元,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
可选的,所述点击手势包括:所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
可选的,获取模块31包括:第一获取单元311、计算单元312和第二获取单元313。
第一获取单元311,用于获取所述触摸动作在所述触摸屏上的接触区域内 的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s。
计算单元312,用于计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n)。
第二获取单元313,用于通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
本发明实施例的手势控制装置用于执行方法实施例二的手势控制方法,与方法实施例一和二基于同一构思,其带来的技术效果也相同,具体过程请参照方法实施例一和二的描述,此处不再赘述。
参见图7,为本发明实施例提供的一种手势控制装置的又一结构示意图,在本发明实施例中手势控制装置用于实现本发明实施例的手势控制方法,手势控制装置包括处理器71、存储器72和通信接口73,手势控制装置中的处理器71的数量可以是一个或多个,图7以一个处理器为例。本发明的一些实施例中,处理器71、存储器72和通信接口73可通过总线或其他方式连接,图7中以总线连接为例。
其中,存储器72中存储一组程序代码,且处理器71用于调用存储器72中存储的程序代码,用于执行以下操作:
检测对终端设备的触摸屏的触摸动作;
获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
在本发明的一些实施例中,处理器71执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
在本发明的一些实施例中,处理器71执行所述识别所述关节触摸动作的 对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
确定所述关节触摸动作发生的界面;
若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
在本发明的一些实施例中,处理器71执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
确定所述关节触摸动作发生的界面;
识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
在本发明的一些实施例中,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
在本发明的一些实施例中,处理器71执行所述获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度包括:
获取所述触摸动作在所述触摸屏上的接触区域内的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
在本发明的一些实施例中,处理器71还用于执行:
自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至映射关系库中。
实施本发明的实施例,通过获取触摸屏上发生的接触面积和z轴加速度,识别出关节触摸动作,识别关节触摸动作对应的手势类型,根据手势类型调用终端设备的预设功能,增加了基于接触面积和z轴加速度的触摸手势对终端设备的交互方法,使终端设备的交互方法更加丰富。
本发明实施例还提供了一种终端设备,包括触摸屏83、重力传感器82和手势控制装置81,触摸屏83、重力传感器82和手势控制装置81之间可通过总线连接,也可通过其他方式连接,手势识别装置81为装置实施例一和二中 所述的手势识别装置,重力传感器用于获取触摸动作接触触摸屏时产生的z轴加速度。其中终端设备可以是智能手机、平板电脑或其他带触摸屏的设备。终端设备的具体工作过程可参照方法实施例一和二的描述,此处不再赘述。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本发明一种较佳实施例而已,当然不能以此来限定本发明之权利范围,本领域普通技术人员可以理解实现上述实施例的全部或部分流程,并依本发明权利要求所作的等同变化,仍属于发明所涵盖的范围。

Claims (23)

  1. 一种手势控制方法,其特征在于,包括:
    检测对终端设备的触摸屏的触摸动作;
    获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
    若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
  2. 如权利要求1所述的方法,其特征在于,所述识别关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
    确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
  3. 如权利要求1所述的方法,其特征在于,所述识别关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
    确定所述关节触摸动作发生的界面;
    若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
  4. 如权利要求1所述的方法,其特征在于,所述识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能包括:
    确定所述关节触摸动作发生的界面;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
  5. 如权利要求1-4任意一项所述的方法,其特征在于,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
  6. 如权利要求1-5任意一项所述的方法,其特征在于,所述获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度包括:
    获取所述触摸动作在所述触摸屏上的接触区域内的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
    计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
    通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
  7. 如权利要求1-6任意一项所述的方法,其特征在于,所述检测对终端设备的触摸屏的触摸动作之前,还包括:
    自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至映射关系库中。
  8. 一种手势控制装置,其特征在于,包括:
    检测模块,用于检测对终端设备的触摸屏的触摸动作;
    获取模块,用于获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
    确定模块,用于若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
    调用模块,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
  9. 如权利要求8所述的装置,其特征在于,所述调用模块包括:
    第一确定单元,用于确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
    第一调用单元,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
  10. 如权利要求8所述的装置,其特征在于,所述调用模块包括:
    第二确定单元,用于确定所述关节触摸动作发生的界面;
    第二调用单元,用于若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
  11. 如权利要求8所述的装置,其特征在于,所述调用模块包括:
    第三确定单元,用于确定所述关节触摸动作发生的界面;
    第三调用单元,用于识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
  12. 如权利要求8-11任意一项所述的装置,其特征在于,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
  13. 如权利要求8-12任意一项所述的装置,其特征在于,所述获取模块包括:
    第一获取单元,用于获取所述触摸动作在所述触摸屏上的接触区域内的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
    计算单元,用于计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
    第二获取单元,用于通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
  14. 如权利要求8-13任意一项所述的装置,其特征在于,还包括:
    映射模块,用于自定义所述关节触摸动作对应的手势类型与预设功能的映 射关系,并将所述映射关系保存至映射关系库中。
  15. 一种手势识别装置,其特征在于,包括处理器和存储器,所述存储器中存储一组程序代码,所述处理器调用所述存储器中存储的程序代码,用于执行以下操作:
    检测对终端设备的触摸屏的触摸动作;
    获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
    若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
  16. 如权利要求15所述的装置,其特征在于,所述处理器执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
    确定所述关节触摸动作发生的界面,以及所述界面所属的应用程序;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述应用程序对应的预设功能。
  17. 如权利要求15所述的装置,其特征在于,所述处理器执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
    确定所述关节触摸动作发生的界面;
    若所述关节动作发生的界面为所述终端设备的系统桌面,查询与所述手势类型关联的应用程序,并启动或关闭所述应用程序。
  18. 如权利要求15所述的装置,其特征在于,所述处理器执行所述识别所述关节触摸动作的对应手势类型,并根据所述手势类型调用所述终端设备对应的预设功能包括:
    确定所述关节触摸动作发生的界面;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型对所述界面进行操作;其中,所述操作包括截屏、图标排列或更换主题。
  19. 如权利要求15-18任意一项所述的装置,其特征在于,所述触摸动作对应的手势类型包括点击手势或滑动手势,所述点击手势包括:单点单次点击手势、单点多次点击手势和多点点击手势中的至少一个;所述滑动轨迹包括封闭轨迹和非封闭轨迹中的至少一个。
  20. 如权利要求15-19任意一项所述的装置,其特征在于,所述处理器执行所述获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度包括:
    获取所述触摸动作在所述触摸屏上的接触区域内的像素点数量m,以及所述触摸屏的像素点数量n、所述触摸屏的面积s;
    计算得到所述触摸动作在所述触摸屏上的接触面积为s*(m/n);
    通过终端设备自带的重力加速度传感器获取所述触摸动作的z轴加速度。
  21. 如权利要求15-20任意一项所述的装置,其特征在于,所述处理器还用于执行:
    自定义所述关节触摸动作对应的手势类型与预设功能的映射关系,并将所述映射关系保存至映射关系库中。
  22. 一种终端设备,其特征在于,包括如权利要求8-22任意一项所述的手势控制装置、触摸屏和重力传感器。
  23. 一种存储介质,用于控制计算机设备执行手势控制方法,所述方法包括以下步骤:
    检测对终端设备的触摸屏的触摸动作;
    获取所述触摸动作在所述触摸屏上的接触面积和接触所述触摸屏时产生的z轴加速度;
    若所述接触面积大于预设面积且所述z轴加速度大于预设加速度,确定所述触摸动作为关节触摸动作;
    识别所述关节触摸动作对应的手势类型,并根据所述手势类型调用所述终端设备的预设功能。
PCT/CN2015/076536 2015-04-14 2015-04-14 一种手势控制方法、装置、终端设备和存储介质 WO2016165066A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US15/566,582 US10802704B2 (en) 2015-04-14 2015-04-14 Gesture control method, apparatus, terminal device, and storage medium
EP15888771.1A EP3276480A4 (en) 2015-04-14 2015-04-14 Gesture control method, device, terminal apparatus and storage medium
PCT/CN2015/076536 WO2016165066A1 (zh) 2015-04-14 2015-04-14 一种手势控制方法、装置、终端设备和存储介质
JP2017553945A JP6598089B2 (ja) 2015-04-14 2015-04-14 ジェスチャコントロール方法、装置、端末デバイス、およびストレージ媒体
CN201580029659.2A CN106415472B (zh) 2015-04-14 2015-04-14 一种手势控制方法、装置、终端设备和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/076536 WO2016165066A1 (zh) 2015-04-14 2015-04-14 一种手势控制方法、装置、终端设备和存储介质

Publications (1)

Publication Number Publication Date
WO2016165066A1 true WO2016165066A1 (zh) 2016-10-20

Family

ID=57125459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/076536 WO2016165066A1 (zh) 2015-04-14 2015-04-14 一种手势控制方法、装置、终端设备和存储介质

Country Status (5)

Country Link
US (1) US10802704B2 (zh)
EP (1) EP3276480A4 (zh)
JP (1) JP6598089B2 (zh)
CN (1) CN106415472B (zh)
WO (1) WO2016165066A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105320436A (zh) * 2015-07-07 2016-02-10 崔景城 一种以手指关节敲击屏幕来触发截屏的方法
JP6819423B2 (ja) * 2017-04-04 2021-01-27 富士ゼロックス株式会社 無線通信装置
CN109408263A (zh) * 2018-09-27 2019-03-01 惠州Tcl移动通信有限公司 一种界面稳定切换的方法、存储介质及智能终端
CN110275665A (zh) * 2019-05-23 2019-09-24 深圳龙图腾创新设计有限公司 一种触摸屏操作方法、电子设备及存储介质
CN110502153B (zh) * 2019-08-30 2022-11-11 Oppo(重庆)智能科技有限公司 触摸屏报点率的调整方法、触摸屏、电子设备及存储介质
CN113805487B (zh) * 2020-07-23 2022-09-23 荣耀终端有限公司 控制指令的生成方法、装置、终端设备及可读存储介质
CN112532226B (zh) * 2020-11-05 2024-05-10 广东瑞德智能科技股份有限公司 一种新型触摸人机交互处理方法
CN112445410B (zh) * 2020-12-07 2023-04-18 北京小米移动软件有限公司 触控事件识别方法、装置及计算机可读存储介质
CN114911401B (zh) * 2021-02-08 2024-06-25 华为技术有限公司 电子设备及其触摸操作的分析方法和可读介质
KR102436970B1 (ko) * 2021-02-16 2022-08-26 서울과학기술대학교 산학협력단 사운드 기계학습에 기반한 제스처 검출 장치 및 방법
CN117157613A (zh) * 2021-04-06 2023-12-01 三星电子株式会社 用于执行捕获功能的电子设备和用于操作电子设备的方法
KR20230015785A (ko) * 2021-07-23 2023-01-31 삼성전자주식회사 전자 장치 및 그 제어 방법
CN116225274A (zh) * 2023-04-28 2023-06-06 荣耀终端有限公司 触控操作的识别方法、装置、电子设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799292A (zh) * 2011-05-24 2012-11-28 联想(北京)有限公司 一种触摸控制方法、装置及电子设备
CN103809882A (zh) * 2012-11-07 2014-05-21 联想(北京)有限公司 一种信息处理的方法、电子设备及触控输入装置
CN104049759A (zh) * 2014-06-25 2014-09-17 华东理工大学 触摸屏和行为感知联合的指令输入与防护方法

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008192092A (ja) * 2007-02-08 2008-08-21 Fuji Xerox Co Ltd タッチパネル装置、情報処理装置及びプログラム
JP5160337B2 (ja) 2008-08-11 2013-03-13 ソニーモバイルコミュニケーションズ, エービー 入力処理装置、入力処理方法、入力処理プログラム、及び携帯端末装置
US8633901B2 (en) 2009-01-30 2014-01-21 Blackberry Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device
JP5091180B2 (ja) * 2009-03-27 2012-12-05 ソニーモバイルコミュニケーションズ, エービー 携帯端末装置
US8836648B2 (en) * 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
JP2011014044A (ja) 2009-07-03 2011-01-20 Sony Corp 操作制御装置、操作制御方法およびコンピュータプログラム
JP2012058856A (ja) 2010-09-06 2012-03-22 Sony Corp 情報処理装置、情報処理方法及び情報処理プログラム
JP5352619B2 (ja) 2011-04-13 2013-11-27 株式会社日本自動車部品総合研究所 操作入力装置
KR101858608B1 (ko) * 2011-10-28 2018-05-17 엘지전자 주식회사 이동 단말기 및 이동 단말기의 제어 방법
JP6021335B2 (ja) * 2011-12-28 2016-11-09 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム、および、情報処理方法
US20130342468A1 (en) * 2012-06-20 2013-12-26 Chimei Innolux Corporation Method for determining touch location on a touch panel and touch panel module
JP5460793B2 (ja) * 2012-08-21 2014-04-02 シャープ株式会社 表示装置、表示方法、テレビジョン受信機及び表示制御装置
KR20140113119A (ko) * 2013-03-15 2014-09-24 엘지전자 주식회사 전자 기기 및 그 제어방법
JP6132644B2 (ja) * 2013-04-24 2017-05-24 キヤノン株式会社 情報処理装置、表示制御方法、コンピュータプログラム、及び記憶媒体
KR101474467B1 (ko) * 2013-07-09 2014-12-19 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
JP2015060455A (ja) 2013-09-19 2015-03-30 シャープ株式会社 電子装置、制御方法及びプログラム
WO2015085526A1 (en) * 2013-12-12 2015-06-18 Empire Technology Development Llc Visualization of size of objects

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799292A (zh) * 2011-05-24 2012-11-28 联想(北京)有限公司 一种触摸控制方法、装置及电子设备
CN103809882A (zh) * 2012-11-07 2014-05-21 联想(北京)有限公司 一种信息处理的方法、电子设备及触控输入装置
CN104049759A (zh) * 2014-06-25 2014-09-17 华东理工大学 触摸屏和行为感知联合的指令输入与防护方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3276480A4 *

Also Published As

Publication number Publication date
EP3276480A4 (en) 2018-05-02
EP3276480A1 (en) 2018-01-31
US10802704B2 (en) 2020-10-13
CN106415472A (zh) 2017-02-15
JP2018511892A (ja) 2018-04-26
CN106415472B (zh) 2020-10-09
JP6598089B2 (ja) 2019-10-30
US20180095657A1 (en) 2018-04-05

Similar Documents

Publication Publication Date Title
WO2016165066A1 (zh) 一种手势控制方法、装置、终端设备和存储介质
US11809702B2 (en) Modeless augmentations to a virtual trackpad on a multiple screen computing device
US10542205B2 (en) Movable user interface shutter button for camera
US8390577B2 (en) Continuous recognition of multi-touch gestures
CN107025019B (zh) 虚拟按键的交互方法及终端设备
US20230021260A1 (en) Gesture instruction execution method and apparatus, system, and storage medium
US8581869B2 (en) Information processing apparatus, information processing method, and computer program
US20130212541A1 (en) Method, a device and a system for receiving user input
US9317171B2 (en) Systems and methods for implementing and using gesture based user interface widgets with camera input
CN103869947B (zh) 控制电子设备的方法及电子设备
CN106415471A (zh) 终端的用户界面的处理方法、用户界面和终端
WO2021092768A1 (zh) 触摸事件的处理方法、装置、移动终端及存储介质
WO2022007544A1 (zh) 设备控制方法、装置、存储介质及电子设备
US10345932B2 (en) Disambiguation of indirect input
US20180210597A1 (en) Information processing device, information processing method, and program
CN111078087A (zh) 移动终端、控制模式切换方法及计算机可读存储介质
US10241671B2 (en) Gesture response method and device
US20150153925A1 (en) Method for operating gestures and method for calling cursor
CN111198644A (zh) 智能终端的屏幕操作的识别方法及系统
CN103543824A (zh) 手势输入系统及方法
WO2023016193A1 (zh) 设备控制方法、装置、电子设备以及存储介质
US20140035876A1 (en) Command of a Computing Device
US9778822B2 (en) Touch input method and electronic apparatus thereof
WO2022127063A1 (zh) 输入方法、装置和用于输入的装置
CN115469786A (zh) 显示设备及绘画对象选择方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15888771

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017553945

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15566582

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE