WO2016165066A1 - 一种手势控制方法、装置、终端设备和存储介质 - Google Patents
一种手势控制方法、装置、终端设备和存储介质 Download PDFInfo
- Publication number
- WO2016165066A1 WO2016165066A1 PCT/CN2015/076536 CN2015076536W WO2016165066A1 WO 2016165066 A1 WO2016165066 A1 WO 2016165066A1 CN 2015076536 W CN2015076536 W CN 2015076536W WO 2016165066 A1 WO2016165066 A1 WO 2016165066A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- touch action
- joint
- touch
- terminal device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Definitions
- Embodiments of the present invention relate to the field of touch control, and in particular, to a gesture control method, apparatus, terminal device, and storage medium.
- the touch screen has a wider application range.
- the user performs various basic touch actions such as pressing, moving, or lifting on the touch screen to generate various types of touch gestures.
- the most used touch gesture on the existing terminal device is a click gesture.
- the terminal device will input the location of the click gesture, determine the application corresponding to the location, and trigger the application to execute. The corresponding operation, otherwise the location does not correspond to the application, no action is triggered.
- the recognition of the touch gesture by the terminal device is based on the position and the trajectory.
- the touch gesture based on the two-dimensional plane is increasingly unable to meet the interaction requirements of the terminal device.
- the technical problem to be solved by the embodiments of the present invention is to provide a gesture control method, device, terminal device, and storage medium, which can enrich the interaction mode of the terminal device.
- a gesture control method including:
- the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
- the identifying a gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
- the identifying a gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
- the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
- the identifying a gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
- the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, where the click gesture includes: At least one of a single click gesture, a single point multiple click gesture, and a multi click gesture; the sliding track includes at least one of a closed track and a non-closed track.
- the obtaining a contact area of the touch action on the touch screen and contacting the touch screen is generated
- the z-axis acceleration includes:
- the z-axis acceleration of the touch action is acquired by a gravity acceleration sensor that is provided by the terminal device.
- the method before the detecting the touch action on the touch screen of the terminal device, the method further includes:
- mapping relationship between the gesture type corresponding to the joint touch action and the preset function is customized, and the mapping relationship is saved in the mapping relationship library.
- a second aspect of the embodiments of the present invention provides a gesture control apparatus, including:
- a detecting module configured to detect a touch action on a touch screen of the terminal device
- An acquiring module configured to acquire a contact area of the touch action on the touch screen and a z-axis acceleration generated when contacting the touch screen;
- a determining module configured to determine that the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration
- the calling module is configured to identify a gesture type corresponding to the joint touch action, and invoke a preset function of the terminal device according to the gesture type.
- the calling module includes:
- a first determining unit configured to determine an interface in which the joint touch action occurs, and an application to which the interface belongs
- the first calling unit is configured to identify a gesture type corresponding to the joint touch action, and invoke a preset function corresponding to the application according to the gesture type.
- the calling module includes:
- a second determining unit configured to determine an interface where the joint touch action occurs
- a second calling unit configured to: if the interface where the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
- the calling module includes:
- a third determining unit configured to determine an interface at which the joint touch action occurs
- a third calling unit configured to identify a gesture type corresponding to the joint touch action, and operate the interface according to the gesture type; wherein the operation includes a screen capture, an icon arrangement, or a replacement theme.
- the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, where the click gesture includes: At least one of a single click gesture, a single point multiple click gesture, and a multi click gesture; the sliding track includes at least one of a closed track and a non-closed track.
- the acquiring module includes:
- a first acquiring unit configured to acquire an image of the touch action in a contact area on the touch screen a number m of prime points, and a number n of pixels of the touch screen, an area s of the touch screen;
- a calculating unit configured to calculate a contact area of the touch action on the touch screen as s*(m/n);
- the second acquiring unit is configured to acquire the z-axis acceleration of the touch action by using a gravity acceleration sensor that is provided by the terminal device.
- the mapping module is configured to customize a mapping between the gesture type and the preset function corresponding to the joint touch action Relationship and save the mapping relationship to the mapping relationship library
- a third aspect of the embodiments of the present invention provides a gesture recognition apparatus, including a processor and a memory, wherein the memory stores a set of program codes, and the processor calls the program code stored in the memory to perform the following operations. :
- the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
- the processor performs the corresponding gesture type that identifies the joint touch action, and invokes a preset function corresponding to the terminal device according to the gesture type, including :
- the processor performs the corresponding gesture type that identifies the joint touch action, and invokes a preset function corresponding to the terminal device according to the gesture type, including :
- the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
- the calling module includes:
- a third determining unit configured to determine an interface at which the joint touch action occurs
- a third calling unit configured to identify a gesture type corresponding to the joint touch action, and operate the interface according to the gesture type; wherein the operation includes a screen capture, an icon arrangement, or a replacement theme.
- the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, where the click gesture includes: At least one of a single click gesture, a single point multiple click gesture, and a multi click gesture; the sliding track includes at least one of a closed track and a non-closed track.
- the processor performs the acquiring contact area and contact of the touch action on the touch screen
- the z-axis acceleration generated when the touch screen is included includes:
- the z-axis acceleration of the touch action is acquired by a gravity acceleration sensor that is provided by the terminal device.
- the processor is further configured to:
- mapping relationship between the gesture type corresponding to the joint touch action and the preset function is customized, and the mapping relationship is saved in the mapping relationship library.
- a fifth aspect of the embodiments of the present invention provides a terminal device, including any one of the above gesture control devices, a touch screen, and a gravity sensor.
- a sixth aspect of the embodiments of the present invention provides a method for controlling a computer device to perform gesture control, where the method includes the following steps:
- the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
- Identifying a gesture type corresponding to the joint touch action, and calling the end according to the gesture type The default function of the end device.
- the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, and the touch based on the contact area and the z-axis acceleration is added.
- the interaction method of the gesture to the terminal device makes the interaction method of the terminal device more abundant.
- FIG. 1 is a schematic flowchart of a gesture control method according to an embodiment of the present invention
- FIG. 2 is another schematic flowchart of a gesture control method according to an embodiment of the present invention.
- FIG. 3 is a schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
- FIG. 4 is another schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
- Figure 5a is a schematic structural view of the calling module of Figure 4.
- Figure 5b is another schematic structural view of the calling module of Figure 4.
- Figure 5c is a schematic diagram of still another structure of the calling module of Figure 4.
- Figure 6 is a schematic structural view of the acquisition module of Figure 4.
- FIG. 7 is still another schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
- FIG. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
- FIG. 1 is a schematic flowchart of a gesture control method according to an embodiment of the present invention.
- the method comprises S101-S104.
- the touch action of the finger on the touch screen of the terminal device can be detected in a bright screen state or a black screen state, and a touch signal is generated.
- the bright screen state refers to a situation in which the backlight of the touch screen is lit, and the bright screen state includes a lock screen bright screen state or a non-lock screen bright screen state, and the black screen state refers to a case where the backlight of the touch screen is turned off.
- the touch screen can be a dedicated touch screen or a display device with a touch function.
- a touch screen may receive a touch action of one or more touch points through a touch event processing function and generate a corresponding gesture, different gestures indicating different operations.
- the recognition and acquisition of the touch gesture may be different depending on the working principle of the touch technology, which is not limited by the present invention.
- the basic touch action of the finger may include actions such as down, move, and up of the finger, by combining different basic touch actions into different types of gestures.
- a tap gesture consists of pressing and lifting two basic touch actions
- the swipe gesture consists of three basic touch actions of pressing, moving, and lifting.
- the present invention is not limited to the touch action of the finger on the touch screen, but also the touch action of other objects on the touch screen, and the touch screen can touch the touch action of different types of objects by different touch technologies.
- the contact area refers to the area of the contact area generated when the finger is in contact with the touch screen.
- the touch screen is rigid, the speed of the finger is attenuated to zero in a short period of time, thereby generating gravitational acceleration in the z-axis direction, and the greater the velocity in the z-axis of the finger, the greater the gravitational acceleration generated in the z-axis direction.
- the touch action is an action when the finger joint of the finger comes into contact with the touch screen.
- the z-axis in the embodiment of the present invention refers to the direction of the vertical touch screen. It can be understood that when the finger touches the touch screen at a certain speed, the direction of the speed is not completely perpendicular to the touch screen, so the z-axis acceleration that can be acquired is corrected by the cosine value of the angle, and the angle cannot be directly measured. The value is set.
- the joint touch action in the embodiment of the present invention is a new touch action defined, which is not necessarily triggered by the finger joint, or may be the action triggered by other objects hitting the touch screen at a fast speed.
- the joint touch action can be referred to as an embodiment of the present invention.
- the touch action is composed of at least one basic touch action, and according to the gesture type corresponding to the joint touch action, the preset function corresponding to the terminal device is invoked according to the gesture type, and the preset function corresponding to the terminal device includes: starting/closing the terminal device
- the application or the function in the calling application For example, to open the corresponding application on the terminal device, including entering the setting menu, starting the camera, starting the screenshot, changing the theme, etc.; or calling the reply function in the SMS application, calling the video call function in the instant messaging application, calling the browser application Screen capture function, etc.
- Different functions corresponding to different gesture types can be customized by the user in advance.
- the application program may be a system application or a third-party application.
- the system application refers to an application that is provided by the terminal device operating system.
- the functions corresponding to the system application include a dialing function, a setting function, and a short message interface function.
- the third application refers to an application installed on the terminal device.
- the identifying the gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
- the application presents a plurality of different interfaces on the terminal device, and the application includes multiple preset functions.
- the interface where the joint touch action occurs is determined, and the application program to which the interface belongs, identifies the gesture type corresponding to the joint touch action, the gesture type includes a click gesture or a swipe gesture, and the applied application is invoked according to the gesture type.
- Preset function within the program is not limited to a click gesture or a swipe gesture.
- the interface where the joint touch action occurs is the interface of the instant communication software, and the joint touch action pair
- the gesture type should be a two-click gesture.
- the two-click gesture is a partial screen capture function in the instant messaging software.
- the screen capture function is activated, and an adjustable size is displayed on the interface where the joint touch action occurs.
- the captured image is saved to the specified location of the terminal device.
- the interface where the joint touch action occurs is the interface of the instant communication software
- the gesture type corresponding to the joint touch action is a three-click gesture
- the three-click gesture is corresponding to the full-screen function in the instant communication software, and the interception is performed.
- the entire screen generates a picture, and the generated picture is saved to a specified location of the terminal device.
- the interface where the joint touch action occurs is the interface of the camera application software
- the gesture type corresponding to the joint touch action is the S track gesture
- the corresponding function of the S track gesture in the camera application software is the camera function
- the camera function is activated.
- the M track gesture corresponding to the preset function in the camera application is the recording function.
- the preset functions corresponding to the same gesture type may be the same or different, and the preset functions corresponding to the gesture type may be customized as needed.
- the identifying the gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
- the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
- an joint touch action occurs on the terminal device, determine an interface where the joint touch action occurs, determine a type of the interface, and if the interface is a system desktop of the terminal device, query an application associated with the gesture type, and obtain an associated application.
- the running state if the associated application is not started, the associated application is started, and the interface after startup is displayed on the terminal device; if the associated application is in the background running state, the associated application is closed.
- the application program in the embodiment of the present invention may be a system application or a third-party application, which is not limited herein.
- the system desktop of the mobile terminal is generally divided into multiple sub-interfaces, and when the interface where the joint touch action occurs is any one of the sub-interfaces, the joint touch action can be determined.
- the interface that occurs is the system desktop.
- the joint touch action is issued.
- the raw interface is the system desktop
- the gesture type corresponding to the joint touch action is the C track gesture
- the preset function corresponding to the C track gesture is called to call the camera application to obtain the running state of the camera application, if the camera application is not In the startup state, the camera application is started, the camera interface is displayed on the terminal device, and the camera application is closed if the camera application is not running in the background state; or the gesture type corresponding to the joint touch action is the S track gesture, and the S track gesture is queried.
- the corresponding preset function is to call the short message application to obtain the running status of the short message application.
- the short message application is not activated, the short message application is started, and the short message editing interface is displayed on the terminal device, if the short message application is running in the background The status is closed, the SMS application is closed; or the implementation of the joint touch action is implemented as a single-point three-click gesture, and the application corresponding to the single-point three-click gesture is to invoke the music player application to obtain the running state of the music playing application. If the music is played The music playing application is started when the program is not activated, and the music player's song list interface is displayed on the terminal device. If the application playing application is in the background running state, the music playing application is closed.
- the gesture type in the embodiment of the present invention may be a click gesture or a swipe gesture, and the association relationship between the gesture type and the application may be customized according to the needs of the user.
- the identifying the gesture type corresponding to the joint touch action, and calling the preset function of the terminal device according to the gesture type includes:
- the interface where the joint touch action occurs is determined, the gesture type corresponding to the joint touch action is recognized, and the interface is operated according to the gesture type.
- the different applications present different interfaces on the terminal device, and the same application presents a plurality of different interfaces on the terminal device.
- the embodiment of the present invention only operates the interface where the joint touch action occurs, such as arranging icons and replacing the theme. Or a screen capture operation, etc.; wherein the same gesture type may have the same operation in different interfaces, or may be different.
- the interface where the joint touch action occurs is the setting interface of the instant communication application
- the gesture type corresponding to the joint touch action is a two-click gesture
- the two-click gesture corresponds to the screen capture function
- the screenshot function is called to perform the current setting interface.
- Screen capture operation; or the interface where the joint touch action occurs is the system desktop of the terminal device, and the gesture type corresponding to the joint touch action is a two-click gesture, two clicks
- the screenshot function is called to perform a screen capture operation on the current system desktop
- the gesture type corresponding to the joint touch action is 1 track gesture
- the 1 track gesture corresponds to the icon arrangement.
- the image arrangement function is called to rearrange the current system desktop.
- the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, thereby increasing the contact-based contact.
- the interaction method of the touch gesture of the area and the z-axis acceleration on the terminal device makes the interaction method of the terminal device more abundant.
- FIG. 2 is another schematic flowchart of a method for controlling a gesture according to an embodiment of the present invention.
- the method includes S201-S209.
- the terminal device presets a gesture type corresponding to multiple joint touch actions, and the user can associate with different gesture types by selecting different functions on the terminal device. At the same time, the terminal device can also learn new gesture types and save the new gesture types locally.
- the preset types of gestures of the terminal device include: a single click gesture, a single click gesture, a two click gesture, an upward swipe gesture, and a downward swipe gesture, wherein the gestures are joint touch actions.
- the user can perform a mapping relationship editing interface, and the above-mentioned gesture is displayed in the mapping relationship editing interface, and the user can select one of the gestures to associate with the required function, for example, associate a single-click gesture with an operation instruction to enter the setting option, A single-click gesture is associated with an operation instruction that initiates a screen capture, and a two-click gesture is associated with an operation instruction that moves the target application into the background, and the upward swipe gesture is associated with an operation command that initiates the browser, and will be down.
- the slide command is associated with the operation command of the switching body.
- the user can enter the gesture learning interface and draw a custom gesture on the gesture learning interface at least twice. If the custom gesture is a sliding gesture, the similarity of the sliding trajectory of each custom gesture is greater than the pre- When the threshold is set, it means that learning a new gesture is successful; if the custom gesture is a click gesture, when the number of clicks and the number of touched points are the same, it indicates that the new gesture is successful, and the learned new gesture is saved to the local In the gesture library, when the user enters the mapping relationship editing interface, the new gesture is displayed in the mapping relationship editing interface for the user to select the operation instruction and the new one. Gestures are associated.
- the new gesture that needs to be learned is to draw the letter C, and the user enters the gesture learning interface.
- the letter C is drawn twice on the gesture learning interface, and the sliding track obtained by the two drawing is obtained.
- the sliding trajectory is fitted into a plane figure, and the similarity algorithm is used to compare the similarity between the two. The greater the similarity, the closer the shape of the two sliding trajectories is, and the larger the difference is, if the calculated similarity is greater than the preset threshold.
- the gesture of drawing the letter C that needs to be learned is a three-click gesture
- a three-click gesture is performed twice on the gesture learning interface, and the touch points are compared twice. The number is 3 times and the number of clicks on each touch point is 1 time, indicating that the three-click gesture is successful.
- the preset function may be to start/close an application on the terminal device and start a function in the application.
- the touch action of the finger on the touch screen of the terminal device can be detected in a bright screen state, and a touch signal is generated, and the bright screen state includes a lock screen bright screen state and a non-lock screen bright screen state.
- the touch screen can be a dedicated touch screen or a display device with a touch function.
- a touch screen may receive a touch action of one or more touch points through a touch event processing function and generate a corresponding gesture, different gestures indicating different operations.
- the recognition and acquisition of the touch gesture may be different depending on the working principle of the touch technology, which is not limited by the present invention.
- the basic touch action of the finger may include actions such as down, move, and up of the finger, by combining different basic touch actions into different types of gestures.
- a tap gesture consists of pressing and lifting two basic touch actions
- the swipe gesture consists of three basic touch actions of pressing, moving, and lifting.
- the touch screen of the terminal device is a capacitive touch screen, and the four corners of the capacitive touch screen are set to four electrodes.
- the surface of the capacitive touch screen has a high frequency signal.
- a coupling capacitor is generated on the contact area.
- a small current from the contact area to the finger direction is generated, and the current on the four electrodes changes, and the touch action on the terminal device can be detected by detecting the change in the current on the electrode.
- the present invention is not limited to the touch action of the finger on the touch screen, and may be other The touch action of the object on the touch screen, the different touch technology touch screen can sense the touch action of different types of objects.
- the contact area refers to the area of the contact area generated when the finger is in contact with the touch screen.
- the area of the contact area that is pressed for the first time is acquired; when the finger touches the touch screen, there is a certain speed in the z-axis direction. Since the touch screen is rigid, the speed of the finger is attenuated to zero in a short period of time, thereby generating gravitational acceleration in the z-axis direction, and the greater the velocity in the z-axis of the finger, the greater the gravitational acceleration generated in the z-axis direction.
- the z-axis in the embodiment of the present invention refers to the direction of the vertical touch screen.
- the method for calculating the contact area may be: obtaining the number m of pixels in the contact area of the touch action on the touch screen, and acquiring the number n of pixels of the touch screen, wherein the number n of pixels of the touch screen is fixed, determined by the resolution of the touch screen.
- the area s of the touch screen is also fixed, and the area s of the touch screen is also fixed.
- the area s can be pre-stored in the ROM of the terminal device, and the area of the contact area is calculated by the formula s*(m/n).
- the z-axis acceleration of the touch action is obtained by the gravity acceleration sensor provided by the terminal device, and the gravity acceleration sensor is now commonly built in terminal devices such as smart phones and tablet computers.
- the z-axis acceleration of the embodiment of the present invention refers to the absolute value of the acceleration based on the gravitational acceleration.
- the gravitational acceleration is 9.8 N/S
- the z-axis acceleration is based on 9.8 N/S.
- the absolute value of the amount of change, the change trend of the amount of change may be a positive change or a reverse change.
- the method for obtaining the number m of pixel points in the contact area of the touch action on the touch screen may be: taking the touch screen as a capacitive touch screen as an example, according to the principle of the capacitive touch screen, when the user's finger touches the capacitive touch screen, the human body electric field A coupling capacitor is formed between the finger and the contact area, and a high frequency signal is present on the contact area, so that the finger absorbs a current which flows out from the electrodes at the four corners of the capacitive touch screen, theoretically flowing through the four electrodes. The current is inversely proportional to the distance from the finger to the four corners.
- the contact area of the touch action can be calculated by the current on the four electrodes, and the contact area is denoised and patterned to obtain a plane figure. The number m of pixels in the graph.
- the contact area calculated according to S203 is compared with the preset area, and the z-axis acceleration calculated according to S203 is compared with the preset acceleration. If the comparison result of both meets the constraint condition greater than, the process proceeds to S205; otherwise, Execute S206.
- the preset area and the preset acceleration may be set as needed, and the invention is not limited.
- the joint touch action in the embodiment of the present invention is a new touch action defined, which is not necessarily triggered by the finger joint, or may be the action triggered by other objects hitting the touch screen at a fast speed, as long as the above is satisfied.
- the limitation of the contact area and the z-axis acceleration may be referred to as the joint touch action of the embodiment of the present invention.
- the preset function of the terminal device is invoked by the location of the touch action, and the prior art has been disclosed in detail, and details are not described herein again.
- the corresponding touch type is identified according to the basic touch action included in the joint touch action, for example, the single click gesture is a sequential press action and a lifting action on one touch point, and a single point
- the double-click gesture is that two presses and lifts occur on one touch point, and the time interval between the two actions is less than the preset duration;
- the swipe gesture is that a press action occurs at one touch point, and the other touch point A lifting motion occurs, and a sliding trajectory is formed during the movement.
- the sliding trajectory of the sliding gesture is a closed graphic
- the operation instruction for querying the closed graphic association in the mapping relationship library of the terminal device is to start the screen capture, and then the operation of starting the screen capture is performed, and a resizable screenshot frame is displayed on the user interface.
- the image in the screenshot box is saved to the specified location of the terminal device;
- the sliding trajectory of the sliding gesture is a non-closed graphic, and the operation instruction for querying the non-closed graphic in the mapping relationship library of the terminal device is the theme switching. Then perform the theme switching operation, display an interface containing multiple theme thumbnails, and after the user selects the desired theme, the current theme is Switch to the selected topic.
- the sliding track of the sliding gesture is a circle
- the associated operation instruction in the mapping relationship library of the terminal device is an instruction to start capturing a part of the screen, and a screenshot is taken on the area included in the circle, and the picture of the screenshot is saved to a specified position of the terminal device.
- the shape of the sliding track can be further subdivided in order to support more operation instructions.
- the different shapes correspond to different operation commands, for example, the sliding track is a letter or a number, and the like, which is not limited by the present invention.
- the click gesture is a single-click gesture
- the operation instruction associated with the single-click gesture is queried in the mapping relationship library of the terminal device to enter the application setting, and then the operation of entering the application setting is performed
- the click gesture is a single-click double-click gesture
- the query is performed.
- the mapping relationship library of the terminal device queries the operation command associated with the single-click gesture to start the browser, and then performs the operation of starting the browser;
- the click gesture is a double-click gesture, and the double-point list is queried in the mapping relationship library of the terminal device.
- the action associated with the gesture is to start the camera, and the operation of starting the camera is performed;
- the click gesture is a single-point three-click gesture, and the operation instruction associated with the single-point three-hook gesture is queried in the mapping relationship library of the terminal device to start capturing the full screen. Perform a full screen capture operation to save the captured image to the specified location of the terminal device.
- the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, thereby increasing the contact-based contact.
- the interaction method of the touch gesture of the area and the z-axis acceleration on the terminal device makes the interaction method of the terminal device more abundant.
- FIG. 3 is a schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
- the gesture control apparatus includes: a detection module 30, an acquisition module 31, a determination module 32, and a calling module 33.
- the detecting module 30 is configured to detect a touch action on the touch screen of the terminal device.
- the acquiring module 31 is configured to acquire a contact area of the touch action on the touch screen and a z-axis acceleration generated when contacting the touch screen.
- the determining module 32 is configured to determine that the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration.
- Calling module 33 configured to identify a gesture type corresponding to the joint touch action, and according to the The gesture type invokes a preset function of the terminal device.
- the gesture control device of the embodiment of the present invention is used to perform the gesture control method in the first embodiment of the method, and is based on the same concept as the method embodiment, and the technical effects thereof are also the same. For the specific process, refer to the description of the method embodiment 1. I won't go into details here.
- FIG. 4 is a schematic diagram of another structure of a gesture control apparatus according to an embodiment of the present invention.
- the gesture control apparatus includes a detection module 30, an acquisition module 31, and a determination module 32.
- the mapping module 34 is configured to customize a mapping relationship between the gesture type corresponding to the joint touch action and the preset function, and save the mapping relationship to the mapping relationship library.
- the calling module 33 includes: a first determining unit 331 and a second calling unit 332.
- the first determining unit 331 is configured to determine an interface where the joint touch action occurs, and an application to which the interface belongs.
- the first invoking unit 332 is configured to identify a gesture type corresponding to the joint touch action, and invoke a preset function corresponding to the application according to the gesture type.
- the calling module 33 includes: a second determining unit 333 and a second calling unit 334.
- the second determining unit 333 is configured to determine an interface where the joint touch action occurs.
- the second invoking unit 334 is configured to: if the interface where the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
- the calling module 33 includes: a third determining unit 335 and a third calling unit 336.
- a third determining unit configured to determine an interface at which the joint touch action occurs
- a third calling unit configured to identify a gesture type corresponding to the joint touch action, and operate the interface according to the gesture type; wherein the operation includes a screen capture, an icon arrangement, or a replacement theme.
- the click gesture includes: the gesture type corresponding to the touch action includes a click gesture or a swipe gesture, and the click gesture includes: a single-point single-click gesture, a single-point multiple-click gesture, and a multi-click gesture.
- At least one of the sliding trajectories includes at least one of a closed trajectory and a non-closed trajectory.
- the obtaining module 31 includes: a first obtaining unit 311, a calculating unit 312, and a second acquiring unit 313.
- a first acquiring unit 311, configured to acquire the touch action in a contact area on the touch screen The number of pixels m, and the number of pixels n of the touch screen, the area s of the touch screen.
- the calculating unit 312 is configured to calculate that the contact area of the touch action on the touch screen is s*(m/n).
- the second acquiring unit 313 is configured to acquire the z-axis acceleration of the touch action by using a gravity acceleration sensor that is provided by the terminal device.
- the gesture control device of the embodiment of the present invention is used to perform the gesture control method in the second embodiment of the method.
- the method and the first embodiment are based on the same concept, and the technical effects thereof are also the same.
- FIG. 7 is another schematic structural diagram of a gesture control apparatus according to an embodiment of the present invention.
- a gesture control apparatus is used to implement a gesture control method according to an embodiment of the present invention.
- the gesture control apparatus includes a processor 71.
- the memory 72 and the communication interface 73, the number of the processors 71 in the gesture control device may be one or more, and FIG. 7 takes a processor as an example.
- the processor 71, the memory 72, and the communication interface 73 may be connected by a bus or other means, and the bus connection is taken as an example in FIG.
- the memory 72 stores a set of program codes
- the processor 71 is configured to call the program code stored in the memory 72 for performing the following operations:
- the touch action is a joint touch action if the contact area is greater than a preset area and the z-axis acceleration is greater than a preset acceleration;
- the processor 71 performs the corresponding gesture type that identifies the joint touch action, and invokes the preset function corresponding to the terminal device according to the gesture type, including:
- processor 71 performs the identification of the joint touch action Corresponding to the gesture type, and calling the preset function corresponding to the terminal device according to the gesture type includes:
- the interface in which the joint action occurs is a system desktop of the terminal device, query an application associated with the gesture type, and start or close the application.
- the processor 71 performs the corresponding gesture type that identifies the joint touch action, and invokes the preset function corresponding to the terminal device according to the gesture type, including:
- the gesture type corresponding to the touch action includes a click gesture or a swipe gesture
- the click gesture includes: a single-point single-click gesture, a single-point multiple-click gesture, and a multi-click gesture.
- the sliding track includes at least one of a closed track and a non-closed track.
- the processor 71 performs the acquiring the contact area of the touch action on the touch screen and the z-axis acceleration generated when contacting the touch screen, including:
- the z-axis acceleration of the touch action is acquired by a gravity acceleration sensor that is provided by the terminal device.
- the processor 71 is further configured to:
- mapping relationship between the gesture type corresponding to the joint touch action and the preset function is customized, and the mapping relationship is saved in the mapping relationship library.
- the joint touch action is recognized, the gesture type corresponding to the joint touch action is recognized, and the preset function of the terminal device is invoked according to the gesture type, thereby increasing the contact-based contact.
- the interaction method of the touch gesture of the area and the z-axis acceleration on the terminal device makes the interaction method of the terminal device more abundant.
- the embodiment of the present invention further provides a terminal device, including a touch screen 83, a gravity sensor 82, and a gesture control device 81.
- the touch screen 83, the gravity sensor 82, and the gesture control device 81 can be connected through a bus, or can be connected by other means.
- the gesture recognition device 81 is in the device embodiments 1 and 2
- the gesture recognition device is configured to acquire a z-axis acceleration generated when a touch action contacts the touch screen.
- the terminal device can be a smartphone, a tablet or other device with a touch screen. For the specific working process of the terminal device, reference may be made to the description of the method embodiments 1 and 2, and details are not described herein again.
- the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/076536 WO2016165066A1 (zh) | 2015-04-14 | 2015-04-14 | 一种手势控制方法、装置、终端设备和存储介质 |
| CN201580029659.2A CN106415472B (zh) | 2015-04-14 | 2015-04-14 | 一种手势控制方法、装置、终端设备和存储介质 |
| EP15888771.1A EP3276480A4 (en) | 2015-04-14 | 2015-04-14 | Gesture control method, device, terminal apparatus and storage medium |
| US15/566,582 US10802704B2 (en) | 2015-04-14 | 2015-04-14 | Gesture control method, apparatus, terminal device, and storage medium |
| JP2017553945A JP6598089B2 (ja) | 2015-04-14 | 2015-04-14 | ジェスチャコントロール方法、装置、端末デバイス、およびストレージ媒体 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/076536 WO2016165066A1 (zh) | 2015-04-14 | 2015-04-14 | 一种手势控制方法、装置、终端设备和存储介质 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016165066A1 true WO2016165066A1 (zh) | 2016-10-20 |
Family
ID=57125459
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2015/076536 Ceased WO2016165066A1 (zh) | 2015-04-14 | 2015-04-14 | 一种手势控制方法、装置、终端设备和存储介质 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US10802704B2 (enExample) |
| EP (1) | EP3276480A4 (enExample) |
| JP (1) | JP6598089B2 (enExample) |
| CN (1) | CN106415472B (enExample) |
| WO (1) | WO2016165066A1 (enExample) |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105320436A (zh) * | 2015-07-07 | 2016-02-10 | 崔景城 | 一种以手指关节敲击屏幕来触发截屏的方法 |
| JP6819423B2 (ja) * | 2017-04-04 | 2021-01-27 | 富士ゼロックス株式会社 | 無線通信装置 |
| CN109408263A (zh) * | 2018-09-27 | 2019-03-01 | 惠州Tcl移动通信有限公司 | 一种界面稳定切换的方法、存储介质及智能终端 |
| CN110275665A (zh) * | 2019-05-23 | 2019-09-24 | 深圳龙图腾创新设计有限公司 | 一种触摸屏操作方法、电子设备及存储介质 |
| CN110502153B (zh) * | 2019-08-30 | 2022-11-11 | Oppo(重庆)智能科技有限公司 | 触摸屏报点率的调整方法、触摸屏、电子设备及存储介质 |
| CN111311489B (zh) * | 2020-01-17 | 2023-07-04 | 维沃移动通信有限公司 | 一种图像裁剪方法及电子设备 |
| CN113805487B (zh) * | 2020-07-23 | 2022-09-23 | 荣耀终端有限公司 | 控制指令的生成方法、装置、终端设备及可读存储介质 |
| CN112532226B (zh) * | 2020-11-05 | 2024-05-10 | 广东瑞德智能科技股份有限公司 | 一种新型触摸人机交互处理方法 |
| CN112445410B (zh) * | 2020-12-07 | 2023-04-18 | 北京小米移动软件有限公司 | 触控事件识别方法、装置及计算机可读存储介质 |
| CN114911401B (zh) * | 2021-02-08 | 2024-06-25 | 华为技术有限公司 | 电子设备及其触摸操作的分析方法和可读介质 |
| KR102436970B1 (ko) * | 2021-02-16 | 2022-08-26 | 서울과학기술대학교 산학협력단 | 사운드 기계학습에 기반한 제스처 검출 장치 및 방법 |
| EP4287005A4 (en) * | 2021-04-06 | 2024-08-07 | Samsung Electronics Co., Ltd. | ELECTRONIC DEVICE FOR IMPLEMENTING A CAPTURE FUNCTION AND METHOD FOR OPERATING AN ELECTRONIC DEVICE |
| KR20230015785A (ko) * | 2021-07-23 | 2023-01-31 | 삼성전자주식회사 | 전자 장치 및 그 제어 방법 |
| CN116225274A (zh) * | 2023-04-28 | 2023-06-06 | 荣耀终端有限公司 | 触控操作的识别方法、装置、电子设备及存储介质 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102799292A (zh) * | 2011-05-24 | 2012-11-28 | 联想(北京)有限公司 | 一种触摸控制方法、装置及电子设备 |
| CN103809882A (zh) * | 2012-11-07 | 2014-05-21 | 联想(北京)有限公司 | 一种信息处理的方法、电子设备及触控输入装置 |
| CN104049759A (zh) * | 2014-06-25 | 2014-09-17 | 华东理工大学 | 触摸屏和行为感知联合的指令输入与防护方法 |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008192092A (ja) | 2007-02-08 | 2008-08-21 | Fuji Xerox Co Ltd | タッチパネル装置、情報処理装置及びプログラム |
| JP5160337B2 (ja) * | 2008-08-11 | 2013-03-13 | ソニーモバイルコミュニケーションズ, エービー | 入力処理装置、入力処理方法、入力処理プログラム、及び携帯端末装置 |
| US8633901B2 (en) * | 2009-01-30 | 2014-01-21 | Blackberry Limited | Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
| JP5091180B2 (ja) * | 2009-03-27 | 2012-12-05 | ソニーモバイルコミュニケーションズ, エービー | 携帯端末装置 |
| US8836648B2 (en) * | 2009-05-27 | 2014-09-16 | Microsoft Corporation | Touch pull-in gesture |
| JP2011014044A (ja) | 2009-07-03 | 2011-01-20 | Sony Corp | 操作制御装置、操作制御方法およびコンピュータプログラム |
| JP2012058856A (ja) * | 2010-09-06 | 2012-03-22 | Sony Corp | 情報処理装置、情報処理方法及び情報処理プログラム |
| JP5352619B2 (ja) | 2011-04-13 | 2013-11-27 | 株式会社日本自動車部品総合研究所 | 操作入力装置 |
| KR101858608B1 (ko) * | 2011-10-28 | 2018-05-17 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
| JP6021335B2 (ja) | 2011-12-28 | 2016-11-09 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、および、情報処理方法 |
| US20130342468A1 (en) * | 2012-06-20 | 2013-12-26 | Chimei Innolux Corporation | Method for determining touch location on a touch panel and touch panel module |
| JP5460793B2 (ja) * | 2012-08-21 | 2014-04-02 | シャープ株式会社 | 表示装置、表示方法、テレビジョン受信機及び表示制御装置 |
| KR20140113119A (ko) * | 2013-03-15 | 2014-09-24 | 엘지전자 주식회사 | 전자 기기 및 그 제어방법 |
| JP6132644B2 (ja) * | 2013-04-24 | 2017-05-24 | キヤノン株式会社 | 情報処理装置、表示制御方法、コンピュータプログラム、及び記憶媒体 |
| KR101474467B1 (ko) * | 2013-07-09 | 2014-12-19 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어방법 |
| JP2015060455A (ja) * | 2013-09-19 | 2015-03-30 | シャープ株式会社 | 電子装置、制御方法及びプログラム |
| KR101804082B1 (ko) * | 2013-12-12 | 2017-12-28 | 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 | 오브젝트들의 사이즈의 시각화 |
-
2015
- 2015-04-14 WO PCT/CN2015/076536 patent/WO2016165066A1/zh not_active Ceased
- 2015-04-14 JP JP2017553945A patent/JP6598089B2/ja active Active
- 2015-04-14 EP EP15888771.1A patent/EP3276480A4/en not_active Ceased
- 2015-04-14 CN CN201580029659.2A patent/CN106415472B/zh active Active
- 2015-04-14 US US15/566,582 patent/US10802704B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102799292A (zh) * | 2011-05-24 | 2012-11-28 | 联想(北京)有限公司 | 一种触摸控制方法、装置及电子设备 |
| CN103809882A (zh) * | 2012-11-07 | 2014-05-21 | 联想(北京)有限公司 | 一种信息处理的方法、电子设备及触控输入装置 |
| CN104049759A (zh) * | 2014-06-25 | 2014-09-17 | 华东理工大学 | 触摸屏和行为感知联合的指令输入与防护方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3276480A4 * |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6598089B2 (ja) | 2019-10-30 |
| CN106415472A (zh) | 2017-02-15 |
| EP3276480A1 (en) | 2018-01-31 |
| US10802704B2 (en) | 2020-10-13 |
| EP3276480A4 (en) | 2018-05-02 |
| CN106415472B (zh) | 2020-10-09 |
| JP2018511892A (ja) | 2018-04-26 |
| US20180095657A1 (en) | 2018-04-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016165066A1 (zh) | 一种手势控制方法、装置、终端设备和存储介质 | |
| US11809702B2 (en) | Modeless augmentations to a virtual trackpad on a multiple screen computing device | |
| US8390577B2 (en) | Continuous recognition of multi-touch gestures | |
| US8581869B2 (en) | Information processing apparatus, information processing method, and computer program | |
| US20170257559A1 (en) | Movable User Interface Shutter Button for Camera | |
| US9317171B2 (en) | Systems and methods for implementing and using gesture based user interface widgets with camera input | |
| CN103869947B (zh) | 控制电子设备的方法及电子设备 | |
| WO2021092768A1 (zh) | 触摸事件的处理方法、装置、移动终端及存储介质 | |
| CN108958627A (zh) | 触控操作方法、装置、存储介质及电子设备 | |
| US20180210597A1 (en) | Information processing device, information processing method, and program | |
| WO2014200550A1 (en) | Disambiguation of indirect input | |
| WO2022007544A1 (zh) | 设备控制方法、装置、存储介质及电子设备 | |
| CN106293430A (zh) | 虚拟滑鼠控制系统及其控制方法 | |
| CN108984097B (zh) | 触控操作方法、装置、存储介质及电子设备 | |
| CN115469786A (zh) | 显示设备及绘画对象选择方法 | |
| CN116048370A (zh) | 显示设备及操作切换方法 | |
| CN111078087A (zh) | 移动终端、控制模式切换方法及计算机可读存储介质 | |
| CN110333780A (zh) | 功能触发方法、装置、设备及存储介质 | |
| US20150153925A1 (en) | Method for operating gestures and method for calling cursor | |
| CN106371595A (zh) | 一种调出消息通知栏的方法及移动终端 | |
| US10241671B2 (en) | Gesture response method and device | |
| CN111930296A (zh) | 电子设备的控制方法、装置及电子设备 | |
| US20140035876A1 (en) | Command of a Computing Device | |
| CN111782381A (zh) | 任务管理方法、装置、移动终端及存储介质 | |
| US9778822B2 (en) | Touch input method and electronic apparatus thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15888771 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017553945 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15566582 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |