CN111142775A - Gesture interaction method and device - Google Patents

Gesture interaction method and device Download PDF

Info

Publication number
CN111142775A
CN111142775A CN201911387227.5A CN201911387227A CN111142775A CN 111142775 A CN111142775 A CN 111142775A CN 201911387227 A CN201911387227 A CN 201911387227A CN 111142775 A CN111142775 A CN 111142775A
Authority
CN
China
Prior art keywords
touch screen
touch
screen
touch point
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911387227.5A
Other languages
Chinese (zh)
Inventor
王友位
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911387227.5A priority Critical patent/CN111142775A/en
Publication of CN111142775A publication Critical patent/CN111142775A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a gesture interaction method and device, and relates to the technical field of electronic equipment. The specific implementation scheme is as follows: step A: when a pointer object is detected to perform touch operation in a target area in a touch screen, generating a virtual touch point at a set position in the touch screen; and B: when the pointer object slides on the touch screen, controlling the virtual touch point to move along with the movement of the pointer object; and C: and when the pointer object is detected to be separated from the touch screen, executing one-time clicking operation at the position of the virtual touch point at the moment. The method enables a user to click most of the screen or any position in the screen only through once pressing, sliding and lifting operations under the condition that the user holds the equipment with one hand, so that the operation with one hand is more convenient.

Description

Gesture interaction method and device
Technical Field
The invention relates to the technical field of electronic equipment, in particular to a gesture interaction method and device.
Technical Field
When the screen of the touch screen device is large, in a one-hand holding state, a user is difficult to click areas far away from fingers on the screen, such as the upper part of the screen, and the user needs to move the device back and forth or use two hands to click buttons, icons, links and the like in the areas, so that the use scene of the touch screen device is limited.
Disclosure of Invention
The application provides a gesture interaction method and device for solving the problem that a user is inconvenient to operate with one hand.
The embodiment of the application provides a gesture interaction method, which comprises the following steps:
step A: when a pointer object is detected to perform touch operation in a target area in a touch screen, generating a virtual touch point at a set position in the touch screen;
and B: when the pointer object slides on the touch screen, controlling the virtual touch point to move along with the movement of the pointer object;
and C: and when the pointer object is detected to be separated from the touch screen, executing one-time clicking operation at the position of the virtual touch point at the moment.
In one possible design, the pointer may be a body part such as a finger, or may be a touch-sensitive implement such as a stylus.
In one possible design, the target area may be the entire touch screen or one or more partial areas of the touch screen.
In one possible design, the touch operation may be one or more of a short press, a long press, a double tap, a light press, a heavy press, a swipe inward from an edge of the touch screen, a swipe inward from a corner of the touch screen.
In one possible design, the set position may be a fixed position in the touch screen, such as the touch screen midpoint; or the position where the pointer contacts the touch screen; or may be a location related to where the pointer contacts the touch screen, such as 3 centimeters above where the pointer contacts the touch screen.
In one possible design, the virtual touch point moves in a manner that: the moving distance of the virtual touch point in the X-axis direction is equal to the product of the moving distance of the pointer object in the X-axis direction and a set first coefficient; the moving distance of the virtual touch point in the Y-axis direction is equal to the product of the moving distance of the pointer in the Y-axis direction and a set second coefficient; the moving distance may be a positive number representing movement in a positive direction, or a negative number representing movement in a negative direction.
In one possible design, the first coefficient and the second coefficient may be equal or unequal.
Preferably, in order for the user to determine the specific position of the virtual touch point, a visual cursor is displayed at the position of the virtual touch point.
Preferably, when a clickable element is detected at the position of the virtual touch point in the screen, the clickable element displays a floating state visualization effect.
Optionally, the control device generates a vibration feedback when the virtual touch point moves to a position where the clickable element is located in the screen.
The embodiment of the present application further provides a gesture interaction device, including:
the detection unit is used for detecting the touch operation of the pointer object on the touch screen and the touch position of the pointer object on the screen;
and the processing unit is used for calculating the position of the virtual touch point and executing click operation.
The embodiment of the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of gesture interaction according to the present invention when executing the program.
One embodiment in the above application has the following advantages or benefits: the user can click most of the screen or any position in the screen only by once pressing-sliding-lifting operation under the condition that the user holds the equipment with one hand, so that the operation with one hand is more convenient.
Drawings
FIG. 1 is a flow chart of a gesture interaction method according to an embodiment of the present application;
FIG. 2 is a diagram of an application scenario of a possible gesture interaction provided by an embodiment of the present application;
FIG. 3 is a diagram of an application scenario illustrating another possible gesture interaction provided by an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a correspondence relationship between the movement of the virtual touch point and the movement of the pointing object in the embodiment of the present application;
FIG. 5 is a schematic diagram of a composition structure of a gesture interaction apparatus according to the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The application provides a gesture interaction method, as shown in fig. 1, including:
s101: when a pointer object is detected to perform touch operation in a target area in a touch screen, generating a virtual touch point at a set position in the touch screen;
s102: when the pointer object slides on the touch screen, controlling the virtual touch point to move along with the movement of the pointer object;
s103: and when the pointer object is detected to be separated from the touch screen, executing one-time clicking operation at the position of the virtual touch point at the moment.
As a specific example of the present invention, fig. 2 shows a possible application scenario:
where 201 is the pointing object, in this particular example, a finger; 202 is a touch screen; 204 is the position where the finger touches the screen, in this particular example, the virtual touch point appears at the position where the finger touches the screen; displaying the visual cursor at the virtual touch point, which in this particular example is a circle; 205 is a clickable element including, but not limited to, a button, an icon, a switch, a text link, a picture link, etc., in this particular example, an icon;
when the finger moves from position 204 to a new position 208, the virtual touch point moves from position 204 to a new position 207, wherein the distance the finger moves is 2 cm on the X-axis and 3 cm on the Y-axis; in this specific example, if the first coefficient and the second coefficient are both 2, the distance moved by the virtual touch point is 4 centimeters on the X axis and 6 centimeters on the Y axis;
the clickable element 206, 206 at the position 207 of the moved virtual touch point shows a floating state visualization effect, in this particular example changing the color to white;
the finger leaves the screen and a click operation is performed on the clickable element 206.
Fig. 3 shows another possible application scenario:
generating the virtual touch point at a screen fixing position 302 when a finger touches the screen at a 301 position;
when the finger moves from the position 301 to the position 303, the virtual touch point moves from the position 302 to the position 304, wherein the distance moved by the finger is X-1 cm, and Y-2 cm; in this specific example, if the first coefficient and the second coefficient are both 1.5, the distance moved by the virtual touch point is-1.5 cm on the X axis and 3 cm on the Y axis;
after the movement, the clickable element is located at the position 304 of the virtual touch point, and the clickable element displays a floating state visualization effect, in this specific example, the color is changed to white; meanwhile, the equipment generates vibration feedback;
the finger leaves the screen and a click event is performed on the clickable element.
The corresponding relation between the moving distance of the virtual touch point and the moving distance of the pointer is shown in FIG. 4, when the pointer moves from 403 to 404, the virtual touch point also moves from 407 to 408; wherein the distance of the moving of the pointer on the X axis is 401 and is marked as delta X1(ii) a The distance the finger has traveled on the Y axis is 402, denoted Δ Y1(ii) a The distance that the virtual touch point moves on the X axis is 405, which is denoted as Δ X2(ii) a The distance the virtual touch point moves on the Y axis is 406, denoted Δ Y2(ii) a The first coefficient is kxSaid second coefficient is denoted as ky(ii) a The relationship between them satisfies the following relationship:
ΔX2=kx×ΔX1
ΔY2=ky×ΔY1
coordinates and Δ X of 407 are known2And Δ Y2The position coordinates of 408 are obtained.
An embodiment of the present application provides a gesture interaction apparatus, as shown in fig. 5. The gesture interaction device 501 can be used for executing the gesture interaction method described in fig. 1, and includes:
a detecting unit 502, configured to detect the touch operation of the pointer on the touch screen and a touch position of the pointer on the screen;
and the processing unit 503 is configured to calculate a position of the virtual touch point and perform a click operation.
The apparatus in the above embodiments may be a terminal device, or may be a chip applied in the terminal device, or other combined devices and components having the above terminal function.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., Digital Video Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Those of ordinary skill in the art will appreciate that the various illustrative elements and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. A gesture interaction method, comprising:
step A: when a pointer object is detected to perform touch operation in a target area in a touch screen, generating a virtual touch point at a set position in the touch screen;
and B: when the pointer object slides on the touch screen, controlling the virtual touch point to move along with the movement of the pointer object;
and C: and when the pointer object is detected to be separated from the touch screen, executing one-time clicking operation at the position of the virtual touch point at the moment.
2. The method of claim 1, wherein: the pointing object may be a body part such as a finger or may be a touch-sensitive implement such as a stylus.
3. The method of claim 1, wherein: the target area may be the entire touch screen or one or more partial areas of the touch screen.
4. The method of claim 1, wherein: the touch operation may be one or more of a short press, a long press, a double click, a tap, a heavy press, a slide inward from an edge of the touch screen, a slide inward from a corner of the touch screen.
5. The method of claim 1, wherein: the set position may be a fixed position in the touch screen; or the position where the pointer contacts the touch screen; or may be a position related to the position where the pointer contacts the touch screen.
6. The method of claim 1, comprising:
the moving mode of the virtual touch point is as follows: the moving distance of the virtual touch point in the X-axis direction is equal to the product of the moving distance of the pointer object in the X-axis direction and a set first coefficient; the moving distance of the virtual touch point in the Y-axis direction is equal to the product of the moving distance of the pointer in the Y-axis direction and a set second coefficient; the moving distance may be a positive number representing movement in a positive direction, or a negative number representing movement in a negative direction.
7. The method of claim 1, wherein: and displaying a visual cursor at the position of the virtual touch point.
8. The method of claim 1, wherein: and when a clickable element is detected at the position of the touch virtual touch point in the screen, displaying a suspension state visualization effect by the clickable element.
9. A gesture interaction apparatus, comprising:
the detection unit is used for detecting the touch operation of the pointer object on the touch screen and the touch position of the pointer object on the screen;
and the processing unit is used for calculating the position of the virtual touch point and executing click operation.
10. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor performs the method of any one of claims 1-8.
CN201911387227.5A 2019-12-27 2019-12-27 Gesture interaction method and device Pending CN111142775A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911387227.5A CN111142775A (en) 2019-12-27 2019-12-27 Gesture interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911387227.5A CN111142775A (en) 2019-12-27 2019-12-27 Gesture interaction method and device

Publications (1)

Publication Number Publication Date
CN111142775A true CN111142775A (en) 2020-05-12

Family

ID=70521369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911387227.5A Pending CN111142775A (en) 2019-12-27 2019-12-27 Gesture interaction method and device

Country Status (1)

Country Link
CN (1) CN111142775A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156213A1 (en) * 2021-01-22 2022-07-28 歌尔股份有限公司 Gesture-based display interface control method and apparatus, device and storage medium
CN115665313A (en) * 2021-07-09 2023-01-31 华为技术有限公司 Device control method and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092518A (en) * 2013-01-24 2013-05-08 福建升腾资讯有限公司 Moving cloud desktop accurate touch method based on remote desktop protocol (RDP)
CN104281395A (en) * 2013-07-08 2015-01-14 联想(北京)有限公司 Mutual information processing method and electronic device
US20160239172A1 (en) * 2015-02-13 2016-08-18 Here Global B.V. Method, apparatus and computer program product for calculating a virtual touch position
CN108491152A (en) * 2018-02-11 2018-09-04 李帆 Touch screen terminal control method, terminal and medium based on virtual cursor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092518A (en) * 2013-01-24 2013-05-08 福建升腾资讯有限公司 Moving cloud desktop accurate touch method based on remote desktop protocol (RDP)
CN104281395A (en) * 2013-07-08 2015-01-14 联想(北京)有限公司 Mutual information processing method and electronic device
US20160239172A1 (en) * 2015-02-13 2016-08-18 Here Global B.V. Method, apparatus and computer program product for calculating a virtual touch position
CN108491152A (en) * 2018-02-11 2018-09-04 李帆 Touch screen terminal control method, terminal and medium based on virtual cursor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156213A1 (en) * 2021-01-22 2022-07-28 歌尔股份有限公司 Gesture-based display interface control method and apparatus, device and storage medium
CN115665313A (en) * 2021-07-09 2023-01-31 华为技术有限公司 Device control method and electronic device

Similar Documents

Publication Publication Date Title
KR101328202B1 (en) Method and apparatus for running commands performing functions through gestures
US8370772B2 (en) Touchpad controlling method and touch device using such method
JP4372188B2 (en) Information processing apparatus and display control method
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
EP2508972A2 (en) Portable electronic device and method of controlling same
US9423953B2 (en) Emulating pressure sensitivity on multi-touch devices
JP2014241139A (en) Virtual touchpad
JPWO2013094371A1 (en) Display control apparatus, display control method, and computer program
KR20140038568A (en) Multi-touch uses, gestures, and implementation
KR102228335B1 (en) Method of selection of a portion of a graphical user interface
KR20150092672A (en) Apparatus and Method for displaying plural windows
TWI482064B (en) Portable device and operating method thereof
WO2014006806A1 (en) Information processing device
JP2015043135A (en) Information processor
JP5991320B2 (en) Input device, image display method and program
CN111142775A (en) Gesture interaction method and device
JP2014191560A (en) Input device, input method, and recording medium
JP6411067B2 (en) Information processing apparatus and input method
EP3433713B1 (en) Selecting first digital input behavior based on presence of a second, concurrent, input
KR20130102670A (en) For detailed operation of the touchscreen handset user-specific finger and touch pen point contact location method and system for setting
TWI413920B (en) Computer cursor control system
US20150091831A1 (en) Display device and display control method
JP6106973B2 (en) Information processing apparatus and program
KR101165388B1 (en) Method for controlling screen using different kind of input devices and terminal unit thereof
JP6344355B2 (en) Electronic terminal, and control method and program thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination