CN110727345B - Method and system for realizing man-machine interaction through finger intersection movement - Google Patents

Method and system for realizing man-machine interaction through finger intersection movement Download PDF

Info

Publication number
CN110727345B
CN110727345B CN201910886587.3A CN201910886587A CN110727345B CN 110727345 B CN110727345 B CN 110727345B CN 201910886587 A CN201910886587 A CN 201910886587A CN 110727345 B CN110727345 B CN 110727345B
Authority
CN
China
Prior art keywords
initial
critical path
screen
target
gesture image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910886587.3A
Other languages
Chinese (zh)
Other versions
CN110727345A (en
Inventor
辛承才
李健
段家喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ned+ Ar Display Technology Co ltd
Original Assignee
Beijing Ned+ Ar Display Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ned+ Ar Display Technology Co ltd filed Critical Beijing Ned+ Ar Display Technology Co ltd
Priority to CN201910886587.3A priority Critical patent/CN110727345B/en
Publication of CN110727345A publication Critical patent/CN110727345A/en
Application granted granted Critical
Publication of CN110727345B publication Critical patent/CN110727345B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a method and a system for realizing man-machine interaction through finger intersection movement, wherein the method comprises the following steps: acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user; performing feature point analysis on the initial gesture image and the target gesture image, and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image; and according to the resolution ratio and the change rate of the screen, acquiring the moving distance of the cursor on the screen, and controlling the cursor to move on the screen. According to the method for realizing man-machine interaction through finger intersection movement, AR glasses interaction can be realized through gesture operation, only the intersection movement of the thumb and the index finger is needed, and large-amplitude gesture waving and five-finger shape transformation are not needed.

Description

Method and system for realizing man-machine interaction through finger intersection movement
Technical Field
The invention relates to the technical field of computers, in particular to a method and a system for realizing man-machine interaction through finger intersection movement.
Background
In the interactive technology of AR/VR glasses, no traditional physical touch pad is provided, so that the AR/VR equipment is conveniently and quickly used for a long time, the interactive equipment is not additionally added to be a necessary direction, and therefore head control and gesture operation are the most direct and natural interactive modes.
However, the traditional head control and gesture operation is not suitable for long-time operation because the operation action range is relatively large, and if a user uses the head control and large-amplitude gesture operation mode in public places, the space of other people can be occupied, the public safety is affected, in addition, the operation behaviors of the user are very different, and the discomfort of other people is easily caused.
Disclosure of Invention
Aiming at the problems, the embodiment of the invention provides a method and a system for realizing man-machine interaction through finger intersection movement.
In a first aspect, an embodiment of the present invention provides a method for implementing human-computer interaction by moving a finger intersection, including:
acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
performing feature point analysis on the initial gesture image and the target gesture image, and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
and according to the resolution ratio and the change rate of the screen, acquiring the moving distance of the cursor on the screen, and controlling the cursor to move on the screen.
Preferably, the preset gesture is that any two fingers of the user intersect.
Preferably, the acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image specifically includes:
performing feature point analysis on the initial gesture image to obtain initial critical path lengths of any two fingers, and performing feature point analysis on the target gesture image to obtain target critical path lengths of any two fingers;
acquiring a first change rate according to a first initial critical path length in the initial critical path lengths, a second initial critical path length in the initial critical path lengths, a first target critical path length in the target critical path lengths and a second target critical path length in the target critical path lengths;
and acquiring a second change rate according to a third initial critical path length in the initial critical path lengths, a fourth initial critical path length in the initial critical path lengths, a third target critical path length in the target critical path lengths and a fourth target critical path length in the target critical path lengths.
Preferably, the first rate of change is obtained in particular by:
Δrx=a 1 p 1 /a 1 b 1 -a 2 p 2 /a 2 b 2
wherein Deltax represents the first rate of change, a 1 p 1 Representing the first initial critical path length, a 1 b 1 Representing the second initial critical path length, a 2 p 2 Representing the firstTarget critical path length, a 2 b 2 Representing the second target critical path length.
Preferably, the second rate of change is obtained in particular by:
Δry=m 1 p 1 /m 1 n 1 -m 2 p 2 /m 2 n 2
wherein Deltary represents the second rate of change, m 1 p 1 Representing the third initial critical path length, m 1 n 1 The fourth initial critical path length, m 2 p 2 Representing the third target critical path length, m 2 n 2 The fourth target critical path length.
Preferably, the method for obtaining the moving distance of the cursor on the screen according to the resolution and the change rate of the screen, and controlling the cursor to move on the screen specifically includes:
acquiring the moving distance of a cursor on the screen in the X-axis direction according to the resolution of the screen and the first change rate;
and acquiring the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second change rate.
Preferably, the acquiring, according to the resolution of the screen and the first rate of change, a moving distance of a cursor on the screen in an X-axis direction specifically includes:
Δx=R x *Δrx,
wherein Deltax represents the moving distance of the cursor on the screen in the X-axis direction, R x Representing the lateral resolution of the screen, Δrx representing the first rate of change;
the step of obtaining the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second change rate specifically includes:
Δy=R y *Δry,
wherein deltay represents the moving distance of the cursor on the screen in the Y-axis direction, R y Representation ofThe longitudinal resolution of the screen, deltary, represents the second rate of change
In a second aspect, an embodiment of the present invention provides a system for implementing human-computer interaction by moving a finger intersection, including:
the gesture detection device comprises an acquisition module, a gesture detection module and a gesture detection module, wherein the acquisition module is used for acquiring an initial gesture image and a target gesture image, and the initial gesture image and the target gesture image both comprise preset gestures of a user;
the change rate module is used for analyzing characteristic points of the initial gesture image and the target gesture image and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
and the moving module is used for acquiring the moving distance of the cursor on the screen according to the resolution ratio and the change rate of the screen and controlling the cursor to move on the screen.
In a third aspect, an embodiment of the present invention provides an interaction device, which is an AR device or a VR device comprising a system for implementing human-computer interaction by finger cross-point movement as claimed in claim 8.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including:
at least one processor, at least one memory, a communication interface, and a bus; wherein,
the processor, the memory and the communication interface complete the communication with each other through the bus;
the communication interface is used for information transmission between the test equipment and the communication equipment of the display device;
the memory stores program instructions executable by the processor, and the processor invokes the program instructions to perform a method for implementing human-computer interaction through finger cross point movement provided in the first aspect.
According to the method for realizing man-machine interaction through finger intersection movement, provided by the embodiment of the invention, VR glasses interaction can be realized through gesture operation, only cross movement of thumb and index finger is needed, large-amplitude gesture swing and shape transformation of five fingers are not needed, the movement range of arms and fingers is small, the method is suitable for long-time operation, and high-precision interaction control can be realized through an accurate algorithm.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for implementing man-machine interaction by finger cross point movement according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a gesture of crossing a thumb and an index finger as a preset gesture in an embodiment of the present invention;
FIG. 3 is a schematic diagram of an initial gesture image according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a target gesture image according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of on-screen cursor movement in an embodiment of the present invention;
FIG. 6 is a schematic diagram of a system for implementing man-machine interaction by finger cross point movement according to an embodiment of the present invention;
fig. 7 is a schematic entity structure diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a flowchart of a method for implementing man-machine interaction by finger intersection movement according to an embodiment of the present invention, where, as shown in fig. 1, the method includes:
s1, acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
s2, analyzing feature points of the initial gesture image and the target gesture image, and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
s3, according to the resolution ratio and the change rate of the screen, obtaining the moving distance of the cursor on the screen, and controlling the cursor to move on the screen.
Specifically, the executing body in the embodiment of the present invention is AR glasses or VR glasses, in the embodiment of the present invention, the AR glasses are taken as an example for explanation, the preset gesture in the embodiment of the present invention is a gesture that is well specified in advance, any two fingers in the preset gesture intersect, and fig. 2 is a schematic diagram of the gesture that uses the intersection of the thumb and the index finger as the preset gesture in the embodiment of the present invention.
When the AR glasses are used for human-computer interaction, a user wears the AR glasses, a micro camera is arranged on a certain glasses frame of the AR glasses, the micro camera captures gesture moving images of the user, and the micro camera can be an RGB camera, an infrared camera or a depth camera; in the embodiment of the invention, the gesture of the user moves from the initial position to the target position, the miniature camera captures the initial gesture image at the initial position, the target gesture image at the target position is captured, and the change rate of the gesture of the user from the initial position to the target position is obtained by analyzing the characteristic points of the captured initial gesture image and the target gesture image. The change rate can be the change amount of the gesture of the user in the X-axis direction and the Y-axis direction, and can also be the change amount of the angle and the length, and according to the change amount, the moving distance of the cursor can be obtained by combining the resolution of the screen, so that the cursor can be controlled to move on the screen.
According to the method for realizing man-machine interaction through finger intersection movement, AR glasses interaction can be realized through gesture operation, only the cross movement of the thumb and the index finger is needed, large-amplitude gesture swing and shape transformation of the five fingers are not needed, the moving range of the arms and the fingers is small, the method is suitable for long-time operation, and high-precision interaction control can be realized through an accurate algorithm.
On the basis of the foregoing embodiment, preferably, the acquiring a rate of change of the preset gesture from the initial gesture image to the target gesture image specifically includes:
performing feature point analysis on the initial gesture image to obtain initial critical path lengths of any two fingers, and performing feature point analysis on the target gesture image to obtain target critical path lengths of any two fingers;
acquiring the first change rate according to a first initial critical path length in the initial critical path lengths, a second initial critical path length in the initial critical path lengths, a first target critical path length in the target critical path lengths and a second target critical path length in the target critical path lengths;
and acquiring the second change rate according to a third initial critical path length in the initial critical path lengths, a fourth initial critical path length in the initial critical path lengths, a third target critical path length in the target critical path lengths and a fourth target critical path length in the target critical path lengths.
In the implementation of the present invention, the rate of change is a first rate of change of the user gesture in the X-axis direction and a second rate of change in the Y-axis direction.
The AR glasses generally project a picture or video to be displayed on a certain screen, when a gesture of a user moves from an initial position to a target position, a distance that a cursor on the screen needs to move needs to be found out correspondingly, according to the resolution and a first change rate of the display screen of the AR glasses, a moving distance of the cursor in an X-axis direction can be obtained, according to the resolution and a second change rate of the display screen of the AR glasses, a moving distance of the cursor in a Y-axis direction can be obtained, and the cursor can be controlled to move correspondingly from the initial position on the screen by moving the corresponding distance.
FIG. 3 is a schematic diagram of an initial gesture image according to an embodiment of the present invention, where, as shown in FIG. 3, by performing feature point analysis on the initial gesture image, two critical path lengths and positions of crossing points in the initial gesture image can be obtained, where the two critical paths are a respectively 1 b 1 And m 1 n 1 The crossing point is p in the figure 1 From these two critical path lengths and the position of the intersection point, a first initial critical path length a can be obtained 1 p 1 Second initial critical path length a 1 b 1 Third initial critical path length m 1 p 1 And a fourth initial critical path length m 1 n 1
Similarly, fig. 4 is a schematic diagram of a target gesture image in the embodiment of the present invention, and as shown in fig. 4, by performing feature point analysis on the target gesture image, two critical path lengths and positions of crossing points in the target gesture image can be obtained, where the two critical paths are a 2 b 2 And m 2 n 2 The crossing point is p in the figure 2 From these two critical path lengths and the position of the intersection point, a first target critical path length a can be obtained 2 p 2 Second target critical path length a 2 b 2 Third target critical path length m 2 p 2 And a fourth target critical path length m 2 n 2
Specifically, the obtaining, according to the resolution of the screen and the first rate of change, the moving distance of the cursor on the screen in the X-axis direction specifically includes:
Δx=R x *Δrx,
wherein Deltax represents the moving distance of the cursor on the screen in the X-axis direction, R x Representing the lateral resolution of the screen, Δrx represents the first rate of change.
Specifically, the obtaining, according to the resolution of the display screen and the second rate of change, a movement distance of a cursor on the screen in a Y-axis direction specifically includes:
Δy=R y *Δry,
wherein deltay represents the moving distance of the cursor on the screen in the Y-axis direction, R y Representing the longitudinal resolution of the screen, Δry represents the second rate of change.
Fig. 5 is a schematic diagram of cursor movement on a screen according to an embodiment of the present invention, as shown in fig. 5, and it is assumed that the resolution of the AR glasses screen according to an embodiment of the present invention is 1920×1080 pixels (pixels), a horizontal 1920 pixels, and a vertical 1080 pixels.
The process of the preset gesture from the initial gesture picture to the target gesture picture is to control the cursor on the screen from S 0 Move to S 1 S is used for representing the coordinate of a cursor on a screen, the S coordinate is (x, y), and x is more than or equal to 0 and less than or equal to 1920,0 and y is more than or equal to 1080, and S is 0 (x 0 ,y 0 ) S is the initial position of the cursor 0 The initial position of the cursor can be corresponding to the initial position of the gesture preset by the user at any point, S 1 (x 1 ,y 1 ) Is the point where the cursor is located when the user gesture is at the target location.
Defining ap/ab as the p-point to ab scaling factor rx, then mp/mn as the p-point to mn scaling factor ry, then the change from the initial position to the target position can be described by the difference of the two scaling factors, then:
Δrx=a 1 p 1 /a 1 b 1 -a 2 p 2 /a 2 b 2
Δry=m 1 p 1 /m 1 n 1 -m 2 p 2 /m 2 n 2
then, the moving amount (deltax, deltay) of the cursor on the screen and the changing rate (deltarx, deltary) of the preset gesture are in one-to-one correspondence through an algorithm, and the relation between the two is as follows:
Δx=1920*Δrx;
Δy=1080*Δry;
then, the cursor is moved from S on the screen 0 (x 0 ,y 0 ) Move to S 1 (x 1 ,y 1 ) The coordinate correspondence relationship is as follows:
x 1 =x 0 +Δx;
y 1 =y 0 +Δy;
S 1 =(x 0 +Δx,y 0 +Δy);
the variation of defining the S point on the screen can be expressed by the variation of the coordinates x and y in the two points, where x is a variation Δx and y is a variation Δy.
Thus S 1 =(x 0 +1920*(a 1 p 1 /a 1 b 1 -a 2 p 2 /a 2 b 2 ),y 0 +1080*(m 1 p 1 /m 1 n 1 -m 2 p 2 /m 2 n 2 ))。
In addition, in the embodiment of the present invention, the movement of the cursor is controlled by the movement of the gesture of the user, and when the cursor moves to the corresponding position, the operation may be performed in other manners, for example: clicking, wherein the cursor is stopped at a certain position for a certain time, namely clicking operation, or the finger stretching out of the cursor is clicking, and the finger stretching out of the cursor is ending clicking; and after the finger is pulled out, controlling the finger intersection to move, and realizing the dragging of the whole image or the target. The matching operation can be carried out according to actual requirements.
Fig. 6 is a schematic structural diagram of a system for implementing man-machine interaction by finger intersection movement according to an embodiment of the present invention, as shown in fig. 6, the system includes an obtaining module 601, a rate of change module 602, and a moving module 603, where:
the acquiring module 601 is configured to acquire an initial gesture image and a target gesture image, where the initial gesture image and the target gesture image each include a preset gesture of a user;
the change rate module 602 is configured to perform feature point analysis on the initial gesture image and the target gesture image, and obtain a change rate of the preset gesture from the initial gesture image to the target gesture image;
the moving module 603 is configured to obtain a moving distance of a cursor on the screen according to a resolution of the screen and the change rate, and control the cursor to move on the screen.
The acquiring module 601 acquires an initial gesture image and a target gesture image of a user, the analyzing module 602 performs feature point analysis on the initial gesture image and the target gesture image to obtain the change rate of a preset gesture from an initial position to a target position, and the moving module 603 obtains the change rate and the screen resolution according to the previous, obtains the moving distance of a cursor, and controls the cursor to move on the screen.
The implementation process of the embodiment of the present system is the same as that of the embodiment of the method, and please refer to the embodiment of the method for details, which is not described herein.
The embodiment of the invention also provides interaction equipment which can be AR equipment or VR equipment, wherein the AR equipment and the VR equipment both comprise the system for realizing man-machine interaction through finger intersection movement.
Fig. 7 is a schematic physical structure diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 7, where the server may include: processor 710, communication interface (Communications Interface) 720, memory 730, and bus 740, wherein processor 710, communication interface 720, memory 730 communicate with each other via bus 740. Processor 710 may call logic instructions in memory 730 to perform the following method:
acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
performing feature point analysis on the initial gesture image and the target gesture image, and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
and according to the resolution ratio and the change rate of the screen, acquiring the moving distance of the cursor on the screen, and controlling the cursor to move on the screen.
Further, the logic instructions in the memory 730 described above may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a random access Memory (RAM, randomAccess Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. A method for implementing human-machine interaction by finger intersection movement, comprising:
acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
performing feature point analysis on the initial gesture image and the target gesture image, and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
according to the resolution ratio and the change rate of the screen, acquiring the moving distance of a cursor on the screen, and controlling the cursor to move on the screen;
the preset gesture is that any two fingers of the user cross; and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image in the same coordinate system.
2. The method according to claim 1, wherein the obtaining the rate of change of the preset gesture from the initial gesture image to the target gesture image specifically comprises:
performing feature point analysis on the initial gesture image to obtain initial critical path lengths of any two fingers, and performing feature point analysis on the target gesture image to obtain target critical path lengths of any two fingers;
acquiring a first change rate according to a first initial critical path length in the initial critical path lengths, a second initial critical path length in the initial critical path lengths, a first target critical path length in the target critical path lengths and a second target critical path length in the target critical path lengths;
and acquiring a second change rate according to a third initial critical path length in the initial critical path lengths, a fourth initial critical path length in the initial critical path lengths, a third target critical path length in the target critical path lengths and a fourth target critical path length in the target critical path lengths.
3. The method according to claim 2, characterized in that the first rate of change is obtained in particular by:
Δrx=a 1 p 1 /a 1 b 1 -a 2 p 2 /a 2 b 2
wherein Deltax represents the first rate of change, a 1 p 1 Representing the first initial critical path length, a 1 b 1 Representing the second initial critical path length, a 2 p 2 Representing the first target critical path length, a 2 b 2 Representing the second target critical path length.
4. The method according to claim 2, characterized in that the second rate of change is obtained in particular by:
Δry=m 1 p 1 /m 1 n 1 -m 2 p 2 /m 2 n 2
wherein Deltary represents the second rate of change, m 1 p 1 Representing the third initial critical path length, m 1 n 1 The fourth initial critical path length, m 2 p 2 Representing the third target critical path length, m 2 n 2 The fourth target critical path length.
5. The method according to claim 4, wherein the step of obtaining the moving distance of the cursor on the screen according to the resolution of the screen and the change rate, and controlling the cursor to move on the screen specifically comprises:
acquiring the moving distance of a cursor on the screen in the X-axis direction according to the resolution of the screen and the first change rate;
and acquiring the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second change rate.
6. The method according to claim 5, wherein the step of obtaining the moving distance of the cursor on the screen in the X-axis direction according to the resolution of the screen and the first rate of change comprises:
Δx=R x *Δrx,
wherein Deltax represents the moving distance of the cursor on the screen in the X-axis direction, R x Representing the lateral resolution of the screen, Δrx representing the first rate of change;
the step of obtaining the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second change rate specifically includes:
Δy=R y *Δry,
wherein deltay represents the moving distance of the cursor on the screen in the Y-axis direction, R y Representing the longitudinal resolution of the screen, Δry represents the second rate of change.
7. A system for achieving human-machine interaction through finger crossing point movement, comprising:
the gesture detection device comprises an acquisition module, a gesture detection module and a gesture detection module, wherein the acquisition module is used for acquiring an initial gesture image and a target gesture image, and the initial gesture image and the target gesture image both comprise preset gestures of a user;
the change rate module is used for analyzing characteristic points of the initial gesture image and the target gesture image and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
the moving module is used for acquiring the moving distance of the cursor on the screen according to the resolution ratio and the change rate of the screen and controlling the cursor to move on the screen;
the preset gesture is that any two fingers of the user cross; and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image in the same coordinate system.
8. An interactive device, characterized in that it is an AR device or VR device comprising a system for human-machine interaction by finger crossing movements according to claim 7.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the program, realizes the steps of the method for realizing man-machine interaction by finger crossing movements according to any of claims 1 to 6.
CN201910886587.3A 2019-09-19 2019-09-19 Method and system for realizing man-machine interaction through finger intersection movement Active CN110727345B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910886587.3A CN110727345B (en) 2019-09-19 2019-09-19 Method and system for realizing man-machine interaction through finger intersection movement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910886587.3A CN110727345B (en) 2019-09-19 2019-09-19 Method and system for realizing man-machine interaction through finger intersection movement

Publications (2)

Publication Number Publication Date
CN110727345A CN110727345A (en) 2020-01-24
CN110727345B true CN110727345B (en) 2023-12-26

Family

ID=69219225

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910886587.3A Active CN110727345B (en) 2019-09-19 2019-09-19 Method and system for realizing man-machine interaction through finger intersection movement

Country Status (1)

Country Link
CN (1) CN110727345B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104898972A (en) * 2015-05-19 2015-09-09 青岛海信移动通信技术股份有限公司 Method and equipment for regulating electronic image
CN107077169A (en) * 2014-11-14 2017-08-18 高通股份有限公司 Spatial interaction in augmented reality
CN109828660A (en) * 2018-12-29 2019-05-31 深圳云天励飞技术有限公司 A kind of method and device of the control application operating based on augmented reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160224123A1 (en) * 2015-02-02 2016-08-04 Augumenta Ltd Method and system to control electronic devices through gestures

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107077169A (en) * 2014-11-14 2017-08-18 高通股份有限公司 Spatial interaction in augmented reality
CN104898972A (en) * 2015-05-19 2015-09-09 青岛海信移动通信技术股份有限公司 Method and equipment for regulating electronic image
CN109828660A (en) * 2018-12-29 2019-05-31 深圳云天励飞技术有限公司 A kind of method and device of the control application operating based on augmented reality

Also Published As

Publication number Publication date
CN110727345A (en) 2020-01-24

Similar Documents

Publication Publication Date Title
US20220084279A1 (en) Methods for manipulating objects in an environment
US10761612B2 (en) Gesture recognition techniques
EP2919104B1 (en) Information processing device, information processing method, and computer-readable recording medium
US20150035746A1 (en) User Interface Device
CN115443445A (en) Hand gesture input for wearable systems
JP6631541B2 (en) Method and system for touch input
CN105612478A (en) User interface programmatic scaling
CN110968187B (en) Remote touch detection enabled by a peripheral device
EP2814000A1 (en) Image processing apparatus, image processing method, and program
WO2012082971A1 (en) Systems and methods for a gaze and gesture interface
EP4127879A1 (en) Method and device for adjusting the control-display gain of a gesture controlled electronic device
KR20140100547A (en) Full 3d interaction on mobile devices
US20180032139A1 (en) Interactive system control apparatus and method
US10359906B2 (en) Haptic interface for population of a three-dimensional virtual environment
JP2014211858A (en) System, method and program for providing user interface based on gesture
Matulic et al. Phonetroller: Visual representations of fingers for precise touch input with mobile phones in vr
CN113138670B (en) Touch screen interaction gesture control method and device, touch screen and storage medium
CN110941337A (en) Control method of avatar, terminal device and computer readable storage medium
Zhang et al. A novel human-3DTV interaction system based on free hand gestures and a touch-based virtual interface
CN110727345B (en) Method and system for realizing man-machine interaction through finger intersection movement
US20230316634A1 (en) Methods for displaying and repositioning objects in an environment
CN116360589A (en) Method and medium for inputting information by virtual keyboard and electronic equipment
Siam et al. Human computer interaction using marker based hand gesture recognition
CN114116106A (en) Chart display method and device, electronic equipment and storage medium
CN112068699A (en) Interaction method, interaction device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant