CN110727345A - Method and system for realizing man-machine interaction through finger intersection point movement - Google Patents
Method and system for realizing man-machine interaction through finger intersection point movement Download PDFInfo
- Publication number
- CN110727345A CN110727345A CN201910886587.3A CN201910886587A CN110727345A CN 110727345 A CN110727345 A CN 110727345A CN 201910886587 A CN201910886587 A CN 201910886587A CN 110727345 A CN110727345 A CN 110727345A
- Authority
- CN
- China
- Prior art keywords
- screen
- initial
- target
- critical path
- gesture image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000008859 change Effects 0.000 claims abstract description 72
- 238000004590 computer program Methods 0.000 claims 1
- 210000003811 finger Anatomy 0.000 abstract description 35
- 239000011521 glass Substances 0.000 abstract description 15
- 210000003813 thumb Anatomy 0.000 abstract description 5
- 230000009466 transformation Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000005224 forefinger Anatomy 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides a method and a system for realizing human-computer interaction through finger cross point movement, wherein the method comprises the following steps: acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user; analyzing feature points of the initial gesture image and the target gesture image to obtain the change rate of the preset gesture from the initial gesture image to the target gesture image; and acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate, and controlling the cursor to move on the screen. According to the method for realizing human-computer interaction through the movement of the finger intersection, provided by the embodiment of the invention, the interaction of the AR glasses can be realized through gesture operation, only the crossed movement of the thumb and the index finger is needed, and great gesture swinging and shape transformation of five fingers are not needed.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a method and a system for realizing human-computer interaction through finger intersection point movement.
Background
In the interaction technology of the AR/VR glasses, a traditional physical touch pad is not available, the AR/VR equipment is used conveniently, quickly and for a long time, and the direction of the interaction equipment is inevitable without additionally increasing, so that head control and gesture operation are the most direct and natural interaction modes.
However, the traditional head control and gesture operation is not suitable for long-time operation due to large operation action range, and in public places, if a user uses the head control and large gesture operation mode, the space of other people is invaded, public safety is affected, and in addition, the operation behavior of the user is strange, so that the discomfort of other people is easily caused.
Disclosure of Invention
In view of the foregoing problems, embodiments of the present invention provide a method and system for implementing human-computer interaction through finger intersection movement.
In a first aspect, an embodiment of the present invention provides a method for implementing human-computer interaction through finger intersection movement, including:
acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
analyzing feature points of the initial gesture image and the target gesture image to obtain the change rate of the preset gesture from the initial gesture image to the target gesture image;
and acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate, and controlling the cursor to move on the screen.
Preferably, the preset gesture is that any two fingers of the user intersect.
Preferably, the obtaining a change rate of the preset gesture from the initial gesture image to the target gesture image specifically includes:
performing feature point analysis on the initial gesture image to obtain initial key path lengths of any two fingers, and performing feature point analysis on the target gesture image to obtain target key path lengths of any two fingers;
acquiring a first change rate according to a first initial critical path length in the initial critical path lengths, a second initial critical path length in the initial critical path lengths, a first target critical path length in the target critical path lengths and a second target critical path length in the target critical path lengths;
and acquiring a second change rate according to a third initial critical path length in the initial critical path lengths, a fourth initial critical path length in the initial critical path lengths, a third target critical path length in the target critical path lengths and a fourth target critical path length in the target critical path lengths.
Preferably, the first rate of change is obtained in particular by:
Δrx=a1p1/a1b1-a2p2/a2b2,
wherein Δ rx represents the first rate of change, a1p1Representing the first initial critical path length, a1b1Representing the second initial critical path length, a2p2Representing the first target critical path length, a2b2Representing the second target critical path length.
Preferably, the second rate of change is obtained in particular by:
Δry=m1p1/m1n1-m2p2/m2n2,
wherein Δ ry represents the second rate of change, m1p1Represents the third initial critical path length, m1n1The fourth initial critical path length, m2p2Representing the third target critical path length, m2n2The fourth target critical path length.
Preferably, the obtaining a moving distance of a cursor on the screen according to the resolution and the change rate of the screen, and controlling the cursor to move on the screen specifically includes:
acquiring the moving distance of a cursor on the screen in the X-axis direction according to the resolution of the screen and the first change rate;
and acquiring the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second change rate.
Preferably, the obtaining a moving distance of a cursor on the screen in the X-axis direction according to the resolution of the screen and the first change rate specifically includes:
Δx=Rx*Δrx,
where Δ X represents a moving distance of a cursor on the screen in the X-axis direction, RxRepresenting the lateral resolution of the screen, Δ rx representing the first rate of change;
the obtaining of the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second rate of change specifically includes:
Δy=Ry*Δry,
where Δ Y represents a moving distance of the cursor on the screen in the Y-axis direction, RyRepresents the longitudinal resolution of the screen, and Δ ry represents the second rate of change
In a second aspect, an embodiment of the present invention provides a system for implementing human-computer interaction through finger intersection movement, including:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring an initial gesture image and a target gesture image, and the initial gesture image and the target gesture image both comprise preset gestures of a user;
the change rate module is used for analyzing the characteristic points of the initial gesture image and the target gesture image to obtain the change rate of the preset gesture from the initial gesture image to the target gesture image;
and the moving module is used for acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate and controlling the cursor to move on the screen.
In a third aspect, an embodiment of the present invention provides an interaction device, where the interaction device is an AR device or a VR device including the system for implementing human-computer interaction through finger intersection movement according to claim 8.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including:
at least one processor, at least one memory, a communication interface, and a bus; wherein,
the processor, the memory and the communication interface complete mutual communication through the bus;
the communication interface is used for information transmission between the test equipment and the communication equipment of the display device;
the memory stores program instructions executable by the processor, and the processor calls the program instructions to execute the method for realizing human-computer interaction through finger cross point movement provided by the first aspect.
According to the method for realizing human-computer interaction through the movement of the finger intersection point, provided by the embodiment of the invention, the interaction of VR glasses can be realized through gesture operation, only the crossed movement of a thumb and an index finger is needed, large-amplitude gesture swinging and shape transformation of five fingers are not needed, the moving range of arms and fingers is small, the method is suitable for long-time operation, and high-precision interaction control can be realized through an accurate algorithm.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for implementing human-computer interaction through finger intersection movement according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a gesture crossing a thumb and a forefinger as a preset gesture according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an initial gesture image according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a target gesture image according to an embodiment of the present disclosure;
FIG. 5 is a diagram illustrating on-screen cursor movement in an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a system for implementing human-computer interaction through finger intersection movement according to an embodiment of the present invention;
fig. 7 is a schematic physical structure diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart of a method for implementing human-computer interaction through finger intersection movement according to an embodiment of the present invention, as shown in fig. 1, the method includes:
s1, acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
s2, analyzing the characteristic points of the initial gesture image and the target gesture image, and acquiring the change rate of the preset gesture from the initial gesture image to the target gesture image;
and S3, acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate, and controlling the cursor to move on the screen.
Specifically, in the embodiment of the present invention, the execution subject is AR glasses or VR glasses, and in the embodiment of the present invention, the AR glasses are taken as an example for description, the preset gesture in the embodiment of the present invention is a predefined gesture, any two fingers in the preset gesture intersect each other, and fig. 2 is a schematic diagram in which a gesture where a thumb and an index finger intersect each other is taken as a preset gesture in the embodiment of the present invention.
When the AR glasses are used for man-machine interaction, a user wears the AR glasses, a micro camera is arranged on a certain glasses frame of the AR glasses and captures gesture moving images of the user, and the micro camera can be an RGB camera, an infrared camera or a depth camera; in the embodiment of the invention, the gesture of the user moves from the initial position to the target position, the miniature camera captures the initial gesture image at the initial position and captures the target gesture image at the target position, and the change rate of the gesture of the user from the initial position to the target position is obtained by analyzing the feature points of the captured initial gesture image and the captured target gesture image. The change rate can be the change amount of the user gesture in the X-axis direction and the Y-axis direction, or the change amount of the angle and the length, and according to the change amount, the moving distance of the cursor can be obtained by combining the resolution of the screen, so that the cursor can be controlled to move on the screen.
According to the method for realizing human-computer interaction through the movement of the finger intersection point, provided by the embodiment of the invention, the interaction of AR glasses can be realized through gesture operation, only the crossed movement of the thumb and the index finger is needed, large-amplitude gesture swinging and shape transformation of five fingers are not needed, the moving range of the arm and the fingers is small, the method is suitable for long-time operation, and high-precision interaction control can be realized through an accurate algorithm.
On the basis of the foregoing embodiment, preferably, the acquiring a change rate of the preset gesture from the initial gesture image to the target gesture image specifically includes:
performing feature point analysis on the initial gesture image to obtain initial key path lengths of any two fingers, and performing feature point analysis on the target gesture image to obtain target key path lengths of any two fingers;
acquiring the first change rate according to a first initial critical path length in the initial critical path lengths, a second initial critical path length in the initial critical path lengths, a first target critical path length in the target critical path lengths and a second target critical path length in the target critical path lengths;
and acquiring the second change rate according to a third initial critical path length in the initial critical path lengths, a fourth initial critical path length in the initial critical path lengths, a third target critical path length in the target critical path lengths and a fourth target critical path length in the target critical path lengths.
In the implementation of the invention, the change rate is a first change rate of the user gesture in the X-axis direction and a second change rate in the Y-axis direction.
The AR glasses usually project a picture or a video to be displayed on a certain screen, when a gesture of a user moves from an initial position to a target position, a distance that a cursor on the screen needs to move needs to be found out correspondingly, according to a resolution and a first change rate of a display screen of the AR glasses, a moving distance of the cursor in an X-axis direction can be obtained, according to a resolution and a second change rate of the display screen of the AR glasses, a moving distance of the cursor in a Y-axis direction can be obtained, and the cursor is moved from the initial position on the screen by a corresponding distance, that is, the cursor can be controlled to move correspondingly on the screen.
Fig. 3 is a schematic diagram of an initial gesture image in an embodiment of the present invention, and as shown in fig. 3, by performing feature point analysis on the initial gesture image, lengths of two key paths and positions of an intersection in the initial gesture image can be obtained, where the two key paths are a1b1And m1n1The cross point is p in the figure1Based on the two critical path lengths and the position of the intersection, a first initial critical path length a is obtained1p1Second initial critical path length a1b1Third initial critical path length m1p1And a fourth initial critical path length m1n1。
Similarly, fig. 4 is a schematic diagram of a target gesture image in the embodiment of the present invention, and as shown in fig. 4, feature point analysis is performed on the target gesture image to obtain lengths of two key paths and positions of an intersection in the target gesture image, where the two key paths are a2b2And m2n2The cross point is p in the figure2According to the two critical path lengths and the position of the intersection point, the first target critical path length a can be obtained2p2Second target critical path length a2b2Third target critical path length m2p2And a fourth target critical path length m2n2。
Specifically, the obtaining of the moving distance of the cursor on the screen in the X-axis direction according to the resolution of the screen and the first change rate specifically includes:
Δx=Rx*Δrx,
where Δ X represents a moving distance of a cursor on the screen in the X-axis direction, RxRepresents the lateral resolution of the screen and Δ rx represents the first rate of change.
Specifically, the obtaining of the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the display screen and the second rate of change specifically includes:
Δy=Ry*Δry,
where Δ Y represents a moving distance of the cursor on the screen in the Y-axis direction, RyRepresents the longitudinal resolution of the screen and Δ ry represents the second rate of change.
Fig. 5 is a schematic diagram of cursor movement on a screen in an embodiment of the present invention, and as shown in fig. 5, it is assumed that resolution of an AR glasses screen in an embodiment of the present invention is 1920 × 1080 pixels (pixels), transverse 1920 pixels (pixels), and longitudinal 1080 pixels (pixels).
The process of the preset gesture from the initial gesture picture to the target gesture picture is to control the cursor on the screen from S0Move to S1The process of (1) represents the coordinate of the cursor on the screen by S, the S coordinate is (x, y), x is more than or equal to 0 and less than or equal to 1920, y is more than or equal to 0 and less than or equal to 1080, and S is0(x0,y0) As the initial position of the cursor, S0At any point, the initial position of the cursor corresponds to the initial position of the preset gesture of the user, S1(x1,y1) The point at which the cursor was located when the user gesture was at the target position.
Defining ap/ab as the scaling factor rx of p-point with respect to ab, then mp/mn as the scaling factor ry of p-point with respect to mn, then the change from the initial position to the target position can be described by the difference of the two scaling factors, then:
Δrx=a1p1/a1b1-a2p2/a2b2;
Δry=m1p1/m1n1-m2p2/m2n2;
then, the moving amount (Δ x, Δ y) of the cursor on the screen and the change rate (Δ rx, Δ ry) of the preset gesture are in one-to-one correspondence through an algorithm, and the relationship between the two is as follows:
Δx=1920*Δrx;
Δy=1080*Δry;
then the cursor is on the screen from S0(x0,y0) Move to S1(x1,y1) The coordinate corresponding relation is as follows:
x1=x0+Δx;
y1=y0+Δy;
S1=(x0+Δx,y0+Δy);
the amount of change defining the S point on the screen can be represented by the amount of transformation of coordinates x and y in two points, the amount of change of x being Δ x, where the amount of change of y is Δ y.
Thus, S1=(x0+1920*(a1p1/a1b1-a2p2/a2b2),y0+1080*(m1p1/m1n1-m2p2/m2n2))。
In addition, in the embodiment of the present invention, the movement of the cursor is controlled by the movement of the user gesture, and when the cursor moves to the corresponding position, the operation may be performed in other manners, for example: clicking, namely pausing the cursor at a certain position for a certain time to obtain a clicking operation, or extending an idle finger to obtain a clicking operation, and retracting the extended finger to obtain a click end; dragging, after the fingers are stretched out, controlling the movement of the intersection points of the fingers, and realizing the dragging of the whole image or the target. Can carry out the cooperation operation according to actual demand.
Fig. 6 is a schematic structural diagram of a system for implementing human-computer interaction through finger intersection movement according to an embodiment of the present invention, as shown in fig. 6, the system includes an obtaining module 601, a rate change module 602, and a moving module 603, where:
the acquiring module 601 is configured to acquire an initial gesture image and a target gesture image, where both the initial gesture image and the target gesture image include a preset gesture of a user;
the change rate module 602 is configured to perform feature point analysis on the initial gesture image and the target gesture image, and obtain a change rate of the preset gesture from the initial gesture image to the target gesture image;
the moving module 603 is configured to obtain a moving distance of a cursor on the screen according to the resolution of the screen and the change rate, and control the cursor to move on the screen.
The obtaining module 601 obtains an initial gesture image and a target gesture image of a user, the analyzing module 602 performs feature point analysis on the initial gesture image and the target gesture image to obtain a change rate of a preset gesture from an initial position to a target position, and the moving module 603 obtains a moving distance of a cursor according to the change rate and the screen resolution obtained in the previous step to control the cursor to move on a screen.
The specific implementation process of the embodiment of the system is the same as that of the embodiment of the method described above, and please refer to the embodiment of the method for details, which is not described herein again.
The embodiment of the invention also provides an interactive device, which can be an AR device or a VR device, wherein the AR device and the VR device both comprise the system for realizing human-computer interaction through the movement of the finger cross point.
Fig. 7 is a schematic entity structure diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 7, the server may include: a processor (processor)710, a communication Interface 720, a memory (memory)730, and a bus 740, wherein the processor 710, the communication Interface 720, and the memory 730 communicate with each other via the bus 740. Processor 710 may call logic instructions in memory 730 to perform the following method:
acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
analyzing feature points of the initial gesture image and the target gesture image to obtain the change rate of the preset gesture from the initial gesture image to the target gesture image;
and acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate, and controlling the cursor to move on the screen.
In addition, the logic instructions in the memory 730 can be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (10)
1. A method for realizing human-computer interaction through finger intersection point movement is characterized by comprising the following steps:
acquiring an initial gesture image and a target gesture image, wherein the initial gesture image and the target gesture image both comprise preset gestures of a user;
analyzing feature points of the initial gesture image and the target gesture image to obtain the change rate of the preset gesture from the initial gesture image to the target gesture image;
and acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate, and controlling the cursor to move on the screen.
2. The method of claim 1, wherein the preset gesture is any two fingers of the user crossing.
3. The method according to claim 2, wherein the obtaining of the change rate of the preset gesture from the initial gesture image to the target gesture image specifically includes:
performing feature point analysis on the initial gesture image to obtain initial key path lengths of any two fingers, and performing feature point analysis on the target gesture image to obtain target key path lengths of any two fingers;
acquiring a first change rate according to a first initial critical path length in the initial critical path lengths, a second initial critical path length in the initial critical path lengths, a first target critical path length in the target critical path lengths and a second target critical path length in the target critical path lengths;
and acquiring a second change rate according to a third initial critical path length in the initial critical path lengths, a fourth initial critical path length in the initial critical path lengths, a third target critical path length in the target critical path lengths and a fourth target critical path length in the target critical path lengths.
4. The method according to claim 3, characterized in that said first rate of change is obtained in particular by:
Δrx=a1p1/a1b1-a2p2/a2b2,
wherein Δ rx represents the first rate of change, a1p1Representing the first initial critical path length, a1b1Representing the second initial critical path length, a2p2Representing the first target KeyPath length, a2b2Representing the second target critical path length.
5. The method according to claim 3, wherein the second rate of change is obtained by:
Δry=m1p1/m1n1-m2p2/m2n2,
wherein Δ ry represents the second rate of change, m1p1Represents the third initial critical path length, m1n1The fourth initial critical path length, m2p2Representing the third target critical path length, m2n2The fourth target critical path length.
6. The method according to claim 5, wherein the obtaining a moving distance of a cursor on the screen according to the resolution and the change rate of the screen, and controlling the cursor to move on the screen specifically includes:
acquiring the moving distance of a cursor on the screen in the X-axis direction according to the resolution of the screen and the first change rate;
and acquiring the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second change rate.
7. The method according to claim 6, wherein the obtaining a moving distance of a cursor on the screen in an X-axis direction according to the resolution of the screen and the first rate of change specifically comprises:
Δx=Rx*Δrx,
where Δ X represents a moving distance of a cursor on the screen in the X-axis direction, RxRepresenting the lateral resolution of the screen, Δ rx representing the first rate of change;
the obtaining of the moving distance of the cursor on the screen in the Y-axis direction according to the resolution of the screen and the second rate of change specifically includes:
Δy=Ry*Δry,
where Δ Y represents a moving distance of the cursor on the screen in the Y-axis direction, RyRepresents the longitudinal resolution of the screen and Δ ry represents the second rate of change.
8. A system for realizing human-computer interaction through finger intersection point movement is characterized by comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring an initial gesture image and a target gesture image, and the initial gesture image and the target gesture image both comprise preset gestures of a user;
the change rate module is used for analyzing the characteristic points of the initial gesture image and the target gesture image to obtain the change rate of the preset gesture from the initial gesture image to the target gesture image;
and the moving module is used for acquiring the moving distance of the cursor on the screen according to the resolution of the screen and the change rate and controlling the cursor to move on the screen.
9. An interaction device, characterized in that the interaction device is an AR device or a VR device comprising a system for human-computer interaction by finger cross point movement according to claim 8.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of the method for human-computer interaction by finger cross point movement as claimed in any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910886587.3A CN110727345B (en) | 2019-09-19 | 2019-09-19 | Method and system for realizing man-machine interaction through finger intersection movement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910886587.3A CN110727345B (en) | 2019-09-19 | 2019-09-19 | Method and system for realizing man-machine interaction through finger intersection movement |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110727345A true CN110727345A (en) | 2020-01-24 |
CN110727345B CN110727345B (en) | 2023-12-26 |
Family
ID=69219225
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910886587.3A Active CN110727345B (en) | 2019-09-19 | 2019-09-19 | Method and system for realizing man-machine interaction through finger intersection movement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110727345B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104898972A (en) * | 2015-05-19 | 2015-09-09 | 青岛海信移动通信技术股份有限公司 | Method and equipment for regulating electronic image |
US20160224123A1 (en) * | 2015-02-02 | 2016-08-04 | Augumenta Ltd | Method and system to control electronic devices through gestures |
CN107077169A (en) * | 2014-11-14 | 2017-08-18 | 高通股份有限公司 | Spatial interaction in augmented reality |
CN109828660A (en) * | 2018-12-29 | 2019-05-31 | 深圳云天励飞技术有限公司 | A kind of method and device of the control application operating based on augmented reality |
-
2019
- 2019-09-19 CN CN201910886587.3A patent/CN110727345B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107077169A (en) * | 2014-11-14 | 2017-08-18 | 高通股份有限公司 | Spatial interaction in augmented reality |
US20160224123A1 (en) * | 2015-02-02 | 2016-08-04 | Augumenta Ltd | Method and system to control electronic devices through gestures |
CN104898972A (en) * | 2015-05-19 | 2015-09-09 | 青岛海信移动通信技术股份有限公司 | Method and equipment for regulating electronic image |
CN109828660A (en) * | 2018-12-29 | 2019-05-31 | 深圳云天励飞技术有限公司 | A kind of method and device of the control application operating based on augmented reality |
Also Published As
Publication number | Publication date |
---|---|
CN110727345B (en) | 2023-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2919104B1 (en) | Information processing device, information processing method, and computer-readable recording medium | |
CN104199550B (en) | Virtual keyboard operation device, system and method | |
US20150035746A1 (en) | User Interface Device | |
KR101608423B1 (en) | Full 3d interaction on mobile devices | |
CN110968187B (en) | Remote touch detection enabled by a peripheral device | |
CN103955339A (en) | Terminal operation method and terminal equipment | |
KR20170009979A (en) | Methods and systems for touch input | |
JP2014211858A (en) | System, method and program for providing user interface based on gesture | |
CN112068698A (en) | Interaction method and device, electronic equipment and computer storage medium | |
CN110941337A (en) | Control method of avatar, terminal device and computer readable storage medium | |
CN113138670B (en) | Touch screen interaction gesture control method and device, touch screen and storage medium | |
CN114546212A (en) | Method, device and equipment for adjusting interface display state and storage medium | |
CN103455262A (en) | Pen-based interaction method and system based on mobile computing platform | |
CN104199548A (en) | Man-machine interactive type virtual touch device, system and method | |
US10831338B2 (en) | Hiding regions of a shared document displayed on a screen | |
CN116360589A (en) | Method and medium for inputting information by virtual keyboard and electronic equipment | |
CN110727345B (en) | Method and system for realizing man-machine interaction through finger intersection movement | |
CN107967091B (en) | Human-computer interaction method and computing equipment for human-computer interaction | |
CN113457117B (en) | Virtual unit selection method and device in game, storage medium and electronic equipment | |
CN104571791A (en) | Information processing method and electronic equipment | |
WO2017016333A1 (en) | Screen adjustment method and device | |
CN112068699A (en) | Interaction method, interaction device, electronic equipment and storage medium | |
CN113485590A (en) | Touch operation method and device | |
CN107977071B (en) | Operation method and device suitable for space system | |
CN107422969B (en) | Infrared touch screen, and zooming method and device of infrared touch screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |