CN116755565A - Method, system and storage medium for realizing virtual interaction by transmitting signal by finger - Google Patents

Method, system and storage medium for realizing virtual interaction by transmitting signal by finger Download PDF

Info

Publication number
CN116755565A
CN116755565A CN202311052692.XA CN202311052692A CN116755565A CN 116755565 A CN116755565 A CN 116755565A CN 202311052692 A CN202311052692 A CN 202311052692A CN 116755565 A CN116755565 A CN 116755565A
Authority
CN
China
Prior art keywords
signal
interaction
finger
vibration
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311052692.XA
Other languages
Chinese (zh)
Inventor
齐本铁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weisaike Network Technology Co ltd
Original Assignee
Nanjing Weisaike Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Weisaike Network Technology Co ltd filed Critical Nanjing Weisaike Network Technology Co ltd
Priority to CN202311052692.XA priority Critical patent/CN116755565A/en
Publication of CN116755565A publication Critical patent/CN116755565A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Abstract

The invention discloses a method, a system and a storage medium for realizing virtual interaction by using finger transmission signals, and belongs to the technical field of gesture control; acquiring a second signal of a second sensor, wherein the second signal is a vibration signal generated when a second finger wearing the second sensor is touched; detecting touch types, namely clicking, double clicking or rubbing, and outputting a first signal which is a signal A1, a signal A2 and a signal A3 correspondingly according to the touch types, and outputting a second signal which is a signal B1, a signal B2 and a signal B3 correspondingly; calculating an interaction signal according to the received first signal and/or second signal; and sending the interaction signal to the client for the client to execute the interaction instruction according to the received interaction signal. According to the invention, the interaction signal is calculated by acquiring the signal generated by the vibration of the finger, a visual algorithm is not needed, and the accuracy is higher.

Description

Method, system and storage medium for realizing virtual interaction by transmitting signal by finger
Technical Field
The invention relates to the technical field of gesture control, in particular to a method, a system and a storage medium for realizing virtual interaction by using finger transmission signals.
Background
With the development of VR/AR technology, the interactive scenes used in VR/AR technology are gradually increased, so that in order to improve the real feeling, the VR/AR technology generally adopts gesture control to send interactive instructions, the manner is separated from the traditional keyboard and mouse, the use flexibility is improved, the immersion experience of the user is enhanced, and the gesture control is a technology for controlling the interaction and operation in the virtual environment by identifying and explaining the hand actions of the user. It can provide a more intuitive, immersive experience for users, enabling them to interact with virtual objects naturally and smoothly.
Currently, conventional gesture control mostly uses a camera to detect and recognize user gestures, such as fist making, nodding, waving, and the like. These gestures may be mapped to specific interactive functions, such as selection, movement, zoom-in and zoom-out, but the method is mainly implemented by using a visual algorithm, which is easily interfered to cause calculation deviation, such as that the gestures are blocked, or a series of external factors such as influence of ambient light, so that control errors occur and the experience of a user is affected.
Disclosure of Invention
The invention aims to solve the problem that errors occur in gesture calculation by vision and the precision is low, and provides a method, a system and a storage medium for realizing virtual interaction by using finger transmission signals.
In a first aspect, the present invention achieves the above object by the following technical solutions, and a method for achieving virtual interaction by transmitting a signal with a finger is applied to a wearable terminal, where the wearable terminal includes a first sensor and a second sensor, and the method includes the following steps:
acquiring a first signal of the first sensor, wherein the first signal is a vibration signal generated when a first finger wearing the first sensor is touched;
acquiring a second signal of the second sensor, wherein the second signal is a vibration signal generated when a second finger wearing the second sensor is touched;
detecting the touch type to be single-click, double-click or friction, outputting a signal A1, a signal A2 and a signal A3 according to the first signal of the touch type, and outputting a signal B1, a signal B2 and a signal B3 according to the second signal of the touch type;
calculating an interaction signal according to the received first signal and/or second signal;
and sending the interaction signal to a client for the client to execute an interaction instruction according to the received interaction signal.
Preferably, the method for calculating the interaction signal according to the received first signal and/or second signal comprises the following steps:
if only one of the signals A1, A2, A3, B1, B2 and B3 is received, the calculated interaction signal corresponds to the interaction signal 1, the interaction signal 2, the interaction signal 3, the interaction signal 4, the interaction signal 5 and the interaction signal 6;
if the signal A1 and the signal B1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the signal B1 and the signal A1 are received in turn in the time T1, the calculated interaction signal corresponds to the interaction signal 8.
Preferably, the method for detecting the touch type includes:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
Preferably, the client executes the interaction instruction according to the received interaction signal by presetting an interaction signal and interaction instruction comparison table in the client, searching the comparison table after the client receives the interaction signal, and executing the corresponding interaction instruction.
In a second aspect, the present invention achieves the above object by a method for achieving virtual interaction by transmitting a signal with a finger, the method comprising the steps of:
receiving an interaction signal sent by a wearing terminal, wherein the interaction signal is a signal output by calculating a first signal and/or a second signal, the first signal is a vibration signal generated when the wearing terminal touches a first finger wearing a first sensor, the second signal is a vibration signal generated when the wearing terminal touches a second finger wearing a second sensor, the touch type is detected to be classified into single click, double click or friction according to the wearing terminal, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3, and the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3;
and executing an interaction instruction according to the interaction signal.
Preferably, the method for calculating the signal output by the first signal and/or the second signal comprises:
if the wearable terminal only receives the signals A1, A2, A3, B1, B2 and B3, the calculated interaction signals are corresponding to the interaction signals 1, 2, 3, 4, 5 and 6;
if the wearable terminal receives the signal A1 and the signal B1 in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the wearable terminal receives the signal B1 and the signal A1 in the time T1, the calculated interaction signal is corresponding to the interaction signal 8.
Preferably, the method for detecting the touch type according to the wearable terminal includes:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
In a third aspect, the present invention achieves the above object by a system for achieving virtual interaction by transmitting a signal with a finger, the system comprising:
the wearable terminal is provided with a first sensor and is used for acquiring a first signal, wherein the first signal is a vibration signal generated when a first finger wearing the first sensor is touched;
the second sensor is used for acquiring a second signal, and the second signal is a vibration signal generated when the second finger wearing the second sensor is touched;
the touch detection unit is used for detecting that the touch type is classified into single click, double click or friction, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3 according to the touch type, and the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3;
the signal calculation unit is used for calculating an interaction signal according to the received first signal and/or second signal and sending the interaction signal to the client;
the client is used for running the virtual scene, and is internally provided with a preset interactive signal and an interactive instruction comparison table, and the client is used for executing corresponding interactive instructions according to the interactive signals.
Preferably, the signal calculating unit comprises the following steps:
if only one of the signals A1, A2, A3, B1, B2 and B3 is received, the calculated interaction signal corresponds to the interaction signal 1, the interaction signal 2, the interaction signal 3, the interaction signal 4, the interaction signal 5 and the interaction signal 6;
if the signal A1 and the signal B1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the signal B1 and the signal A1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 8;
the detection method of the touch detection unit comprises the following steps:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
In a fourth aspect, the present invention achieves the above object by a storage medium having stored thereon a computer program which, when executed by a processor, implements a method for achieving virtual interaction with a finger-passing signal according to the first or second aspect.
Compared with the prior art, the invention has the beneficial effects that: according to the method, the sensor is used for acquiring the signals generated by the vibration of the fingers to calculate the interaction signals, the client executes the interaction instructions according to the interaction signals to realize some instruction operation in the virtual scene, and the method does not need to adopt a visual algorithm to conduct gesture recognition, so that the error is small, the precision is higher, the data analysis difficulty is low, the method is suitable for human natural finger gestures, and the method is easy to learn and use.
Drawings
FIG. 1 is a flow chart of a method for implementing virtual interactions using finger-transferred signals in accordance with the present invention.
FIG. 2 is a diagram illustrating vibration waveforms generated by different touch types according to the present invention.
FIG. 3 is a schematic diagram of the interaction signal calculated from the sensor signal according to the present invention.
FIG. 4 is a diagram showing the intent of the interactive signal and interactive instruction of the present invention.
FIG. 5 is a flowchart of a method for detecting touch categories according to the present invention.
FIG. 6 is a schematic diagram of a system for implementing virtual interactions using finger-transferred signals in accordance with the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
The method is characterized in that when the method is operated, a hardware device comprises a wearing terminal and a client for operating a virtual scene, wherein the wearing terminal is worn on a limb and can be arranged on a palm or an arm and comprises a first sensor and a second sensor, the first sensor and the second sensor are respectively arranged on a first finger and a second finger, the sensors are used for collecting vibration signals, and the identification of interaction instructions is realized through the response of vibration of the fingers.
As shown in fig. 1, when applied to a wearable device, the method comprises the following steps:
step S11, a first signal of the first sensor is obtained, the first signal is a vibration signal generated when a first finger wearing the first sensor is touched, the signal waveform collected by the first sensor approaches to a straight line in a state that the first finger is not touched, vibration is collected by the first sensor when the first finger is touched, vibration waves occur, the signal waveform breaks the state of approaching to the straight line, and therefore the vibration waves are defined as vibration signals, and the first sensor is used for collecting the vibration signals.
Step S12, obtaining a second signal of the second sensor, where the second signal is a vibration signal generated when the second finger wearing the second sensor is touched, and step S12 and step S11 belong to parallel steps, and because the first sensor and the second sensor are independently worn on the first finger and the second finger, it is not necessary to consider which finger is touched first, and the second sensor is the same principle as the first sensor, except that the second sensor is used to collect the occurrence of vibration waves on the second finger.
It should be noted that the touch described in step S12 and step S11 mostly represents the touching behaviors of the other fingers on the first finger and the second finger, for example, the first finger and the second finger are respectively the index finger and the middle finger under the same palm, and the two fingers can be easily touched by the thumb, so the touching behaviors are realized by the thumb.
Step S13, detecting that the touch type is classified into a single click, a double click or friction, according to the touch type, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3, the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3, and the touch actions are classified into three types of single click, double click and friction, so that the first sensor can output three types of signals according to the three types of touch conditions, the second sensor can also output three types of signals in the same way, the signals output by the first sensor and the second sensor are used for executing step S14, according to step S11 and step S22, the touch type is classified according to the touch effect, as shown in fig. 2, in a waveform which is close to a straight line, the X axis of the waveform is time, the Y axis is the vibration amplitude, when the first vibration waveform appears, which indicates that the sensor detects the single click is defined as the single click, when the two times of vibration are very close to each other, the second sensor can also output three types of signals, the first sensor can be used for defining the double click as the vibration amplitude, and the double click amplitude is the vibration amplitude which is the vibration waveform to be the vibration that is the vibration amplitude which is the vibration waveform to be small.
Step S14, calculating an interaction signal according to the received first signal and/or second signal, where the method for calculating the interaction signal according to the received first signal and/or second signal includes:
if only one of the signals A1, A2, A3, B1, B2 and B3 is received, the calculated interaction signals are corresponding to the interaction signals 1, 2, 3, 4, 5 and 6, and the interaction signals are all obtained by calculation of a single signal;
if the signal A1 and the signal B1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the signal B1 and the signal A1 are received in the time T1, the calculated interaction signal corresponds to the interaction signal 8, the interaction signal 7 and the interaction signal 8 are both calculated by the two signals, and the time T1 is to provide the response time of a touch sensor for the user, which is generally set between 300ms and 700 ms.
As shown in fig. 3, the interaction signals calculated according to the signals of the sensor are eight, the signal A1 corresponds to the interaction signal 1, the signal A2 corresponds to the interaction signal 2, the signal A3 corresponds to the interaction signal 3, the signal B1 corresponds to the interaction signal 4, the signal B2 corresponds to the interaction signal 5, and the signal B3 corresponds to the interaction signal 6, which are output results of the interaction signals when only one signal is received, and when two signals exist at the time T1, different interaction signals are output according to the time of receiving the signals, for example, the signal A1 is received first and then the signal B1 is received, the signal B1 is received first and then the signal A1 is received, and the interaction signal 8 is received. The client executes the interaction instruction according to the received interaction signal by presetting an interaction signal and an interaction instruction comparison table at the client, searching the comparison table after the client receives the interaction signal, executing the corresponding interaction instruction, taking eight interaction signals in fig. 3 as an example, and the comparison table is shown in fig. 4, wherein the instruction operations corresponding to the interaction signal are respectively that an interaction signal 1 corresponds to a selected area, an interaction signal 2 corresponds to an opening/running target (which may be a file or a program), an interaction signal 3 corresponds to an enlarged selected area, an interaction signal 4 corresponds to an opening target menu column, an interaction signal 5 corresponds to a deleting target, an interaction signal 6 corresponds to a reduced selected area, an interaction signal 7 corresponds to a downward sliding, and an interaction signal 8 corresponds to an upward sliding, so that by utilizing the wearing terminal, we can break away from the mouse function, utilize a viewing angle camera to replace a mouse cursor, and utilize touching fingers to replace the functions of left and right keys of the mouse, thereby enhancing the interaction experience of a user in a virtual scene.
Step S15, the interactive signal is sent to the client, and the client executes the interactive instruction according to the received interactive signal, and the client executes the interactive instruction according to the comparison table shown in fig. 4.
In step S13, although it is proposed that the vibration waveforms generated by different touch categories are different, we need to obtain the result of the touch category by detecting the vibration waveforms, and the specific detection method is shown in fig. 5, and the method for detecting the touch category includes:
setting amplitude thresholds M and N, wherein M is larger than N, the amplitude threshold is used for distinguishing whether the touch operation is friction or single-double click, and the vibration waveform amplitude generated is larger than M and is indicated as a vibration signal generated by single-double click, and the vibration waveform amplitude generated is not larger than M and is indicated as a vibration signal generated by friction; setting detection time T2 and T3, wherein T2 is more than T3, the detection time is also used for distinguishing whether the touch operation is friction or one of single and double clicking, and the required detection time is shorter because the friction generates intensive continuous vibration signals, and the required detection time is longer for providing enough reaction time for a user by single and double clicking;
outputting a single click if the amplitude X & gtM of the vibration signal appears in the T2 time is detected, outputting a double click if the amplitude X & gtM of the vibration signal appears in the two times is detected, wherein the T2 time can be selected to be the same as the T1 time, and the same value is set to 300-700ms;
if the amplitude N < Y < M of the vibration signal is detected in the time T3, the output is friction, the time T3 can be set to 100-300ms, and in order to prevent the judgment of the touch type from being influenced by the slight shake of the human body, the vibration signal with the waveform amplitude in the [ N, M ] interval is only effective vibration signal because the amplitude threshold N is required to be set to filter out noise when the friction detection is carried out.
The method for realizing virtual interaction by using the finger to transfer the signal comprises the following steps when being applied to a client:
receiving an interaction signal sent by a wearing terminal, wherein the interaction signal is a signal output by calculating a first signal and/or a second signal, the first signal is a vibration signal generated when the wearing terminal touches a first finger wearing a first sensor, the second signal is a vibration signal generated when the wearing terminal touches a second finger wearing a second sensor, the touch type is detected to be classified into single click, double click or friction according to the wearing terminal, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3, and the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3;
and executing an interaction instruction according to the interaction signal.
The method for calculating the signal output by the first signal and/or the second signal comprises the following steps:
if the wearable terminal only receives the signals A1, A2, A3, B1, B2 and B3, the calculated interaction signals are corresponding to the interaction signals 1, 2, 3, 4, 5 and 6;
if the wearable terminal receives the signal A1 and the signal B1 in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the wearable terminal receives the signal B1 and the signal A1 in the time T1, the calculated interaction signal is corresponding to the interaction signal 8.
The method for detecting the touch type to be classified as single click, double click or friction according to the wearable terminal comprises the following steps:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
Since the steps in the method applied to the client are substantially the same as those in the method applied to the wearable terminal, the specific principle thereof will not be described in detail.
Example 2
As shown in fig. 6, a system for implementing virtual interactions using finger-transferred signals, the system comprising:
the wearable terminal is provided with a first sensor and is used for acquiring a first signal, wherein the first signal is a vibration signal generated when a first finger wearing the first sensor is touched;
and the second sensor is used for acquiring a second signal, and the second signal is a vibration signal generated when the second finger wearing the second sensor is touched.
The touch detection unit is used for detecting that the touch type is classified into single click, double click or friction, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3 according to the touch type, and the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3; the detection method of the touch detection unit comprises the following steps:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
The signal computing unit is used for computing an interaction signal according to the received first signal and/or second signal and sending the interaction signal to the client, and the computing method of the signal computing unit is as follows:
if only one of the signals A1, A2, A3, B1, B2 and B3 is received, the calculated interaction signal corresponds to the interaction signal 1, the interaction signal 2, the interaction signal 3, the interaction signal 4, the interaction signal 5 and the interaction signal 6;
if the signal A1 and the signal B1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the signal B1 and the signal A1 are received in turn in the time T1, the calculated interaction signal corresponds to the interaction signal 8.
The client is used for running the virtual scene, and is internally provided with a preset interactive signal and an interactive instruction comparison table, and the client is used for executing corresponding interactive instructions according to the interactive signals.
The method of the present system is substantially the same as that of embodiment 1, so specific operation principles of each unit will not be described in detail.
Example 3
The embodiment provides a storage medium, which comprises a storage program area and a storage data area, wherein the storage program area can store an operating system, a program required by running an instant messaging function and the like; the storage data area can store various instant messaging information, operation instruction sets and the like. A computer program is stored in the stored program area, which when being executed by a processor implements the method of virtual interaction with finger-transferred signals as described in embodiment 1. The processor may include one or more Central Processing Units (CPUs) or a digital processing unit or the like.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to the embodiments described in detail below, and that the embodiments described in the examples may be combined as appropriate to form other embodiments that will be apparent to those skilled in the art.

Claims (10)

1. The method for realizing virtual interaction by using finger transmission signals is applied to a wearable terminal, and the wearable terminal comprises a first sensor and a second sensor, and is characterized by comprising the following steps:
acquiring a first signal of the first sensor, wherein the first signal is a vibration signal generated when a first finger wearing the first sensor is touched;
acquiring a second signal of the second sensor, wherein the second signal is a vibration signal generated when a second finger wearing the second sensor is touched;
detecting the touch type to be single-click, double-click or friction, outputting a signal A1, a signal A2 and a signal A3 according to the first signal of the touch type, and outputting a signal B1, a signal B2 and a signal B3 according to the second signal of the touch type;
calculating an interaction signal according to the received first signal and/or second signal;
and sending the interaction signal to a client for the client to execute an interaction instruction according to the received interaction signal.
2. The method for achieving virtual interaction with finger-transferred signals of claim 1, wherein the method for calculating the interaction signal from the received first signal and/or second signal comprises:
if only one of the signals A1, A2, A3, B1, B2 and B3 is received, the calculated interaction signal corresponds to the interaction signal 1, the interaction signal 2, the interaction signal 3, the interaction signal 4, the interaction signal 5 and the interaction signal 6;
if the signal A1 and the signal B1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the signal B1 and the signal A1 are received in turn in the time T1, the calculated interaction signal corresponds to the interaction signal 8.
3. The method for realizing virtual interaction by using finger transmission signals according to claim 1 or 2, wherein the method for detecting the touch category comprises:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
4. The method for implementing virtual interaction by using finger transmission signals according to claim 1, wherein the client executes the interaction instruction according to the received interaction signal by presetting an interaction signal and interaction instruction comparison table in the client, and the client searches the comparison table after receiving the interaction signal to execute the corresponding interaction instruction.
5. A method for realizing virtual interaction by using a finger to transmit a signal, which is applied to a client terminal running a virtual field, and is characterized by comprising the following steps:
receiving an interaction signal sent by a wearing terminal, wherein the interaction signal is a signal output by calculating a first signal and/or a second signal, the first signal is a vibration signal generated when the wearing terminal touches a first finger wearing a first sensor, the second signal is a vibration signal generated when the wearing terminal touches a second finger wearing a second sensor, the touch type is detected to be classified into single click, double click or friction according to the wearing terminal, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3, and the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3;
and executing an interaction instruction according to the interaction signal.
6. The method of using finger-transferred signals to achieve virtual interaction of claim 5, wherein the method of calculating the signal output by the first signal and/or the second signal comprises:
if the wearable terminal only receives the signals A1, A2, A3, B1, B2 and B3, the calculated interaction signals are corresponding to the interaction signals 1, 2, 3, 4, 5 and 6;
if the wearable terminal receives the signal A1 and the signal B1 in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the wearable terminal receives the signal B1 and the signal A1 in the time T1, the calculated interaction signal is corresponding to the interaction signal 8.
7. The method for realizing virtual interaction by using a finger transmission signal according to claim 6, wherein the method for detecting the touch category as one click, two click or friction according to the wearable terminal comprises:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
8. A system for implementing virtual interactions using finger-transferred signals, the system comprising:
the wearable terminal is provided with a first sensor and is used for acquiring a first signal, wherein the first signal is a vibration signal generated when a first finger wearing the first sensor is touched;
the second sensor is used for acquiring a second signal, and the second signal is a vibration signal generated when the second finger wearing the second sensor is touched;
the touch detection unit is used for detecting that the touch type is classified into single click, double click or friction, the first signal is correspondingly output as a signal A1, a signal A2 and a signal A3 according to the touch type, and the second signal is correspondingly output as a signal B1, a signal B2 and a signal B3;
the signal calculation unit is used for calculating an interaction signal according to the received first signal and/or second signal and sending the interaction signal to the client;
the client is used for running the virtual scene, and is internally provided with a preset interactive signal and an interactive instruction comparison table, and the client is used for executing corresponding interactive instructions according to the interactive signals.
9. The system for realizing virtual interaction by using finger transmission signals according to claim 8, wherein the calculation method of the signal calculation unit is as follows:
if only one of the signals A1, A2, A3, B1, B2 and B3 is received, the calculated interaction signal corresponds to the interaction signal 1, the interaction signal 2, the interaction signal 3, the interaction signal 4, the interaction signal 5 and the interaction signal 6;
if the signal A1 and the signal B1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 7;
if the signal B1 and the signal A1 are received in turn in the time T1, the calculated interaction signal is corresponding to the interaction signal 8;
the detection method of the touch detection unit comprises the following steps:
setting amplitude thresholds M and N, wherein M is larger than N, and setting detection times T2 and T3, and T2 is larger than T3;
outputting as a single click if the amplitude X > M of the vibration signal appears once is detected in the time T2, and outputting as a double click if the amplitude X > M of the vibration signal appears twice is detected;
if the amplitude N < Y < M of the vibration signal is detected to be present in the time T3, the output is friction.
10. A storage medium having stored thereon a computer program which, when executed by a processor, implements a method of virtual interaction with finger delivery signals as claimed in any of claims 1-4 or 5-7.
CN202311052692.XA 2023-08-21 2023-08-21 Method, system and storage medium for realizing virtual interaction by transmitting signal by finger Pending CN116755565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311052692.XA CN116755565A (en) 2023-08-21 2023-08-21 Method, system and storage medium for realizing virtual interaction by transmitting signal by finger

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311052692.XA CN116755565A (en) 2023-08-21 2023-08-21 Method, system and storage medium for realizing virtual interaction by transmitting signal by finger

Publications (1)

Publication Number Publication Date
CN116755565A true CN116755565A (en) 2023-09-15

Family

ID=87955546

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311052692.XA Pending CN116755565A (en) 2023-08-21 2023-08-21 Method, system and storage medium for realizing virtual interaction by transmitting signal by finger

Country Status (1)

Country Link
CN (1) CN116755565A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150370326A1 (en) * 2014-06-23 2015-12-24 Thalmic Labs Inc. Systems, articles, and methods for wearable human-electronics interface devices
CN110096213A (en) * 2019-04-30 2019-08-06 努比亚技术有限公司 Terminal operation method, mobile terminal and readable storage medium storing program for executing based on gesture
CN110096131A (en) * 2018-01-29 2019-08-06 华为技术有限公司 Sense of touch exchange method, device and sense of touch wearable device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150370326A1 (en) * 2014-06-23 2015-12-24 Thalmic Labs Inc. Systems, articles, and methods for wearable human-electronics interface devices
CN110096131A (en) * 2018-01-29 2019-08-06 华为技术有限公司 Sense of touch exchange method, device and sense of touch wearable device
CN110096213A (en) * 2019-04-30 2019-08-06 努比亚技术有限公司 Terminal operation method, mobile terminal and readable storage medium storing program for executing based on gesture

Similar Documents

Publication Publication Date Title
US10126826B2 (en) System and method for interaction with digital devices
KR100687737B1 (en) Apparatus and method for a virtual mouse based on two-hands gesture
US8270670B2 (en) Method for recognizing and tracing gesture
US9063573B2 (en) Method and system for touch-free control of devices
CN108616712B (en) Camera-based interface operation method, device, equipment and storage medium
GB2502087A (en) Gesture recognition
US20100088595A1 (en) Method of Tracking Touch Inputs
US10228794B2 (en) Gesture recognition and control based on finger differentiation
US8749488B2 (en) Apparatus and method for providing contactless graphic user interface
WO2023179694A1 (en) Texture identification based difference touch method
CN116755565A (en) Method, system and storage medium for realizing virtual interaction by transmitting signal by finger
US11755124B1 (en) System for improving user input recognition on touch surfaces
Kim et al. Mo-Bi: Contextual mobile interfaces through bimanual posture sensing with Wrist-Worn devices
KR20130099708A (en) Input apparatus
TWI697827B (en) Control system and control method thereof
KR20140016655A (en) Multi touch apparatus and method of discriminating touch on object
Dave et al. Project MUDRA: Personalization of Computers using Natural Interface
Anavekar et al. The Techniques of Human-Machine Communication through Expressive Gestures.
Pame et al. A Novel Approach to Improve User Experience of Mouse Control using CNN Based Hand Gesture Recognition
Liang Towards Ubiquitous Intelligent Hand Interaction
Iswarya et al. Interactive Media Control System Using Hand Gestures
Tiwari et al. Volume Controller using Hand Gestures
CN114995635A (en) Desktop gesture interaction method based on mixed reality
Rajkarne et al. Realtime Hand Gesture Recognition System for Human Computer Interaction.
Vatari et al. Smart Mouse.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination