CN111552383A - Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment - Google Patents

Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment Download PDF

Info

Publication number
CN111552383A
CN111552383A CN202010335492.5A CN202010335492A CN111552383A CN 111552383 A CN111552383 A CN 111552383A CN 202010335492 A CN202010335492 A CN 202010335492A CN 111552383 A CN111552383 A CN 111552383A
Authority
CN
China
Prior art keywords
gesture recognition
information
recognition model
information acquired
pressure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010335492.5A
Other languages
Chinese (zh)
Inventor
史杰
王西颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Iqiyi Intelligent Technology Co Ltd
Original Assignee
Nanjing Iqiyi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Iqiyi Intelligent Technology Co Ltd filed Critical Nanjing Iqiyi Intelligent Technology Co Ltd
Priority to CN202010335492.5A priority Critical patent/CN111552383A/en
Publication of CN111552383A publication Critical patent/CN111552383A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

The invention discloses a finger identification method and a finger identification system of virtual augmented reality interaction equipment and the interaction equipment, wherein the method comprises the following steps: acquiring hand contact information of a user at the current moment, pressure information of the hand of the user on the handle body at the current moment and motion information of the handle body at the current moment; acquiring hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filtering the information acquired by the three sensors, and transmitting the information to a trained gesture recognition model for gesture recognition operation; the gesture recognition model takes hand contact information, pressure information and motion information as input information, performs gesture recognition action according to the input information, and recognizes and outputs a gesture recognition result applied to the handle body by a user at the current moment. Compared with the prior art, the technical scheme of the invention has the remarkable technical advantages of rapid identification, high identification accuracy and the like.

Description

Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment
Technical Field
The invention relates to the technical field of AR/VR (augmented reality/virtual reality), in particular to a finger identification method and system of virtual augmented reality interaction equipment and the interaction equipment.
Background
AR (augmented reality) is a technology for seamlessly integrating real world information and virtual world information, and is a technology for applying virtual information to the real world and sensing the virtual information by human senses by overlaying entity information which is difficult to experience in a certain time space range of the real world originally through computer and other scientific technologies after simulation, so that the sensory experience beyond reality is achieved. VR (virtual reality technology) is a computer simulation system that can create and experience virtual worlds, which uses computers to create a simulation environment, and is an interactive three-dimensional dynamic view and entity behavior system with multi-source information fusion, and immerses users in the simulation environment.
The AR device or the VR device may provide a virtual environment for interaction with a user, and generally, a carrier of the AR device or the VR device is generally an interaction handle, and when performing finger interaction recognition, the interaction device may sense and recognize a corresponding finger control operation after a user finger applies a control operation to the interaction handle. The gesture recognition technology of the currently common interactive device is mainly divided into two categories: contact and non-contact methods.
Regarding the non-contact gesture recognition scheme, the gestures and motions of the user's hand are mainly photographed by a visual sensor. And carrying out image segmentation on the shot image to distinguish the hand of the user from the background. And applying the divided hand images of the user to a hand recognition algorithm, recognizing the gestures, motions and the like of the user, and finishing subsequent corresponding actions. However, the above non-contact recognition method also has some technical problems, i.e. the non-contact gesture recognition scheme is limited by key parameters such as the FOV (field of view) of the camera, and the system is easily unable to capture the hand image of the user, so that the hand motion and posture of the user cannot be recognized outside the FOV. Secondly, based on a non-contact gesture recognition scheme, the method is easily interfered by ambient light, hands are easily shielded, and the robustness of a gesture recognition algorithm is poor. And thirdly, based on a non-contact gesture recognition scheme, the cost of configuring an external camera is high. The algorithm is complex, the recognition speed is low, and the power consumption of the whole recognition system is high.
Regarding the existing contact type gesture recognition scheme, devices such as an inertial sensor, a distance sensor and a pressure sensor are mainly used for collecting electric signals generated when the hand gesture and motion of a user occur. And after the electric signals are correspondingly processed, the electric signals are output to a hand recognition algorithm, the gestures, the motions and the like of the user are recognized, and subsequent corresponding actions are completed. However, the above-mentioned existing touch-based gesture recognition method also has some problems, and although there is no limitation on FOV and an external camera is not needed, the recognition algorithm is relatively simple.
Therefore, how to overcome the above technical problems for the existing contact gesture recognition is a problem to be solved by those skilled in the art.
Disclosure of Invention
In view of this, the embodiment of the present invention provides a finger recognition method and system for a virtual augmented reality interaction device, and an interaction device (that is, the interaction device is a virtual augmented reality interaction device).
An embodiment of the present invention provides a finger recognition method for a virtual augmented reality interaction device, including the following steps:
acquiring hand contact information of a user at the current moment, pressure information of the hand of the user on the handle body at the current moment and motion information of the handle body at the current moment;
acquiring hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filtering the information acquired by the three sensors, and transmitting the information to a trained gesture recognition model for gesture recognition operation;
the trained gesture recognition model takes hand contact information, pressure information and motion information as input information, performs gesture recognition action according to the input information, and recognizes and outputs a gesture recognition result (including gestures and postures) applied to the handle body by the user at the current moment.
Preferably, as one possible embodiment; in the above finger recognition method for virtual augmented reality interaction device, before transmitting to the trained gesture recognition model, constructing and training the gesture recognition model;
constructing an execution operation for training the gesture recognition model, comprising the following steps:
the method comprises the steps that the hand contact information of a user is collected by a sensor matrix, the pressure sensor collects the pressure information of the hand of the user on a handle body, and the IMU inertial sensor module collects the motion information of the handle body;
the controller module acquires a large amount of hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filters the information acquired by the three sensors, and transmits the information to a pre-constructed gesture recognition model for gesture recognition training;
after the gesture recognition model is trained by a large number of data sets, the gesture recognition model with higher recognition accuracy can be trained.
Preferably, as one possible embodiment; the gesture recognition model mentioned in the above finger recognition method for virtual augmented reality interaction device is a gesture recognition model obtained based on a neural network or a machine learning manner.
Another embodiment of the present invention provides a finger recognition system of a virtual augmented reality interaction device, including: the device comprises an acquisition module and an operation identification module, wherein the acquisition module is used for acquiring a data packet;
the acquisition module is used for acquiring hand contact information acquired by the sensor matrix, pressure information acquired by the pressure sensor and motion information acquired by the IMU inertial sensor module, filtering the information acquired by the three sensors and transmitting the information to a trained gesture recognition model for gesture recognition operation;
and the operation recognition module is used for controlling the trained gesture recognition model to further control the trained gesture recognition model to take hand contact information, pressure information and motion information as input information, perform gesture recognition action according to the input information, and recognize and output a gesture recognition result applied to the handle body by the user at the current moment.
Preferably, as one possible embodiment; the finger recognition system of the virtual augmented reality interaction device further comprises a gesture recognition model training module;
the gesture recognition model training module is used for obtaining a large amount of hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filtering the information acquired by the three sensors, transmitting the information to a pre-constructed gesture recognition model and performing gesture recognition training; after a large number of data sets are trained, a gesture recognition model with high recognition accuracy can be trained.
Another embodiment of the present invention provides an interactive device (specifically, an interactive handle in an embodiment of the present invention), including: the virtual augmented reality interaction device comprises a handle body, a controller module and a memory, wherein the memory stores a computer program, and the controller module is used for executing the computer program to implement the finger identification method of the virtual augmented reality interaction device;
the interactive device further comprises a proximity sensor matrix, a pressure sensor, and an IMU inertial sensor module;
the proximity sensor matrix is used for identifying the contact action of the user hand on the handle body at the current moment so as to acquire the hand contact information of the user;
the pressure sensor is used for identifying the pressing action of the user on the handle body at the current moment so as to acquire the pressure information of the hand of the user on the handle body;
the IMU inertial sensor module is used for identifying the spatial motion action of the handle body at the current moment so as to acquire the motion information of the handle body in real time;
the controller module is used for acquiring hand contact information acquired by the sensor matrix, pressure information acquired by the pressure sensor and motion information acquired by the IMU inertial sensor module, filtering the information acquired by the three sensors and transmitting the information to a trained gesture recognition model for gesture recognition operation;
the trained gesture recognition model is used for taking hand contact information, pressure information and motion information as input information, performing gesture recognition action according to the input information, and recognizing and outputting a gesture recognition result (including gestures and postures) applied to the handle body by a user at the current moment.
Preferably, as one possible embodiment; the proximity sensor matrix includes 12 capacitive sensor cells, outputting a 1 x 12 matrix of values.
Preferably, as one possible embodiment; the pressure sensors include at least one pressure sensor matrix that outputs a 1 x 3 matrix of values.
Preferably, as one possible embodiment; the IMU inertial sensor module comprises a three-axis accelerometer and a three-axis gyroscope and is used for measuring acceleration and angular velocity; the IMU inertial sensor module is used for detecting and generating a set of 6-dimensional numerical value matrix, wherein 3 data represent linear acceleration in the directions of three axes xyz of the IMU, and the other 3 data represent angular velocity in the directions of the three axes xyz of the IMU.
The embodiment of the invention has at least the following technical advantages:
the embodiment of the invention provides a finger identification method and a finger identification system of virtual augmented reality interaction equipment and a technical scheme of the interaction equipment; the finger identification method of the virtual augmented reality interaction equipment integrates detection information of various sensors; in the specific implementation process, the hand contact information of the user at the current moment is acquired by using the proximity sensor matrix, the pressure information of the hand of the user on the handle body at the current moment is acquired by using the pressure sensor, and the motion information of the handle body at the current moment is acquired by using the IMU inertial sensor module; the controller module acquires hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filters the information acquired by the three sensors and transmits the information to a trained gesture recognition model to perform gesture recognition operation (a gesture recognition model with higher recognition accuracy is trained by using a neural network or machine learning based mode); and then, using the trained gesture recognition model to take hand contact information, pressure information and motion information as input information, performing gesture recognition action according to the input information, and recognizing and outputting a gesture recognition result (including gestures and postures) applied to the handle body by the user at the current moment.
According to the finger recognition method of the virtual augmented reality interaction device, the non-visual sensor is used, the data information (data errors caused by single detection data are avoided) of various sensors is used as model input to construct the gesture recognition model, and finally the gesture recognition of the user is completed through the gesture recognition model. The finger identification method of the virtual augmented reality interaction device provided by the embodiment of the invention can be widely applied to VR devices and AR devices.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings required to be used in the embodiments will be briefly described below, and it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope of the present invention. Like components are numbered similarly in the various figures.
Fig. 1 illustrates a main flow diagram of a recognition operation in a finger recognition method of a virtual augmented reality interaction device according to an embodiment of the present invention;
fig. 2 is a schematic main flow chart illustrating the construction of a gesture recognition model in the finger recognition method of the virtual augmented reality interaction device according to the embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a control principle structure of a finger recognition system of a virtual augmented reality interaction device according to an embodiment of the present invention;
fig. 4 shows a schematic structural diagram of a system architecture of a virtual augmented reality interaction device according to an embodiment of the present invention.
Reference numbers: 100-a controller module; 101-an acquisition module; 102-operation identification module; 103-a gesture recognition model training module; 200-a memory; 300-a handle body; 400-proximate sensor matrix; 500-a pressure sensor; 600-an IMU inertial sensor module; 700-a clock module; 800-motor drive; 900-linear motor; 1000-a power module; 1100-an antenna module; 1200-an input module; 1210-pressing a key; 1220-capacitive screen; 1230-Joystick.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Hereinafter, the terms "including", "having", and their derivatives, which may be used in various embodiments of the present invention, are only intended to indicate specific features, numbers, steps, operations, elements, components, or combinations of the foregoing, and should not be construed as first excluding the existence of, or adding to, one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing.
Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of the present invention belong. The terms (such as those defined in commonly used dictionaries) should be interpreted as having a meaning that is consistent with their contextual meaning in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in various embodiments of the present invention.
Example one
The first embodiment provides a finger recognition method for a virtual augmented reality interaction device, which can be applied to various virtual reality interaction devices, and the interaction device in the first embodiment refers in particular to a virtual reality interaction handle; the finger recognition method needs to integrate the detection information of a plurality of sensors, constructs a gesture recognition model for carrying out comprehensive data recognition by the plurality of sensors, can greatly improve the precision compared with the existing recognition method, has higher recognition accuracy, and avoids more false recognition problems.
As shown in fig. 1, a finger recognition method of a virtual augmented reality interaction device in a first embodiment of the present invention is described in detail below. The invention provides a finger identification method of virtual augmented reality interaction equipment, which comprises the following operation steps:
step S100: acquiring hand contact information of a user at the current moment, pressure information of the hand of the user on the handle body at the current moment and motion information of the handle body at the current moment;
step S200: acquiring hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filtering the information acquired by the three sensors, and transmitting the information to a trained gesture recognition model for gesture recognition operation;
step S300: the trained gesture recognition model takes hand contact information, pressure information and motion information as input information, performs gesture recognition action according to the input information, and recognizes and outputs a gesture recognition result (including control gesture recognition and handle control gesture recognition) applied to the handle body by a user at the current moment.
It should be noted that before performing finger recognition judgment at the current moment, a trained gesture recognition model needs to be constructed in advance; the trained gesture recognition model can judge the gesture and the posture of the user according to the input information, so that the subsequent corresponding action is completed. Specifically, the trained gesture recognition model outputs the gesture result of the user at the current moment. The above steps S100 to S300 illustrate the prediction recognition process of the gesture recognition model: the controller module acquires hand contact information, pressure information and motion information at the current moment, and transmits the information acquired by the three sensors to a trained gesture recognition model after filtering processing, so as to perform gesture recognition operation (a gesture recognition model with higher recognition accuracy is trained by using a neural network or machine learning based mode); and then, using the trained gesture recognition model to take hand contact information, pressure information and motion information as input information, performing gesture recognition action according to the input information, and recognizing and outputting a gesture recognition result applied to the handle body by the user at the current moment.
According to the finger recognition method of the virtual augmented reality interaction equipment, a gesture recognition model is built by taking data information of various sensors (data errors caused by single detection data are avoided) as model input, and finally recognition processing of user gestures is completed through the gesture recognition model.
As shown in fig. 2, in the finger recognition method for virtual augmented reality interaction equipment, before the transmission to the trained gesture recognition model, the method further includes building and training the gesture recognition model;
constructing an execution operation for training the gesture recognition model, comprising the following steps:
step S10: the method comprises the steps that the hand contact information of a user is collected by a sensor matrix, the pressure sensor collects the pressure information of the hand of the user on a handle body, and the IMU inertial sensor module collects the motion information of the handle body;
step S20: the controller module acquires a large amount of hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filters the information acquired by the three sensors, and transmits the information to a pre-constructed gesture recognition model for gesture recognition training;
step S20: after the gesture recognition model is trained by a large number of data sets, the gesture recognition model with higher recognition accuracy can be trained.
The above steps S10-S30 illustrate the process of constructing and training the gesture recognition model.
In the above technical solution, the gesture recognition model knows the correct gesture posture, and 31 kinds of data of the collected 3 kinds of sensors are processed by corresponding filtering, for example: after EKF filtering or median filtering, the new data constitutes a set of training data. The training data and the true value label (namely the known correct gesture) form a group to form and construct a training gesture recognition model. The gesture recognition model may be trained based on neural networks or machine learning principles, for example using SVM (support vector machine) principles. If the recognition result of the gesture recognition model is not good, the training data set can be increased or the training model can be changed, the recognition accuracy is improved, and finally the satisfactory gesture recognition model is trained.
The gesture recognition model mentioned in the above finger recognition method for virtual augmented reality interaction device is a gesture recognition model obtained based on a neural network or a machine learning manner.
Example two
As shown in fig. 3, another embodiment of the present invention provides a finger recognition system of a virtual augmented reality interaction device (i.e. the gesture recognition system refers to the controller module, and the controller module 100 is composed of a plurality of program module units), including: the system comprises an acquisition module 101 and an operation identification module 102, wherein;
the acquisition module 101 is configured to acquire hand contact information acquired by the proximity sensor matrix, pressure information acquired by the pressure sensor, and motion information acquired by the IMU inertial sensor module, filter the acquired information of the three sensors, and transmit the information to a trained gesture recognition model for gesture recognition operation;
the operation recognition module 102 is configured to control the trained gesture recognition model to use hand contact information, pressure information, and motion information as input information, perform gesture recognition according to the input information, and recognize and output a gesture recognition result (including a gesture and a posture) applied to the handle body by the user at the current moment.
Preferably, as one possible embodiment; the finger recognition system of the virtual augmented reality interaction device further comprises a gesture recognition model training module 103;
the gesture recognition model training module 103 is used for obtaining a large amount of hand contact information acquired by the sensor matrix, pressure information acquired by the pressure sensor and motion information acquired by the IMU inertial sensor module, filtering the information acquired by the three sensors, transmitting the information to a pre-constructed gesture recognition model and performing gesture recognition training; after a large number of data sets are trained, a gesture recognition model with high recognition accuracy can be trained.
EXAMPLE III
As shown in fig. 4, the third embodiment of the present invention includes the program module unit of the controller module 100 in the second embodiment, and also includes other hardware structures, such as a power module, an antenna module, a clock module, a motor driver, a linear motor, an input module, etc.;
an embodiment of the present invention provides an interactive device (specifically, an interactive handle in an embodiment of the present invention), including: a controller module 100, a memory 200 and a handle body 300, wherein the memory 200 stores a computer program, and the controller module 100 (i.e. a processor) is used for executing the computer program to implement the finger recognition method of the virtual augmented reality interaction device according to the first embodiment;
the interactive device further comprises a proximity sensor matrix 400, pressure sensors 500 and an IMU inertial sensor module 600;
the proximity sensor matrix 400 is used for identifying the contact action of the user hand on the handle body at the current moment, so as to collect the hand contact information of the user; the proximity sensor matrix 400 is disposed on the back of the grip of the handle body 300, collects touch information of the user's handle, and identifies the contact position and area between the user's hand and the handle.
The pressure sensor 500 is used for identifying the pressing action applied to the handle body by the user at the current moment so as to collect the pressure information of the hand of the user on the handle body; the pressure sensor module is placed on the back of the handle body 300 to collect the pressure information of the handle of the user on the handle body.
The IMU inertial sensor module 600 is configured to identify a spatial motion of the handle body at the current time, so as to acquire motion information of the handle body in real time;
the controller module 100 is configured to obtain hand contact information acquired by the sensor matrix 400, pressure information acquired by the pressure sensor 500, and motion information acquired by the IMU inertial sensor module 600, filter the information acquired by the three sensors, and transmit the information to a trained gesture recognition model for gesture recognition;
the trained gesture recognition model is used for taking hand contact information, pressure information and motion information as input information, performing gesture recognition action according to the input information, and recognizing and outputting a gesture recognition result (including gestures and postures) applied to the handle body by a user at the current moment.
It should be noted that the controller module 100 (i.e., the processor) is configured to obtain detection information collected by three different sensors; and then, transmitting detection information acquired by three different sensors to a training model based on a neural network or machine learning after filtering processing, and performing gesture recognition training. After the gesture recognition model is trained by a large number of data sets, the gesture recognition model with higher recognition accuracy can be trained; meanwhile, after the training of the gesture recognition model is carried out, when the gesture recognition operation at the current moment is carried out, the trained gesture recognition model is only required to execute the judgment and recognition operation.
The interactive device related to the third embodiment of the invention mainly comprises the following structure and sequence modules: a controller module 100, a memory 200, a handle body 300, a proximity sensor matrix 400, a pressure sensor 500, an IMU inertial sensor module 600, a clock module 700, a motor drive 800, a linear motor 900, a power module 1000, an antenna module 1100, and an input module 1200 (the input module includes keys 1210, a capacitive screen 1220, and a joystick 1230, implementing user interaction with the handle);
in a specific structure, the controller module 100: and the signals of all the modules are controlled and processed, so that the handle works in a normal state. IMU inertial sensor module 600: the module measures the motion information of the handle in real time, transmits the collected motion information of the handle to the controller module 100, and supplies the motion information to the controller module 100 for analysis and judgment. The controller module 100 can calculate the motion information of the handle through a corresponding algorithm, and can correct the data collected by the pressure sensor 500 and the like by using the motion information. The power module 1000 may power each of the other modules. The antenna module 1100: the handle body can be in wireless communication with other equipment by using the antenna module. The motor drive 800 described above: the linear motor is driven to work normally. The linear motor 900: the handle has a vibration function. The input module 1200 described above: the button, capacitive screen and control rod realize the interactive function of user and handle.
Preferably, as one possible embodiment; the proximity sensor matrix 400 has 12 capacitive sensor cells in total, outputting a 1 x 12 matrix of values.
It should be noted that, in relation to the proximity sensor matrix 400, each sensor unit in the proximity sensor matrix is a capacitive sensor. There are 12 capacitive sensor cells in the proximity sensor matrix, outputting a 1 x 12 matrix of values. When the user's handle approaches or touches each capacitive sensor cell, each cell can detect a change in capacitance caused by hand movement or contact. The capacitance sensor outputs digital information after performing corresponding digital processing on the capacitance value, for example, the digital output range of each capacitance sensor unit is (0-127). Therefore, a value matrix of the proximate sensor matrix output having 12 capacitive sensor cells can be represented as (c1, c2, c3, c4, c5, c6, c7, c8, c9, c10, c11, c12), where c is in the range of (0-127), and a smaller value indicates that the user's hand is farther away from the capacitive sensor cells.
For example, as the user's hand approaches the proximity sensor matrix, the proximity sensor matrix may output (20, 32,64,96,72,11,13,56,37,23,86, 110). From the data, it can be seen that the user's hand is closest to the c12 capacitive sensor cell and farthest from the c6 capacitive sensor cell.
Preferably, as one possible embodiment; the pressure sensor 500 comprises at least one pressure sensor matrix outputting a matrix of 1 x 3 values.
It should be noted that, in the pressure sensor 500, the pressure sensor in the embodiment of the present invention uses a sensor based on piezoelectric principle, and generates a pressure value. Meanwhile, the embodiment of the invention can also use a pressure sensor matrix, for example, each pressure sensor unit in the matrix uses a sensor of a piezoelectric principle, and the piezoelectric sensor can sense the change of the pressure value and output the pressure value as digital information for a subsequent algorithm after corresponding digital processing. For example, a 1 x 3 pressure sensor matrix is currently used, and the value matrix output by the pressure sensor matrix can be represented as (p1, p2, p3), where p x value varies with the contact and movement of the user's hand, for example, the value range of each currently used pressure sensor unit is (0-32).
When the handle is stationary and the user presses his hand against the matrix of pressure sensors, the matrix outputs values (10,20,30), with smaller values indicating less force applied by the user to the handle. From the data, the force exerted by the user on the p1 sensor was the smallest and the force exerted on the p3 sensor was the largest.
The pressure sensor matrix can also generate data changes when the handle is moving in an accelerated manner without any force being applied to the handle pressure matrix from the outside. For example, if the handle is moving in a free fall and falls at an acceleration of 9.8m/s2, the pressure sensor matrix will output a value of (16,16, 16).
Preferably, as one possible embodiment; the IMU inertial sensor module 600 includes a three-axis accelerometer and a three-axis gyroscope to achieve measurement of acceleration and angular velocity; the IMU inertial sensor module 400 is configured to generate a set of 6-dimensional numerical matrices, where 3 data represent linear accelerations in the xyz directions of the three axes of the IMU, and the other 3 data represent angular velocities in the xyz directions of the three axes of the IMU.
It should be noted that, in the above-mentioned IMU inertial sensor module 600, the IMU inertial sensor in the embodiment of the present invention includes a three-axis accelerometer and a three-axis gyroscope to implement measurement of acceleration and angular velocity). The IMU will generate a set of 6-dimensional numerical matrices, where 3 data represent linear accelerations in the xyz direction of the three axes of the IMU and the other 3 data represent angular velocities in the xyz direction of the three axes of the IMU. The data output by the IMU may be represented as (ax, ay, az, gx, gy, gz), where ax represents the linear acceleration on the x-axis of the IMU sensor, ay represents the linear acceleration on the y-axis of the IMU sensor, az represents the linear acceleration on the z-axis of the IMU sensor, and the output range of ax/ay/az may be (-2048 to + 2048). In addition, gx represents the angular velocity of the IMU sensor on the x-axis, gy represents the angular velocity of the IMU sensor on the y-axis, gz represents the angular velocity of the IMU sensor on the z-axis, and the output range of gx/gy/gz can be (-2048- +2048)
The IMU inertial sensor module 600 detects the contact of the user's hand or the corresponding movement of the handle, and each value output by the IMU changes accordingly. For example, the IMU may output data of (0,0,128,0,0,0) when the handle is stationary. When the handle is performing an acceleration motion with an acceleration of 9.8m/s2, in a direction opposite to the direction of gravity, the IMU may output data of (0,0,0,0,0, 0).
The invention knows the correct gesture, and 31 numerical data of the 3 types of sensors are acquired, and are processed by corresponding filtering, such as: after EKF filtering or median filtering, the new data constitutes a set of training data. The training data and the true value label (namely the known correct gesture) form a group to form and construct a training gesture recognition model. The gesture recognition model may be trained based on neural networks or machine learning principles, for example using SVM (support vector machine) principles. If the recognition result of the gesture recognition model is not good, the training data set can be increased or the training model can be changed, the recognition accuracy is improved, and finally the satisfactory gesture recognition model is trained. Generally speaking, the inertial sensor module of the IMU has better perception of large movements of the hand, but has poorer recognition capability of small movement information of the finger. However, in the finger recognition method of the virtual augmented reality interaction device provided by this embodiment, the detection systems of the proximity sensor matrix and the pressure sensor are fused together (where the proximity sensor matrix has a better recognition capability of micro motion information), so that the deficiency can be compensated, and the gesture recognition model after the three kinds of data information are combined has a stronger recognition capability and a higher recognition accuracy.
In summary, the finger recognition method and system for the virtual augmented reality interaction device and the interaction device provided by the embodiment have the technical advantages of higher recognition processing speed, higher accuracy, wide application range and the like compared with the prior art.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (9)

1. A finger recognition method of a virtual augmented reality interaction device is characterized by comprising the following operation steps:
acquiring hand contact information of a user at the current moment, pressure information of the hand of the user on the handle body at the current moment and motion information of the handle body at the current moment;
acquiring hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filtering the information acquired by the three sensors, and transmitting the information to a trained gesture recognition model for gesture recognition operation;
the trained gesture recognition model takes hand contact information, pressure information and motion information as input information, performs gesture recognition action according to the input information, and recognizes and outputs a gesture recognition result applied to the handle body by the user at the current moment.
2. The method of claim 1, further comprising building a trained gesture recognition model before transmitting to the trained gesture recognition model;
constructing an execution operation for training the gesture recognition model, comprising the following steps:
the method comprises the steps that the hand contact information of a user is collected by a sensor matrix, the pressure sensor collects the pressure information of the hand of the user on a handle body, and the IMU inertial sensor module collects the motion information of the handle body;
the controller module acquires a large amount of hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filters the information acquired by the three sensors, and transmits the information to a pre-constructed gesture recognition model for gesture recognition training;
after the gesture recognition model is trained by a large number of data sets, the gesture recognition model with higher recognition accuracy can be trained.
3. The method as claimed in claim 2, wherein the gesture recognition model is a gesture recognition model obtained based on a neural network or machine learning manner.
4. A finger recognition system of virtual augmented reality interaction equipment is characterized by comprising an acquisition module and an operation recognition module, wherein the acquisition module is used for acquiring a finger image;
the acquisition module is used for acquiring hand contact information acquired by the sensor matrix, pressure information acquired by the pressure sensor and motion information acquired by the IMU inertial sensor module, filtering the information acquired by the three sensors and transmitting the information to a trained gesture recognition model for gesture recognition operation;
and the operation recognition module is used for controlling the trained gesture recognition model to further control the trained gesture recognition model to take hand contact information, pressure information and motion information as input information, perform gesture recognition action according to the input information, and recognize and output a gesture recognition result applied to the handle body by the user at the current moment.
5. The finger recognition system of the virtual augmented reality interaction device of claim 4, further comprising a gesture recognition model training module;
the gesture recognition model training module is used for obtaining a large amount of hand contact information acquired by a sensor matrix, pressure information acquired by a pressure sensor and motion information acquired by an IMU inertial sensor module, filtering the information acquired by the three sensors, transmitting the information to a pre-constructed gesture recognition model and performing gesture recognition training; after a large number of data sets are trained, a gesture recognition model with high recognition accuracy can be trained.
6. An interactive device, comprising: a handle body, a controller module and a memory, the memory storing a computer program, the controller module being configured to execute the computer program to implement the finger recognition method of the virtual augmented reality interaction device of any one of the preceding claims 1 to 3;
the interactive device further comprises a proximity sensor matrix, a pressure sensor, and an IMU inertial sensor module;
the proximity sensor matrix is used for identifying the contact action of the user hand on the handle body at the current moment so as to acquire the hand contact information of the user;
the pressure sensor is used for identifying the pressing action of the user on the handle body at the current moment so as to acquire the pressure information of the hand of the user on the handle body;
the IMU inertial sensor module is used for identifying the spatial motion action of the handle body at the current moment so as to acquire the motion information of the handle body in real time;
the controller module is used for acquiring hand contact information acquired by the sensor matrix, pressure information acquired by the pressure sensor and motion information acquired by the IMU inertial sensor module, filtering the information acquired by the three sensors and transmitting the information to a trained gesture recognition model for gesture recognition operation;
the trained gesture recognition model is used for taking hand contact information, pressure information and motion information as input information, performing gesture recognition action according to the input information, and recognizing and outputting a gesture recognition result applied to the handle body by a user at the current moment.
7. The interaction device of claim 6, wherein the proximity sensor matrix comprises 12 capacitive sensor cells outputting a 1 x 12 matrix of values.
8. The interaction device of claim 6, wherein the pressure sensors comprise at least one pressure sensor matrix that outputs a 1 x 3 matrix of values.
9. The interaction device of claim 6, wherein the IMU inertial sensor module includes a three-axis accelerometer and a three-axis gyroscope to enable measurement of acceleration and angular velocity; the IMU inertial sensor module is used for detecting and generating a set of 6-dimensional numerical value matrix, wherein 3 data represent linear acceleration in the directions of three axes xyz of the IMU, and the other 3 data represent angular velocity in the directions of the three axes xyz of the IMU.
CN202010335492.5A 2020-04-24 2020-04-24 Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment Pending CN111552383A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010335492.5A CN111552383A (en) 2020-04-24 2020-04-24 Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010335492.5A CN111552383A (en) 2020-04-24 2020-04-24 Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment

Publications (1)

Publication Number Publication Date
CN111552383A true CN111552383A (en) 2020-08-18

Family

ID=72003985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010335492.5A Pending CN111552383A (en) 2020-04-24 2020-04-24 Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment

Country Status (1)

Country Link
CN (1) CN111552383A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112328154A (en) * 2020-11-11 2021-02-05 维沃移动通信有限公司 Equipment control method and device and electronic equipment
CN112685919A (en) * 2021-03-12 2021-04-20 南京爱奇艺智能科技有限公司 Handle tracking effect evaluation method
CN113687714A (en) * 2021-07-16 2021-11-23 北京理工大学 Fingertip interaction system and method of active flexible pressure sensor
CN117251058A (en) * 2023-11-14 2023-12-19 中国海洋大学 Control method of multi-information somatosensory interaction system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945362A (en) * 2012-10-18 2013-02-27 中国科学院计算技术研究所 Isomerous data fusion based coordinated gesture recognition method and system of sensor
CN105117016A (en) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 Interaction handle used in interaction control of virtual reality and augmented reality
CN105975072A (en) * 2016-04-29 2016-09-28 乐视控股(北京)有限公司 Method, device and system for identifying gesture movement
CN108983979A (en) * 2018-07-25 2018-12-11 北京因时机器人科技有限公司 A kind of gesture tracking recognition methods, device and smart machine

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945362A (en) * 2012-10-18 2013-02-27 中国科学院计算技术研究所 Isomerous data fusion based coordinated gesture recognition method and system of sensor
CN105117016A (en) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 Interaction handle used in interaction control of virtual reality and augmented reality
CN105975072A (en) * 2016-04-29 2016-09-28 乐视控股(北京)有限公司 Method, device and system for identifying gesture movement
CN108983979A (en) * 2018-07-25 2018-12-11 北京因时机器人科技有限公司 A kind of gesture tracking recognition methods, device and smart machine

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112328154A (en) * 2020-11-11 2021-02-05 维沃移动通信有限公司 Equipment control method and device and electronic equipment
CN112685919A (en) * 2021-03-12 2021-04-20 南京爱奇艺智能科技有限公司 Handle tracking effect evaluation method
CN113687714A (en) * 2021-07-16 2021-11-23 北京理工大学 Fingertip interaction system and method of active flexible pressure sensor
CN117251058A (en) * 2023-11-14 2023-12-19 中国海洋大学 Control method of multi-information somatosensory interaction system
CN117251058B (en) * 2023-11-14 2024-01-30 中国海洋大学 Control method of multi-information somatosensory interaction system

Similar Documents

Publication Publication Date Title
CN111552383A (en) Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment
Wang et al. Controlling object hand-over in human–robot collaboration via natural wearable sensing
CN110262664B (en) Intelligent interactive glove with cognitive ability
KR100630806B1 (en) Command input method using motion recognition device
Lu et al. Gesture recognition using data glove: An extreme learning machine method
KR100777107B1 (en) apparatus and method for handwriting recognition using acceleration sensor
Jingqiu et al. An ARM-based embedded gesture recognition system using a data glove
CN111966217A (en) Unmanned aerial vehicle control method and system based on gestures and eye movements
CN106970705A (en) Motion capture method, device and electronic equipment
Prasad et al. A wireless dynamic gesture user interface for HCI using hand data glove
Hsu et al. Drift modeling and compensation for MEMS-based gyroscope using a Wiener-type recurrent neural network
CN111158476B (en) Key recognition method, system, equipment and storage medium of virtual keyboard
CN109960404B (en) Data processing method and device
CN112527104A (en) Method, device and equipment for determining parameters and storage medium
CN108089710A (en) A kind of electronic equipment control method, device and electronic equipment
Sung et al. Motion quaternion-based motion estimation method of MYO using K-means algorithm and Bayesian probability
CN110236560A (en) Six axis attitude detecting methods of intelligent wearable device, system
Vu et al. Hand pose detection in hmd environments by sensor fusion using multi-layer perceptron
CN115079684A (en) Feedback method of robot and robot
Khan et al. Gesture recognition using Open-CV
Mahajan et al. Digital pen for handwritten digit and gesture recognition using trajectory recognition algorithm based on triaxial accelerometer
Lee et al. A hand gesture recognition method using inertial sensor for rapid operation on embedded device
US10156907B2 (en) Device for analyzing the movement of a moving element and associated method
CN117290773B (en) Amphibious personalized gesture recognition method and recognition system based on intelligent data glove
US11782522B1 (en) Methods and systems for multimodal hand state prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination