CN114652331A - System for testing and evaluating coordination between accurate grasping muscles based on mixed reality - Google Patents
System for testing and evaluating coordination between accurate grasping muscles based on mixed reality Download PDFInfo
- Publication number
- CN114652331A CN114652331A CN202210227558.8A CN202210227558A CN114652331A CN 114652331 A CN114652331 A CN 114652331A CN 202210227558 A CN202210227558 A CN 202210227558A CN 114652331 A CN114652331 A CN 114652331A
- Authority
- CN
- China
- Prior art keywords
- mixed reality
- muscles
- evaluating
- coordination
- network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000003205 muscle Anatomy 0.000 title claims abstract description 60
- 238000012360 testing method Methods 0.000 title claims abstract description 37
- 238000000034 method Methods 0.000 claims abstract description 44
- 230000008569 process Effects 0.000 claims abstract description 39
- 230000003183 myoelectrical effect Effects 0.000 claims abstract description 17
- 230000000007 visual effect Effects 0.000 claims abstract description 16
- 238000005070 sampling Methods 0.000 claims abstract description 14
- 230000003238 somatosensory effect Effects 0.000 claims abstract description 8
- 238000011156 evaluation Methods 0.000 claims description 14
- 230000008447 perception Effects 0.000 claims description 9
- 230000001934 delay Effects 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 5
- 238000010276 construction Methods 0.000 claims 1
- 210000003811 finger Anatomy 0.000 description 10
- 230000000694 effects Effects 0.000 description 9
- 238000011158 quantitative evaluation Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 210000000245 forearm Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000015541 sensory perception of touch Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 208000018360 neuromuscular disease Diseases 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 101800004637 Communis Proteins 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000003169 central nervous system Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000013399 early diagnosis Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000004932 little finger Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007659 motor function Effects 0.000 description 1
- 230000004220 muscle function Effects 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 230000003767 neural control Effects 0.000 description 1
- 230000002232 neuromuscular Effects 0.000 description 1
- 230000009023 proprioceptive sensation Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002195 synergetic effect Effects 0.000 description 1
- 238000010998 test method Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Abstract
The invention discloses a system for testing and evaluating coordination among accurate grasping muscles based on mixed reality, which comprises: the real scene acquisition device is used for acquiring real gripping information in the gripping process; the myoelectric sensing device is used for acquiring a multi-channel myoelectric signal in the gripping process; the mixed reality platform comprises a processor and mixed reality equipment, wherein the processor performs three-dimensional reconstruction on a real scene according to real grasping information and feeds the real scene back to the mixed reality equipment to construct a virtual scene, and time delay is introduced into the virtual scene; according to the multi-channel electromyographic signals, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection among the nodes as edges is constructed, and network structure indexes used for evaluating the compatibility among muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network. Introducing time delay to realize desynchronization of visual feedback information and somatosensory information; and (3) providing a multilayer novel weighted complex network, and quantitatively evaluating the coordination among muscles.
Description
Technical Field
The invention relates to the technical field of grasping tests, in particular to a system for testing and evaluating coordination among precise grasping muscles based on mixed reality.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
Precise grasping is a very common and important hand activity in daily life. When the human hand grips, a plurality of muscles participate in activities and have synergistic effect of muscle groups under the control of the central nervous system. As a power source for the motor control system, the coordinated activity among the multiple muscles plays a key role in the process of the gripping movement. It is generally accepted that skilled object manipulation requires the combined action of feedforward and feedback control in a neural control mechanism, and the perceptual information of various modalities includes: the vision, the touch, the proprioception, the hearing and the like all play an important role in perception feedback, are closely coupled with the motor intention to form a perception motor fusion mechanism, and have important influence on precise grasping and other fine operations. How to realize the testing and evaluation of the multi-muscle coordination under the perception-motor fusion can not only reflect the functions of the neuromuscular system, but also has important significance on the integrated development of perception and movement, the learning and cognition mechanism of operational skills and the diagnosis, evaluation and rehabilitation of various clinical neuromuscular diseases.
Virtual Reality (VR) uses a computer to create a Virtual environment that provides simulated visual, tactile, and auditory information to a user, making the user as if he were present and interacting with the environment. However, VR technology completely isolates the user from the surrounding real environment, and when the user completes the action in the virtual environment, due to lack of real touch feeling of the actual object, effective touch perception feedback cannot be obtained during grasping, desynchronization of vision and touch in the precise grasping process cannot be realized, and precise measurement and quantitative evaluation cannot be realized in the multi-muscle coordination process.
Disclosure of Invention
In order to solve the problems, the invention provides a system for testing and evaluating the coordination between the accurate grasping muscles based on mixed reality, wherein time delay is introduced into a real scene and a virtual scene so as to realize desynchronization of visual feedback information and somatosensory information; meanwhile, a multilayer novel weighted complex network is provided, the interaction relation among the multichannel electromyographic signals is calculated, and the coordination among the muscles is quantitatively evaluated.
In order to achieve the purpose, the invention adopts the following technical scheme:
in a first aspect, the present invention provides a system for testing and evaluating coordination between precise grasping muscles based on mixed reality, comprising: the system comprises a real scene acquisition device, a myoelectricity sensing device and a mixed reality platform;
the real scene acquisition device is used for acquiring real gripping information in the gripping process;
the myoelectric sensing device is used for acquiring a multi-channel myoelectric signal in the gripping process;
the mixed reality platform comprises a processor and mixed reality equipment, wherein the processor receives real gripping information, carries out three-dimensional reconstruction on a real scene according to the real gripping information and feeds back the real scene to the mixed reality equipment so as to construct a virtual scene, and introduces time delay into the virtual scene so as to realize evaluation of the coordination among muscles under different time delays;
the processor receives multi-channel electromyographic signals, evaluates the compatibility between muscles according to the multi-channel electromyographic signals, specifically, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection between the nodes as edges is constructed according to the multi-channel electromyographic signals, and network structure indexes used for evaluating the compatibility between the muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network.
As an alternative embodiment, time delay is introduced into the virtual scene to achieve desynchronization of the somatosensory feedback information and the visual feedback information, so that the somatosensory feedback information is inconsistent with the visual feedback information, and the influence of the visual-somatosensory perception conflict on the coordination among muscles is quantitatively evaluated.
As an alternative embodiment, the process of constructing the multilayer novel weighted complex network includes: the electromyographic signals of each channel are mapped into a novel weighting complex network, and a multilayer novel weighting complex network is finally constructed according to the natural corresponding relation of each electromyographic sampling point of the synchronously acquired multi-channel electromyographic signals.
As an alternative implementation, the phase synchronization relationship of any two layers of networks in the multilayer novel weighted complex network is represented by a phase-locked value, and when the phase-locked value is zero, phase synchronization does not exist, and when the phase-locked value is 1, phase synchronization exists.
As an alternative embodiment, the phase-lock value is: and extracting average weighting degree characteristics of each layer of novel weighted complex network, and obtaining a phase-locked value according to the phase difference of the average weighting degree sequence of any two layers of networks.
In an alternative embodiment, the average weighting characteristic is an average weighting of all edges of each node, and the average weighting sequence is obtained from the average weighting characteristic of all nodes in each layer network.
As an alternative embodiment, the network structure index for evaluating the consistency between muscles comprises a clustering coefficient and a characteristic path length.
As an alternative embodiment, the clustering coefficient is:
where CC is the clustering coefficient, M is the number of network layers, kpIs the degree of the p-th network; PLVpq,PLVrp,PLVrqIs the element of the corresponding position of (p, q), (r, p), (r, q) in the matrix PLV.
As an alternative embodiment, the characteristic path length is:
wherein L ispIs the average characteristic path length, l, of the p-th networkpqIs the shortest path length of the p-th layer and the q-th layer network; m is the number of network layers and CPL is the characteristic path length.
In a second aspect, the invention provides a method for evaluating a coordination test between accurate grasping muscles based on mixed reality, which comprises the following steps:
acquiring real gripping information in the gripping process and multi-channel electromyographic signals in the gripping process;
three-dimensional reconstruction of a real scene is carried out according to real grasping information and then fed back to mixed reality equipment, so that a virtual scene is constructed, and time delay is introduced into the virtual scene to realize evaluation of coordination among muscles under different time delays;
the process of evaluating the inter-muscle coordination includes: according to the multi-channel electromyographic signals, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection among the nodes as edges is constructed, and network structure indexes used for evaluating the compatibility among muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a system for testing and evaluating coordination between accurate grasping muscles based on mixed reality. Muscle activity parameters at different time delays were compared to the no-delay condition to quantify the effect of visual-somatosensory perception conflicts on precise grasping of multi-muscle coordination.
The invention provides a system for testing and evaluating coordination among precise grasping muscles based on mixed reality, which records myoelectric signals of a plurality of muscles of a hand and a forearm in a grasping activity period in the whole process, provides a multilayer novel weighted complex network for constructing the complex network in a multichannel myoelectric signal, calculates the interactive relation among the multichannel myoelectric signals, quantitatively evaluates the coordination among muscles, realizes the functional evaluation of precise grasping and the precise quantitative evaluation of the injury degree and the rehabilitation degree of the grasping related muscle function, and has important application value for the sensory motor function test of a nervous system, the coordination function evaluation of the precise grasping multimuscle, the early diagnosis and the rehabilitation evaluation of various neuromuscular diseases and the like.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
Fig. 1 is a schematic structural diagram of a gripping device provided in embodiment 1 of the present invention;
fig. 2 is a schematic view of a mixed reality gripping platform provided in embodiment 1 of the present invention;
fig. 3 is a schematic diagram of a mixed reality interface provided in embodiment 1 of the present invention;
FIG. 4 is a flowchart of the test provided in example 1 of the present invention;
fig. 5 is a schematic diagram of a multilayer novel weighted complex network provided in embodiment 1 of the present invention;
the device comprises a stand column 1, a stand column 2, a force/torque sensor 3, a force/torque sensor 4, a force/torque sensor 5, a force/torque sensor 6, a force/torque sensor 7, a base 8, a gripping device 9, a myoelectricity sensing device 10, a mixed reality helmet 11, a depth camera 12, an upper computer platform 13, a test platform table 14, a hand 15 and a virtual circular table.
Detailed Description
The invention is further described with reference to the following figures and examples.
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and it should be understood that the terms "comprises" and "comprising", and any variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
Example 1
Mixed Reality (MR) is an important branch of VR, and the key difference from VR is that the MR blends real images with virtual scenes, rather than completely isolating the user from the surrounding real environment. The visual sense and the tactile sense are desynchronized in the accurate grasping process by utilizing the MR technology, the visual interference is generated by visual time delay, the motion behavior representation in the state that the visual information is inconsistent with the tactile sense information such as the tactile sense is obtained, and the accurate quantitative evaluation on the perception motion function is formed on the premise of ensuring the participation degree and the presence of participants.
From this, this embodiment proposes a mixed reality-based system for evaluating coordination test between accurate grasping muscles, including: the system comprises a real scene acquisition device, a myoelectricity sensing device and a mixed reality platform;
the real scene acquisition device is used for acquiring real gripping information in the gripping process;
the myoelectric sensing device is used for acquiring a multi-channel myoelectric signal in the gripping process;
the mixed reality platform comprises a processor and mixed reality equipment, wherein the processor receives real gripping information, carries out three-dimensional reconstruction on a real scene according to the real gripping information and feeds back the real scene to the mixed reality equipment so as to construct a virtual scene, and introduces time delay into the virtual scene so as to realize evaluation of the coordination among muscles under different time delays;
the processor receives multi-channel electromyographic signals, evaluates the compatibility between muscles according to the multi-channel electromyographic signals, specifically, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection between the nodes as edges is constructed according to the multi-channel electromyographic signals, and network structure indexes used for evaluating the compatibility between the muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network.
In the present embodiment, the grasping apparatus shown in fig. 1 includes a main body 1, force/moment sensors corresponding to respective fingers, and a base 7; the upright post 1 is arranged on the base 7, and the force/torque sensor is arranged on the upright post 1.
Specifically, the positions of the force/torque sensors are determined by natural arrangement positions when a human hand grasps the force/torque sensors, and the arrangement of the positions of the thumb and the four opposite fingers is based on the physiological structure of the human body and the natural grasping posture of the five fingers; wherein, the force/moment sensor 2 is corresponding to the radial (thumb) position, and the force/moment sensors 3-6 are positioned at the opposite sides of the force/moment sensor 2 and respectively correspond to four fingers (index finger, middle finger, ring finger and little finger) at the ulnar side for recording the real-time signals of the fingertip force/moment.
The upright column is a cylindrical upright column, and the diameter of the cylindrical upright column is determined according to the grasping aperture.
The base adopts the disc base, and its diameter slightly is greater than the stand part, and the base increases the load and improves stability for grabbing device to prevent to topple over because of misoperation causes grabbing device in the test process.
In this embodiment, the electromyographic sensing device is disposed at a corresponding position of the muscles of the hand and the forearm, and the electromyographic sensing device is used to synchronously record surface electromyographic signals of a plurality of muscles on the front side of the hand and the arm in the gripping process. For example, BrachioRadialis (BR), ulnar carpus Flexor (FCU), Flexor Carpus Radialis (FCR), Extensor Digitorum Communis (EDC), superficial Flexor Digitorum Superfacialis (FDS), Abductor Pollicis Brevicis (APB), and First Interosseous dorsi muscle (FDI), among others, which play an important role in the grasping process, may be selected for electromyographic signal recording.
In this embodiment, the real scene acquisition device adopts a depth camera to acquire images of a human hand gripping process in real time so as to perform three-dimensional reconstruction on the human hand gripping process, construct a mixed reality virtual scene, and introduce a visual time delay into the virtual scene, thereby designing a visual mixed reality virtual scene which is high in immersion and performs real-time interaction with the gripping device in the real world.
As shown in fig. 2, the mixed reality gripping platform provided in this embodiment includes a test platform table 13, two depth cameras 11, and an upper computer platform 12, the gripping device 8 and the two depth cameras 11 are placed on the test platform table 12, and the upper computer platform 12 is disposed on one side of the test platform table 13;
in the embodiment, two depth cameras 11 which are based on the parallax principle and located at different positions are used for acquiring color images and depth images of a hand gripping process in real time from different positions, an upper computer platform 12 receives gripping information in real scenes acquired in real time through a processor, and after the color images and the depth images are preprocessed, data such as three-dimensional geometric information, positions, postures and the like of an object are acquired by calculating position deviation between corresponding points of the images, so that three-dimensional reconstruction of the real scenes is performed, and the reconstructed three-dimensional scenes are fed back to a mixed reality device to be presented to a subject.
Meanwhile, according to the embodiment, two levels of time delay are introduced into a virtual scene from the dynamic time dimension of the grasping activity, so that the desynchronization of the somatosensory information and the visual information is realized, the visual feedback information is inconsistent with the somatosensory feedback information, the muscle activity parameters under different time delays are compared with the condition without delay, and the influence of the visual-somatosensory perception conflict on the coordination among the multiple precisely grasped muscles is quantized.
The present embodiment introduces three different levels of visual time delay, namely no time delay (i.e. the observed event is synchronized with the actually occurring event in the MR scene), visually observed events being delayed 100ms, 200ms with respect to the actually occurring event, respectively.
Fig. 3 is a mixed reality interface observed by a subject, wherein the test platform table 13 is a mapping of a real scene in the mixed reality scene, the virtual circular table 15 provides a reference of a target height for a gripping activity, and the surface of the virtual circular table 15 has a vertical height of 15cm according to the test platform table 13.
In the test procedure shown in fig. 4, the gripping device 8 is first placed 30cm from the edge of the test platform table 13, the mixed reality helmet 10 is worn by the subject, the myoelectric sensing device 9 is arranged at the corresponding position of the hand and forearm muscles, the gripping device 8 is aligned with the right shoulder of the subject, and each subject needs to do one exercise to become familiar with the MR environment and the test process before the official test.
The specific execution process of the test is as follows: the subject is ready, the right arm is placed on the table, the first instruction "stop resting" is heard, with the help of the researcher, five fingers are placed at the five protuberances of the gripping device 8; hearing the second voice command "ready", all fingers briefly leave the gripping means 8; hearing the third sound prompt sound of 'start', the five fingers contact the gripping device 8 again, the gripping device 8 is lifted to the height of the virtual circular table 15 in the MR scene, the stability is kept for 30s until hearing the fourth sound prompt sound of 'end', and the gripping device 8 is put back to the test platform table 13; the entire process was repeated 9 times, with 3 replicates per time delay level. In this task, the subject is required to try to lift the gripping means 8 vertically to the same height as the platform in the field of view and to remain as stable as possible during the holding phase.
In the test process, a multi-channel electromyographic signal in the gripping process is acquired through the electromyographic sensing device, the sampling rate of the electromyographic sensing device is set to 2000Hz, and the acquired multi-channel electromyographic signal is defined as follows: { Xm(t) }, M ═ 1.. M, M ═ 7, M is the number of channels;
for any electromyographic time series with K sampling points, it is expressed as:
Xm(tk),k=1,...K (1)
in the embodiment, the phase coupling relation among multiple muscles is analyzed based on a multi-layer complex network, each electromyographic channel is mapped into a layer of complex network, all sampling points of electromyographic signals are reserved in the method for avoiding losing useful electromyographic information in the mapping process, and each electromyographic sampling point is a node of one layer of complex network formed by an electromyographic time sequence. Thus, the multi-layer complex network formed by the multi-channel electromyographic signals is represented as:
{Gm},m=1,...M (2)
Gm=(Km,Em) (3)
wherein G ismRepresenting a one-level complex network with K nodes and E edges, node KmAnd edge EmRespectively expressed as:
Km={kk},k=1,2,...K (4)
Em=ek,k=1,2,...K (5)
to determine the edges between the nodes, the directed connections of all nodes to each other are considered, i.e. if there are K nodes, each node is connected to K-1 nodes other than itself. For example, a time series X with 10 nodes1(tk) K 1, 2.. 10, the complex network of which is denoted G1=(10,45)。
In order to obtain more reliable and robust information, the present embodiment calculates the weight coefficient of each edge by the following formula:
wherein k isi,kjIs any two nodes of a complex network, ti,tjIs the time point, w, corresponding to two nodesijIs ki,kjThe weight coefficient of an edge between two nodes, and only the absolute value of the weight coefficient is considered in the algorithm.
After the weight coefficient of each edge is added into each layer of Complex Network, each myoelectric channel is mapped into a layer of novel Weighted Complex Network, so that a multilayer Weighted Complex Network is obtained, and meanwhile, because the natural corresponding relation of each sampling point of the synchronously acquired multi-channel myoelectric signal across the multilayer Network exists between different layers, a Multilayer Novel Weighted Complex Network (MNWCN) is finally obtained;
fig. 5 shows a multilayer novel weighted complex network with M2 layers and K4 nodes, two multi-channel time series X of electromyographic signals with 4 nodes1(tk),X2(tk) K is 1,2,3,4, forming a network structure { G having two layers of complex networks1,G2}。
In order to keep most of the information of the complex network at the cost of less information loss, this embodiment extracts an Average Weighted Degree (AWD) feature for each layer of the novel Weighted complex network to represent the Average weight of all edges of each node:
where B (i) is the neighborhood of node i, wijIs the weight of the edge between nodes i and j, WDiIs the total weight, AWD, of all edges connected to node iiIs the average weight of all edges connected to node i.
In each layer network, the average weighting degree sequence is formed by the arrangement of the average weighting degree characteristics of all nodes in each layer network, and the average weighting degree sequence { AWD (active weighted distribution) for different two-layer networksmAnd { AWD }nAnd expressing the Phase synchronization relationship between the two by using a Phase Locking Value (PLV):
wherein the content of the first and second substances,is degree sequence { AWDmAnd { AWD }nThe phase difference of (c),andis degree sequence { AWDmAnd { AWD }nThe phase of the wave; when PLVmnAt 0, the two degree sequences are not phase-synchronized; when PLVmnAt 1, there is a stable phase difference, i.e., phase synchronization.
For the time series of the electromyographic signals of the M channels, an adjacency matrix P of M × M composed of phase-locked values is finally obtained:
P=(PLVmn)M*M (11)
element value PLVmn,m∈[1,M],n∈[1,M]Representing the phase locking value between the average weighting degree sequences between the corresponding two layers of networks; p is a symmetric matrix, i.e. PLVmn=PLVnm,m≠n,m∈[1,M],n∈[1,M]。
In the embodiment, network structure indexes for evaluating the consistency among muscles are extracted, including Clustering Coefficient (CC) and Characteristic Path Length (CPL), so as to measure the information separation and integration capacity of the muscle network respectively;
wherein, the clustering coefficient is:
where M is the number of layers of the MNWCN network, CCpIs the clustering coefficient, k, of the p-th networkpIs the degree of the p-th network;
wherein, PLVpq,PLVrp,PLVrqIs the element of the corresponding position of (p, q), (r, p), (r, q) in the matrix PLV.
The characteristic path length is:
wherein L ispThe average characteristic path length of the p-th network is defined as:
wherein lpqIs the shortest path length between the p-th layer and the q-th layer.
Through analyzing the change of CC and CPL, the information interaction capacity between muscle networks is reflected, and therefore accurate quantitative evaluation is carried out on the coordination between muscles in the accurate grasping process.
In further embodiments, there is also provided an evaluation method of the system for evaluating coordination test between precision grasping muscles based on mixed reality, which includes:
acquiring real gripping information in the gripping process and multi-channel electromyographic signals in the gripping process;
three-dimensional reconstruction of a real scene is carried out according to real grasping information and then fed back to mixed reality equipment, so that a virtual scene is constructed, and time delay is introduced into the virtual scene to realize evaluation of coordination among muscles under different time delays;
the process of the evaluation of the inter-muscle coordination comprises the following steps: according to the multi-channel electromyographic signals, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection among the nodes as edges is constructed, and network structure indexes used for evaluating the compatibility among muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network.
It will be understood that the same invention should be considered if it is an invention that simply modifies the geometric configuration of the gripping means. The invention is considered to be the same if the invention replaces small circular tables providing the target height in mixed reality with other objects. If other types of sensors are arranged to obtain the electromyographic signals or the same type of electromyographic sensing devices are used for measuring the surface electromyographic signals of different muscles, the invention is regarded as the same invention.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.
Claims (10)
1. A system for testing and evaluating the coordination between accurate grasping muscles based on mixed reality, comprising: the system comprises a real scene acquisition device, a myoelectricity sensing device and a mixed reality platform;
the real scene acquisition device is used for acquiring real gripping information in the gripping process;
the myoelectric sensing device is used for acquiring a multi-channel myoelectric signal in the gripping process;
the mixed reality platform comprises a processor and mixed reality equipment, wherein the processor receives real gripping information, carries out three-dimensional reconstruction on a real scene according to the real gripping information and feeds back the real scene to the mixed reality equipment so as to construct a virtual scene, and introduces time delay into the virtual scene so as to realize evaluation of the coordination among muscles under different time delays;
the processor receives multi-channel electromyographic signals, evaluates the compatibility between muscles according to the multi-channel electromyographic signals, specifically, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection between the nodes as edges is constructed according to the multi-channel electromyographic signals, and network structure indexes used for evaluating the compatibility between the muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network.
2. The system for testing and evaluating the coordination among the precise grasping muscles based on the mixed reality as claimed in claim 1, wherein a time delay is introduced into the virtual scene to realize desynchronization of the somatosensory feedback information and the visual feedback information, so that the somatosensory feedback information is inconsistent with the visual feedback information, and the influence of the visual-somatosensory perception conflict on the coordination among the muscles is quantitatively evaluated.
3. The system for evaluating the coordination test between the precise grasping muscles based on the mixed reality as claimed in claim 1, wherein the construction process of the multilayer novel weighted complex network comprises: the myoelectric signals of all the channels are mapped into a novel weighting complex network, and a multilayer novel weighting complex network is finally constructed according to the natural corresponding relation of all myoelectric sampling points of the multi-channel myoelectric signals which are synchronously acquired.
4. The system for testing and evaluating the coordination between the precise grasping muscles based on the mixed reality as claimed in claim 1, wherein the phase synchronization relationship of any two layers of networks in the multilayer novel weighted complex network is characterized by a phase-locked value, and when the phase-locked value is zero, no phase synchronization exists, and when the phase-locked value is 1, phase synchronization exists.
5. The system for evaluating the coordination test between the precise grasping muscles based on the mixed reality as claimed in claim 4, wherein the phase-locked value is: and extracting average weighting degree characteristics of each layer of novel weighted complex network, and obtaining a phase-locked value according to the phase difference of the average weighting degree sequence of any two layers of networks.
6. The system for evaluating the coordination test between the precise grasping muscles based on the mixed reality as claimed in claim 5, wherein the average weighting degree characteristic is an average weighting of all edges of each node, and the average weighting degree sequence is obtained from the average weighting degree characteristic of all nodes in each layer network.
7. The system of claim 1, wherein the network structure indicators for evaluating the consistency between muscles comprise clustering coefficients and characteristic path lengths.
8. The system for evaluating the coordination test between the precise grasping muscles based on the mixed reality as claimed in claim 7, wherein the clustering coefficient is:
wherein CC is a clustering coefficient, M is the number of network layers, kpIs the degree of the p-th network; PLVpq,PLVrp,PLVrqIs the element of the corresponding position of (p, q), (r, p), (r, q) in the matrix PLV.
9. The system for evaluating the coordination test between the precise grasping muscles based on the mixed reality as claimed in claim 7, wherein the characteristic path length is:
wherein L ispIs the average characteristic path length, l, of the p-th networkpqIs the shortest path length of the p-th layer and the q-th layer network; m is the number of network layers and CPL is the characteristic path length.
10. A method for testing and evaluating coordination among accurate grasping muscles based on mixed reality is characterized by comprising the following steps:
acquiring real gripping information in the gripping process and multi-channel electromyographic signals in the gripping process;
three-dimensional reconstruction of a real scene is carried out according to real grasping information and then fed back to mixed reality equipment, so that a virtual scene is constructed, and time delay is introduced into the virtual scene to realize evaluation of coordination among muscles under different time delays;
the process of the evaluation of the inter-muscle coordination comprises the following steps: according to the multi-channel electromyographic signals, a multi-layer novel weighted complex network which takes electromyographic sampling points as nodes and directional connection among the nodes as edges is constructed, and network structure indexes used for evaluating the compatibility among muscles are extracted according to the phase synchronization relation of any two layers of networks in the multi-layer novel weighted complex network.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210227558.8A CN114652331A (en) | 2022-03-08 | 2022-03-08 | System for testing and evaluating coordination between accurate grasping muscles based on mixed reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210227558.8A CN114652331A (en) | 2022-03-08 | 2022-03-08 | System for testing and evaluating coordination between accurate grasping muscles based on mixed reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114652331A true CN114652331A (en) | 2022-06-24 |
Family
ID=82030148
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210227558.8A Pending CN114652331A (en) | 2022-03-08 | 2022-03-08 | System for testing and evaluating coordination between accurate grasping muscles based on mixed reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114652331A (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104207793A (en) * | 2014-07-03 | 2014-12-17 | 中山大学 | Gripping function evaluating and training system |
CN104382595A (en) * | 2014-10-27 | 2015-03-04 | 燕山大学 | Upper limb rehabilitation system and method based on myoelectric signal and virtual reality interaction technology |
CN105095865A (en) * | 2015-07-17 | 2015-11-25 | 广西师范大学 | Directed-weighted-complex-network-based cervical cell recognition method and a cervical cell recognition apparatus |
CN109674445A (en) * | 2018-11-06 | 2019-04-26 | 杭州电子科技大学 | Coupling analytical method between a kind of combination Non-negative Matrix Factorization and the flesh of complex network |
CN111227829A (en) * | 2020-02-14 | 2020-06-05 | 广东司法警官职业学院 | Electroencephalogram signal analysis method based on complex network characteristic indexes |
CN111513735A (en) * | 2020-05-31 | 2020-08-11 | 天津大学 | Major depressive disorder identification system based on brain-computer interface and deep learning and application |
CN111742282A (en) * | 2018-02-19 | 2020-10-02 | 瓦尔基里工业有限公司 | Haptic feedback for virtual reality |
CN112232301A (en) * | 2020-11-16 | 2021-01-15 | 杭州电子科技大学 | Inter-muscle coupling network analysis method based on multi-scale Copula mutual information |
CN113632176A (en) * | 2019-03-29 | 2021-11-09 | 脸谱科技有限责任公司 | Method and apparatus for low latency body state prediction based on neuromuscular data |
CN114021604A (en) * | 2021-10-26 | 2022-02-08 | 汕头大学 | Motion imagery training system based on real-time feedback of 3D virtual reality technology |
CN114098765A (en) * | 2021-11-23 | 2022-03-01 | 燕山大学 | Method and device for extracting parameters and features of multi-channel high-frequency brain wave coupled brain network |
-
2022
- 2022-03-08 CN CN202210227558.8A patent/CN114652331A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104207793A (en) * | 2014-07-03 | 2014-12-17 | 中山大学 | Gripping function evaluating and training system |
CN104382595A (en) * | 2014-10-27 | 2015-03-04 | 燕山大学 | Upper limb rehabilitation system and method based on myoelectric signal and virtual reality interaction technology |
CN105095865A (en) * | 2015-07-17 | 2015-11-25 | 广西师范大学 | Directed-weighted-complex-network-based cervical cell recognition method and a cervical cell recognition apparatus |
CN111742282A (en) * | 2018-02-19 | 2020-10-02 | 瓦尔基里工业有限公司 | Haptic feedback for virtual reality |
CN109674445A (en) * | 2018-11-06 | 2019-04-26 | 杭州电子科技大学 | Coupling analytical method between a kind of combination Non-negative Matrix Factorization and the flesh of complex network |
CN113632176A (en) * | 2019-03-29 | 2021-11-09 | 脸谱科技有限责任公司 | Method and apparatus for low latency body state prediction based on neuromuscular data |
CN111227829A (en) * | 2020-02-14 | 2020-06-05 | 广东司法警官职业学院 | Electroencephalogram signal analysis method based on complex network characteristic indexes |
CN111513735A (en) * | 2020-05-31 | 2020-08-11 | 天津大学 | Major depressive disorder identification system based on brain-computer interface and deep learning and application |
CN112232301A (en) * | 2020-11-16 | 2021-01-15 | 杭州电子科技大学 | Inter-muscle coupling network analysis method based on multi-scale Copula mutual information |
CN114021604A (en) * | 2021-10-26 | 2022-02-08 | 汕头大学 | Motion imagery training system based on real-time feedback of 3D virtual reality technology |
CN114098765A (en) * | 2021-11-23 | 2022-03-01 | 燕山大学 | Method and device for extracting parameters and features of multi-channel high-frequency brain wave coupled brain network |
Non-Patent Citations (3)
Title |
---|
刘孟杰: "视觉延迟与力矩感知作用下多指精确抓握协调控制研究", 《优秀硕士论文数据库 医药卫生科技辑》, no. 12, 15 December 2021 (2021-12-15), pages 1 - 10 * |
施宏达;李秀飞;张立中;: "虚拟现实灵巧手姿态力度控制算法与仿真", 长春理工大学学报(自然科学版), no. 05, 15 October 2016 (2016-10-15) * |
谢平;刘欢;王磊磊;程生翠;陈伟;: "基于脑肌电反馈的虚拟康复训练系统设计", 仪器仪表学报, no. 01, 15 January 2018 (2018-01-15) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
de Rugy et al. | Are muscle synergies useful for neural control? | |
Dawson et al. | Myoelectric training systems | |
Esfahlani et al. | Validity of the Kinect and Myo armband in a serious game for assessing upper limb movement | |
CN109350923A (en) | A kind of rehabilitation training of upper limbs system based on VR and more body position sensors | |
CN104778894A (en) | Virtual simulation bone-setting manipulation training system and establishment method thereof | |
Lockery et al. | Store-and-feedforward adaptive gaming system for hand-finger motion tracking in telerehabilitation | |
Everard et al. | New technologies promoting active upper limb rehabilitation after stroke: an overview and network meta-analysis | |
CN116312947B (en) | Immersive ankle and foot rehabilitation training method based on upper limb movement signals and electronic equipment | |
Bhattacharjya et al. | Harnessing smartphone technology and three dimensional printing to create a mobile rehabilitation system, mRehab: assessment of usability and consistency in measurement | |
Ison et al. | Simultaneous myoelectric control of a robot arm using muscle synergy-inspired inputs from high-density electrode grids | |
Song et al. | Proposal of a wearable multimodal sensing-based serious games approach for hand movement training after stroke | |
Holmes et al. | Usability and performance of Leap Motion and Oculus Rift for upper arm virtual reality stroke rehabilitation | |
Furmanek et al. | A kinematic and EMG dataset of online adjustment of reach-to-grasp movements to visual perturbations | |
Hu et al. | Intuitive environmental perception assistance for blind amputees using spatial audio rendering | |
Popescu et al. | PC-based telerehabilitation system with force feedback | |
CN114652331A (en) | System for testing and evaluating coordination between accurate grasping muscles based on mixed reality | |
Han | A virtual reality algorithm for the study of clinical efficacy of sports injury rehabilitation training | |
Auccahuasi et al. | Analysis of a mechanism to evaluate upper limb muscle activity based on surface electromyography using the MYO-EMG device | |
Ponto et al. | Virtual Exertions: a user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation | |
Krammer et al. | Sensing form-finger gaiting as key to tactile object exploration-a data glove analysis of a prototypical daily task | |
Sun | Virtual and Augmented Reality-Based Assistive Interfaces for Upper-limb Prosthesis Control and Rehabilitation | |
TWI682765B (en) | A remote human pulse measurement and reproducing system | |
Gourlay et al. | Telemedicinal virtual reality for cognitive rehabilitation | |
Rahman et al. | Gear: A mobile game-assisted rehabilitation system | |
Dechenaud et al. | Development of adapted guitar to improve motor function after stroke: Feasibility study in young adults |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |