CN113190114B - Virtual scene experience system and method with haptic simulation and emotional perception - Google Patents

Virtual scene experience system and method with haptic simulation and emotional perception Download PDF

Info

Publication number
CN113190114B
CN113190114B CN202110400345.6A CN202110400345A CN113190114B CN 113190114 B CN113190114 B CN 113190114B CN 202110400345 A CN202110400345 A CN 202110400345A CN 113190114 B CN113190114 B CN 113190114B
Authority
CN
China
Prior art keywords
user
touch
simulation
scene
skin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110400345.6A
Other languages
Chinese (zh)
Other versions
CN113190114A (en
Inventor
董元发
梁成
刘文戎
蒋磊
曾涛
严华兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Three Gorges University CTGU
Original Assignee
China Three Gorges University CTGU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Three Gorges University CTGU filed Critical China Three Gorges University CTGU
Priority to CN202110400345.6A priority Critical patent/CN113190114B/en
Priority to CN202210380616.0A priority patent/CN114779930A/en
Publication of CN113190114A publication Critical patent/CN113190114A/en
Application granted granted Critical
Publication of CN113190114B publication Critical patent/CN113190114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Abstract

The invention relates to a virtual scene experience system with touch simulation and emotion perception, which comprises a scene computer, a display terminal and a touch simulation and feedback device, wherein the display terminal and the touch simulation and feedback device are respectively in communication connection with the scene computer; the skin touch simulation module generates different pressure and shearing force on the skin contacted with the skin touch simulation module according to the scene displayed by the display terminal and the limb position of the user, and simulates the touch sense of an object in the scene touched by the user. The system provided by the invention provides virtual scenes for the user, accurately tracks the motion track and the gesture of the hand of the user, simulates more vivid tactile experience, can be widely used for virtual simulation of various scenes, and greatly improves the experience degree of the user.

Description

Virtual scene experience system and method with haptic simulation and emotional perception
Technical Field
The invention belongs to the field of intelligent wearable devices, and particularly relates to a virtual scene experience system with touch simulation and emotion perception.
Background
Currently, virtual reality technology is widely applied to the product experience industry, and a typical application scenario is the automobile driving experience in virtual reality: the user wears the VR head display, and external equipment such as cooperation force feedback gloves, car seat just can feel in the virtual reality scene and be close to real driving experience. However, the perception provided by the above-described external devices has certain limitations. For example, the tactile sensation felt by the user is generated by directly touching the steering wheel and the seat, but when the user touches other surfaces with hands in the virtual reality scene, the tactile sensation cannot be generated, which reduces the user experience to some extent.
At present, the problems of the existing touch simulation device are mostly complex in structure and low in simulation true degree. For example, when force feedback is provided by using electromagnetic force or a miniature vibration motor, the whole device is heavy due to the large size of the components, and when a user wears the device on the hand, the device is easy to generate oppression and fatigue, so that the experience of the user is reduced to a certain extent. In addition, when the tactile feedback is simulated by using electromagnetic force, a vibration motor or the like, the force can be applied to the skin only in the vertical direction, and the perception of the force by the skin in the real world is not limited to only one plane at all, so that the device can not completely simulate the force feedback in different dimensions, and the simulation degree is reduced.
In addition, when the user interacts with the virtual reality scene, the emotional and physiological signals of the user are changing constantly. For example, when a user touches an object with sharp appearance in a virtual reality scene, the inner heart will create a conflicting mood, which will appear physiologically as a heart beat quickening, muscle contraction, and so on. The physiological signals are collected and analyzed, and the change of the interactive consciousness and emotion of the user with different objects in the virtual scene can be obtained. Currently, physiological signal acquisition is widely used in wearable devices such as smart watches, and is rarely used in the field of virtual reality. Emotion analysis based on physiological signals is also difficult to integrate into existing virtual reality somatosensory device systems.
However, the application of emotion analysis in virtual reality requires a portable and accurate sensory simulator to ensure that the user has as close to real experience as possible when interacting with a virtual scene; on the other hand, because the degree of fidelity of such interaction is high, the generated physiological signal can sufficiently reflect the emotion generated by the same interaction of the user in the real world, and therefore, a system capable of collecting and analyzing the physiological signal of the user in real time is needed.
Disclosure of Invention
The virtual scene experience system is worn on the hand of a user, the motion track and the gesture of the hand of the user are accurately tracked, when the user has a touch behavior in the virtual scene, pressure in the vertical direction and shearing force in the horizontal direction can be provided for the hand of the user, and more vivid touch experience is simulated; meanwhile, electroencephalogram, myoelectricity and blood pressure signals of the user are collected, the emotion of the user is obtained through analysis, the preference degree of the user to a touch object is obtained, feedback information of the user to an experience object is collected, and the method can be used for virtual experience of various new life things in real life.
The technical scheme of the invention is that the virtual scene experience system with touch simulation and emotion perception comprises a scene computer, a display terminal and a touch simulation and feedback device, wherein the display terminal and the touch simulation and feedback device are respectively in communication connection with the scene computer; the skin touch simulation module generates different pressure and shearing force on the skin contacted with the skin touch simulation module according to the scene displayed by the display terminal and the limb position of the user, and simulates the touch sense of an object in the scene touched by the user.
The touch simulation and feedback device further comprises a physiological signal module, and the physiological signal module comprises a blood pressure sensor, an electromyographic sensor and an electroencephalographic sensor which are respectively connected with the microprocessor.
The skin touch simulation module comprises fiber cloth worn on the limbs of a user and touch sheets arranged in fiber cloth holes, wherein limiting rings fixedly connected with the fiber cloth are arranged outside the touch sheets, a plurality of connecting points on the periphery of the touch sheets are connected with the tail ends of fiber ropes uniformly distributed along the limiting rings, the fiber ropes are far away from the ends of the touch sheets and fixedly connected with micro pistons in the tubules, the tubules are connected with a hydraulic mechanism, hydraulic fluid is filled in the tubules, under the driving of the hydraulic mechanism, the hydraulic fluid in the tubules in different directions on the periphery of the touch sheets has different pressures, so that the fiber ropes connected with the pistons in the tubules in different directions generate different tensile forces, and the touch sheets generate pressure and shearing forces on the skin of the user body contacted with the touch sheets under the action of the tensile forces in different directions and different sizes of the fiber ropes in different directions.
The touch simulation and feedback device further comprises a micro hydraulic module, and the thin tube is connected with a thin tube joint of the micro hydraulic module.
The tactile sensation simulation and feedback device further comprises a gyroscope sensor connected to the microprocessor.
Preferably, the display terminal is a VR headset.
The method of the virtual scene experience system comprises the following steps:
step 1: the scene computer displays a virtual scene to a user through a display terminal, and the user wears a touch simulation and feedback device to interact with the scene computer;
step 2: a scene computer captures touch behaviors of a user in a virtual scene;
and step 3: according to the touch object of the user, the scene computer controls a touch simulation and feedback device to generate touch experience on the user body;
and 4, step 4: a physiological signal module of the touch simulation and feedback device acquires a physiological signal of a user;
and 5: and judging the emotion of the user according to the acquired physiological signals of the user.
Further, step 5 comprises the following substeps:
step 5.1: collecting electroencephalogram, electromyogram and blood pressure signals of different users under different emotions, extracting electroencephalogram characteristic vectors, electromyogram characteristic vectors and blood pressure characteristic vectors of physiological signal characteristics to obtain a physiological signal characteristic vector group of the users, and manually marking an emotion label of the physiological signal characteristic vector of each trial time;
step 5.2: initializing a fusion coefficient matrix, fusing the electroencephalogram characteristic vector, the electromyogram characteristic vector and the blood pressure characteristic vector of each trial time to obtain a fusion characteristic vector matrix, taking the characteristic value of the fusion coefficient matrix as particles of a particle swarm, and optimizing the fusion coefficient matrix by using a particle swarm optimization algorithm to enable the fusion value of a physiological signal characteristic vector group calculated by the fusion coefficient matrix to be adaptive to the emotion label marked by the physiological signal characteristic vector group, so as to obtain a primarily optimized fusion coefficient matrix;
step 5.3: fusing the user physiological signal characteristics acquired in the step 5.1 by utilizing a fusion coefficient matrix to obtain a fusion characteristic vector matrix of each test time, and constructing a training data set by combining an emotion label manually marked on the fusion characteristic vector matrix;
step 5.4: training the training data set by adopting a pair of multi-support vector machines, calculating the fitness, updating the fusion coefficient, stopping training after reaching the preset fitness, and obtaining a plurality of training network models with emotion labels;
step 5.5: extracting a characteristic vector group from physiological signals of a user collected in real time, obtaining a fusion characteristic matrix by using a fusion coefficient matrix, training the fusion characteristic matrix by using a training network model, classifying by using a pair of multi-support vector machines, matching a classification result with an emotion label, and taking the emotion label with the best matching degree as an emotion judgment result of the user.
Preferably, step 5.1 adopts a co-space mode algorithm to extract the electroencephalogram feature vector from the electroencephalogram signal of the user.
Compared with the prior art, the invention has the beneficial effects that:
1) the system provides a virtual scene for the user, accurately tracks the motion track and the gesture of the hand of the user, can provide pressure in the vertical direction and shearing force in the horizontal direction for the hand of the user when the user has touch behavior in the virtual scene, simulates more vivid tactile experience, can be widely used for virtual simulation of various scenes, and greatly improves the experience degree of the user;
2) the system collects the physiological signals of the user such as electroencephalogram, myoelectricity, blood pressure and the like in real time, analyzes the emotion of the user and obtains the preference degree of the user to a touch object;
3) the method adopts the co-space mode algorithm to extract the electroencephalogram characteristic information with high discrimination from the electroencephalogram signals of the user, removes irrelevant components in the electroencephalogram signals, and has good anti-noise effect;
4) according to the method, after the particle swarm algorithm is used for optimizing the fusion coefficient of the physiological signal characteristic vectors, a pair of multi-support vector machines are used for carrying out classification training on emotion labels, so that the emotion recognition precision is improved;
5) the method provided by the invention can be used for analyzing the emotion generated by the touch behavior in the virtual scene of the user in real time to obtain the feedback information of the user to the experience object, is good in accuracy, can be widely used for virtual experience of various new things, and is high in experience degree, time-saving, labor-saving and cost-saving.
Drawings
The invention is further illustrated by the following figures and examples.
Fig. 1 is a schematic structural diagram of a virtual scene experience system according to an embodiment of the present invention.
FIG. 2 is a diagram of a skin touch simulation module according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a skin-touch simulation module worn on a hand of a user according to an embodiment of the present invention.
Fig. 4 is a schematic flow chart illustrating a process of analyzing and obtaining a user emotion according to an embodiment of the present invention.
Detailed Description
Example one
As shown in fig. 1, the virtual scene experience system with tactile simulation and emotion perception comprises a scene computer 1, a display terminal 3 and a tactile simulation and feedback device 2 which are respectively connected with the scene computer in a communication way, wherein the tactile simulation and feedback device 2 is worn on the hand of a user and comprises a microprocessor 201, a second communication module 202, a positioning module 205, a skin tactile simulation module 203, a memory 206 and a gyroscope sensor 207 which are respectively connected with the tactile simulation and feedback device; the physiological signal module 208 comprises a blood pressure sensor, an electromyographic sensor and an electroencephalographic sensor which are respectively connected with the microprocessor 201. The second communication module 202 is in communication connection with the communication module of the scene computer 1 via a wireless network. The scene computation 1 accurately tracks the motion trajectory and pose of the user's hand using the positioning module 205 and the gyro sensor 207 of the haptic simulation and feedback device 2.
As shown in fig. 2 and 3, the skin touch simulation module 203 includes a fiber cloth 2031 worn on a limb of a user and a circular touch sheet 2036 arranged in a hole of the fiber cloth, a circular limit ring 2032 fixedly connected with the fiber cloth is arranged outside the touch sheet 2036, 4 connection points uniformly distributed on the outer ring of the touch sheet 2036 are respectively connected with the ends of the fiber ropes 2033 uniformly distributed along the limit ring 2032, the fiber ropes 2033 are fixedly connected with a micro piston 2034 in a tubule 2035 far away from the end of the touch sheet, the tubule 2035 is connected with a tubule joint 2041 of the micro hydraulic module 204, the tubule 2035 is filled with hydraulic fluid, the hydraulic fluid in the tubule in different directions around the touch sheet 2036 has different pressures under the driving of the micro hydraulic module, so that the fiber ropes connected with the pistons in the tubules in the different directions generate different pulling forces, the touch sheet 2036 is under the action of the pulling forces with different magnitudes of the fiber ropes in different directions, a vertical pressure and a horizontal shear force are generated to the skin of the user body in contact therewith. The skin tactile sensation simulation module 203 generates different pressure and shearing force to the skin contacted by the skin tactile sensation simulation module according to the scene displayed by the display terminal 3 and the limb position of the user, so as to simulate the tactile sensation of the object in the scene touched by the user.
In the embodiment, the display terminal 3 employs a VR headset.
The method of the virtual scene experience system comprises the following steps:
step 5.1: collecting electroencephalogram, electromyogram and blood pressure signals of different users under different emotions, and extracting an electromyogram feature vector of physiological signal featuresx 2Blood pressure feature vectorx 3Extracting the characteristic vector of the brain electricity by adopting a common space mode algorithmx 1Obtaining a physiological signal feature vector set X = [ 2 ] of the user x 1, x 2, x 3]Manually marking the emotion label of the physiological signal characteristic vector of each trial;
step 5.2: initializing the fusion coefficient matrix A = [ = ], [a 1,a 2,a 3]For each trial, the EEG feature vectorx 1Myoelectric feature vectorx 2Blood pressure feature vectorx 3Performing fusion to obtain a fusion characteristic vector matrixv=[a 1 x 1, a 2 x 2, a 3 x 3]Taking the characteristic value of the fusion coefficient matrix as particles of the particle swarm, and optimizing the fusion coefficient matrix by utilizing a particle swarm optimization algorithm, so that the fusion value of the physiological signal characteristic vector group calculated by the fusion coefficient matrix is adaptive to the emotion label marked by the fusion coefficient matrix, and a primarily optimized fusion coefficient matrix is obtained;
step 5.3: fusing the user physiological signal characteristics acquired in the step 5.1 by utilizing the fusion coefficient matrix A to obtain a fusion characteristic vector matrix of each test time, and constructing a training data set by combining an emotion label manually marked on the fusion characteristic vector matrix;
step 5.4: training the training data set by adopting a pair of multi-support vector machines, calculating the fitness, updating the fusion coefficient matrix A, stopping training after reaching the preset fitness, and obtaining a plurality of training network models with emotion labels;
step 5.5: extracting a characteristic vector group from physiological signals of a user collected in real time, obtaining a fusion characteristic vector matrix by using a fusion coefficient matrix A, training the fusion characteristic matrix by using a training network model, classifying by using a pair of multi-support vector machines, matching a classification result with emotion labels, and taking the emotion label with the best matching degree as an emotion judgment result of the user.
The co-space mode algorithm of step 5.1 refers to the co-space mode algorithm disclosed in "Spatial patterns underlying position differences in the background EEG" paper Zoltan J. et al published by Brain Topograph 2, 1990.
The Particle Swarm optimization algorithm of step 5.2 refers to the Particle Swarm optimization algorithm disclosed in the IEEE 2007 Swarm Intelligent conference (Swarm Symposium) article "Particle Swarm optimization".
The support vector machine of step 5.3 refers to the one-to-many support vector machine disclosed in "A compliance of methods for multiclass support vector machines" by Hsu Chih-Wei et al published in IEEE transactions on neural networks, 13 th 2002.
Example two
Skin touch simulation device, including wearing the fibre cloth 2031 on user's limbs and the tactile sensation piece 2036 of arranging in the fibre cloth hole, tactile sensation piece 2036 outside is equipped with the spacing ring 2032 with fibre cloth fixed connection, a plurality of junctures all around of tactile sensation piece 2036 are connected with the end along the fibre rope 2033 of spacing ring 2032 evenly distributed, fibre rope 2033 is far away from tactile sensation piece end and miniature piston 2034 fixed connection in the tubule, tubule 2035 is connected with hydraulic pressure mechanism, be full of hydraulic fluid in the tubule 2035, under the drive of hydraulic pressure mechanism, the hydraulic fluid has different pressures in the tubule of different directions all around tactile sensation piece 2036, make the fibre rope of different directions and the tubule in piston connection produce different pulling forces, tactile sensation piece 2036 produces pressure and shearing force to the user's body skin rather than the contact under the effect of the different size of the fibre rope of different directions.
The embodiment selects the automobile field for application, and when a user interacts with a virtual reality scene, the skin touch simulation device can provide accurate touch simulation, so that the user can have an immersive experience. The experience feeling is very close to the experience feeling generated when a user really touches a certain object in reality, and the experience feeling of the user on any object on the visual and tactile layers can be reflected by acquiring the physiological signals of the experience process of the user in real time. For example, when a user wants to experience the texture and hand feeling of a car seat or an interior, the user only needs to wear a terminal display platform, such as a VR head display, and then wears the skin touch simulation device provided by the present invention, and the experience can be completed in a virtual scene without the need of experience in person to the field in reality; the physiological signal change of the user in the experience process can be analyzed and the preference degree of the user on the automobile seat, the interior trim texture and the hand feeling can be obtained through the physiological signal module provided by the invention. For example, the car seat material built in the virtual scene is fabric, and the material that the user prefers is leather, so that the user will have a certain feeling of conflict after the tactile feedback device simulates the tactile sensation of the fabric to the user. After the emotion of the user experience is collected and analyzed, the method can help manufacturers to improve the design.

Claims (7)

1. Virtual scene experience system with tactile simulation and emotion perception, characterized in that, it comprises a scene computer (1) and a display terminal (3) and a tactile simulation and feedback device (2) respectively connected with it in communication, said tactile simulation and feedback device (2) is worn on the user's limb, it comprises a microprocessor (201) and a second communication module (202), a positioning module (205) and a skin tactile simulation module (203) respectively connected with it; the skin touch simulation module (203) generates different pressure and shearing force on the skin contacted with the skin touch simulation module according to the scene displayed by the display terminal and the limb position of the user, and simulates the touch sense of an object in the user touch scene;
the skin touch simulation module (203) comprises fiber cloth (2031) worn on a user limb and a touch sheet (2036) arranged in a fiber cloth hole, a limiting ring (2032) fixedly connected with the fiber cloth is arranged outside the touch sheet (2036), a plurality of connecting points around the touch sheet (2036) are connected with the tail ends of fiber ropes (2033) uniformly distributed along the limiting ring (2032), the end of the fiber rope (2033) far away from the touch sheet is fixedly connected with a micro piston (2034) in a tubule, the tubule (2035) is connected with a hydraulic mechanism, the tubule (2035) is filled with hydraulic fluid, the hydraulic fluid in tubules in different directions around the touch sheet (2036) has different pressures under the drive of the hydraulic mechanism, so that the fiber ropes connected with the pistons in the tubules in different directions generate different pulling forces, and the touch sheet (2036) is under the action of the pulling forces of different sizes of the fiber ropes in different directions, creating pressure and shear forces on the skin of the user's body in contact therewith.
2. Virtual scene experience system with tactile simulation and emotional perception according to claim 1, characterized in that said tactile simulation and feedback means (2) further comprise a physiological signal module (208), the physiological signal module (208) comprising a blood pressure sensor, an electromyography sensor and an electroencephalography sensor, respectively connected to a microprocessor.
3. Virtual scene experience system with haptic simulation and emotional perception according to claim 1, characterized in that the haptic simulation and feedback means (2) further comprise a micro hydraulic module (204), the thin tube (2035) being connected to a thin tube joint (2041) of the micro hydraulic module.
4. Virtual scene experience system with tactile simulation and emotional perception according to claim 1, characterized in that said tactile simulation and feedback means (2) further comprise a gyroscopic sensor (207) connected to a microprocessor.
5. A virtual scene experience system with tactile simulation and emotional perception according to claim 1, wherein the display terminal is a VR headset.
6. The method of a virtual scene experience system according to any of the claims 2-5, comprising the steps of:
step 1: the scene computer displays a virtual scene to a user through a display terminal, and the user wears a touch simulation and feedback device to interact with the scene computer;
step 2: a scene computer captures touch behaviors of a user in a virtual scene;
and step 3: according to the touch object of the user, the scene computer controls a touch simulation and feedback device to generate touch experience on the user body;
and 4, step 4: a physiological signal module of the touch simulation and feedback device acquires a physiological signal of a user;
and 5: and judging the emotion of the user according to the physiological signals of the user acquired in real time.
7. A skin touch simulation device is characterized by comprising a fiber cloth (2031) worn on a user limb and a touch sheet (2036) arranged in a fiber cloth hole, wherein a limit ring (2032) fixedly connected with the fiber cloth is arranged outside the touch sheet (2036), a plurality of connecting points around the touch sheet (2036) are connected with the tail ends of fiber ropes (2033) uniformly distributed along the limit ring (2032), the end of the fiber rope (2033) far away from the touch sheet is fixedly connected with a micro piston (2034) in a tubule, the tubule (2035) is connected with a hydraulic mechanism, the tubule (2035) is filled with hydraulic fluid, under the driving of the hydraulic mechanism, the hydraulic fluid in the tubules in different directions around the touch sheet (2036) has different pressures, so that the fiber ropes connected with the pistons in the tubules in different directions generate different pulling forces, and the touch sheet (2036) is under the action of the pulling forces of the fiber ropes in different directions, creating pressure and shear forces on the skin of the user's body in contact therewith.
CN202110400345.6A 2021-04-14 2021-04-14 Virtual scene experience system and method with haptic simulation and emotional perception Active CN113190114B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110400345.6A CN113190114B (en) 2021-04-14 2021-04-14 Virtual scene experience system and method with haptic simulation and emotional perception
CN202210380616.0A CN114779930A (en) 2021-04-14 2021-04-14 Emotion recognition method for VR user touch experience based on one-to-many support vector machines

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110400345.6A CN113190114B (en) 2021-04-14 2021-04-14 Virtual scene experience system and method with haptic simulation and emotional perception

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210380616.0A Division CN114779930A (en) 2021-04-14 2021-04-14 Emotion recognition method for VR user touch experience based on one-to-many support vector machines

Publications (2)

Publication Number Publication Date
CN113190114A CN113190114A (en) 2021-07-30
CN113190114B true CN113190114B (en) 2022-05-20

Family

ID=76974082

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210380616.0A Pending CN114779930A (en) 2021-04-14 2021-04-14 Emotion recognition method for VR user touch experience based on one-to-many support vector machines
CN202110400345.6A Active CN113190114B (en) 2021-04-14 2021-04-14 Virtual scene experience system and method with haptic simulation and emotional perception

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210380616.0A Pending CN114779930A (en) 2021-04-14 2021-04-14 Emotion recognition method for VR user touch experience based on one-to-many support vector machines

Country Status (1)

Country Link
CN (2) CN114779930A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115202471A (en) * 2022-06-21 2022-10-18 京东方科技集团股份有限公司 Whole body posture tracking and touch equipment and virtual reality system
CN116909392B (en) * 2023-06-30 2024-01-26 青岛登云智上科技有限公司 Wearable and perceivable interaction system and interaction method
CN117289796A (en) * 2023-09-22 2023-12-26 中山大学 High-interaction mixed reality system and method for complex equipment based on haptic glove

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN211577846U (en) * 2019-10-11 2020-09-25 安徽建筑大学 Tactile feedback glove and VR (virtual reality) equipment assembly with same

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3574554B2 (en) * 1997-11-18 2004-10-06 独立行政法人 科学技術振興機構 Tactile presentation method and device
KR100900310B1 (en) * 2006-12-05 2009-06-02 한국전자통신연구원 Tactile and Visual Display Device
WO2009018330A2 (en) * 2007-07-30 2009-02-05 University Of Utah Research Foundation Shear tactile display system for communicating direction and other tactile cues
CN102713546B (en) * 2009-10-14 2014-07-09 国立大学法人东北大学 Sheet-like tactile sensor system
KR20170060696A (en) * 2015-11-25 2017-06-02 인하대학교 산학협력단 A method for producing transparent film tactile device based on nanocellulose
CN106108894A (en) * 2016-07-18 2016-11-16 天津大学 A kind of emotion electroencephalogramrecognition recognition method improving Emotion identification model time robustness
US9741216B1 (en) * 2016-10-14 2017-08-22 Oculus Vr, Llc Skin stretch instrument
CN106448339A (en) * 2016-10-19 2017-02-22 华南理工大学 Driving training system based on enhanced reality and biofeedback
CN107422841B (en) * 2017-03-03 2020-03-20 杭州市第一人民医院 Man-machine interaction method based on non-contact emotion recognition
KR20230074849A (en) * 2017-06-29 2023-05-31 애플 인크. Finger-mounted device with sensors and haptics
US10915174B1 (en) * 2017-07-20 2021-02-09 Apple Inc. Electronic devices with directional haptic output
US10617942B2 (en) * 2017-12-29 2020-04-14 Microsoft Technology Licensing, Llc Controller with haptic feedback
CN112154401A (en) * 2018-05-09 2020-12-29 苹果公司 Finger attachment device with fabric
CN109145513B (en) * 2018-09-30 2023-11-07 南京航空航天大学 Non-contact type force touch sense reproduction system and method based on electromagnetic field combined excitation control
US11550395B2 (en) * 2019-01-04 2023-01-10 Ultrahaptics Ip Ltd Mid-air haptic textures
US20210081048A1 (en) * 2019-09-11 2021-03-18 Facebook Technologies, Llc Artificial reality devices, including haptic devices and coupling sensors
CN111782034A (en) * 2020-06-10 2020-10-16 华南理工大学 Novel electromagnetic touch simulation feedback device and method based on linear motor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN211577846U (en) * 2019-10-11 2020-09-25 安徽建筑大学 Tactile feedback glove and VR (virtual reality) equipment assembly with same

Also Published As

Publication number Publication date
CN113190114A (en) 2021-07-30
CN114779930A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
CN113190114B (en) Virtual scene experience system and method with haptic simulation and emotional perception
CN110070944B (en) Social function assessment training system based on virtual environment and virtual roles
Yang et al. Gesture interaction in virtual reality
CA2882968C (en) Facilitating generation of autonomous control information
Zacharatos et al. Automatic emotion recognition based on body movement analysis: a survey
Rezazadeh et al. Using affective human–machine interface to increase the operation performance in virtual construction crane training system: A novel approach
JP2022501718A (en) Human / computer interface with fast and accurate tracking of user interactions
KR101548156B1 (en) A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same
CN106445168A (en) Intelligent gloves and using method thereof
JP2021511567A (en) Brain-computer interface with adaptation for fast, accurate, and intuitive user interaction
CN107589782A (en) Method and apparatus for the ability of posture control interface of wearable device
CN103955269A (en) Intelligent glass brain-computer interface method based on virtual real environment
CN104997581A (en) Artificial hand control method and apparatus for driving EEG signals on the basis of facial expressions
Rechy-Ramirez et al. Impact of commercial sensors in human computer interaction: a review
Abbasi-Asl et al. Brain-computer interface in virtual reality
Cutipa-Puma et al. A low-cost robotic hand prosthesis with apparent haptic sense controlled by electroencephalographic signals
Hu et al. Navigation in virtual and real environment using brain computer interface: a progress report
Kwon et al. Electromyography-based decoding of dexterous, in-hand manipulation of objects: Comparing task execution in real world and virtual reality
Wu et al. Omnidirectional mobile robot control based on mixed reality and semg signals
CN113035000A (en) Virtual reality training system for central integrated rehabilitation therapy technology
Longo et al. Using brain-computer interface to control an avatar in a virtual reality environment
CN111459276A (en) Motion capture glove of virtual human hand reality system and virtual reality system
CN116301335A (en) Haptic feedback system integrating physiological signal acquisition and operation method
Grandi et al. Digital Manufacturing and Virtual Reality for Tractors' Human-Centred Design
CN113138668B (en) Automatic driving wheelchair destination selection method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant