CN113268139A - Virtual experiment-oriented natural interaction method and system for virtual and real objects - Google Patents

Virtual experiment-oriented natural interaction method and system for virtual and real objects Download PDF

Info

Publication number
CN113268139A
CN113268139A CN202110402087.5A CN202110402087A CN113268139A CN 113268139 A CN113268139 A CN 113268139A CN 202110402087 A CN202110402087 A CN 202110402087A CN 113268139 A CN113268139 A CN 113268139A
Authority
CN
China
Prior art keywords
virtual
hand
target object
force
elastic unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110402087.5A
Other languages
Chinese (zh)
Inventor
许胡宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan University
Original Assignee
Foshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan University filed Critical Foshan University
Priority to CN202110402087.5A priority Critical patent/CN113268139A/en
Publication of CN113268139A publication Critical patent/CN113268139A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Abstract

The invention discloses a virtual experiment-oriented natural interaction method and system for virtual and real objects, wherein the method comprises the following steps: constructing a virtual model, wherein a virtual hand, a target object and a virtual elastic unit are arranged in the virtual model; the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, wherein the elastic force generated by the virtual elastic unit is increased along with the reduction of the distance between the virtual hand and the target object; acquiring a collision point of the virtual hand and the target object, and the elastic force generated by the virtual elastic unit when the virtual hand collides with the target object, and calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated by the virtual elastic unit; and feeding back the gripping force and the friction force to the handheld device. The method provides a more real and natural interaction technology through a physical simulation mode, the virtual hands generated by the students in the virtual world can also simulate the friction force and the holding force, and the immersion and the sense of reality of the students in the virtual experiment are increased.

Description

Virtual experiment-oriented natural interaction method and system for virtual and real objects
Technical Field
The invention relates to the technical field of virtual reality, in particular to a virtual-real object natural interaction method and system for virtual experiments.
Background
With the rapid development of computer hardware and software technologies, the combination of virtual reality technology and natural interaction technology has become an important research direction, and has become popular with many technologies such as computer graphics man-machine interface technology, multimedia technology and sensing technology. Virtual reality technology (VR) mainly includes aspects of simulating environment, perception, natural skills, sensing equipment and the like. The simulated environment is a three-dimensional realistic image generated by a computer and dynamic in real time. Perception means that an ideal VR should have the perception that everyone has. In addition to the visual perception generated by computer graphics technology, there are also perceptions such as auditory sensation, tactile sensation, force sensation, and movement, and even olfactory sensation and taste sensation, which are also called multi-perception. The virtual reality technology is combined with a plurality of fields while being developed, the virtual experiment is a product combined with the education field, and in the virtual experiment, the natural interaction technology provides natural interaction approaching to the real world, so that the combination of the natural interaction technology and the virtual experiment is very important. Some existing interaction technologies only guarantee the grabbing effect, but do not really restore the grabbing condition of the hands, and lack the sense of reality.
Disclosure of Invention
The present invention is directed to at least solving the problems of the prior art. Therefore, the invention provides a virtual-real object natural interaction method and system for virtual experiments.
The invention provides a virtual-experiment-oriented natural interaction method for virtual and real objects, which is applied to a computer terminal and comprises the following steps:
constructing a virtual model, wherein a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is formed by mapping a motion track of a real hand acquired by a handheld device, the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, and the elastic force generated by the virtual elastic unit is increased as the distance between the virtual hand and the target object is reduced;
acquiring a collision point of the virtual hand and the target object, and calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated by the virtual elastic unit at the collision point;
and feeding back the gripping force and the friction force to the handheld device.
According to the embodiment of the invention, at least the following technical effects are achieved:
the method comprises the steps that a virtual model is built, a virtual hand, a target object and a virtual elastic unit located between the virtual hand and the target object are arranged in the virtual model, and the virtual elastic unit can obtain elastic force generated when the virtual hand moves to a collision point towards the target object; then calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated at the collision point; and finally, feeding back the gripping force and the friction force to the handheld device. Compared with other natural interaction methods, the method provides a more real and natural interaction technology in a physical simulation mode; according to the method, in a virtual-real fusion mode, the student controls the object in the virtual experiment in the real world through the real hand of the student, and the virtual hand generated in the virtual world can simulate the friction force and the holding force, so that the immersion and the reality of the student in the virtual experiment are improved.
According to some embodiments of the invention, the virtual model is built by Unity 3D.
According to some embodiments of the invention, the handheld device is a Leap motion somatosensory recognizer.
According to some embodiments of the invention, the virtual elastic unit is a virtual spring.
The invention provides a virtual experiment-oriented natural interaction method for virtual and real objects, which is applied to a handheld device and comprises the following steps:
acquiring a motion track of a real hand, and sending the motion track to a computer terminal so as to enable the computer terminal to construct a virtual model, wherein a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is mapped by the motion trail, and the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, wherein the elastic force generated by the virtual elastic unit is increased along with the reduction of the distance between the virtual hand and the target object; enabling the computer terminal to obtain a collision point of the virtual hand and the target object, and enabling the computer terminal to calculate the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated by the virtual elastic unit at the collision point;
and receiving the grabbing force and the friction force fed back by the computer terminal.
According to the embodiment of the invention, at least the following technical effects are achieved:
the method comprises the steps that a virtual model is built, a virtual hand, a target object and a virtual elastic unit located between the virtual hand and the target object are arranged in the virtual model, and the virtual elastic unit can obtain elastic force generated when the virtual hand moves to a collision point towards the target object; then calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated at the collision point; and finally, feeding back the gripping force and the friction force to the handheld device. Compared with other natural interaction methods, the method provides a more real and natural interaction technology in a physical simulation mode; according to the method, in a virtual-real fusion mode, the student controls the object in the virtual experiment in the real world through the real hand of the student, and the virtual hand generated in the virtual world can simulate the friction force and the holding force, so that the immersion and the reality of the student in the virtual experiment are improved.
In a third aspect of the present invention, a virtual-real object natural interaction system for virtual experiments is provided, which includes:
the computer terminal is used for constructing a virtual model, and a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is formed by mapping a motion track of a real hand acquired by a handheld device, the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, the elastic force generated by the virtual elastic unit is increased along with the reduction of the distance between the virtual hand and the target object, the virtual elastic unit is used for acquiring a collision point of the virtual hand and the target object, and calculating the grabbing force and the friction force of the virtual hand for grabbing the target object according to the elastic force generated by the virtual elastic unit at the collision point and feeding back the grabbing force and the friction force to the handheld device;
and the handheld device is used for acquiring the motion trail of the real hand and receiving the grabbing force and the friction force fed back by the computer terminal.
According to the embodiment of the invention, at least the following technical effects are achieved:
the system comprises a virtual model, wherein a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model, and the virtual elastic unit can obtain the elastic force generated when the virtual hand moves to a collision point towards the target object; then calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated at the collision point; and finally, feeding back the gripping force and the friction force to the handheld device. Compared with other natural interaction systems, the system provides a more real and natural interaction technology in a physical simulation mode; this system is through the mode that virtual reality fuses, and the student controls the object in the virtual experiment in the real world through the real hand of oneself, and the virtual hand that generates in the virtual world can also carry out frictional force and hold the emulation of holding power moreover, has increased the sense of immersion and the sense of reality of student when carrying out the virtual experiment.
According to some embodiments of the invention, the computer terminal constructs the virtual model by Unity 3D.
According to some embodiments of the invention, the handheld device is a Leap motion somatosensory recognizer.
According to some embodiments of the invention, the virtual elastic unit is a virtual spring.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a virtual-real object natural interaction method for a virtual experiment according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "axial", "radial", "circumferential", and the like, indicate orientations and positional relationships based on the orientations and positional relationships shown in the drawings, and are used merely for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the device or element so referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, should not be construed as limiting the present invention. Furthermore, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Referring to fig. 1, an embodiment of the present invention provides a virtual-experiment-oriented natural interaction method for virtual and real objects, including the following steps:
s100, a computer terminal constructs a virtual model, wherein a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is formed by mapping the motion trail of the real hand acquired by the handheld device, and the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, wherein the elastic force generated by the virtual elastic unit is increased along with the reduction of the distance between the virtual hand and the target object.
The method comprises the steps of firstly constructing a virtual model through a computer terminal, then acquiring the motion trail of a real hand of a human body through a Leap motion somatosensory recognizer, sending the acquired motion trail to the computer terminal to construct the virtual model, and mapping the motion trail of the virtual hand and the motion trail of the virtual hand in a virtual space.
S200, the computer terminal obtains a collision point of the virtual hand and the target object, and the grabbing force and the friction force of the virtual hand for grabbing the target object are calculated through the elastic force generated by the virtual elastic unit at the collision point.
When the human body looks at the position between the virtual hand and the target object through the virtual glasses (note that the virtual glasses are common knowledge in the art, the invention does not describe the virtual glasses in detail), the moving gesture is moved towards the direction of the target object, the virtual elastic unit obtains the elastic force generated when the virtual hand moves towards the target object, the elastic force is increased along with the reduction of the distance between the virtual hand and the target object, and when the virtual hand collides with the target object, the elastic force obtained by the virtual elastic unit is maximum, and the energy sum in the whole virtual model is almost minimum. The gripping force and the friction force are calculated by the component force of the elastic force generated by the virtual elastic unit. Meanwhile, the elasticity is fed back to the real hand of the human body through the handheld device (note that in the whole moving process, the elasticity is fed back to the real hand of the human body through the handheld device), and the real hand can obtain the real mutual inductance in the whole moving process.
And S300, feeding back the grabbing force and the friction force to the handheld device by the computer terminal.
And finally, the gripping force and the friction force are fed back to the handheld device, and the handheld device feeds back to the real hand of the human body, so that the friction force and the gripping force of the real hand of the human body are simulated, and the immersion and the reality of students in the virtual experiment are improved.
The method comprises the steps that a virtual model is built, a virtual hand, a target object and a virtual elastic unit located between the virtual hand and the target object are arranged in the virtual model, and the virtual elastic unit can obtain elastic force generated when the virtual hand moves to a collision point towards the target object; then calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated at the collision point; and finally, feeding back the gripping force and the friction force to the handheld device. Compared with other natural interaction methods, the method provides a more real and natural interaction technology in a physical simulation mode; according to the method, in a virtual-real fusion mode, the student controls the object in the virtual experiment in the real world through the real hand of the student, and the virtual hand generated in the virtual world can simulate the friction force and the holding force, so that the immersion and the reality of the student in the virtual experiment are improved.
As an alternative embodiment, the computer terminal constructs the virtual model through Unity 3D. The handheld device is a Leap motion somatosensory recognizer. The virtual elastic unit is a virtual spring, so that the construction cost for constructing a virtual model is reduced.
One embodiment of the present invention provides a virtual-real object natural interaction system for virtual experiments, which includes a computer terminal and a handheld device, wherein:
the computer terminal is used for constructing a virtual model through Unity 3D, and a virtual hand, a target object and a virtual spring positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is formed by mapping a motion track of a real hand acquired by the Leap motion somatosensory recognizer, the virtual spring can generate elastic force when the virtual hand moves towards a target object, the elastic force generated by the virtual spring is increased along with the reduction of the distance between the virtual hand and the target object and is used for acquiring a collision point of the virtual hand and the target object, the grabbing force and the friction force of the virtual hand for grabbing the target object are calculated through the elastic force generated by the virtual spring at the collision point, and the grabbing force and the friction force are fed back to the Leap motion somatosensory recognizer;
the Leap motion somatosensory recognizer is used for collecting the motion trail of a real hand and receiving the grabbing force and the friction force fed back by the computer terminal.
The system constructs a virtual model through a computer terminal, a virtual hand, a target object and a virtual spring positioned between the virtual hand and the target object are arranged in the virtual model, and the virtual spring can obtain the elastic force generated when the virtual hand moves to a collision point towards the target object; then calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated at the collision point; and finally, feeding back the grabbing force and the friction force to the Leap motion somatosensory recognizer. Compared with other natural interaction systems, the system provides a more real and natural interaction technology in a physical simulation mode; this system is through the mode that virtual reality fuses, and the student controls the object in the virtual experiment in the real world through the real hand of oneself, and the virtual hand that generates in the virtual world can also carry out frictional force and hold the emulation of holding power moreover, has increased the sense of immersion and the sense of reality of student when carrying out the virtual experiment.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (9)

1. A virtual-real object natural interaction method for virtual experiments is characterized by being applied to a computer terminal and comprising the following steps:
constructing a virtual model, wherein a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is formed by mapping a motion track of a real hand acquired by a handheld device, the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, and the elastic force generated by the virtual elastic unit is increased as the distance between the virtual hand and the target object is reduced;
acquiring a collision point of the virtual hand and the target object, and calculating the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated by the virtual elastic unit at the collision point;
and feeding back the gripping force and the friction force to the handheld device.
2. The virtual experiment-oriented natural object interaction method of claim 1, wherein the virtual model is constructed by Unity 3D.
3. The virtual-experiment-oriented natural object interaction method as claimed in claim 1, wherein the handheld device is a Leap motion somatosensory recognizer.
4. The virtual experiment-oriented natural interaction method for virtual and real objects as claimed in claim 1, wherein the virtual elastic unit is a virtual spring.
5. A virtual-real object natural interaction method for virtual experiments is characterized by being applied to a handheld device and comprising the following steps:
acquiring a motion track of a real hand, and sending the motion track to a computer terminal so as to enable the computer terminal to construct a virtual model, wherein a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is mapped by the motion trail, and the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, wherein the elastic force generated by the virtual elastic unit is increased along with the reduction of the distance between the virtual hand and the target object; enabling the computer terminal to obtain a collision point of the virtual hand and the target object, and enabling the computer terminal to calculate the grabbing force and the friction force of the virtual hand for grabbing the target object through the elastic force generated by the virtual elastic unit at the collision point;
and receiving the grabbing force and the friction force fed back by the computer terminal.
6. A virtual-real object natural interaction system for virtual experiments is characterized by comprising:
the computer terminal is used for constructing a virtual model, and a virtual hand, a target object and a virtual elastic unit positioned between the virtual hand and the target object are arranged in the virtual model; the virtual hand is formed by mapping a motion track of a real hand acquired by a handheld device, the virtual elastic unit can generate elastic force when the virtual hand moves towards the target object, the elastic force generated by the virtual elastic unit is increased along with the reduction of the distance between the virtual hand and the target object, the virtual elastic unit is used for acquiring a collision point of the virtual hand and the target object, and calculating the grabbing force and the friction force of the virtual hand for grabbing the target object according to the elastic force generated by the virtual elastic unit at the collision point and feeding back the grabbing force and the friction force to the handheld device;
and the handheld device is used for acquiring the motion trail of the real hand and receiving the grabbing force and the friction force fed back by the computer terminal.
7. The virtual experiment-oriented virtual-real object natural interaction system of claim 6, wherein the computer terminal constructs the virtual model through Unity 3D.
8. The virtual-experiment-oriented virtual-real object natural interaction system of claim 6, wherein the handheld device is a Leap motion somatosensory recognizer.
9. The virtual-experiment-oriented virtual-real object natural interaction system of claim 6, wherein the virtual elastic unit is a virtual spring.
CN202110402087.5A 2021-04-14 2021-04-14 Virtual experiment-oriented natural interaction method and system for virtual and real objects Pending CN113268139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110402087.5A CN113268139A (en) 2021-04-14 2021-04-14 Virtual experiment-oriented natural interaction method and system for virtual and real objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110402087.5A CN113268139A (en) 2021-04-14 2021-04-14 Virtual experiment-oriented natural interaction method and system for virtual and real objects

Publications (1)

Publication Number Publication Date
CN113268139A true CN113268139A (en) 2021-08-17

Family

ID=77228904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110402087.5A Pending CN113268139A (en) 2021-04-14 2021-04-14 Virtual experiment-oriented natural interaction method and system for virtual and real objects

Country Status (1)

Country Link
CN (1) CN113268139A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663197A (en) * 2012-04-18 2012-09-12 天津大学 Virtual hand grasp simulating method based on motion capture
WO2015102484A1 (en) * 2013-12-31 2015-07-09 Vrest B.V. Method for generating a real-time haptic feedback signal for a haptic device of a virtual surgery simulator and virtual surgery simulator
CN105913718A (en) * 2016-07-08 2016-08-31 哈尔滨理工大学 Thread lift plastic surgery simulation system
CN109471521A (en) * 2018-09-05 2019-03-15 华东计算技术研究所(中国电子科技集团公司第三十二研究所) Virtual and real shielding interaction method and system in AR environment
CN110931121A (en) * 2019-11-29 2020-03-27 重庆邮电大学 Remote operation guiding device based on Hololens and operation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663197A (en) * 2012-04-18 2012-09-12 天津大学 Virtual hand grasp simulating method based on motion capture
WO2015102484A1 (en) * 2013-12-31 2015-07-09 Vrest B.V. Method for generating a real-time haptic feedback signal for a haptic device of a virtual surgery simulator and virtual surgery simulator
CN105913718A (en) * 2016-07-08 2016-08-31 哈尔滨理工大学 Thread lift plastic surgery simulation system
CN109471521A (en) * 2018-09-05 2019-03-15 华东计算技术研究所(中国电子科技集团公司第三十二研究所) Virtual and real shielding interaction method and system in AR environment
CN110931121A (en) * 2019-11-29 2020-03-27 重庆邮电大学 Remote operation guiding device based on Hololens and operation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周月飞;张秀山;袁凯;: "一种基于弹簧模型的作用力视觉反馈方法", 计算机技术与发展, no. 10, 10 October 2009 (2009-10-10), pages 230 - 233 *

Similar Documents

Publication Publication Date Title
Mihelj et al. Virtual reality technology and applications
Tzafestas Intelligent Systems, Control and Automation: Science and Engineering
Tzovaras et al. Design and implementation of haptic virtual environments for the training of the visually impaired
Zhang et al. The application of virtual reality technology in physical education teaching and training
CN107403566A (en) Educational system using virtual robot
CN105159448A (en) Multi-person same-scene immersion type virtual reality apparatus
Vengust et al. NERVteh compact motion based driving simulator
Huang et al. Expressive body animation pipeline for virtual agent
Jacobson et al. Multi-modal virtual reality for presenting geographic information
WO2023019376A1 (en) Tactile sensing system and method for using same
Field et al. Generalised algorithms for redirected walking in virtual environments
Serrano et al. Insertion of real agents behaviors in CARLA autonomous driving simulator
CN113268139A (en) Virtual experiment-oriented natural interaction method and system for virtual and real objects
Zhao et al. Demonstration of enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation
Devine et al. " HapticVive"-a point contact encounter haptic solution with the HTC VIVE and Baxter robot
Lê et al. A concept for a virtual reality driving simulation in combination with a real car
Liang et al. A VR-based calligraphy writing system with force reflection
CN106371574A (en) Tactile feedback method and apparatus, and virtual reality interaction system
CN112989449B (en) Haptic force feedback simulation interaction method and device for optimizing motion stiffness
Krishnaswamy et al. Do you see what I see? effects of pov on spatial relation specifications
Suzuki et al. A proposal on a haptic learning-science simulator for visually impaired students
Esen et al. Bone drilling medical training system
Gunn Using haptics in a networked immersive 3D environment
Esen et al. A multi-user virtual training system concept and objective assessment of trainings
Murai et al. An indoor-walk-guide simulator using a haptic interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination