CN113761947A - Virtual simulation multi-person interactive system - Google Patents
Virtual simulation multi-person interactive system Download PDFInfo
- Publication number
- CN113761947A CN113761947A CN202010509995.XA CN202010509995A CN113761947A CN 113761947 A CN113761947 A CN 113761947A CN 202010509995 A CN202010509995 A CN 202010509995A CN 113761947 A CN113761947 A CN 113761947A
- Authority
- CN
- China
- Prior art keywords
- module
- electrically connected
- output end
- input end
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title claims description 16
- 238000004088 simulation Methods 0.000 claims abstract description 13
- 230000003993 interaction Effects 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims description 17
- 238000013519 translation Methods 0.000 claims description 17
- 230000002457 bidirectional effect Effects 0.000 claims description 15
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 230000009471 action Effects 0.000 abstract description 7
- 210000000988 bone and bone Anatomy 0.000 abstract description 7
- 230000008859 change Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 6
- 238000000034 method Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 210000002478 hand joint Anatomy 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/40—Processing or translation of natural language
- G06F40/58—Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/28—Constructional details of speech recognition systems
- G10L15/30—Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
- H04L67/025—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computational Linguistics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a virtual simulation multi-person interaction system which comprises a single-person client system, a server, a room system and a CS (circuit switched) framework, wherein the output end of the single-person client system is electrically connected with the input end of the server. The virtual simulation multi-person interaction system can collect and use the actions by arranging the simulation equipment in the single-person client system, can calculate the motion of bones and hands by matching with a computing system, sends the motion to a PC (personal computer) through a receiver, finally sends a signal to Hmd, and then obtains position information by using a Tracker and a base station module, so that a model in the virtual simulation system obtains world coordinates and can move in the world, and finally, data obtained in the single-person client system is transmitted to a user receiving module unit in a room system through a server, so that other users can obtain the change condition of a user model, and interaction among multiple persons and multi-person action synchronization can be carried out.
Description
Technical Field
The invention relates to the technical field of interactive systems, in particular to a virtual simulation multi-person interactive system.
Background
Virtual simulation is actually a computer system that can create and experience a virtual world, which does not exist physically, but exists in the sensory world of the experiencer, which can be perceived by vision, hearing, touch, force sense, which can be actually realized, which can be difficult to realize in practice, or which cannot be realized at all, and "virtual" means computer-generated meaning, so that virtual reality refers to a special environment generated by a computer, into which a person can "project" himself by using various special devices, and operate and control the environment to achieve a special purpose, i.e. the person is the master of the environment.
The existing virtual simulation system is basically operated by one person, the operation and the action of other people cannot be observed, so that the virtual simulation system has larger limitation, interaction among multiple persons cannot be carried out, and communication among multiple users are difficult to carry out in the virtual simulation system, so that the practicability of the virtual simulation system is lower.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a virtual simulation multi-person interaction system, which solves the problems that the operation and the action of other people cannot be observed in the conventional virtual simulation system, and a plurality of people are difficult to communicate in the virtual simulation system.
(II) technical scheme
In order to achieve the purpose, the invention is realized by the following technical scheme: a virtual simulation multi-person interaction system comprises a single client system, a server, a room system and a CS framework, wherein the output end of the single client system is electrically connected with the input end of the server, the output end of the server is electrically connected with the input end of the room system, the single client system comprises a Tracker, a base station module, an assignment system, an Hmd, a PC, a voice system, a simulation device, a computing system and a receiver, the Tracker and the base station module are in bidirectional connection, the base station module and the assignment system are in bidirectional connection, the assignment system and Hmd are in bidirectional connection, the PC and Hmd are in bidirectional connection, the output end of the simulation device is electrically connected with the input end of the computing system, the output end of the computing system and the input end of the receiver are in bidirectional connection, and the output end of the receiver and the input end of the PC are in electric connection, and the output end of the voice system is electrically connected with the input end of the PC.
Preferably, the simulation device comprises a mobile capture device and a data glove, and the base station module comprises a first base station and a second base station.
Preferably, the computing system comprises an application program, a central processing unit, a data assignment module and an IK computing module, wherein an output end of the application program is electrically connected with an input end of the central processing unit, an output end of the central processing unit is electrically connected with an input end of the data assignment module, and an output end of the data assignment module is electrically connected with an input end of the IK computing module.
Preferably, the voice system comprises a microphone module, a voice recognition module, a selection module, a voice conversion module and a translation module, wherein the output end of the microphone module is electrically connected with the input end of the selection module, the output end of the selection module is electrically connected with the input end of the voice recognition module, and the output end of the voice conversion module is electrically connected with the input end of the translation module.
Preferably, the output end of the speech recognition module is electrically connected with the input end of the PC, and the output end of the translation module is electrically connected with the input end of the PC.
Preferably, the CS architecture includes a client, a plurality of clients are disposed inside the client, and an output end of the client is electrically connected to an input end of the server.
Preferably, the room system includes a user acceptance module, and the user acceptance module is provided in plurality.
Preferably, the assignment system comprises a processing module, a conveying module and a feedback module, wherein the output end of the processing module is electrically connected with the input end of the conveying module, and the output end of the conveying module is electrically connected with the input end of the feedback module.
(III) advantageous effects
The invention provides a virtual simulation multi-person interactive system. Compared with the prior art, the method has the following beneficial effects:
(1) the virtual simulation multi-person interactive system is electrically connected with the input end of a server through the output end of a single client system, the output end of the server is electrically connected with the input end of a room system, the single client system comprises a Tracker, a base station module, an assignment system, Hmd, a PC, a voice system, an analog device, a computing system and a receiver, the Tracker is bidirectionally connected with the base station module, the base station module is bidirectionally connected with the assignment system, the assignment system is bidirectionally connected with Hmd, the PC is bidirectionally connected with Hmd, the output end of the analog device is electrically connected with the input end of the computing system, the output end of the computing system is bidirectionally connected with the input end of the receiver, the output end of the receiver is electrically connected with the input end of the PC, the output end of the voice system is electrically connected with the input end of the PC, and the analog device is arranged in the single client system, the motion of using can gather, the motion that can cooperate the computing system to calculate skeleton and hand is sent to the PC through the receiver and is sent the signal to Hmd at last, reuse Tracker and base station module and obtain positional information, thus make the model in the virtual simulation system obtain the world coordinate, and can move in the world, transmit the data that obtains in the single client system to the user in the room system through the server finally and receive the module unit, thus can make other users obtain the change situation of user's model, thus can carry out interaction and many people's motion synchronization between the many people.
(2) The virtual simulation multi-user interactive system comprises a microphone module, a voice recognition module, a selection module, a voice conversion module and a translation module through a voice system, wherein the output end of the microphone module is electrically connected with the input end of the voice recognition module, voice is electrically connected with the output end of the selection module through the output end of the module, the output end of the voice conversion module is electrically connected with the input end of the translation module, a user can record voice information of the user through the microphone module in cooperation with the voice recognition module and transmit the voice information to a PC (personal computer) end, the voice can be converted into characters through the voice conversion module and the translation module, dialects or foreign languages can be translated through the translation module and transmitted to the PC end, and finally the translated to a user receiving module unit in a room system through a server, and communication among multiple users can be further enhanced through the structure, thereby enhancing the practicability thereof.
Drawings
FIG. 1 is a functional block diagram of a single-person client system of the present invention;
FIG. 2 is a functional block diagram of the interactive system architecture of the present invention;
FIG. 3 is a block diagram of the operation of the speech system of the present invention;
FIG. 4 is a functional block diagram of the computing system of the present invention;
FIG. 5 is a functional block diagram of a valuation system of the present invention;
fig. 6 is a block diagram illustrating the operation of the CS architecture of the present invention.
In the figure, 1-single person client system, 11-Tracker, 12-base station module, 121-first base station, 122-second base station, 13-assignment system, 131-processing module, 132-delivery module, 133-feedback module, 14-Hmd, 15-PC, 16-voice system, 161-microphone module, 162-voice recognition module, 163-selection module, 164-voice conversion module, 165-translation module, 17-simulation device, 171-motion capture device, 172-data glove, 18-computing system, 181-application, 182-central processor, 183-IK computing module, 184-data assignment module, 19-receiver, 2-server, 3-room system, 4-CS architecture, 41-client.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-6, an embodiment of the present invention provides a technical solution: a virtual simulation multi-person interactive system comprises a single-person client system 1, a server 2, a room system 3 and a CS framework 4, wherein the output end of the single-person client system 1 is electrically connected with the input end of the server 2, the output end of the server 2 is electrically connected with the input end of the room system 3, the single-person client system 1 comprises Tracker11, a base station module 12, an assignment system 13, Hmd14, a PC15, a voice system 16, a simulation device 17, a computing system 18 and a receiver 19, data obtained in the receiver 19 is rendered through the PC15 and presented in Hmd14, Hmd14 is a head-mounted display which is not described in the prior art, a HTC VIVE Tracker2.0 Tracker is adopted in Tracker11, wireless and delay-free connection can be established between an accessory to be added and a ViVR, the Tracker11 is positioned through the base station module 12, obtaining position information, assigning the position information to a model, so that the model obtains world coordinates and can move in the world, wherein Tracker11 is in bidirectional connection with a base station module 12, the base station module 12 is in bidirectional connection with an assignment system 13, the assignment system 13 is in bidirectional connection with Hmd14, a PC15 is in bidirectional connection with Hmd14, an output end of a simulation device 17 is electrically connected with an input end of a computing system 18, an output end of the computing system 18 is in bidirectional connection with an input end of a receiver 19, an output end of the receiver 19 is electrically connected with an input end of a PC15, an output end of a voice system 16 is electrically connected with an input end of a PC15, the simulation device 17 comprises a dynamic capture device 171 and a data glove 172, the dynamic capture device 171 adopts a posture sensor which is internally provided with a triaxial accelerometer, a gyroscope and a triaxial magnetometer and adopts an autonomous nine-axis data fusion algorithm, the high-precision posture data can be output, original data such as acceleration, angular velocity, magnetic force values and the like are provided for a user, multi-window display, data curve display, multiple calibration modes and data forwarding are achieved, a data glove 172 adopts a high-performance 9-axis MEMS inertial sensor to collect motion data of each hand joint in real time, skeletal motion is restored through reverberation dynamics, reproduction of real hand motion can be achieved in a virtual scene, a vibration feedback module is arranged in a palm, vibration effects are triggered according to different scenes, the experience immersion is more real, the glove adopts 2.4GHz wireless transmission to achieve a high frame rate low-delay transmission effect within 10ms and above a single hand 120Hz, a base station module 12 comprises a first base station 121 and a second base station 122, the first base station 121 and the second base station 122 adopt HTC Vive base stations, the base station Steama tracking technology is 2.0, and a play area of 6 meters at maximum is supported, 150 degrees in the horizontal market and 110 degrees in the vertical market, wherein the computing system 18 comprises an application program 181, a central processing unit 182, a data assignment module 13 and an IK calculation module 183, the simulation device 17 obtains motion information and transmits the motion information to the application program 181, the application program 181 further calculates bone motion data, then assigns the data to bones in a corresponding model, and drives whole bones on the model to move through the IK calculation module 183, an output end of the application program 181 is electrically connected with an input end of the central processing unit 182, an output end of the central processing unit 182 is electrically connected with an input end of the data assignment module 13, an output end of the data assignment module 13 is electrically connected with an input end of the IK calculation module 183, the voice system 16 comprises a microphone module 161, a voice recognition module 162, a selection module 163, a voice conversion module 164 and a translation module 165, and a user uses the microphone module 161, the voice information of the user can be recorded by matching with the voice recognition module 162 and is transmitted to the PC15, the voice can be converted into characters by utilizing the voice conversion module 164 and the translation module 165, dialect or foreign language can be translated by the translation module 165 and is transmitted to the PC15, and finally the characters are transmitted to the user receiving module unit in the room system by the server 2, the communication and the communication among a plurality of people can be further enhanced by the structure, so that the practicability of the structure is enhanced, the output end of the microphone module 161 is electrically connected with the input end of the selection module 163, the output end of the selection module 163 is electrically connected with the input end of the voice recognition module 162, the output end of the voice conversion module 164 is electrically connected with the input end of the translation module 165, the output end of the voice recognition module 162 is electrically connected with the input end of the PC15, and the output end of the translation module 165 is electrically connected with the input end of the PC15, the CS architecture 4 includes a client 41, a plurality of clients are disposed inside the client 41, an output end of the client 41 is electrically connected to an input end of the server 2, the room system 3 includes a plurality of user accepting modules, the assigning system 13 includes a processing module 131, a conveying module 132 and a feedback module 133, an output end of the processing module 131 is electrically connected to an input end of the conveying module 132, an output end of the conveying module 132 is electrically connected to an input end of the feedback module 133, and the content not described in detail in this specification belongs to the prior art known to those skilled in the art.
When the virtual simulation system is used, a user carries the motion capture device 171 and the data glove 172, then the virtual simulation system is started, the simulation device 17 obtains motion information of the user and transmits the motion information to an application program, then the application program 181 in the computing system 18 calculates motion information of bones, data is assigned to corresponding bone upper diseases through the data assignment module 13, the whole body bones are driven to move through the IK calculation module 183, the data is transmitted to the PC15 end through the receiver 19, the PC15 renders a model with body motion, hand motion and position information and displays the model in Hmd14, the Tracker11 locates through the first base station 121 and the second base station 122 in the base station module 12, the position information is obtained through the processing module 131 in the assignment system 13, the information is transmitted to Hmd14 through the transmission module 132, and the voice information of the user records voice of the user through the microphone module 161, and selects whether to transmit in voice or text form through the selection module 163, and when it is selected to transmit in voice form, the data is transmitted to the PC15 through the voice recognition module 162, and when it is selected to transmit in text form, the data is converted into text through the voice conversion module 164, and is translated into desired text through the translation module 165, and is transmitted to the PC15, and finally, the data is transmitted to a plurality of user acceptance units in the room system 3 through the server 2.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (8)
1. A virtual simulation multi-person interaction system comprises a single-person client system (1), a server (2), a room system (3) and a CS framework (4), and is characterized in that: the output end of the single client system (1) is electrically connected with the input end of the server (2), the output end of the server (2) is electrically connected with the input end of the room system (3), the single client system (1) comprises a Tracker (11), a base station module (12), an assignment system (13), Hmd (14), a PC (15), a voice system (16), a simulation device (17), a computing system (18) and a receiver (19), the Tracker (11) and the base station module (12) are in bidirectional connection, the base station module (12) and the assignment system (13) are in bidirectional connection, the assignment system (13) and the Hmd (14) are in bidirectional connection, the PC (15) and the Hmd (14) are in bidirectional connection, the output end of the simulation device (17) is electrically connected with the input end of the computing system (18), the output end of the computing system (18) and the input end of the receiver (19) are in bidirectional connection, the output end of the receiver (19) is electrically connected with the input end of the PC (15), and the output end of the voice system (16) is electrically connected with the input end of the PC (15).
2. The virtual reality multi-person interactive system of claim 1, wherein: the simulation device (17) comprises a mobile capture device (171) and a data glove (172), and the base station module (12) comprises a first base station (121) and a second base station (122).
3. The virtual reality multi-person interactive system of claim 1, wherein: the computing system (18) comprises an application program (181), a central processing unit (182), a data assignment module (184) and an IK computing module (183), wherein an output end of the application program (181) is electrically connected with an input end of the central processing unit (182), an output end of the central processing unit (182) is electrically connected with an input end of the data assignment module (184), and an output end of the data assignment module (184) is electrically connected with an input end of the IK computing module (183).
4. The virtual reality multi-person interactive system of claim 1, wherein: the voice system (16) comprises a microphone module (161), a voice recognition module (162), a selection module (163), a voice conversion module (164) and a translation module (165), wherein the output end of the microphone module (161) is electrically connected with the input end of the selection module (163), the output end of the selection module (163) is electrically connected with the input end of the voice recognition module (162), and the output end of the voice conversion module (164) is electrically connected with the input end of the translation module (165).
5. The virtual reality multi-person interactive system of claim 4, wherein: the output end of the voice recognition module (162) is electrically connected with the input end of the PC (15), and the output end of the translation module (165) is electrically connected with the input end of the PC (15).
6. The virtual reality multi-person interactive system of claim 1, wherein: the CS framework (4) comprises a client (41), a plurality of clients are arranged inside the client (41), and the output end of the client (41) is electrically connected with the input end of the server (2).
7. The virtual reality multi-person interactive system of claim 1, wherein: the room system (3) includes a user acceptance module, and the user acceptance module is provided in plurality.
8. The virtual reality multi-person interactive system of claim 1, wherein: the assignment system (13) comprises a processing module (131), a conveying module (132) and a feedback module (133), wherein the output end of the processing module (131) is electrically connected with the input end of the conveying module (132), and the output end of the conveying module (132) is electrically connected with the input end of the feedback module (133).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010509995.XA CN113761947A (en) | 2020-06-04 | 2020-06-04 | Virtual simulation multi-person interactive system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010509995.XA CN113761947A (en) | 2020-06-04 | 2020-06-04 | Virtual simulation multi-person interactive system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113761947A true CN113761947A (en) | 2021-12-07 |
Family
ID=78785269
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010509995.XA Pending CN113761947A (en) | 2020-06-04 | 2020-06-04 | Virtual simulation multi-person interactive system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113761947A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117115861A (en) * | 2023-10-19 | 2023-11-24 | 四川弘和数智集团有限公司 | Glove detection method and device, electronic equipment and storage medium |
-
2020
- 2020-06-04 CN CN202010509995.XA patent/CN113761947A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117115861A (en) * | 2023-10-19 | 2023-11-24 | 四川弘和数智集团有限公司 | Glove detection method and device, electronic equipment and storage medium |
CN117115861B (en) * | 2023-10-19 | 2024-01-26 | 四川弘和数智集团有限公司 | Glove detection method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7366196B2 (en) | Widespread simultaneous remote digital presentation world | |
CN104699247B (en) | A kind of virtual reality interactive system and method based on machine vision | |
Biocca | Virtual reality technology: A tutorial | |
EP2915025B1 (en) | Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing | |
WO2020203999A1 (en) | Communication assistance system, communication assistance method, and image control program | |
US20160225188A1 (en) | Virtual-reality presentation volume within which human participants freely move while experiencing a virtual environment | |
US20150070274A1 (en) | Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements | |
CN103197757A (en) | Immersion type virtual reality system and implementation method thereof | |
CN107656505A (en) | Use the methods, devices and systems of augmented reality equipment control man-machine collaboration | |
EP2499550A1 (en) | Avatar-based virtual collaborative assistance | |
WO2022188022A1 (en) | Hearing-based perception system and method for using same | |
WO2019087564A1 (en) | Information processing device, information processing method, and program | |
CN114708408A (en) | Experience system for virtual reality and meta-universe scene building in water | |
CN116572260A (en) | Emotion communication accompanying and nursing robot system based on artificial intelligence generated content | |
CN113761947A (en) | Virtual simulation multi-person interactive system | |
WO2017061890A1 (en) | Wireless full body motion control sensor | |
CN111103974B (en) | Immersive virtual reality system for multi-directional movement of upper limbs | |
JPWO2018135057A1 (en) | Information processing apparatus, information processing method, and program | |
CN109426336A (en) | A kind of virtual reality auxiliary type selecting equipment | |
CN107783639A (en) | Virtual reality leisure learning system | |
CN108459716B (en) | Method for realizing multi-person cooperation to complete task in VR | |
CN115097939A (en) | Large-space multi-user VR interactive experience system and method | |
Sakamoto et al. | Human interaction issues in a digital-physical hybrid world | |
CN205581780U (en) | Adjustable virtual dynamic interaction system of novel focus of adjustable focus | |
CN109508071A (en) | The application method that virtual reality display is combined with kinect |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |