CN116679834A - Large-space multi-person VR interactive experience system and method - Google Patents

Large-space multi-person VR interactive experience system and method Download PDF

Info

Publication number
CN116679834A
CN116679834A CN202310964245.5A CN202310964245A CN116679834A CN 116679834 A CN116679834 A CN 116679834A CN 202310964245 A CN202310964245 A CN 202310964245A CN 116679834 A CN116679834 A CN 116679834A
Authority
CN
China
Prior art keywords
user
interactive
positioning
real
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310964245.5A
Other languages
Chinese (zh)
Other versions
CN116679834B (en
Inventor
罗龙辉
黄小捷
余小强
宁原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Dazang Technology Co ltd
Original Assignee
Nanchang Dazang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Dazang Technology Co ltd filed Critical Nanchang Dazang Technology Co ltd
Priority to CN202310964245.5A priority Critical patent/CN116679834B/en
Publication of CN116679834A publication Critical patent/CN116679834A/en
Application granted granted Critical
Publication of CN116679834B publication Critical patent/CN116679834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • H04W84/12WLAN [Wireless Local Area Networks]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of multi-person VR interaction, and discloses a large-space multi-person VR interaction experience system and method.

Description

Large-space multi-person VR interactive experience system and method
Technical Field
The application relates to the technical field of multi-person VR interaction, in particular to a large-space multi-person VR interaction experience system and method.
Background
The virtual reality game enters an interactive virtual field game scene through the fact that the user wears the virtual reality equipment, so that the current scene can be virtualized, virtual interaction can be realized, and the user can feel the immersive game experience.
VR interaction is generally that dress VR head shows and handheld VR interactive handle, interactive handle can simulate more article, for example gun in the recreation, sword fan, racket etc. but the interaction of handle is generally through cooperation interaction such as handle itself and camera, the delay is great, simulation precision is not high, and multiuser's interaction produces the interference easily, and it is inconvenient and accurate to the specific location of user in experience space, through the mode of formation of image, because the deviation will appear in the reason location of hardware or software, it is accurate inadequately to exchange, and be blocked each other easily between user or the handle, and need a large amount of high-difficulty algorithm to cooperate the location, increase the degree of difficulty to the development of VR interaction, increase the load of system operation, thereby can't make more people participate in simultaneously.
In order to solve the problems, the application provides a large-space multi-person VR interactive experience system and a large-space multi-person VR interactive experience method.
Disclosure of Invention
The application aims to provide a large-space multi-person VR interactive experience system and method, which are used for solving the problems of inconvenient interactive tracking and inconvenient positioning in the prior art.
In order to achieve the above purpose, the present application provides the following technical solutions:
a large-space multi-person VR interactive experience system comprises a VR equipment unit, an interactive tracking unit, a multi-person real-time positioning unit and a computing unit;
the VR equipment unit comprises VR head display equipment and a WiFi receiving device;
the interaction tracking unit comprises interaction handles interacted with a plurality of VR and a plurality of photoelectric distance sensing tracking devices which are uniformly arranged at the top in the experience space, the uniformly arranged photoelectric distance sensing tracking devices perform real-time scanning on the interaction handles used by different users in real time, the interaction of user actions is performed by real-time follow-up of the positioning of the interaction handles, and data information of the real-time interaction actions of the users is obtained and transmitted to the computing unit;
the multi-user real-time positioning unit is used for performing in-site real-time positioning on the classified users, acquiring the real-time position information of the users and transmitting the information to the computing unit;
the computing unit is used for conducting the motion and interaction of the user in the VR scene to be displayed in a VR head display worn by the target user through the WiFi transmission device after computation according to the real-time positioning information of multiple persons and the information of interaction tracking.
Further, the photoelectric distance sensing tracking device comprises a plurality of diffuse reflection photoelectric switches, a plurality of positioning patches and a fixed suspended ceiling;
further, a plurality of the positioning patches are uniformly adhered to each interaction handle and are grouped, a plurality of the diffuse reflection photoelectric switches are closely arranged and are fixed in the fixed suspended ceiling in an ordered mode, the sensing direction is vertical to the ground of the experience space, the diffuse reflection photoelectric switches detect the positioning of the patches on the handles so as to calculate and simulate the azimuth and the interaction angle of the interaction handles, and a plurality of the diffuse reflection photoelectric switches detect different patches on the handles so as to simulate the inclination, the vertical, the horizontal and the overturning situations of the interaction handles.
Further, the diffuse reflection photoelectric switches form two-dimensional positioning through arrangement and sequence, and the paving direction of the diffuse reflection photoelectric switches can be divided into an X axis and a Y axis so that after a plurality of diffuse reflection photoelectric switches are arranged, each diffuse reflection photoelectric switch forms a coordinate point for interactive positioning.
Further, the calculating unit is further provided with a WiFi transmission device, the calculating unit is used for detecting and receiving signals of the diffuse reflection photoelectric switch through the diffuse reflection photoelectric switch according to movement of the user handheld interaction handle, the calculating unit receives the signals of the diffuse reflection photoelectric switch and transmits the signals to the VR head display device through WiFi, and interaction actions are displayed in the VR head display device for the user.
Further, the real-time positioning unit for multiple persons comprises a plurality of proximity switches which are uniformly arranged in the ground of the experience space, an ultrasonic sensor and an inertial sensor which are correspondingly combined and installed with the proximity switches, the plurality of proximity switches are closely arranged and sequenced on the ground of the fixed experience space, when the positioning system of the proximity switches and the positioning system of the ultrasonic sensor are temporarily broken, the position tracking is continued through the inertial sensor positioning system until the positioning of the proximity switches and the positioning of the ultrasonic sensor are restored, and the two systems are dynamically balanced through a fusion algorithm.
Furthermore, the proximity switches form two-dimensional positioning through arrangement and sequence, and the laying direction of the proximity switches can be divided into an X axis and a Y axis, so that after a plurality of proximity switches are arranged, each proximity switch forms a coordinate point for positioning a user.
Further, the ultrasonic sensor comprises a transmitting end and a plurality of groups of reflecting ends, the plurality of groups of reflecting ends are detachably arranged on the sole of a user, the bottoms of the plurality of groups of reflecting ends are provided with protrusions and depressions with different shapes and used for distinguishing the user, after the transmitting end transmits ultrasonic waves, the ground shapes of the reflecting ends are different, reflected signals are different, and then the user is confirmed and classified according to the shapes, the user is positioned according to the sequence of the proximity switch through a user pedal proximity switch, and the transmitting end matched with the proximity switch is used for detecting specific users, so that the specific positioning of distinguishing the user is realized.
Furthermore, the computing unit is used for receiving signals of the proximity switch and the ultrasonic sensor according to the fact that a user steps on the proximity switch and the signals of the proximity switch and the ultrasonic sensor are differentiated through ultrasonic waves, the computing unit receives signals of the diffuse reflection photoelectric switch and transmits the signals to the VR head display device through WiFi, and movement and positioning of the user are displayed in the VR head display.
The method for multi-person VR interactive experience in a large space is also provided, and comprises the following steps:
the user wears VR equipment, holds an interactive handle, acquires and transmits information of interactive actions of the user, wears two identical reflecting ends of the ultrasonic sensor on the sole of each user, and acquires and transmits real-time positions and position changes of the user in the middle of experience;
according to information of user interaction, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in a VR head display, and after different reflection patches on an interaction handle are classified, different groups of reflection patches respectively feed back interactions of different users in an experience space to simulation of the VR scene in real time;
according to the real-time movement of the user, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in VR head display, and the combination positioning of a proximity switch and an ultrasonic sensor and the matching tracking of an inertial sensor form real-time movement tracking of a plurality of users, and reflecting ends of different groups of different users reflect different signals to receive data of the VR virtual scene and display the data to the target user;
according to interactive tracking and real-time position positioning of the user in the experience space, the collected and tidied data are transmitted to a WiFi receiving device on VR head display equipment of the target user through a WiFi transmission device to be received, and the data are displayed to the user in the VR head display, so that interaction of a plurality of users in the experience space is realized.
Compared with the prior art, the application has the beneficial effects that:
the application provides a large-space multi-person VR interactive experience system and method, wherein the interaction action delay of an interaction handle is small, the simulation precision is high, the interaction of multiple users cannot be interfered, the specific positioning of users in the experience space is convenient and accurate, the positioning deviation is reduced, the precision of the interaction is improved, the users or handles cannot be blocked with each other, a large number of high-difficulty algorithms are not needed for matching the positioning, the development difficulty of VR interaction is reduced, the load of system operation is reduced, more people can participate simultaneously conveniently, the VR experience of the users is improved, various actions and interactions in the experience space are more accurate, the mutual interaction of multiple users can be improved, and the user obtains more immersive VR experience.
Drawings
FIG. 1 is a schematic diagram of a large-space multi-person VR interactive experience system;
FIG. 2 is a schematic diagram of the whole assembly of a large-space multi-person VR interactive experience system of the present application;
FIG. 3 is a schematic workflow diagram of a large-space multi-person VR interactive experience system in accordance with the present application;
in the figure:
1. a VR device unit; 11. VR head display equipment; 12. a WiFi receiving device; 2. an interaction tracking unit; 21. an interactive handle; 22. photoelectric distance sensing tracking device; 221. diffuse reflection photoelectric switch; 222. a site; 3. a multi-person real-time positioning unit; 31. a proximity switch; 32. an ultrasonic sensor; 4. and a calculation unit.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments.
Example 1
Referring to fig. 1 to 3, the present application provides a technical solution: a large-space multi-person VR interactive experience system and method comprises a VR equipment unit 1, an interactive tracking unit 2, a multi-person real-time positioning unit 3 and a computing unit 4.
VR equipment unit 1 includes VR head display equipment 11 and wiFi receiving arrangement 12, and VR equipment unit 1 is including being applicable to the VR head display of different users to wear-type VR glasses is the core, through liquid crystal panel and lens with image display in the human eye sight scope, through visual difference simulation depth sense, reinforcing immersion sense, and wiFi receiving arrangement 12 is used for receiving the data simulation that computing element 4 collected and handled and feeds back to VR simulation scene through wiFi transmission device.
The interactive tracking unit 2 comprises an interactive handle 21 interacted with a plurality of VR and a plurality of photoelectric distance sensing tracking devices 22 uniformly arranged at the top in the experience space, the plurality of photoelectric distance sensing tracking devices 22 uniformly arranged perform real-time scanning on the interactive handles 21 used by different users, the interaction of user actions is performed by real-time following the positioning of the interactive handles 21, data information of the real-time interaction actions of the users is obtained and transmitted to the computing unit 4, the interactive tracking unit 2 comprises the interactive handles 21 and a tracking part thereof, the interaction actions are performed by the users by holding the interactive handles 21, the simulated effect of the interactive handles 21 can be more accurately positioned through an interactive tracking system on the interactive handles, such as a racket, a sword fan and an umbrella object in a simulated game, the interaction actions of the interactive handles 21 and the interactive handles 21 in a VR simulated scene can be more accurately tracked and positioned, and the interaction actions of the interactive handles 21 among different users in the VR simulated scene can be more accurately tracked and positioned.
The multi-user real-time positioning unit 3 is used for carrying out in-situ real-time positioning on the user classified by arrangement, acquiring real-time position information of the user and transmitting the information to the computing unit 4, and through the multi-user real-time positioning unit 3, the user can be positioned more quickly, the user can conveniently move and interact with the user, and the multi-user real-time positioning unit is directly triggered, so that accurate positioning is formed, the accurate matching of the user position and VR simulation scene in the experience space is improved, the accuracy of the interaction is improved, the complex operation is removed by positioning in a triggering mode, the stability and the efficiency of system operation are improved, the software development difficulty is reduced, and the calculation complexity between the actual user and the VR scene is reduced.
The computing unit 4 is configured to conduct motion and interaction of a user in a VR scene through a WiFi transmission device according to real-time positioning information and interaction tracking information of multiple persons, display the motion and interaction in a VR head display worn by a target user to the user through calculation, connect the WiFi transmission device with the WiFi receiving device 12 through an intranet, and transmit the motion and interaction through a 5G signal, so that delay of user movement and interaction is reduced.
The electro-optic distance sensing tracking device 22 includes a number of diffuse reflective electro-optic switches 221, a number of site 222, and a suspended ceiling.
A plurality of positioning patches 222 are uniformly adhered to each interactive handle 21 and are grouped, a plurality of diffuse reflection photoelectric switches 221 are closely arranged and are fixed in a suspended ceiling in sequence, the sensing direction is vertical to the ground of the experience space, the diffuse reflection photoelectric switches 221 detect the positioning of the patches on the handles so as to calculate and simulate the azimuth and the interaction angle of the interactive handles 21, and a plurality of diffuse reflection photoelectric switches 221 detect different patches on the handles so as to simulate the situations of inclination, vertical, horizontal, overturning and the like of the interactive handles 21.
The diffuse reflection photoelectric switches 221 form two-dimensional positioning through arrangement and sequence, and the paving direction of the diffuse reflection photoelectric switches 221 can be divided into an X axis and a Y axis, so that after a plurality of diffuse reflection photoelectric switches 221 are arranged, each diffuse reflection photoelectric switch 221 forms a coordinate point for interactive positioning.
In combination with the above, the paving direction of the diffuse reflection photoelectric switch 221 can be divided into an X axis and a Y axis, so that the fixed coordinate point formed by each diffuse reflection photoelectric switch 221 is (0, 0), (0, 1), (1, 1) … …, until the suspended ceiling fully covering the whole experience space is paved, when the interactive handle 21 is held to perform interactive actions, the diffuse reflection photoelectric switch 221 transmits signals, then reflects the signals to the diffuse reflection photoelectric switch 221 through the positioning patch 222, and transmits the signals to the computing unit 4 through the signal line, so that the specific position and angle of the interactive handle 21 are positioned, the specific positioning and the capturing and simulation of the interactive actions are performed through the computing unit 4 by matching with the interactive handle 21, and the feedback is performed in the VR head display, so that the accuracy of the interaction is improved, the interaction between the user and the VR scene is facilitated, and the interaction operation between the user and the user is improved, and the interaction experience of the device is improved.
It should be noted that, because people have a volume and hold the handle, users carry the handle and can not produce unit collision between other users and the handle thereof, and then make interactive tracking can not interfere with each other, and then can carry out interactive tracking to more users as far as possible according to the operational capability and the hardware performance of the system itself.
The calculating unit 4 is further provided with a WiFi transmission device, the calculating unit 4 is used for detecting and receiving signals of the diffuse reflection photoelectric switch 221 through the diffuse reflection photoelectric switch 221 according to movement of the user handheld interaction handle 21, the calculating unit 4 receives the signals of the diffuse reflection photoelectric switch 221 and transmits the signals to the VR head display device 11 through WiFi, interaction actions are displayed in the VR head display for the user, the calculating unit 4 is convenient to transmit the data set, calculation, arrangement and simulation of the multi-user real-time positioning unit 3 and the interaction tracking unit 2 through the WiFi transmission device, the VR data are transmitted through the WiFi transmission device, the VR head display is convenient to transmit to a target user through the WiFi receiving device 12, and transmission of internal network and 5G signals is convenient, and transmission efficiency is improved.
The multi-user real-time positioning unit 3 comprises a plurality of proximity switches 31 which are uniformly arranged in the ground of the experience space, an ultrasonic sensor 32 and an inertial sensor which are correspondingly combined and installed with the proximity switches 31, the plurality of proximity switches 31 are closely arranged and sequenced on the ground of the fixed experience space, when the positioning system combining the proximity switches 31 and the ultrasonic sensor 32 is temporarily broken, the position tracking is continued through the inertial sensor positioning system until the positioning of the proximity switches 31 and the positioning of the ultrasonic sensor 32 are restored, and the two systems are dynamically balanced through a fusion algorithm.
The proximity switches 31 are arranged and sequenced to form two-dimensional positioning, and the laying direction of the proximity switches 31 can be divided into an X axis and a Y axis, so that after a plurality of proximity switches 31 are arranged, each proximity switch 31 forms a coordinate point for positioning a user.
The ultrasonic sensor 32 includes a transmitting end and a plurality of groups of reflecting ends, the plurality of groups of reflecting ends are detachably mounted on the sole of the user, the bottom surfaces of the plurality of groups of reflecting ends are provided with protrusions and depressions with different shapes for distinguishing the users, after the transmitting end transmits ultrasonic waves, the ground shapes of the reflecting ends are different, reflected signals are different, and then the user is confirmed and classified according to the shapes, the user is positioned according to the sequence of the proximity switch 31 through the user pedal proximity switch 31, and the transmitting end matched with the proximity switch 31 is used for detecting specific users, so that the specific positioning of distinguishing the users is realized.
The laying direction of the proximity switches 31 can be divided into an X axis and a Y axis, so that the fixed coordinate point positions formed by each proximity switch 31 are (0, 0), (0, 1), (1, 1) … … until the whole ground of the experience space is laid, the same user wears the same group of reflecting ends, when the reflecting ends are treaded on the ground of the experience space, the proximity switch 31 close to the reflecting ends is triggered firstly, the user is positioned by combining the coordinate points, and when the user moves, the position tracking is continuously performed through the inertial sensor positioning system until the positioning of the proximity switch 31 and the positioning of the ultrasonic sensor 32 are restored, the two systems are dynamically balanced through a fusion algorithm, and the ultrasonic wave transmitting terminal transmits signals and reflects through the reflecting terminal, distinguish users, different groups of reflecting terminal bottom surfaces can have different bulges and hollows, the bulges and hollows are the same with the reflecting terminal bottom surfaces of the same group, the bulges and hollows of different groups can be triangular, circular, square and the like, the sizes of the bulges and hollows can be changed to distinguish, the signals reflected by the reflecting terminal are different, the transmitting terminal receives different calculating units 4 for transmitting the reflected signals, and the calculating units are used for distinguishing different users, can expand the number of users for increasing the experience space, and further improve VR and interactive experience thereof.
The computing unit 4 is configured to receive signals of the proximity switch 31 and the ultrasonic sensor 32 according to the user's foot pedal of the proximity switch 31 and through ultrasonic distinction, and the computing unit 4 receives signals of the diffuse reflection photoelectric switch 221 and transmits the signals to the VR head display device 11 through WiFi, and the movement and positioning of the user are displayed in the VR head display.
Referring to fig. 1-3, corresponding to the above embodiment 1, this embodiment further provides a large-space multi-person VR interactive experience method, which includes:
the user wears VR equipment, holds the interactive handle 21, acquires and transmits information of interactive actions of the user, and each user sole wears two identical reflecting ends of the ultrasonic sensor 32 to acquire and transmit real-time positions and position changes of the user in the experience middle;
according to the information of the user interaction, the generated VR virtual scene is transmitted to a WiFi receiving device 12 on VR head display equipment 11 of a target user through a WiFi transmitting device to be received, and is displayed to the user in a VR head display, and after different reflection patches on an interaction handle 21 are classified, the different groups of reflection patches respectively feed back interactions of different users in an experience space to simulation of the VR scene in real time;
according to the real-time movement of the user, the generated VR virtual scene is transmitted to a WiFi receiving device 12 on VR head display equipment 11 of the target user through a WiFi transmitting device to be received, and is displayed to the user in the VR head display, the combination positioning of a proximity switch 31 and an ultrasonic sensor 32 and the matching tracking of an inertial sensor form the real-time movement tracking of a plurality of users, and the reflecting ends of different groups of different users reflect different signals to receive the data of the VR virtual scene and display the data to the target user;
according to interactive tracking and real-time position positioning of the user in the experience space, the collected and tidied data are transmitted to a WiFi receiving device 12 on VR head display equipment 11 of the target user through a WiFi transmission device to be received, and the received data are displayed to the user in the VR head display, so that interaction of a plurality of users in the experience space is realized.
All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.

Claims (10)

1. The large-space multi-person VR interactive experience system is characterized by comprising a VR equipment unit, an interactive tracking unit, a multi-person real-time positioning unit and a computing unit;
the VR equipment unit comprises VR head display equipment and a WiFi receiving device;
the interaction tracking unit comprises interaction handles interacted with a plurality of VR and a plurality of photoelectric distance sensing tracking devices which are uniformly arranged at the top in the experience space, the uniformly arranged photoelectric distance sensing tracking devices perform real-time scanning on the interaction handles used by different users in real time, the interaction of user actions is performed by real-time follow-up of the positioning of the interaction handles, and data information of the real-time interaction actions of the users is obtained and transmitted to the computing unit;
the multi-user real-time positioning unit is used for performing in-site real-time positioning on the classified users, acquiring the real-time position information of the users and transmitting the information to the computing unit;
the computing unit is used for conducting the motion and interaction of the user in the VR scene to be displayed in a VR head display worn by the target user through the WiFi transmission device after computation according to the real-time positioning information of multiple persons and the information of interaction tracking.
2. The large-space multi-person VR interactive experience system of claim 1, wherein the optoelectronic distance sensing and tracking device comprises a plurality of diffuse reflective optoelectronic switches, a plurality of location patches, and a fixed ceiling.
3. The large-space multi-person VR interactive experience system according to claim 2, wherein a plurality of the location patches are uniformly adhered to each interactive handle and grouped, a plurality of the diffuse reflection photoelectric switches are closely arranged and sequentially fixed in a fixed suspended ceiling, the sensing directions are vertical to the ground of the experience space, the diffuse reflection photoelectric switches detect the location of the patches on the handles so as to calculate and simulate the azimuth and the interactive angle of the interactive handles, and a plurality of the diffuse reflection photoelectric switches detect different patches on the handles so as to simulate the inclination, the vertical, the horizontal and the turning scenes of the interactive handles.
4. The multi-person VR interactive experience system in a large space according to claim 3, wherein the diffuse reflection photoelectric switches are arranged and sequenced to form two-dimensional positioning, and the laying direction of the diffuse reflection photoelectric switches can be divided into an X axis and a Y axis so that each diffuse reflection photoelectric switch forms a coordinate point for interactive positioning after the diffuse reflection photoelectric switches are arranged.
5. The large-space multi-person VR interactive experience system according to claim 4, wherein the computing unit is further provided with a WiFi transmission device, the computing unit is used for detecting and receiving signals of the diffuse reflection photoelectric switch through the diffuse reflection photoelectric switch according to movement of a user holding the interactive handle, the computing unit receives the signals of the diffuse reflection photoelectric switch and transmits the signals to VR head display equipment through WiFi, and interactive actions are displayed in the VR head display for the user.
6. The multi-person VR interactive experience system of claim 1, wherein the multi-person real-time positioning unit comprises a plurality of proximity switches, ultrasonic sensors and inertial sensors, wherein the proximity switches are uniformly arranged in the ground of the experience space, the ultrasonic sensors and the inertial sensors are correspondingly combined and installed with the proximity switches, the plurality of proximity switches are closely arranged and sequenced on the ground of the fixed experience space, and when the positioning system of the proximity switches and the ultrasonic sensors is temporarily disconnected, the position tracking is continued through the inertial sensor positioning system until the positioning of the proximity switches and the positioning of the ultrasonic sensors are restored, and the two systems are dynamically balanced through a fusion algorithm.
7. The large-space multi-person VR interactive experience system according to claim 6, wherein the proximity switches are arranged and sequenced to form two-dimensional positioning, and the laying direction of the proximity switches is divided into an X axis and a Y axis, so that each proximity switch forms a coordinate point for positioning a user after the plurality of proximity switches are arranged.
8. The interactive experience system of multiple VR in large space according to claim 7, wherein the ultrasonic sensor comprises a transmitting end and multiple groups of reflecting ends, the multiple groups of reflecting ends are detachably mounted on the sole of the user, the bottoms of the multiple groups of reflecting ends are provided with protrusions and depressions in different shapes for distinguishing the users, after the transmitting ends emit ultrasonic waves, the reflected signals are different when the ground shapes of the reflecting ends are different, and then the user is confirmed and classified according to the shapes, the user is positioned according to the sequence of the proximity switch through a user pedal proximity switch, and the transmitting ends matched with the proximity switch are used for detecting specific users, so that the specific positioning of distinguishing the users is realized.
9. The large-space multi-person VR interactive experience system according to claim 8, wherein the computing unit is configured to receive signals of the proximity switch and the ultrasonic sensor according to user pedal proximity switch and through ultrasonic distinction, and the computing unit receives signals of the diffuse reflection photoelectric switch and transmits the signals to the VR head display device through WiFi, and the VR head display displays movement and positioning of the user.
10. A method for a large-space multi-person VR interactive experience, the method comprising the steps of:
the user wears VR equipment, holds an interactive handle, acquires and transmits information of interactive actions of the user, wears two identical reflecting ends of the ultrasonic sensor on the sole of each user, and acquires and transmits real-time positions and position changes of the user in the middle of experience;
according to information of user interaction, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in a VR head display, and after different reflection patches on an interaction handle are classified, different groups of reflection patches respectively feed back interactions of different users in an experience space to simulation of the VR scene in real time;
according to the real-time movement of the user, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in VR head display, and the combination positioning of a proximity switch and an ultrasonic sensor and the matching tracking of an inertial sensor form real-time movement tracking of a plurality of users, and reflecting ends of different groups of different users reflect different signals to receive data of the VR virtual scene and display the data to the target user;
according to interactive tracking and real-time position positioning of the user in the experience space, the collected and tidied data are transmitted to a WiFi receiving device on VR head display equipment of the target user through a WiFi transmission device to be received, and the data are displayed to the user in the VR head display, so that interaction of a plurality of users in the experience space is realized.
CN202310964245.5A 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method Active CN116679834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310964245.5A CN116679834B (en) 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310964245.5A CN116679834B (en) 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method

Publications (2)

Publication Number Publication Date
CN116679834A true CN116679834A (en) 2023-09-01
CN116679834B CN116679834B (en) 2023-10-24

Family

ID=87787673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310964245.5A Active CN116679834B (en) 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method

Country Status (1)

Country Link
CN (1) CN116679834B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
WO2008035827A1 (en) * 2006-09-21 2008-03-27 Jongchul Kim Pedestrian navigation method and apparatus for using geographic information system
US20130158397A1 (en) * 2010-12-27 2013-06-20 Srinivas K. Method and system to automatically load user settings to wireless ultrasound probe
CN205581785U (en) * 2016-04-15 2016-09-14 向京晶 Indoor virtual reality interactive system of many people
CN106621324A (en) * 2016-12-30 2017-05-10 当家移动绿色互联网技术集团有限公司 Interactive operation method of VR game
WO2017122194A1 (en) * 2016-01-12 2017-07-20 Karma Touch 2016 Ltd. Method and system for transmitter parameter reconfiguration based on receiver spatial information
CN206391542U (en) * 2017-01-12 2017-08-11 深圳游视虚拟现实技术有限公司 A kind of reality-virtualizing game rocket launcher
US20190005848A1 (en) * 2017-06-29 2019-01-03 Verb Surgical Inc. Virtual reality training, simulation, and collaboration in a robotic surgical system
US20190253700A1 (en) * 2018-02-15 2019-08-15 Tobii Ab Systems and methods for calibrating image sensors in wearable apparatuses
US11073906B1 (en) * 2020-01-27 2021-07-27 Valve Corporation Eye tracking system for head-mounted display devices
CN115097939A (en) * 2022-06-30 2022-09-23 沉浸(北京)科技有限公司 Large-space multi-user VR interactive experience system and method
CN116440490A (en) * 2023-04-03 2023-07-18 维沃移动通信有限公司 Interactive control terminal

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
WO2008035827A1 (en) * 2006-09-21 2008-03-27 Jongchul Kim Pedestrian navigation method and apparatus for using geographic information system
US20130158397A1 (en) * 2010-12-27 2013-06-20 Srinivas K. Method and system to automatically load user settings to wireless ultrasound probe
WO2017122194A1 (en) * 2016-01-12 2017-07-20 Karma Touch 2016 Ltd. Method and system for transmitter parameter reconfiguration based on receiver spatial information
CN205581785U (en) * 2016-04-15 2016-09-14 向京晶 Indoor virtual reality interactive system of many people
CN106621324A (en) * 2016-12-30 2017-05-10 当家移动绿色互联网技术集团有限公司 Interactive operation method of VR game
CN206391542U (en) * 2017-01-12 2017-08-11 深圳游视虚拟现实技术有限公司 A kind of reality-virtualizing game rocket launcher
US20190005848A1 (en) * 2017-06-29 2019-01-03 Verb Surgical Inc. Virtual reality training, simulation, and collaboration in a robotic surgical system
US20190253700A1 (en) * 2018-02-15 2019-08-15 Tobii Ab Systems and methods for calibrating image sensors in wearable apparatuses
US11073906B1 (en) * 2020-01-27 2021-07-27 Valve Corporation Eye tracking system for head-mounted display devices
CN115097939A (en) * 2022-06-30 2022-09-23 沉浸(北京)科技有限公司 Large-space multi-user VR interactive experience system and method
CN116440490A (en) * 2023-04-03 2023-07-18 维沃移动通信有限公司 Interactive control terminal

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HU ET AL.: "Impact of virtual reality anatomy training on ultrasound competency development: A randomized controlled trial", PLOS ONE, vol. 15, no. 11, pages 1 - 12 *
娄岩: "《虚拟现实与增强现实实用教程》", 北京:机械工业出版社, pages: 65 - 68 *
张梦欢: "基于CNN和超声传感的手势识别及辅助身份认证", 传感器与微系统, vol. 41, no. 05, pages 110 - 113 *

Also Published As

Publication number Publication date
CN116679834B (en) 2023-10-24

Similar Documents

Publication Publication Date Title
US9132342B2 (en) Dynamic environment and location based augmented reality (AR) systems
CN109754471A (en) Image processing method and device, storage medium, electronic equipment in augmented reality
US20180196116A1 (en) Method and apparatus for using gestures to control a measurement device
US8553935B2 (en) Computer interface employing a manipulated object with absolute pose detection component and a display
KR101229283B1 (en) Method and system for visualising virtual three-dimensional objects
CN105608746A (en) Method for virtual realizing of reality
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
CN105027030A (en) Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
JP2001075726A (en) Computer input device with six degrees of freedom for controlling movement of three-dimensional object
CN105824416A (en) Method for combining virtual reality technique with cloud service technique
CN105797378A (en) Game video realizing method based on virtual reality technology
CN110928404B (en) Tracking system and related tracking method thereof
CN105824417A (en) Method for combining people and objects through virtual reality technology
CN105100952B (en) Screen-picture method of adjustment, device and equipment
CN206819290U (en) A kind of system of virtual reality multi-person interactive
JPH10198506A (en) System for detecting coordinate
CN205102796U (en) Do not paste hand -held type laser 3D scanner of target spot
CN116679834B (en) Large-space multi-person VR interactive experience system and method
JP5658618B2 (en) Operation input device, program
CN109643182A (en) Information processing method and device, cloud processing equipment and computer program product
CN105797379A (en) Game video processing method based on virtual reality technology
CN105718066B (en) It is a kind of can flexible combination real-time optical alignment system
Yang 3D Sensing and Tracking of Human Gait
Hui et al. Research on Graphics Assisted Stereotactic Orientation System Based on Virtual Reality
KR101230950B1 (en) An Haptic Apparatus and method of controlling thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant