CN116679834B - Large-space multi-person VR interactive experience system and method - Google Patents

Large-space multi-person VR interactive experience system and method Download PDF

Info

Publication number
CN116679834B
CN116679834B CN202310964245.5A CN202310964245A CN116679834B CN 116679834 B CN116679834 B CN 116679834B CN 202310964245 A CN202310964245 A CN 202310964245A CN 116679834 B CN116679834 B CN 116679834B
Authority
CN
China
Prior art keywords
user
positioning
interactive
real
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310964245.5A
Other languages
Chinese (zh)
Other versions
CN116679834A (en
Inventor
罗龙辉
黄小捷
余小强
宁原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Dazang Technology Co ltd
Original Assignee
Nanchang Dazang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Dazang Technology Co ltd filed Critical Nanchang Dazang Technology Co ltd
Priority to CN202310964245.5A priority Critical patent/CN116679834B/en
Publication of CN116679834A publication Critical patent/CN116679834A/en
Application granted granted Critical
Publication of CN116679834B publication Critical patent/CN116679834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • H04W84/12WLAN [Wireless Local Area Networks]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of multi-person VR interaction, and discloses a large-space multi-person VR interaction experience system and method.

Description

Large-space multi-person VR interactive experience system and method
Technical Field
The application relates to the technical field of multi-person VR interaction, in particular to a large-space multi-person VR interaction experience system and method.
Background
The virtual reality game enters an interactive virtual field game scene through the fact that the user wears the virtual reality equipment, so that the current scene can be virtualized, virtual interaction can be realized, and the user can feel the immersive game experience.
VR interaction is generally that dress VR head shows and handheld VR interactive handle, interactive handle can simulate more article, for example gun in the recreation, sword fan, racket etc. but the interaction of handle is generally through cooperation interaction such as handle itself and camera, the delay is great, simulation precision is not high, and multiuser's interaction produces the interference easily, and it is inconvenient and accurate to the specific location of user in experience space, through the mode of formation of image, because the deviation will appear in the reason location of hardware or software, it is accurate inadequately to exchange, and be blocked each other easily between user or the handle, and need a large amount of high-difficulty algorithm to cooperate the location, increase the degree of difficulty to the development of VR interaction, increase the load of system operation, thereby can't make more people participate in simultaneously.
In order to solve the problems, the application provides a large-space multi-person VR interactive experience system and a large-space multi-person VR interactive experience method.
Disclosure of Invention
The application aims to provide a large-space multi-person VR interactive experience system and method, which are used for solving the problems of inconvenient interactive tracking and inconvenient positioning in the prior art.
In order to achieve the above purpose, the present application provides the following technical solutions:
a large-space multi-person VR interactive experience system comprises a VR equipment unit, an interactive tracking unit, a multi-person real-time positioning unit and a computing unit;
the VR equipment unit comprises VR head display equipment and a WiFi receiving device;
the interactive tracking unit comprises interactive handles interacted with a plurality of VR and a plurality of photoelectric distance sensing tracking devices which are uniformly arranged at the top in the experience space, wherein each photoelectric distance sensing tracking device comprises a plurality of diffuse reflection photoelectric switches, a plurality of positioning patches and a fixed suspended ceiling, the positioning patches are uniformly adhered to each interactive handle and are grouped, the diffuse reflection photoelectric switches are tightly arranged and fixed in the fixed suspended ceiling in sequence, the sensing directions are vertical to the ground in the experience space, the diffuse reflection photoelectric switches form two-dimensional positioning through arrangement and sequence, the uniformly arranged photoelectric distance sensing tracking devices perform real-time scanning on the interactive handles used by different users in real time, and perform interaction of user actions through real-time follow-up positioning of the interactive handles, so that data information of real-time interaction actions of the users is acquired and transmitted to the computing unit;
the multi-user real-time positioning unit is used for carrying out in-site real-time positioning on the users which are arranged and classified, acquiring real-time position information of the users and transmitting the information to the computing unit, the multi-user real-time positioning unit comprises a plurality of proximity switches which are uniformly arranged in the ground of the experience space, an ultrasonic sensor and an inertial sensor which are correspondingly and combined and installed with the proximity switches, the ultrasonic sensor comprises a transmitting end and a plurality of groups of reflecting ends, the reflecting ends are detachably installed on the soles of the users, the bottoms of the reflecting ends of the plurality of groups are provided with protrusions and depressions with different shapes for distinguishing the users, and the reflected signals are different when the shapes of the ground of the reflecting ends are different after the transmitting ends send ultrasonic waves, so that the users are confirmed and classified according to the shapes;
the computing unit is used for conducting the motion and interaction of the user in the VR scene to be displayed in a VR head display worn by the target user through the WiFi transmission device after computation according to the real-time positioning information of multiple persons and the information of interaction tracking.
Further, the diffuse reflection photoelectric switch detects the positioning of the patches on the handle so as to calculate and simulate the azimuth and the interaction angle of the interaction handle, and the diffuse reflection photoelectric switches detect different patches on the handle so as to simulate the inclination, the vertical, the horizontal and the overturning situations of the interaction handle.
Furthermore, the paving direction of the diffuse reflection photoelectric switch can be divided into an X axis and a Y axis, so that after a plurality of diffuse reflection photoelectric switches are arranged, each diffuse reflection photoelectric switch forms a coordinate point for interactive positioning.
Further, the calculating unit is further provided with a WiFi transmission device, the calculating unit is used for detecting and receiving signals of the diffuse reflection photoelectric switch through the diffuse reflection photoelectric switch according to movement of the user handheld interaction handle, the calculating unit receives the signals of the diffuse reflection photoelectric switch and transmits the signals to the VR head display device through WiFi, and interaction actions are displayed in the VR head display device for the user.
Further, the proximity switches are closely arranged and sequenced on the ground of the fixed experience space, when the positioning system of the proximity switch positioning and ultrasonic sensor is temporarily disconnected, the inertial sensor positioning system is used for continuously tracking the position until the positioning of the proximity switch and the positioning of the ultrasonic sensor are restored, and the two systems are dynamically balanced through a fusion algorithm.
Furthermore, the proximity switches form two-dimensional positioning through arrangement and sequence, and the laying direction of the proximity switches can be divided into an X axis and a Y axis, so that after a plurality of proximity switches are arranged, each proximity switch forms a coordinate point for positioning a user.
Furthermore, the user pedal proximity switch is used for positioning according to the sequence of the proximity switch, and the transmitting end matched with the proximity switch is used for detecting specific users so as to realize specific positioning of distinguishing users.
Furthermore, the computing unit is used for receiving signals of the proximity switch and the ultrasonic sensor according to the fact that a user steps on the proximity switch and the signals of the proximity switch and the ultrasonic sensor are differentiated through ultrasonic waves, the computing unit receives signals of the diffuse reflection photoelectric switch and transmits the signals to the VR head display device through WiFi, and movement and positioning of the user are displayed in the VR head display.
The method for multi-person VR interactive experience in a large space is also provided, and comprises the following steps:
the user wears VR equipment, holds an interactive handle, acquires and transmits information of interactive actions of the user, wears two identical reflecting ends of the ultrasonic sensor on the sole of each user, and acquires and transmits real-time positions and position changes of the user in the middle of experience;
according to information of user interaction, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in a VR head display, and after different reflection patches on an interaction handle are classified, different groups of reflection patches respectively feed back interactions of different users in an experience space to simulation of the VR scene in real time;
according to the real-time movement of the user, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in VR head display, and the combination positioning of a proximity switch and an ultrasonic sensor and the matching tracking of an inertial sensor form real-time movement tracking of a plurality of users, and reflecting ends of different groups of different users reflect different signals to receive data of the VR virtual scene and display the data to the target user;
according to interactive tracking and real-time position positioning of the user in the experience space, the collected and tidied data are transmitted to a WiFi receiving device on VR head display equipment of the target user through a WiFi transmission device to be received, and the data are displayed to the user in the VR head display, so that interaction of a plurality of users in the experience space is realized.
Compared with the prior art, the application has the beneficial effects that:
the application provides a large-space multi-person VR interactive experience system and method, wherein the interaction action delay of an interaction handle is small, the simulation precision is high, the interaction of multiple users cannot be interfered, the specific positioning of users in the experience space is convenient and accurate, the positioning deviation is reduced, the precision of the interaction is improved, the users or handles cannot be blocked with each other, a large number of high-difficulty algorithms are not needed for matching the positioning, the development difficulty of VR interaction is reduced, the load of system operation is reduced, more people can participate simultaneously conveniently, the VR experience of the users is improved, various actions and interactions in the experience space are more accurate, the mutual interaction of multiple users can be improved, and the user obtains more immersive VR experience.
Drawings
FIG. 1 is a schematic diagram of a large-space multi-person VR interactive experience system;
FIG. 2 is a schematic diagram of the whole assembly of a large-space multi-person VR interactive experience system of the present application;
FIG. 3 is a schematic workflow diagram of a large-space multi-person VR interactive experience system in accordance with the present application;
in the figure:
1. a VR device unit; 11. VR head display equipment; 12. a WiFi receiving device; 2. an interaction tracking unit; 21. an interactive handle; 22. photoelectric distance sensing tracking device; 221. diffuse reflection photoelectric switch; 222. a site; 3. a multi-person real-time positioning unit; 31. a proximity switch; 32. an ultrasonic sensor; 4. and a calculation unit.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments.
Example 1
Referring to fig. 1 to 3, the present application provides a technical solution: a large-space multi-person VR interactive experience system and method comprises a VR equipment unit 1, an interactive tracking unit 2, a multi-person real-time positioning unit 3 and a computing unit 4.
VR equipment unit 1 includes VR head display equipment 11 and wiFi receiving arrangement 12, and VR equipment unit 1 is including being applicable to the VR head display of different users to wear-type VR glasses is the core, through liquid crystal panel and lens with image display in the human eye sight scope, through visual difference simulation depth sense, reinforcing immersion sense, and wiFi receiving arrangement 12 is used for receiving the data simulation that computing element 4 collected and handled and feeds back to VR simulation scene through wiFi transmission device.
The interactive tracking unit 2 includes interactive handle 21 with a plurality of VR interactions and a plurality of photoelectric distance sensing tracking devices 22 that set up the top even arrangement in experience space, photoelectric distance sensing tracking devices 22 includes a plurality of diffuse reflection photoelectric switch 221, a plurality of location paster 222 and furred ceiling, a plurality of location paster 222 evenly paste on every interactive handle 21 and group, a plurality of diffuse reflection photoelectric switch 221 closely arrange and the order is fixed in the furred ceiling, and the sense orientation is all perpendicular with experience space's ground, diffuse reflection photoelectric switch 221 forms two-dimentional location through arranging the order, a plurality of photoelectric distance sensing tracking devices 22 of even arrangement carries out real-time scanning to the interactive handle 21 that different users used, carry out the interaction of user action through the location that follows interactive handle 21 in real time, obtain the data information of the real-time interactive action of user and transmit to the calculating unit 4, interactive tracking unit 2 includes interactive handle 21 and the portion thereof, through the mutual action of user's handheld interactive handle 21, in the self of handle interaction, through the interactive system of tracking, the effect that can more accurate location interactive handle 21 is simulated in the interactive handle, for example, the umbrella in the simulation game, the racket, the sword, the mutual action of sword can be more accurate in the mutual action of the accurate interaction 21 of the accurate simulation of the mutual action of the user and the mutual action of the user in the more accurate interaction scene of the mutual position 21 of the interaction with the virtual handle 21 of the more accurate simulation of the real-time.
The multi-user real-time positioning unit 3 is used for carrying out in-site real-time positioning on the user subjected to arrangement classification, acquiring user real-time position information and transmitting the user real-time position information to the computing unit 4, the multi-user real-time positioning unit 3 can be used for carrying out user positioning more quickly, is convenient for user movement and interaction operation, is directly triggered, forms accurate positioning, improves the accurate matching of the user position and VR simulation scenes in the experience space, improves the accuracy of interaction operation, carries out positioning and removing complex operation in a triggering mode, improves the stability and efficiency of system operation, reduces software development difficulty and reduces the calculation complexity between an actual user and VR scenes, the multi-user real-time positioning unit 3 comprises a plurality of proximity switches 31, ultrasonic sensors 32 and inertial sensors, wherein the ultrasonic sensors 32 are arranged in a combined mode and correspond to the proximity switches 31, the plurality of groups of reflection ends are detachably arranged on the sole of the user, the bottom surfaces of the groups of the reflection ends are provided with protrusions and recesses with different shapes, the reflection ends of the transmission ends send out signals with different shapes after ultrasonic waves, and the reflected signals are different in the shapes, and the user classification is further confirmed;
the computing unit 4 is configured to conduct motion and interaction of a user in a VR scene through a WiFi transmission device according to real-time positioning information and interaction tracking information of multiple persons, display the motion and interaction in a VR head display worn by a target user to the user through calculation, connect the WiFi transmission device with the WiFi receiving device 12 through an intranet, and transmit the motion and interaction through a 5G signal, so that delay of user movement and interaction is reduced.
The diffuse reflection photoelectric switch 221 detects the positioning of the patches on the handle so as to calculate and simulate the azimuth and the interaction angle of the interaction handle 21, and the diffuse reflection photoelectric switches 221 detect different patches on the handle so as to simulate the situations of inclination, vertical, horizontal, overturning and the like of the interaction handle 21.
The spreading direction of the diffuse reflection photoelectric switches 221 may be divided into an X axis and a Y axis, so that each diffuse reflection photoelectric switch 221 forms a coordinate point for interactive positioning after the diffuse reflection photoelectric switches 221 are arranged.
In combination with the above, the paving direction of the diffuse reflection photoelectric switch 221 can be divided into an X axis and a Y axis, so that the fixed coordinate point formed by each diffuse reflection photoelectric switch 221 is (0, 0), (0, 1), (1, 1) … …, until the suspended ceiling fully covering the whole experience space is paved, when the interactive handle 21 is held to perform interactive actions, the diffuse reflection photoelectric switch 221 transmits signals, then reflects the signals to the diffuse reflection photoelectric switch 221 through the positioning patch 222, and transmits the signals to the computing unit 4 through the signal line, so that the specific position and angle of the interactive handle 21 are positioned, the specific positioning and the capturing and simulation of the interactive actions are performed through the computing unit 4 by matching with the interactive handle 21, and the feedback is performed in the VR head display, so that the accuracy of the interaction is improved, the interaction between the user and the VR scene is facilitated, and the interaction operation between the user and the user is improved, and the interaction experience of the device is improved.
It should be noted that, because people have a volume and hold the handle, users carry the handle and can not produce unit collision between other users and the handle thereof, and then make interactive tracking can not interfere with each other, and then can carry out interactive tracking to more users as far as possible according to the operational capability and the hardware performance of the system itself.
The calculating unit 4 is further provided with a WiFi transmission device, the calculating unit 4 is used for detecting and receiving signals of the diffuse reflection photoelectric switch 221 through the diffuse reflection photoelectric switch 221 according to movement of the user handheld interaction handle 21, the calculating unit 4 receives the signals of the diffuse reflection photoelectric switch 221 and transmits the signals to the VR head display device 11 through WiFi, interaction actions are displayed in the VR head display for the user, the calculating unit 4 is convenient to transmit the data set, calculation, arrangement and simulation of the multi-user real-time positioning unit 3 and the interaction tracking unit 2 through the WiFi transmission device, the VR data are transmitted through the WiFi transmission device, the VR head display is convenient to transmit to a target user through the WiFi receiving device 12, and transmission of internal network and 5G signals is convenient, and transmission efficiency is improved.
The proximity switches 31 are closely arranged and sequenced on the ground of a fixed experience space, when the positioning system combined by the proximity switches 31 and the ultrasonic sensors 32 is suspended, the position tracking is continued by the inertial sensor positioning system until the positioning of the proximity switches 31 and the positioning of the ultrasonic sensors 32 are restored, and the two systems are dynamically balanced by a fusion algorithm.
The proximity switches 31 are arranged and sequenced to form two-dimensional positioning, and the laying direction of the proximity switches 31 can be divided into an X axis and a Y axis, so that after a plurality of proximity switches 31 are arranged, each proximity switch 31 forms a coordinate point for positioning a user.
The proximity switch 31 is positioned according to the sequence of the proximity switch 31 by a user pedal, and the transmitting end matched with the proximity switch 31 is used for detecting specific users so as to realize specific positioning of distinguishing users.
The laying direction of the proximity switches 31 can be divided into an X axis and a Y axis, so that the fixed coordinate point positions formed by each proximity switch 31 are (0, 0), (0, 1), (1, 1) … … until the whole ground of the experience space is laid, the same user wears the same group of reflecting ends, when the reflecting ends are treaded on the ground of the experience space, the proximity switch 31 close to the reflecting ends is triggered firstly, the user is positioned by combining the coordinate points, and when the user moves, the position tracking is continuously performed through the inertial sensor positioning system until the positioning of the proximity switch 31 and the positioning of the ultrasonic sensor 32 are restored, the two systems are dynamically balanced through a fusion algorithm, and the ultrasonic wave transmitting terminal transmits signals and reflects through the reflecting terminal, distinguish users, different groups of reflecting terminal bottom surfaces can have different bulges and hollows, the bulges and hollows are the same with the reflecting terminal bottom surfaces of the same group, the bulges and hollows of different groups can be triangular, circular, square and the like, the sizes of the bulges and hollows can be changed to distinguish, the signals reflected by the reflecting terminal are different, the transmitting terminal receives different calculating units 4 for transmitting the reflected signals, and the calculating units are used for distinguishing different users, can expand the number of users for increasing the experience space, and further improve VR and interactive experience thereof.
The computing unit 4 is configured to receive signals of the proximity switch 31 and the ultrasonic sensor 32 according to the user's foot pedal of the proximity switch 31 and through ultrasonic distinction, and the computing unit 4 receives signals of the diffuse reflection photoelectric switch 221 and transmits the signals to the VR head display device 11 through WiFi, and the movement and positioning of the user are displayed in the VR head display.
Referring to fig. 1-3, corresponding to the above embodiment 1, this embodiment further provides a large-space multi-person VR interactive experience method, which includes:
the user wears VR equipment, holds the interactive handle 21, acquires and transmits information of interactive actions of the user, and each user sole wears two identical reflecting ends of the ultrasonic sensor 32 to acquire and transmit real-time positions and position changes of the user in the experience middle;
according to the information of the user interaction, the generated VR virtual scene is transmitted to a WiFi receiving device 12 on VR head display equipment 11 of a target user through a WiFi transmitting device to be received, and is displayed to the user in a VR head display, and after different reflection patches on an interaction handle 21 are classified, the different groups of reflection patches respectively feed back interactions of different users in an experience space to simulation of the VR scene in real time;
according to the real-time movement of the user, the generated VR virtual scene is transmitted to a WiFi receiving device 12 on VR head display equipment 11 of the target user through a WiFi transmitting device to be received, and is displayed to the user in the VR head display, the combination positioning of a proximity switch 31 and an ultrasonic sensor 32 and the matching tracking of an inertial sensor form the real-time movement tracking of a plurality of users, and the reflecting ends of different groups of different users reflect different signals to receive the data of the VR virtual scene and display the data to the target user;
according to interactive tracking and real-time position positioning of users in experience space, the collected and tidied
The data are transmitted to a WiFi receiving device 12 on VR head display equipment 11 of the target user through a WiFi transmission device, and are received and displayed to the user in the VR head display, so that interaction of a plurality of users in the experience space is realized.
All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.

Claims (9)

1. The large-space multi-person VR interactive experience system is characterized by comprising a VR equipment unit, an interactive tracking unit, a multi-person real-time positioning unit and a computing unit;
the VR equipment unit comprises VR head display equipment and a WiFi receiving device;
the interaction tracking unit comprises interaction handles interacted with a plurality of VRs and a plurality of photoelectric distance sensing tracking devices which are uniformly arranged at the top in an experience space, wherein each photoelectric distance sensing tracking device comprises a plurality of diffuse reflection photoelectric switches, a plurality of positioning patches and a fixed suspended ceiling, the diffuse reflection photoelectric switches are tightly arranged and sequentially fixed in the fixed suspended ceiling, the sensing directions are vertical to the ground in the experience space, the diffuse reflection photoelectric switches form two-dimensional positioning through arrangement and sequence, the uniformly arranged photoelectric distance sensing tracking devices perform real-time scanning on the interaction handles used by different users in real time, and perform interaction of user actions through real-time follow-up of the positioning of the interaction handles, so that data information of real-time interaction actions of the users is acquired and transmitted to the computing unit;
the multi-user real-time positioning unit is used for carrying out in-site real-time positioning on the users which are arranged and classified, acquiring real-time position information of the users and transmitting the information to the computing unit, the multi-user real-time positioning unit comprises a plurality of proximity switches which are uniformly arranged in the ground of the experience space, an ultrasonic sensor and an inertial sensor which are correspondingly and combined and installed with the proximity switches, the ultrasonic sensor comprises a transmitting end and a plurality of groups of reflecting ends, the reflecting ends are detachably installed on the soles of the users, the bottoms of the reflecting ends of the plurality of groups are provided with protrusions and depressions with different shapes for distinguishing the users, and the reflected signals are different when the shapes of the ground of the reflecting ends are different after the transmitting ends send ultrasonic waves, so that the users are confirmed and classified according to the shapes;
the computing unit is used for conducting the motion and interaction of the user in the VR scene to be displayed in a VR head display worn by the target user through the WiFi transmission device after computation according to the real-time positioning information of multiple persons and the information of interaction tracking.
2. The large-space multi-person VR interactive experience system of claim 1, wherein the diffuse reflection photoelectric switch detects the positioning of the patches on the handle to calculate and simulate the orientation and the interaction angle of the interactive handle, and the plurality of diffuse reflection photoelectric switches detect different patches on the handle to simulate the tilting, vertical, horizontal and turning situations of the interactive handle.
3. The multi-person VR interactive experience system of claim 2, wherein the diffuse reflection photoelectric switch is arranged in a direction that can be divided into an X-axis and a Y-axis so that each diffuse reflection photoelectric switch forms a coordinate point for interactive positioning after the diffuse reflection photoelectric switches are arranged.
4. The large-space multi-person VR interactive experience system according to claim 3, wherein the computing unit is further provided with a WiFi transmission device, the computing unit is used for detecting and receiving signals of the diffuse reflection photoelectric switch through the diffuse reflection photoelectric switch according to movement of a user holding the interactive handle, the computing unit receives the signals of the diffuse reflection photoelectric switch and transmits the signals to VR head display equipment through WiFi, and interactive actions are displayed in the VR head display for the user.
5. The large-space multi-person VR interactive experience system of claim 1, wherein the plurality of proximity switches are closely arranged and sequenced on the ground of the fixed experience space, and when the proximity switch positioning and ultrasonic sensor positioning system is temporarily disconnected, the position tracking is continued by the inertial sensor positioning system until the proximity switch positioning and ultrasonic sensor positioning are restored, and the two systems are dynamically balanced by a fusion algorithm.
6. The large-space multi-person VR interactive experience system according to claim 5, wherein the proximity switches are arranged and sequenced to form two-dimensional positioning, and the laying direction of the proximity switches is divided into an X axis and a Y axis, so that each proximity switch forms a coordinate point for positioning a user after the plurality of proximity switches are arranged.
7. The large-space multi-person VR interactive experience system of claim 6, wherein the user pedal proximity switch is used for positioning according to the sequence of the proximity switch, and the transmitting end matched with the proximity switch is used for detecting specific users so as to distinguish the specific positioning of the users.
8. The large-space multi-person VR interactive experience system according to claim 7, wherein the computing unit is configured to receive signals of the proximity switch and the ultrasonic sensor according to user pedal proximity switch and through ultrasonic distinction, and the computing unit receives signals of the diffuse reflection photoelectric switch and transmits the signals to the VR head display device through WiFi, and the VR head display displays movement and positioning of the user.
9. A method for a large-space multi-person VR interactive experience, characterized in that the method applies the large-space multi-person VR interactive experience system of any one of claims 1 to 8, and the method comprises the following steps:
the user wears VR equipment, holds an interactive handle, acquires and transmits information of interactive actions of the user, wears two identical reflecting ends of the ultrasonic sensor on the sole of each user, and acquires and transmits real-time positions and position changes of the user in the middle of experience;
according to information of user interaction, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in a VR head display, and after different reflection patches on an interaction handle are classified, different groups of reflection patches respectively feed back interactions of different users in an experience space to simulation of the VR scene in real time;
according to the real-time movement of the user, the generated VR virtual scene is transmitted to a WiFi receiving device on VR head display equipment of a target user through a WiFi transmitting device to be received, and is displayed to the user in VR head display, and the combination positioning of a proximity switch and an ultrasonic sensor and the matching tracking of an inertial sensor form real-time movement tracking of a plurality of users, and reflecting ends of different groups of different users reflect different signals to receive data of the VR virtual scene and display the data to the target user;
according to interactive tracking and real-time position positioning of the user in the experience space, the collected and tidied data are transmitted to a WiFi receiving device on VR head display equipment of the target user through a WiFi transmission device to be received, and the data are displayed to the user in the VR head display, so that interaction of a plurality of users in the experience space is realized.
CN202310964245.5A 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method Active CN116679834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310964245.5A CN116679834B (en) 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310964245.5A CN116679834B (en) 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method

Publications (2)

Publication Number Publication Date
CN116679834A CN116679834A (en) 2023-09-01
CN116679834B true CN116679834B (en) 2023-10-24

Family

ID=87787673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310964245.5A Active CN116679834B (en) 2023-08-02 2023-08-02 Large-space multi-person VR interactive experience system and method

Country Status (1)

Country Link
CN (1) CN116679834B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
WO2008035827A1 (en) * 2006-09-21 2008-03-27 Jongchul Kim Pedestrian navigation method and apparatus for using geographic information system
CN205581785U (en) * 2016-04-15 2016-09-14 向京晶 Indoor virtual reality interactive system of many people
CN106621324A (en) * 2016-12-30 2017-05-10 当家移动绿色互联网技术集团有限公司 Interactive operation method of VR game
WO2017122194A1 (en) * 2016-01-12 2017-07-20 Karma Touch 2016 Ltd. Method and system for transmitter parameter reconfiguration based on receiver spatial information
CN206391542U (en) * 2017-01-12 2017-08-11 深圳游视虚拟现实技术有限公司 A kind of reality-virtualizing game rocket launcher
US11073906B1 (en) * 2020-01-27 2021-07-27 Valve Corporation Eye tracking system for head-mounted display devices
CN115097939A (en) * 2022-06-30 2022-09-23 沉浸(北京)科技有限公司 Large-space multi-user VR interactive experience system and method
CN116440490A (en) * 2023-04-03 2023-07-18 维沃移动通信有限公司 Interactive control terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9218452B2 (en) * 2010-12-27 2015-12-22 General Electric Company Method and system to automatically load user settings to wireless ultrasound probe
US11011077B2 (en) * 2017-06-29 2021-05-18 Verb Surgical Inc. Virtual reality training, simulation, and collaboration in a robotic surgical system
US20190253700A1 (en) * 2018-02-15 2019-08-15 Tobii Ab Systems and methods for calibrating image sensors in wearable apparatuses

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
WO2008035827A1 (en) * 2006-09-21 2008-03-27 Jongchul Kim Pedestrian navigation method and apparatus for using geographic information system
WO2017122194A1 (en) * 2016-01-12 2017-07-20 Karma Touch 2016 Ltd. Method and system for transmitter parameter reconfiguration based on receiver spatial information
CN205581785U (en) * 2016-04-15 2016-09-14 向京晶 Indoor virtual reality interactive system of many people
CN106621324A (en) * 2016-12-30 2017-05-10 当家移动绿色互联网技术集团有限公司 Interactive operation method of VR game
CN206391542U (en) * 2017-01-12 2017-08-11 深圳游视虚拟现实技术有限公司 A kind of reality-virtualizing game rocket launcher
US11073906B1 (en) * 2020-01-27 2021-07-27 Valve Corporation Eye tracking system for head-mounted display devices
CN115097939A (en) * 2022-06-30 2022-09-23 沉浸(北京)科技有限公司 Large-space multi-user VR interactive experience system and method
CN116440490A (en) * 2023-04-03 2023-07-18 维沃移动通信有限公司 Interactive control terminal

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Impact of virtual reality anatomy training on ultrasound competency development: A randomized controlled trial;Hu et al.;PLOS ONE;第15卷(第11期);第1-12页 *
基于CNN和超声传感的手势识别及辅助身份认证;张梦欢;传感器与微系统;第41卷(第05期);第110-113页 *
娄岩.《虚拟现实与增强现实实用教程》.北京:机械工业出版社,2021,第65-68页. *

Also Published As

Publication number Publication date
CN116679834A (en) 2023-09-01

Similar Documents

Publication Publication Date Title
US11262841B2 (en) Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing
CN109754471A (en) Image processing method and device, storage medium, electronic equipment in augmented reality
US6417836B1 (en) Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
CN105608746A (en) Method for virtual realizing of reality
CN109671118A (en) A kind of more people's exchange methods of virtual reality, apparatus and system
KR20080069579A (en) Method and system for visualising virtual three-dimensional objects
EP1469424A1 (en) Image processing system, image processing apparatus, and display apparatus
CN105824416A (en) Method for combining virtual reality technique with cloud service technique
Piekarski et al. Augmented reality working planes: A foundation for action and construction at a distance
CN110928404B (en) Tracking system and related tracking method thereof
CN105797378A (en) Game video realizing method based on virtual reality technology
CA2694123A1 (en) Instant calibration of multi-sensor 3d motion capture system
Shin et al. Application of precise indoor position tracking to immersive virtual reality with translational movement support
CN105824417A (en) Method for combining people and objects through virtual reality technology
CN206819290U (en) A kind of system of virtual reality multi-person interactive
CN116679834B (en) Large-space multi-person VR interactive experience system and method
Zhu et al. Wii remote–based low-cost motion capture for automated assembly simulation
JP5658618B2 (en) Operation input device, program
CN109643182A (en) Information processing method and device, cloud processing equipment and computer program product
Wright et al. Leap motion performance in an augmented reality workspace: Integrating devices with an interactive platform
CN111047710B (en) Virtual reality system, interactive device display method, and computer-readable storage medium
JP7398227B2 (en) Work support systems and programs
CN105797379A (en) Game video processing method based on virtual reality technology
CN107145220A (en) Man-machine interaction self-adapting regulation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant