CN113841416A - Interactive immersive cave network - Google Patents

Interactive immersive cave network Download PDF

Info

Publication number
CN113841416A
CN113841416A CN202080003847.9A CN202080003847A CN113841416A CN 113841416 A CN113841416 A CN 113841416A CN 202080003847 A CN202080003847 A CN 202080003847A CN 113841416 A CN113841416 A CN 113841416A
Authority
CN
China
Prior art keywords
cave
request
primary
user
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080003847.9A
Other languages
Chinese (zh)
Inventor
陈震宇
陆传杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luoyong Technology Development Co ltd
Original Assignee
Luoyong Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luoyong Technology Development Co ltd filed Critical Luoyong Technology Development Co ltd
Publication of CN113841416A publication Critical patent/CN113841416A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/157Conference systems defining a virtual conference space and using avatars or agents

Abstract

An immersive interactive CAVE system for teleconferencing or multi-person interaction enables users from a remote CAVE to have no 3D content data locally in their own remote sites. The system configures a hardware system including CAVE settings and routers for connecting to a distributed Virtual Private Cloud (VPC) service on each local site. The system also includes two subsystems: i) streaming and synchronization systems, and ii) interactive systems.

Description

Interactive immersive cave network
Cross reference to related applications
This application is a U.S. provisional application.
Technical Field
General description of the invention field.
Background
CAVE Automatic Virtual Environment (CAVE) is an immersive virtual reality environment in which a projector is oriented between three and six walls of a room-sized cube. CAVE is typically a video theater located in a larger room. The walls of CAVE are typically composed of rear projection screens or flat panel displays. The floor may be a downward projecting screen, a bottom projecting screen or a flat panel display. Due to close viewing, projection systems are typically high resolution, which requires very small pixel sizes to maintain realism. The user wears 3D glasses within CAVE to view the 3D graphics generated by CAVE. People using CAVE can see objects that obviously float in the air and can walk around them to see exactly what they look like in reality.
Early CAVE frames had to be made of non-magnetic materials such as wood to minimize interference with the electromagnetic sensor. The modification of the infrared tracking eliminates this limitation. The motion of the CAVE user is tracked by sensors typically mounted on 3D glasses, and the video is continuously adjusted to maintain the viewer's perspective. The computer controls both this and the audio aspects of CAVE. Multiple speakers are often placed at multiple angles in CAVE to provide 3D sound to supplement 3D video.
However, in terms of teleconferencing, existing teleconferencing software only supports desktop sharing. This poses a technical obstacle to immersive displays such as CAVE which are shared between devices during a teleconference. Meanwhile, to create a CAVE-like environment in a teleconference session, a large amount of 3D data is required. The storage and management of such large-scale 3D data is very expensive, especially in view of the multiple sessions and locations of such teleconferencing sessions. A given teleconference host and participants need to synchronize oversized 3D data for visualization.
Furthermore, multiple remote user interactions on large-scale 3D content can only be performed through head-mounted VR devices. This limitation makes interactive immersive CAVE in a teleconference session difficult, and may require multiple head-mounted VR devices to be provided for all participants. This further increases the cost of such conferences.
Embodiments of the present invention seek to solve or address one or more of the identified technical problems.
Disclosure of Invention
Aspects of the present invention overcome the disadvantages of prior methods by including embodiments of the present invention with various modes of connectivity for linking and synchronizing interactive content across a remote CAVE network.
In another aspect, embodiments of the invention address the problem when a user from a remote CAVE may not have 3D content data locally at their own remote site. Thus, if the user wishes to visualize and interact with 3D content from the host site, the remote site will need to download all of the 3D content in the host site. This operation is very time consuming due to the large amount of data.
In another aspect, embodiments of the present invention overcome problems associated with when multiple users interact on the same set of 3D content and their interactions may cause unsynchronization problems. For example, one user may rotate a 3D object clockwise while another user may simultaneously rotate the same 3D object counter-clockwise.
In one aspect, embodiments of the present invention provide a visual hardware and software system for experiencing an immersive virtual reality environment while enabling interactive interaction with remote users in a teleconference session.
Drawings
It will be appreciated by those of ordinary skill in the art that the elements in the figures are illustrated for simplicity and clarity and have not necessarily been shown in all instances to avoid obscuring aspects of the invention. For example, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure. It will further be appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. It will also be understood that the terms and expressions used herein may be defined with respect to their corresponding areas of inquiry and study except where specific meanings have otherwise been set forth herein.
FIG. 1 is an exemplary diagram illustrating a CAVE system setup connected to an interactive CAVE system according to one embodiment of the invention.
FIG. 2 is a diagram illustrating a network configuration with multiple master CAVEs streaming visual displays to multiple slave CAVEs, according to one embodiment of the present invention.
FIG. 3 is a flow diagram illustrating a lock and wait method according to one embodiment of the invention.
FIG. 4 is a diagram illustrating a portable computing device according to one embodiment of the invention.
FIG. 5 is a diagram illustrating a remote computing device according to one embodiment of the invention.
Detailed Description
The present invention may now be described more fully with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments by which the invention may be practiced. It is to be understood that both the foregoing general description and the following detailed description present embodiments, and are intended to provide an overview or framework for understanding the nature and character of the invention as it is claimed. This invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Among other things, the present invention may be embodied as methods, systems, computer-readable media, apparatuses, or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. The following detailed description, therefore, is not to be taken in a limiting sense.
The present invention provides that a remote CAVE system without up-to-date 3D data content can automatically update 3D data content when connected to a Virtual Private Cloud (VPC) by the following aspects without having to travel to the physical location of the 3D data content and CAVE. This is advantageous for various applications such as architectural environments, immersive gaming environments, interactive 3D video conferencing, and the like.
All remote CAVE systems with/without 3D data content can interact, and manipulate the same 3D data content in their local immersive environment. The interaction mode comprises the following steps:
a voice conference;
touchable interaction; and
multiplayer video game interaction.
In one aspect, embodiments of the invention may enable each local site to configure a hardware system including CAVE settings and routers for connecting to an allocated Virtual Private Cloud (VPC) service. The software system of the embodiment of the invention can comprise two subsystems: i) a streaming and synchronization system, and ii) a collaborative interactive system.
For example, a streaming and synchronization system may include features such as: a) remotely displaying the same 3D virtual reality environment in a CAVE streamed from a master CAVE without transferring 3D data to slave CAVEs, and b) the system may automatically synchronize the 3D data stored in all connected master CAVEs.
With the collaborative interaction system of embodiments of the present invention, the collaborative interaction system may include a real-time graphics engine that uses a lock and wait method to resolve synchronous interaction conflicts.
Referring to FIG. 1, an exemplary diagram illustrates a CAVE system setup for connecting to an interactive CAVE system, according to one embodiment of the invention. For example, the CAVE host as shown may include a projector with a speaker and microphone system in the room. A person conducting a teleconference may stand on one or more screens with interactive tools at hand to interact with virtual objects and the like.
With this arrangement, a particular network configuration may be required in order to obtain the benefits of aspects of the invention. For example, FIG. 2 is a diagram illustrating a network configuration having multiple master CAVEs streaming visual displays to multiple slave CAVEs, according to one embodiment of the invention. For example, as shown, the lines in fig. 2 may be straight lines. 2 denotes the connections established between all CAVEs, where synchronization between the master CAVEs is through these connections to the cloud. All slave CAVEs stream from the nearest master CAVE in the network. For example, signals 1 and 2 indicate the flow of signals from the nearest master CAVE to the slave CAVE. Note that no local data exists in any slave CAVE.
In another embodiment, all of the master CAVEs (e.g., Master 1, Master 2, and Master 3) are connected to each, no matter where they are connected in the VPC in the network. However, for a slave CAVE (e.g., a client CAVE in a teleconferencing setting), these devices connect to the nearest master CAVE in the local area network (e.g., via connection 1). In another embodiment, a slave CAVE may also be connected to the nearest master CAVE (e.g., via connection 2).
Once CAVE is established using the network configuration shown in FIG. 2, aspects of the present invention incorporate a streaming and synchronization system. For example, a master CAVE is initialized by opening a streaming session, and any slave CAVE (i.e., without 3D content data) can join the session from now on at any time.
In another embodiment, a remote master CAVE (i.e., with 3D content data) may join a streaming session for initial data synchronization to ensure data integrity. Note that once the session is initiated, no other primary CAVE can join as a primary CAVE.
Further, as the streaming session begins, and the rendering from the master CAVE will be streamed to the nearest slave CAVE by network distance. Upon receiving a user's interaction with 3D data from any master CAVE, the local master CAVE will visually and digitally synchronize (i.e., the 3D data) to all other master CAVEs connected inside the VPC.
In one embodiment, the primary CAVE that initializes the primary CAVE for the streaming session may be considered "local"; those primary CAVEs that later join the conference are considered "remote".
Then, when the user interacts with 3D data from any slave CAVE, all master CAVEs will update the visual effect by streaming to their corresponding connected slave CAVEs, and the initiating interactive slave CAVE will send an interaction signal (most recent) to its connection to master CAVE. The primary CAVE receiving the interaction signal will locally trigger UI event processing and treat the event as "the user interacts with 3D data from any primary CAVE".
FIG. 3 is a flow diagram illustrating a lock and wait method according to one embodiment of the invention. In another embodiment, FIG. 3 illustrates a second subsystem, wherein the interactive system may include a real-time graphics engine that uses a lock and wait method to resolve simultaneous interaction conflicts, as shown in the flow diagram. In one embodiment, the flow diagram begins with a user event request. In one embodiment, the user event request may be a user who wishes to request interaction with an object.
In an example, the latency range may be 5 seconds. In another embodiment, the wait time frame may be a function of the computing power or the computation of master/slave CAVE. For example, a more powerful system may have a shorter latency, while a less powerful system may have a longer latency.
Fig. 4 may be a high-level illustration of a portable computing device 801 in communication with a remote computing device 841, but may store and access applications in a variety of ways. Further, the application may be obtained in various ways (e.g., from an application store, a website, a store Wi-Fi system, etc.). Various versions of the application may be used to take advantage of the following: different computing devices, different languages and different API platforms.
In one embodiment, the portable computing device 801 may be a mobile device 112 that operates using a portable power supply 855 such as a battery. The portable computing device 801 may also have a display 802 that may or may not be a touch-sensitive display. More specifically, the display 802 may have, for example, a capacitive sensor that may be used to provide input data to the portable computing device 801. In other embodiments, an input pad 804, such as an arrow, scroll wheel, keyboard, etc., may be used to provide input to the portable computing device 801. In addition, the portable computing device 801 may have a microphone 806 that may accept and store spoken data, a camera 808 that may accept images, and a speaker 810 that may deliver sound.
The portable computing device 801 may be capable of communicating with a computing device 841 or multiple computing devices 841 that comprise the cloud of computing devices 811. The portable computing device 801 may be capable of communicating in a variety of ways. In some embodiments, communication may be wired, for example, by an ethernet cable, a USB cable, or an RJ6 cable. In other embodiments, the communication may be wireless, such as through Wi-Fi (802.11 standard), Bluetooth, cellular communication, or near field communication devices. The communication may be directly to the computing device 841, or may be through a communication network 102 such as a cellular service, through the internet, through a private network, through bluetooth, etc. Fig. 4 may be a simplified illustration of the physical elements making up the portable computing device 801, while fig. 5 may be a simplified illustration of the physical elements making up the server type computing device 841.
Fig. 4 may be a sample portable computing device 801 physically configured in accordance with a portion of a system. The portable computing device 801 may have a processor 850 that is physically configured according to computer executable instructions. It may have a portable power supply 855 such as a rechargeable battery. It may also have a sound and video module 860 that helps display video and sound, and may be turned off when not in use to save power and battery life. The portable computing device 801 may also have volatile memory 865 and non-volatile memory 870. It may have GPS functionality 880, which may be a separate circuit or may be part of the processor 850. There may also be an input/output bus 875 that traverses data to and from various user input devices, such as microphone 806, camera 808, and other inputs, such as tablet 804, display 802, and speaker 810, among others. It may also control communication with a network, either through wireless or wired devices. Of course, this is only one embodiment of the portable computing device 801 and the number and type of portable computing devices 801 is limited only by the imagination.
As a result of this system, better information can be provided to the user at the point of sale. This information may be user specific and may need to exceed a relevance threshold. As a result, the user can make a more informed decision. The system not only can accelerate the processing speed, but also can use a computing system to obtain better results.
The physical elements making up the remote computing device 841 may be further illustrated in fig. 5. At a high level, computing device 841 may include digital storage, such as magnetic disks, optical disks, flash memory, non-volatile storage, and the like. The structured data may be stored in a digital store, such as a database. Server 841 may have processor 1000 physically configured according to computer-executable instructions. It may also have a sound and video module 1005 that facilitates the display of video and sound, and may be turned off when not in use to conserve power and battery life. The server 841 may also have volatile memory 1010 and non-volatile memory 1015.
The database 1025 may be stored in the memory 1010 or 1015 or may be independent. Database 1025 may also be part of the cloud of computing devices 841 and may be stored in a distributed manner among multiple computing devices 841. There may also be an input/output bus 1020 that routes data to and from various user input devices, such as the microphone 806, the camera 808, input devices (e.g., the tablet 804, the display 802, and the speaker 810, etc.). The input/output bus 1020 may also control wired devices that communicate with the network in a wireless or wireless manner. In some embodiments, the application may be on local computing device 801, and in other embodiments, the application may be on remote 841. Of course, this is only one embodiment of the server 841, and the number and type of portable computing devices 841 is limited only by imagination.
The user devices, computers, and servers described herein may be general purpose computers that may have microprocessors (such as from Intel corporation, or from Intel corporation) in addition to other elements
Figure BDA0002880853280000061
Figure BDA0002880853280000062
or
Figure BDA0002880853280000063
) (ii) a Volatile and non-volatile memory; one or more mass storage devices (i.e., hard disk drives); various user input devices, such as a mouse, keyboard, or microphone; and a video display system. The user devices, computers, and servers described herein may run on any of a number of operating systems, including but not limited to
Figure BDA0002880853280000064
Figure BDA0002880853280000065
Or
Figure BDA0002880853280000066
However, it is contemplated that any suitable operating system may be used with the present invention. These servers may be clusters of Web servers, each of which may be based on
Figure BDA0002880853280000067
And is supported by a load balancer that decides which cluster of Web servers should be processed based on the current request load of the available servers.
The user devices, computers, and servers described herein may communicate via a network including the internet, WAN, LAN, Wi-Fi, other computer networks (now known or later devised), and/or any combination of the foregoing. Those of ordinary skill in the art having the benefit of this disclosure, drawings, and claims should appreciate that a network may connect various components via any combination of wired and wireless conduits, including copper, fiber optics, microwave, and other forms of radio frequency, electrical, and/or optical communication techniques. It should also be understood that any network may be connected to any other network in a different manner. The interconnection between computers and servers in a system is an example. Any device described herein may communicate with any other device via one or more networks.
The provided embodiments may include other devices and networks than those shown. Further, functions described as being performed by one device may be distributed and performed by two or more devices. Multiple devices may also be combined into a single device that may perform the functions of the combined devices.
The various participants and elements described herein can operate one or more computer devices to facilitate the functionality described herein. Any of the elements in the above figures, including any servers, user equipment, or databases, may use any suitable number of subsystems to facilitate the functionality described herein.
Any of the software components or functions described in this application may be implemented as software code or computer readable instructions executable by at least one processor using any suitable computer language (e.g., Java, C + + or Perl) and using, for example, conventional or object-oriented techniques. .
The software code may be stored as a series of instructions or commands on a non-transitory computer readable medium, such as a Random Access Memory (RAM), a Read Only Memory (ROM), a magnetic medium such as a hard disk-drive or a floppy disk, or an optical medium such as a CD-ROM. Any such computer-readable media may reside on or within a single computing device, and may exist on or within different computing devices within a system or network.
It will be appreciated that the invention as described above may be implemented in the form of control logic using computer software in a modular or integrated manner. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will know and appreciate other ways and/or methods to implement the present invention using hardware, software, or a combination of hardware and software.
The above description is illustrative and not restrictive. Many variations of the invention will become apparent to those skilled in the art upon reading the present disclosure. The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.
One or more features from any embodiment may be combined with one or more features of any other embodiment without departing from the scope of the invention. The description of "a" or "the" means "one or more" unless specifically indicated to the contrary. Unless specifically stated to the contrary, the expression "and/or" indicates the broadest meaning of the term.
One or more elements of the present system may be required to be a means for performing a specified function. Having such means plus function elements to describe certain elements of the claimed system, those of ordinary skill in the art having the present specification, drawings, and claims will appreciate that the corresponding structure is a general purpose computer, processor, or microprocessor (as the case may be) programmed to perform the functions recited using the functions found in any general purpose computer, without requiring special programming and/or implementation of one or more algorithms to achieve the recited functions. As will be appreciated by one of ordinary skill in the art, algorithms may be expressed within the present disclosure as mathematical formulas, flow charts, statements, and/or in any other way to provide one of ordinary skill in the art with sufficient structure to implement the recited processes and their equivalents.
While this disclosure may be embodied in many different forms, the figures and discussion are presented with the understanding that the present disclosure is an exemplification of the principles of one or more inventions and does not limit any of the inventions to the embodiments illustrated.
The present disclosure provides a solution to the long-term sensory need described above. In particular, the systems and methods described herein may be configured to improve CAVE applications in teleconferencing settings. Additional advantages and modifications of the above-described systems and methods will readily occur to those skilled in the art. The disclosure in its broader aspects is therefore not limited to the specific details, representative system and method, and illustrative examples shown and described above. Various modifications and variations may be made to the above description without departing from the scope or spirit of the disclosure, and it is intended that the disclosure cover all such modifications and variations as fall within the scope of the defined claims and their equivalents.

Claims (1)

1. A system for resolving conflicts in CAVE Automatic Virtual Environment (CAVE) interactions between a plurality of users, comprising:
receiving a user event request from a local CAVE, the user event request including a request from a user to interact with an object available in an interactive multi-person CAVE session connected via a computer network;
determining whether the object is currently in a locked state;
in response to determining that the object is not in a locked state, determining whether a request is sent from a primary CAVE, the primary CAVE including a CAVE system that initiated the session;
triggering a Graphical User Interface (GUI) event handler locally at a local CAVE in response to determining that the request was sent from a primary CAVE;
triggering another GUI event handler from a most recent primary CAVE in the computer network in response to determining that the request was sent from a system other than the primary CAVE;
issuing a command to place all interactive items in a locked state;
synchronizing visual elements in all CAVE systems; and
all interactive items are released from the locked state.
CN202080003847.9A 2019-05-31 2020-05-30 Interactive immersive cave network Pending CN113841416A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962855660P 2019-05-31 2019-05-31
US62/855,660 2019-05-31
PCT/IB2020/055147 WO2020240512A1 (en) 2019-05-31 2020-05-30 Collaborative immersive cave network

Publications (1)

Publication Number Publication Date
CN113841416A true CN113841416A (en) 2021-12-24

Family

ID=73552724

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080003847.9A Pending CN113841416A (en) 2019-05-31 2020-05-30 Interactive immersive cave network

Country Status (2)

Country Link
CN (1) CN113841416A (en)
WO (1) WO2020240512A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227487A1 (en) * 2002-06-01 2003-12-11 Hugh Harlan M. Method and apparatus for creating and accessing associative data structures under a shared model of categories, rules, triggers and data relationship permissions
US20120330913A1 (en) * 2011-06-24 2012-12-27 Salesforce.Com, Inc. Systems and methods for supporting transactional message handling
WO2018131803A1 (en) * 2017-01-10 2018-07-19 삼성전자 주식회사 Method and apparatus for transmitting stereoscopic video content
US20180314322A1 (en) * 2017-04-28 2018-11-01 Motive Force Technology Limited System and method for immersive cave application
EP3489891A1 (en) * 2016-09-09 2019-05-29 Samsung Electronics Co., Ltd. Method and device for processing three-dimensional image

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008125593A2 (en) * 2007-04-14 2008-10-23 Musecom Ltd. Virtual reality-based teleconferencing
US8615383B2 (en) * 2008-01-18 2013-12-24 Lockheed Martin Corporation Immersive collaborative environment using motion capture, head mounted display, and cave
WO2017165705A1 (en) * 2016-03-23 2017-09-28 Bent Image Lab, Llc Augmented reality for the internet of things
US20180158243A1 (en) * 2016-12-02 2018-06-07 Google Inc. Collaborative manipulation of objects in virtual reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227487A1 (en) * 2002-06-01 2003-12-11 Hugh Harlan M. Method and apparatus for creating and accessing associative data structures under a shared model of categories, rules, triggers and data relationship permissions
US20120330913A1 (en) * 2011-06-24 2012-12-27 Salesforce.Com, Inc. Systems and methods for supporting transactional message handling
EP3489891A1 (en) * 2016-09-09 2019-05-29 Samsung Electronics Co., Ltd. Method and device for processing three-dimensional image
WO2018131803A1 (en) * 2017-01-10 2018-07-19 삼성전자 주식회사 Method and apparatus for transmitting stereoscopic video content
US20180314322A1 (en) * 2017-04-28 2018-11-01 Motive Force Technology Limited System and method for immersive cave application

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MANJREKAR SIDDHESH等: "CAVE: An Emerging Immersive Technology - A Review", 《2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION》, pages 131 - 136 *

Also Published As

Publication number Publication date
WO2020240512A1 (en) 2020-12-03

Similar Documents

Publication Publication Date Title
US11218522B1 (en) Data processing system and method using hybrid system architecture for image processing tasks
US7840638B2 (en) Participant positioning in multimedia conferencing
JP6377082B2 (en) Providing a remote immersive experience using a mirror metaphor
JP6961334B2 (en) Servers, information processing methods and programs
EP3962077A1 (en) System and method for the delivery of applications within a virtual environment
CN112243583A (en) Multi-endpoint mixed reality conference
US20220070241A1 (en) System and method enabling interactions in virtual environments with virtual presence
EP3961496A1 (en) Graphical representation-based user authentication system and method
EP3962076B1 (en) System and method for virtually broadcasting from within a virtual environment
US11651108B1 (en) Time access control in virtual environment application
JP2023082119A (en) Virtual scene information interaction method, device, electronic device, storage medium and computer program
EP3962078A1 (en) Ad hoc virtual communication between approaching user graphical representations
EP3961396A1 (en) System and method to provision cloud computing-based virtual computing resources within a virtual environment
US20230206571A1 (en) System and method for syncing local and remote augmented reality experiences across devices
CN113841416A (en) Interactive immersive cave network
Peake et al. The virtual experiences portals—a reconfigurable platform for immersive visualization
US11776227B1 (en) Avatar background alteration
JP7409467B1 (en) Virtual space generation device, virtual space generation program, and virtual space generation method
US11741652B1 (en) Volumetric avatar rendering
US20230334751A1 (en) System and method for virtual events platform
US11876630B1 (en) Architecture to control zones
Cohen et al. Directional selectivity in panoramic and pantophonic interfaces: Flashdark, Narrowcasting for Stereoscopic Photospherical Cinemagraphy, Akabeko Ensemble
WO2012053001A2 (en) Virtual office environment
WO2024037001A1 (en) Interaction data processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
US20240031182A1 (en) Access control in zones

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40062110

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination