WO2023201534A1 - A system and method for facilitating a virtual event - Google Patents

A system and method for facilitating a virtual event Download PDF

Info

Publication number
WO2023201534A1
WO2023201534A1 PCT/CN2022/087746 CN2022087746W WO2023201534A1 WO 2023201534 A1 WO2023201534 A1 WO 2023201534A1 CN 2022087746 W CN2022087746 W CN 2022087746W WO 2023201534 A1 WO2023201534 A1 WO 2023201534A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
virtual
facilitating
members
interactive data
Prior art date
Application number
PCT/CN2022/087746
Other languages
French (fr)
Inventor
Kin Wang Chau
Original Assignee
Muxic Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Muxic Limited filed Critical Muxic Limited
Priority to PCT/CN2022/087746 priority Critical patent/WO2023201534A1/en
Publication of WO2023201534A1 publication Critical patent/WO2023201534A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data

Definitions

  • the present invention relates to a system and method for facilitating a virtual event, and particularly, although not exclusively, to a system and method for facilitating a virtual event whereby a host and the participants may interact with each other.
  • a method for facilitating a virtual event comprising the steps of:
  • the interactive data is transmitted to the one or more members of the participant group.
  • the interactive data is processed before it is transmitted to the event host.
  • the interactive data is transmitted to selected members of the participant group.
  • the interactive data includes biometric data of the one or more members of the participant group.
  • the interactive data includes motion data arranged to represent the motion of the one or more members of the participant group.
  • the interactive data includes visual and/or audio data captured by the one or more members of the participant group.
  • the visual and/or audio data is processed to animate a mouth an avatar associated with each of the one or more members of the participant group.
  • the motion data is processed to animate the avatar associated with each of the one or more members of the participant group.
  • the motion data is further processed to determine an action or instruction of the one or more members of the participant group.
  • the interactive data is processed to determine one or more emotional attributes of the participant group.
  • the interactive data is processed with a learning network arranged to classify the interactive data to determine the one or more emotional attributes of the participant group.
  • a system for facilitating a virtual event comprising:
  • - a virtual event platform arranged to receive an event stream from an event host for streaming to a participant group;
  • participant streaming gateway arranged to transmit the event stream to one or more members of the participant group
  • an interaction server arranged to receive interactive data from the one or more members of the participant group and to transmit the interactive data to the event host.
  • the interactive data is transmitted to the one or more members of the participant group.
  • the interactive data is processed before it is transmitted to the event host.
  • the interactive data is transmitted to selected members of the participant group.
  • the interactive data includes biometric data of the one or more members of the participant group.
  • the interactive data includes motion data arranged to represent the motion of the one or more members of the participant group.
  • the interactive data includes visual and/or audio data captured by the one or more members of the participant group.
  • the visual and/or audio data is processed to animate a mouth an avatar associated with each of the one or more members of the participant group.
  • the motion data is processed to animate the avatar associated with each of the one or more members of the participant group.
  • the motion data is further processed to determine an action or instruction of the one or more members of the participant group.
  • the interactive data is processed to determine one or more emotional attributes of the participant group.
  • the interactive data is processed with a learning network arranged to classify the interactive data to determine the one or more emotional attributes of the participant group.
  • the virtual event is facilitated within a virtual world and the interaction server is separately implemented and controlled from the virtual world.
  • Figure 1 is a schematic block diagram of a computer system arranged to be implemented to operate a system for facilitating a virtual event in accordance with one embodiment of the present invention
  • Figure 2 is block diagram illustrating an embodiment of a system for facilitating a virtual event
  • Figure 3 is block diagram illustrating another embodiment of a system for facilitating a virtual event
  • Figure 4 is block diagram illustrating the interactive data received by the virtual host in accordance with the system for facilitating a virtual event of Figures 2 or 3;
  • Figure 5 is block diagram illustrating the interactive data provided by the virtual event participants or audience members in accordance with the system for facilitating a virtual event of Figures 2 or 3;
  • Figure 6 is block diagram illustrating an example interaction server of the system for facilitating a virtual event of Figures 2 or 3.
  • FIG. 1 an embodiment of a computer system 100 is illustrated.
  • This embodiment of the computer system 100 is arranged to provide a system for facilitating a virtual event comprising:
  • - a virtual event platform arranged to receive an event stream from an event host for streaming to a participant group;
  • participant streaming gateway arranged to transmit the event stream to one or more members of the participant group
  • an interaction server arranged to receive interactive data from the one or more members of the participant group and to transmit the interactive data to the event host.
  • the virtual event platform, participant streaming gateway and interaction server are implemented by one or more computers or computing systems having an appropriate user interface.
  • the computer may be implemented by any computing architecture, including portable computers, tablet computers, stand-alone Personal Computers (PCs) , smart devices, Internet of Things (IOT) devices, edge computing devices, client/server architecture, “dumb” terminal/mainframe architecture, cloud-computing based architecture, or any other appropriate architecture.
  • the computing device may be appropriately programmed to implement the invention.
  • the system and method for facilitating a virtual event are arranged to facilitate a virtual event such as a music concert, opera, drama, play, Chinese opera, presentation, lecture, seminar, religious sermon, training, consultation, therapeutic treatments, conference, meeting, sports training, competition, counseling or operation between an event host and one or more participants.
  • the virtual event may be facilitated within a virtual environment or virtual world, sometimes referred to as a virtual world platform, virtual world, virtual reality, virtual environment, virtual universe or metaverse which may include, for example, a three-dimensional virtual world that is created and operated by computer systems as an open world or open map virtual environment.
  • virtual world or virtual environment may also include other forms of online or computing environments which may not necessarily be three-dimensional, or even graphics based, such as chatrooms, teleconference platforms, online data exchange services, online messaging services or telephony/communication services or platforms.
  • embodiments of the system and method may be arranged to include an interaction server, or provide an interaction service whereby the host of the virtual event may be able to interact with the participants.
  • the host which may be a performer, artist and/or their supporting teams may be performing, presenting, speaking, singing, acting or otherwise hosting the event, and the participants may in turn provide their interaction or feedback back to the host as part of their participation within the virtual event.
  • the participants may be able to use the interaction server or interaction service to interact with the host by submitting interactive data to the host.
  • This interaction may include text messages, gifts in the forms of digital tokens, digital gifts or payments, artworks or icons, or voice/sound data, video data, location data, movement data or biometric data.
  • such examples of interactive data may also be further processed by examples of the system for facilitating a virtual event such that it may be made more meaningful and useful for the event host to react to the interactive data, either in the next event, or preferably, in real time.
  • the performer or host may be able to deliver their performance within the virtual world or metaverse, followed by requesting for the participants or audience to undertake certain interactive actions such as asking questions, asking the audience to sing along, or asking the audience members to move their body and dance with the music.
  • the host is able to determine the level or type of interactive actions undertaken by the audience, and thus adjusting or adapting their performance accordingly.
  • Embodiments of the present invention may be advantageous as a virtual event may be made more interactive by the event host, who will be able to monitor or gauge the participant’s response and reaction. Furthermore, by facilitating and monitoring the ability for participants to participate within the virtual event through the performance of various tasks, such as singing along, clapping, following specific motion routines, the event host may be able to involve the participants more within the virtual event and thus improving the experience and immersion of the participants within the event.
  • FIG. 1 there is a shown a schematic diagram of a computer system or computer server 100 which is arranged to be implemented as an example embodiment of a system for facilitating a virtual event.
  • the computer system comprises a server 100 which includes suitable components necessary to receive, store and execute appropriate computer instructions.
  • the components may include a processing unit 102, including Central Processing United (CPUs) , Math Co-Processing Unit (Math Processor) , Graphic Processing United (GPUs) or Tensor processing united (TPUs) for tensor or multi-dimensional array calculations or manipulation operations, read-only memory (ROM) 104, random access memory (RAM) 106, and input/output devices such as disk drives 108, input devices 110 such as an Ethernet port, a USB port, etc.
  • Display 112 such as a liquid crystal display, a light emitting display or any other suitable display and communications links 114.
  • the server 100 may include instructions that may be included in ROM 104, RAM 106 or disk drives 108 and may be executed by the processing unit 102.
  • a plurality of communication links 114 may variously connect to one or more computing devices such as a server, personal computers, terminals, wireless or handheld computing devices, Internet of Things (IoT) devices, smart devices, edge computing devices.
  • IoT Internet of Things
  • At least one of a plurality of communications link may be connected to an external computing network through a telephone line or other type of communications link.
  • the server 100 may include storage devices such as a disk drive 108 which may encompass solid state drives, hard disk drives, optical drives, magnetic tape drives or remote or cloud-based storage devices.
  • the server 100 mayuse a single disk drive or multiple disk drives, or a remote storage service.
  • the server 100 may also have a suitable operating system 116 which resides on the disk drive or in the ROM of the server 100.
  • the computer or computing apparatus 100 may also provide the necessary computational capabilities to operate or to interface with a machine learning network, such as a neural network, to provide various functions and outputs.
  • a machine learning network such as a neural network
  • the neural network may be implemented locally, or it may also be accessible or partially accessible via a server or cloud-based service.
  • the machine learning network may also be untrained, partially trained or fully trained, and/or may also be retrained, adapted or updated over time.
  • the computer or computing apparatus 100 may also provide the necessary computational and communication capabilities to operate as, either in part, or completely, as a host server or as a function provider server for virtual reality environments that may also be referred to virtual reality worlds, virtual worlds, virtual environments, gaming environments, gaming worlds, augment reality worlds, metaverse or any similar terms which represent similar virtual worlds, augment reality worlds or virtual universes.
  • the computer or computing apparatus 100 may be implement to operate such virtual worlds or universes and allow for various activities by users who may enter (log in) to these virtual worlds, including gaming, meetings, concerts, operas or other events, virtual or online activities, explorations, social activities, the import, export, trade or exchange of real or virtual properties, currencies or assets that may exists within the real or virtual worlds or universe or virtual currencies or assets which may be usable on multiple platforms and crosses multiple real or virtual worlds or universes.
  • Such virtual currencies, property or assets may be supported by various ledger systems such as open or distributed ledgers or block chain technologies which may operate with the virtual worlds or universes.
  • FIG. 2 there is illustrated a block diagram of an example embodiment of a system for facilitating a virtual event 200 comprising: avirtual event platform 204 arranged to receive an event stream from an event host 212 for streaming to a participant group; aparticipant or event streaming gateway 208 arranged to transmit the event stream to one or more members of the participant group 210; and, an interaction server 206 arranged to receive interactive data from the one or more members of the participant group 210 and to transmit the interactive data to the event host 212.
  • the system for facilitating a virtual event 200 includes a virtual world, virtual environment service, metaverse service202 which may be provided by one or more computer systems arranged to provide a virtual reality world service 202 or any other online data/communication exchange/teleconference services for users.
  • a virtual world or environments202 users may log into the virtual environment services 202 as an avatar, virtual character or other forms of virtual representation of the user so as to access the virtual worlds, virtual environments or metaverse 202.
  • Such services may include, for example, and without limitations, metaverse services such as those of Sandbox, Decentraland, Fortnite or any other metaverse or virtual reality environments that allow users to log into and to explore, manipulate and perform interactions or transactions with other users or operators of the virtual world or virtual environment by controlling an avatar, virtual character or user reference.
  • One such interaction may include activities such as the operation or delivery of a virtual event by a virtual event host 212, with one or more virtual event participants210 who may join the virtual event within the virtual world 202 by logging into the virtual world 202 and accessing the virtual event.
  • Such a virtual event may include, without limitations, music concerts, operas, Chinese operas, dramas, plays, presentations, lectures, seminars, religious sermons, consultations, conferences, therapeutic treatments, counseling, meetings or operations between an event host and one or more participants. These events may occur in real time (live) or they may be pre-recorded and broadcast at a predetermined time.
  • the virtual event may be performed or conducted by use of a virtual event platform 204 which may be integrated within the virtual world service 202 itself.
  • the virtual event platform 204 is arranged to receive a content input from the virtual event host 212 once the host 212 has connected or logged into the virtual world service 202.
  • the virtual event platform 204 is arranged to receive the content input from the host 212, and then provide this content input to the participant streaming gateway 208 which is arranged to stream the content to each of the one or members 210 who are participating within the virtual event.
  • the virtual event host 212 may be a performer such as a singer or artists or may be a presenter to present a lecture, sermon or speech to the audience participants 210.
  • the event host 212 may firstly log into the virtual world 202 and access the virtual event platform 204 and then proceeding to begin the virtual event by streaming content from a host interface to the virtual event platform 204.
  • the host interface may be an electronic or communication device having the function of connecting to the virtual event platform on behalf of the host, and preferably, would include various audio and visual equipment so as to capture images or audio of the host in delivering this virtual event.
  • Audience participants 210 may then also log into the virtual world 202 so as to view or otherwise participate in the virtual event with a participant or user device.
  • the participant or user device may include an interface providing visual and audio signals and would have computing and telecommunications capabilities to connect into the virtual world 202 and to receive the media stream from the virtual host 212.
  • the media stream from the host 212 may be streamed to the participants device or devices.
  • This device may include smartphones, computers, wearable devices, virtual reality goggles, smart glasses, smart devices, or headphones that may at least be able to receive part or all of the media stream from the host 212.
  • the user device may also be referred to as an entry point to the virtual world 202 as it allows the user to connect and enter the virtual world 202.
  • media inputs from the host 212 may be streamed to the virtual event platform 204, which will in turn be streamed to the audience participants or members of the participant groups 210 via the participant/event streaming gateway 208.
  • the members of the participants 210 may then receive the media stream from the host 212 and would be able to listen, watch or otherwise experience the content that is delivered by the event host 212.
  • the audience 210 may receive a stream of the music or song, the audience 210 may also receive a stream of images, either recorded live or pre-produced.
  • an interaction server 206 or interaction service which is arranged to receive interactive data from the one or more audience members 210 and in turn process this data for distribution to other audience members 210 or to the event host 212.
  • the virtual event platform 204, participant streaming gateway 208 and the interaction server 206 may be implemented within the virtual world 202 itself.
  • the virtual world 202 may include programmable functions which allow an operator to specifically program the functions of the virtual event platform 204, the participant streaming gateway 208 and interaction server 206 within the virtual world environment 202 such that the computing systems which operate the virtual world 202 would provide the functions of the virtual event platform 204, participant streaming gateway 208 and the interaction server 206 itself.
  • the host 212 may firstly connect to the virtual worlds 202 with their avatar or character so as to stream their content into the virtual world 202. This content is in turn will be streamed to the audience participants 210. The audience participants 210 may use their user devices to log it into the virtual world 202 and participate in this virtual event.
  • This event may be a virtual concert which allows the audience members 210 to listen and watch the host 212 who may be the performer of this concert and may take the form whereby the virtual event platform 204 has a prebuilt space within the virtual world 202 (such as a virtual concert hall) and thus allowing both the host 212 and the audience 210 to interact within the virtual world 202 at the virtual concert hall so as to participate in this virtual concert.
  • Other types of virtual spaces may also be arranged to be a suitable virtual connection point for different types of virtual events.
  • each audience member 210 may than interact or otherwise provide feedback in the forms of interactive data to the host 212 and to other audience members 210 by using functions provided within their own user devices.
  • the type of interactive data 400 may include biometric data, sound or visual data, text or other digital selection data, and movement data.
  • Each of the specific groups of data may be provided by each of the audience participant 210 by use of their user device and any connected peripheral devices.
  • the interaction server 206 may perform some processing of this data including analysis of the data 400 so as to provide summaries of various statuses of the various audience members 210 so that the host 212 may interpret the data quickly and respond accordingly.
  • Such embodiments may be advantageous to the event host212 and audience members210 as the event host 212 may be able to quickly review the feedback and interaction of the audience members 210 and in turn, adjust, adaptor change their performance or operation of the virtual event to improve or further enhance the experience of each of the audience members 210.
  • the host 212 may also specifically request that the audience participant 210 to interact with the host 212 via the virtual event platform 204. This may be actioned by the host 212 issuing specific instructions to the audience 210 via the virtual event platform 204 and then proceeding to monitor the interactive data 400 that is received from the audience members 210. As an example, where the host 212 may be a singer or artist, the host 212 may request their audience members 210 to sing along to specific phrases of a song or musical performance. In turn the audience members 210 may then activate a microphone or other type of audio data stream on their user device so as to sing along during the performance.
  • This voice data may then be transmitted from the user device to the interaction server 206and in turn to the host 212 who may then review the interactive data 400 to assess the participation level as well as the quality of the participation by their audience 210.
  • the host 212 may also record or further process the audio data from the audience 210 so as to create sound effects or additional audio tracks to form part of their performance in real time or for further publications.
  • Other examples of interaction between the audience members 210 and the host 212 may include a request by the host 212 to and answer specific questions, activate or move a particular body part such as the head or arms, execute a specific dance routine, or to provide feedback in the terms of likes or other types of digital selections which may be submitted by the audience members 210 via an interface on their user device.
  • an advantage of this embodiment of the system for facilitating an event may be found in that the system allows for a greater level of engagement with the audience as well as to allow the audience to enjoy a more immersive experience.
  • the host may also in turn experience a greater level of interaction from their audience and thus providing the host an opportunity to increase the quality and immersive experience of the virtual event.
  • the interactive data may also include biometric data such as heart ratesof the audience members which may in turn be processed or analysed to recognize the emotions, engagement or enjoyment of the audience members.
  • the biometric data may also provide guidance as to whether the audience is excited, bored, happy or any other emotion so that they may adjust their events either in real-time or in the future.
  • the interaction server 206 may also perform specific processing on the interactive data 400 before it is distributed to the host 212 and other audience members 210.
  • the host 212 may engage with a team to process the information provided by the participants 210.
  • the interactive data 400 may be processed with specific statistical analysis tools such as summations, averages, standard deviations and regressions so as to provide a picture of the general well-being of the participants.
  • biometric data from all of the participant members may be analysed by the interaction server so as to devise a specific emotion of the audience.
  • movement data or other types of digital selections as well as voice data may be measured and assessed by the interaction server 206 to gauge the emotion and experience of each of the participant users. This may be performed by a series of analytical processes including the use of machine learning systems such as neural networks that may classify the interactive data into specific emotions or representative well-being attributes of the audience group which will in turn allow the host 212 to alter their immediate performance or to perform an alternative plan to their performance so as to improve the virtual event and its experience for the participant group 210.
  • machine learning systems such as neural networks that may classify the interactive data into specific emotions or representative well-being attributes of the audience group which will in turn allow the host 212 to alter their immediate performance or to perform an alternative plan to their performance so as to improve the virtual event and its experience for the participant group 210.
  • classifiers may include the use of neural networks, multi-class classification systems or embedding systems which are able to classify multiple dimensional vectors or parameters into a small set of classifiers, and thus allow for various interactive data, such as biometric data, movement data etc, to identify a specific state of the audience 210.
  • the interaction server 306 is arranged to be separated, either controlled separately, or physically separated from the virtual world 202 or the computing systems which provide the metaverse, virtual world or virtual environments 202.
  • the virtual event platform 204 and the participant/event streaming gateway 208 may be implemented within the metaverse, virtual world or virtual environment 202.
  • the interaction server 306 may be implemented separately on an additional server that is connected with the host212 and the participant groups 210.
  • the separated interaction server 306 may also be in communication with the virtual world platform 204 or the virtual world 202.
  • the virtual event may be initiated by having the host 212 starting to access the virtual world 202 and beginning to host the virtual event for its audience members or participants 210 within the virtual world 202.
  • the audience members 210 may join the virtual event by connecting with the virtual event platform 204 and in some instances, each of the audience members 210 may have their profile or avatar checked before being admitted to the virtual event. These checks may include the identity of the audience members 210 as well as whether the member 210 has any specific privilegesor assets such as tickets or in possession of any specific tokens, including any specific Non-Fungible Tokens (NFTs) .
  • NFTs Non-Fungible Tokens
  • the event in the form of media data such as visual, audio, text may be streamed to the one or more members of the audience 210 or event participant group.
  • any interactive data produced by the audience members 210 are either in part or entirely, streamed to the host 212 via an external interaction server 306.
  • This may be advantageous in specific examples where the virtual world 202 or virtual world platform 204 does not yet provide the necessary functionality toreceive and process the interactive data and thus to allow complete interactive data to be delivered to the host 212, a separate server to provide the function of the interaction server 306may be necessary or preferable.
  • a virtual host 212 may continue to use a virtual world platform 204 to stream the event to multiple audience members 210.
  • both the host 212 and each of the members of the audience 210 may connect to a separate interaction server 306 via their user device such as a tablet computer or a smartphone so as to obtain or exchange interactive data between the host 212 and each of the members 210 of the participant group.
  • Various functions provided by the interaction server 306 including processing of the interactive data, storage of the interactive data and the analysis of the interactive data may be performed by the interaction server 306 separately. This is particularly advantageous in examples where the virtual world 202 does not provide a complete set of tools for the implementation of the processing of various interactive data and thus a separate interaction server 306 may be implemented to process the interactive data.
  • audience members 210 and the host 212 may host that event through multiple virtual world platforms, a single interaction server 306 may also be adapted to serve multiple virtual events in multiple virtual worlds.
  • the virtual event host 212 may be a single individual such as a performer, singer or lecturer or it may be a group of performers that may include an artist, singer, musician, sound engineer, producer, director, crew, etc.
  • the event hosts 212 may stream their event via the virtual event platform inside the virtual world 202 whilst they may also have access to a host interface 402 which may connect with the interaction server to obtain the interactive data 400.
  • the interaction server may be similar to the server 206/306 asdescribed with reference to Figures 2 and 3 and may be intricately formed within the virtual world 202 or it may also be a separate server from that of the virtual world 202.
  • the host interface may be arranged to receive and process interactive data 400 received from the one or more participants of the virtual event and in turn transmit the processed interactive data 400 to the event host and/or to the one or more participants 210.
  • the interaction server may receive interactive data 400 obtained from the participants of the virtual event 210.
  • Such interactive data 400 may include, without limitations:
  • IMU Inertial Measurement Unit
  • G-sensor that may be held implemented in a user device or peripheral device held or worn by each participant member
  • Biometric data 406 of each of the participants including biometric measurements of each participant’s heart rate as measured by a heart rate sensor in a user device or peripheral device held or worn byeach participant member;
  • Multimedia data 408 including sound and/or video streams as captured by a user device or peripheral device as held or worn by each of the participants arranged to capture sounds or video images of each the participants and their surroundings;
  • This may include, for example, text messages, emotion icons, digital gifts, gift tokens, cryptocurrencies, virtual asserts, or digital artworks etc.
  • the interactive data 400 may also be received by the event host 212 both from individual participants416 or the interactive data from the participants may be grouped 414 together so as to ease its processing by the interaction server and presentation by the host interface 402.
  • the event host interface 402 may be arranged to show the data of the members of the participant group and where necessary, the host 212 may select specific members of the participants and monitor data as received from a specific member 416 or groups of members 414.
  • the interaction server may use an analytical tool such as a learning engine implemented with neural networks or other type of machine learning systems to process and perform a classification of the various interaction information received so as to devise specific states or reports for the host 212. These statesmay report on the levels of crowd participation, participant’s general emotions and an assessment of their level feedback and generally whether the feedback is positive or negative for the virtual host 212.
  • the virtual host 212 may also request the one or more members of the participants to perform specific actions such as singalongs, nodding or shaking of their heads, dance routines, clapping, or the host 212 may ask specific questions or instructions such as whether the audience is enjoying the event or wishes for an offer performance.
  • the participants may reply accordingly via their user or peripheral devices to provide this interactive data to the event host 212.
  • the interactive data 400 received from the audience members may also be analysed by the interaction server. For example, where the virtual event host 212 may ask whether the audience wishes for them to expand a particular song, the audience members may scream into the microphones that they prefer an offer or repeat of a song.
  • the audio 408 from this interactive data 400 may in turn be analysed by a machine learning network or any voice to text system so as to determine whether the audience wishes for an expand performance or whether the audience wishes for the host to move to a different song or topic.
  • the results may then be sent to the event host and presented on the host interface for the host to see and react accordingly.
  • the participant members210 may connect to the virtual events with a user device 502, which may include, without limitation, computing device such as a computer device, computer tablet, computer goggles, smart phone and further useadditional peripheral devices504 which connect to their user device 502.
  • a user device 502 may include, without limitation, computing device such as a computer device, computer tablet, computer goggles, smart phone and further useadditional peripheral devices504 which connect to their user device 502.
  • computing device such as a computer device, computer tablet, computer goggles, smart phone and further useadditional peripheral devices504 which connect to their user device 502.
  • peripheral devices 504 Through these devices 502, 504, each of the one or more members of the participants may provide their multimedia data 408, location data 412, biometric data 406 and movement data 404 and text or digital selections 410.
  • peripheral devices 504 such as goggles, headphones, smart control multimedia/multifunction chairs or furniture, smart glasses, wearable devices, Internet of Things (IoT) devices, smart wristbands, smart vests, smart hats, smart clothing or interactive devices such as smart light/glow sticks may be implemented to collect interactive data 400 from the participants 210.
  • IoT Internet of Things
  • smart wristbands smart vests
  • smart hats smart clothing
  • interactive devices such as smart light/glow sticks
  • the participants 210 may use a peripheral device 504 which includes a pair of smart headphones that may include anIMU, microphone and a heart rate monitor to receive voice 408, biometric 406 and movement data 404 from the user 210.
  • the user 210 may be able to move their heads, execute dance routines, singalong or answer with their voice, as well as provide their heart rates which may in turn be transmitted as interactive data 400 to the virtual event host.
  • This interactive data 400 once processed and analysedmay be an indicator of the emotions, participation and enjoyment of the participants.
  • the interaction server may process this data for redistribution to the participant group such that the one or more members 210 of the participant group may also identify whether their peers are experiencing or enjoying the events in the same way that they are also experiencing the event.
  • the one or more participants 210 may also observe and review the interactive data 400 and may be able to explore the feelings, participation levels and emotions of the other participants as well as to connect, interact and compete with the other participants to obtain certain rewards or prizes as well as to increase their profile rankings or levels, which may allow them greater access to the host 212 or other members 210, or unlock specific access to interactive data 400 or other virtual events.
  • the interactive data 400 may also be utilised in a sports training, games or exercise program where the heartbeat rates may be used to measure the efforts made by the user, whilst the IMU may also measure the movement of the user undergoing specific movements or exercise routines as part of the virtual event, which may be a virtual exercise training, competition or games session.
  • the interaction server 206, 306 is arranged to provide the interactive data 400 from the event participants 210 to the event host 212. Additionally, the interaction server 206/306 may also be arranged to process the interactive data 400 and provide the processed interactive data 614 to the event host and/or the virtual world or virtual environment 202.
  • the interaction server 206/306 is also arranged to provide avatar control data to the virtual world or virtual environments.
  • This may be advantageous as the interactive data provided by the event participants may also be used to control and animate the avatar of each of the participants within the virtual world or virtual environment 202 so as to allow the avatar to be better animated within the virtual world, or to allow the avatar to move about the virtual world, or to also allow the avatar to interact, transact and issue instructions with other avatars, objects or functions within the virtual world 202.
  • the individual participants may provide interactive data 400 to the interaction server206/306 with their user device and any connected peripheral device.
  • Such interactive data 400 may include, without limitation, movement data 404, biometric data 406, multimedia data 408, location data 412, digital selection data 410 as described above with reference to Figure 4.
  • the data400 may be directly transmitted to the event host 212 and presented on a host interface 402.
  • the host interface 402 will allow the host 212 to select and view the interactive data 400 of all of the participants or specific participants.
  • the host 412 may also group the interactive data 400 by the interactive data type or by the participant identity or groups of participants with common profiles or attributes.
  • the interaction server 206/306 includes a learning network which may include a machine learning processor or a neural network processor arranged 612 to process and classifying the interactive data 400 as received from the participants.
  • the learning network processor612 analyses the interactive data 400 and performs the classification of the data to devise specific processed attributes 614 or characteristics of the participants including, without limitation, the emotions of the participants, the participation rates of the participants, the likely demographic or profile information of the participants, and the general response of the participants during the virtual event. Additional attributes 614 may be obtained from the analysis of the interactive data by the learning network processor 612.
  • the interaction server 206/306 may also process the interactive data 400 to provide movement and animation data to the virtual world or virtual environment 202 so as to manipulate and animate the participants within the virtual world 202.
  • movement data is obtained from the IMU as worn or held by each of the participants.
  • the IMU data may mirror specific movements.
  • This data may then be interpreted 606 to be specific animations of the avatar within the virtual world 202, including dance moves, movement of the head or limbs or general movement such as walking or turning of the avatar.
  • a learning network may also be used to interpret the IMU data to devise specific movements of the avatar so as to mirror a physical movement of the participant within the virtual world.
  • the IMU may be placed on a head phone or smart glasses and thus reflect the movement of the head of a participant.
  • a direct movement of the head by a participant may then be animated on their avatar within the virtual world.
  • the movement of the head may represent agreement or disagreement, or specific instructions and thus specific rules may be used to calculate and determine whether the movement is an animation on the avatar, or for actuating specific instructions 608 or to move the avatar in specific directions 606 within the virtual world or environment 202.
  • the interaction server may also include a lip-syncing function 604 which may be performed, in one example, by processing the audio data as received from the participants. By analysing the sounds made by the participants, a table of mouth movements reflective of specific sounds may then be applied as animation to the avatar of the participant within the virtual world or environment. Furthermore, a machine learning system may also be used to process the sounds so as to generate text from the sounds witha speech to text tool. The text may also then be processed to determine feedback for the host or as instructions from the participant.
  • a lip-syncing function 604 may be performed, in one example, by processing the audio data as received from the participants. By analysing the sounds made by the participants, a table of mouth movements reflective of specific sounds may then be applied as animation to the avatar of the participant within the virtual world or environment.
  • a machine learning system may also be used to process the sounds so as to generate text from the sounds witha speech to text tool. The text may also then be processed to determine feedback for the host or as instructions from the participant.
  • the interaction server 206/306 may also use the biometric data as collected from the participants to alter or change the appearance and “look and feel” of the virtual event.
  • the interaction server 206/306 may determine the mood or emotions of the participants and in turn, instruct the virtual event platform within the virtual world 202 to alter its appearance and experience for the participants.
  • the look and feel and of the virtual event may follow a specific colour scheme, followed by adjustment of the tone and loudness of music or sounds so as to create a more compatible environment and in turn, increase the quality of the virtual event experience,
  • the embodiments described with reference to the Figures can be implemented as an application programming interface (API) or as a series of libraries for use by a developer or can be included within another software application, such as a terminal or personal computer operating system or a portable computing device operating system.
  • API application programming interface
  • program modules include routines, programs, objects, components and data files assisting in the performance of particular functions, the skilled person will understand that the functionality of the software application may be distributed across a number of routines, objects or components to achieve the same functionality desired herein.
  • any appropriate computing system architecture may be utilised. This will include stand-alone computers, network computers and dedicated hardware devices. Where the terms “computing system” and “computing device” are used, these terms are intended to cover any appropriate arrangement of computer hardware capable of implementing the function described.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A system and method for facilitating a virtual event comprising the steps ofreceiving an event stream from an event host; transmitting the event stream to one or more members of a participant group; and, receiving interactive data from the one or more members of the participant group; and transmitting the interactive data to the event host.

Description

A SYSTEM AND METHOD FOR FACILITATING A VIRTUAL EVENT TECHNICAL FIELD
The present invention relates to a system and method for facilitating a virtual event, and particularly, although not exclusively, to a system and method for facilitating a virtual event whereby a host and the participants may interact with each other.
BACKGROUND
With advancement and widespread adoption of new communications and computing technologies, people are becoming more and more connected with new communication mediums which are designed to form various channels in which people can communicate and interact with each other.
Despite these progresses, the entertainment and participation experience of virtual events is nonetheless significantly different when compared to those of face to face events. Although the ease of access to a virtual event makes such events more accessible, the experience of such events is very limited due to the various limitations of these mediums. In turn, virtual events, whilst operational, are anoften less enjoyable experience compared with face-to-face events.
Additionally, other forms of digital entertainment such as game services or video/audio on demand services that are accessible within a cyber environment often offer a more enjoyable form of entertainment experience than virtual events, in turn, reducing the attractiveness of virtual events as a suitable medium for entertainment or digital experience.
SUMMARY OF THE INVENTION
In accordance with a first aspect of the present invention, there is provided a method for facilitating a virtual event comprising the steps of:
- receiving an event stream from an event host;
- transmitting the event stream to one or more members of a participant group; and,
- receiving interactive data from the one or more members of the participant group; and transmitting the interactive data to the event host.
In an embodiment of the first aspect, the interactive data is transmitted to the one or more members of the participant group.
In an embodiment of the first aspect, the interactive data is processed before it is transmitted to  the event host.
In an embodiment of the first aspect, the interactive data is transmitted to selected members of the participant group.
In an embodiment of the first aspect, the interactive data includes biometric data of the one or more members of the participant group.
In an embodiment of the first aspect, the interactive data includes motion data arranged to represent the motion of the one or more members of the participant group.
In an embodiment of the first aspect, the interactive data includes visual and/or audio data captured by the one or more members of the participant group.
In an embodiment of the first aspect, the visual and/or audio data is processed to animate a mouth an avatar associated with each of the one or more members of the participant group.
In an embodiment of the first aspect, the motion data is processed to animate the avatar associated with each of the one or more members of the participant group.
In an embodiment of the first aspect, the motion data is further processed to determine an action or instruction of the one or more members of the participant group.
In an embodiment of the first aspect, the interactive data is processed to determine one or more emotional attributes of the participant group.
In an embodiment of the first aspect, the interactive data is processed with a learning network arranged to classify the interactive data to determine the one or more emotional attributes of the participant group.
In accordance with a second aspect of the present invention, there is provided a system for facilitating a virtual event comprising:
- a virtual event platform arranged to receive an event stream from an event host for streaming to a participant group;
- a participant streaming gateway arranged to transmit the event stream to one or more members of the participant group; and,
- an interaction server arranged to receive interactive data from the one or more members of the  participant group and to transmit the interactive data to the event host.
In an embodiment of the second aspect, the interactive data is transmitted to the one or more members of the participant group.
In an embodiment of the second aspect, the interactive data is processed before it is transmitted to the event host.
In an embodiment of the second aspect, the interactive data is transmitted to selected members of the participant group.
In an embodiment of the second aspect, the interactive data includes biometric data of the one or more members of the participant group.
In an embodiment of the second aspect, the interactive data includes motion data arranged to represent the motion of the one or more members of the participant group.
In an embodiment of the second aspect, the interactive data includes visual and/or audio data captured by the one or more members of the participant group.
In an embodiment of the second aspect, the visual and/or audio data is processed to animate a mouth an avatar associated with each of the one or more members of the participant group.
In an embodiment of the second aspect, the motion data is processed to animate the avatar associated with each of the one or more members of the participant group.
In an embodiment of the second aspect, the motion data is further processed to determine an action or instruction of the one or more members of the participant group.
In an embodiment of the second aspect, the interactive data is processed to determine one or more emotional attributes of the participant group.
In an embodiment of the second aspect, the interactive data is processed with a learning network arranged to classify the interactive data to determine the one or more emotional attributes of the participant group.
In an embodiment of the second aspect, the virtual event is facilitated within a virtual world  and the interaction server is separately implemented and controlled from the virtual world.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings in which:
Figure 1 is a schematic block diagram of a computer system arranged to be implemented to operate a system for facilitating a virtual event in accordance with one embodiment of the present invention;
Figure 2 is block diagram illustrating an embodiment of a system for facilitating a virtual event;
Figure 3 is block diagram illustrating another embodiment of a system for facilitating a virtual event;
Figure 4 is block diagram illustrating the interactive data received by the virtual host in accordance with the system for facilitating a virtual event of Figures 2 or 3;
Figure 5 is block diagram illustrating the interactive data provided by the virtual event participants or audience members in accordance with the system for facilitating a virtual event of Figures 2 or 3; and,
Figure 6 is block diagram illustrating an example interaction server of the system for facilitating a virtual event of Figures 2 or 3.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Referring to Figure 1, an embodiment of a computer system 100 is illustrated. This embodiment of the computer system 100 is arranged to provide a system for facilitating a virtual event comprising:
- a virtual event platform arranged to receive an event stream from an event host for streaming to a participant group;
- a participant streaming gateway arranged to transmit the event stream to one or more members of the participant group; and,
- an interaction server arranged to receive interactive data from the one or more members of the participant group and to transmit the interactive data to the event host.
In this example embodiment, the virtual event platform, participant streaming gateway and interaction server are implemented by one or more computers or computing systems having an appropriate user interface. The computer may be implemented by any computing architecture, including portable computers, tablet computers, stand-alone Personal Computers (PCs) , smart devices, Internet of Things (IOT) devices, edge computing devices, client/server architecture, “dumb” terminal/mainframe architecture, cloud-computing based architecture, or any other appropriate architecture. The computing device may be appropriately programmed to implement the invention.
In this embodiment, the system and method for facilitating a virtual event are arranged to facilitate a virtual event such as a music concert, opera, drama, play, Chinese opera, presentation, lecture, seminar, religious sermon, training, consultation, therapeutic treatments, conference, meeting, sports training, competition, counselling or operation between an event host and one or more participants. The virtual event may be facilitated within a virtual environment or virtual world, sometimes referred to as a virtual world platform, virtual world, virtual reality, virtual environment, virtual universe or metaverse which may include, for example, a three-dimensional virtual world that is created and operated by computer systems as an open world or open map virtual environment. Additionally, the term virtual world or virtual environment may also include other forms of online or computing environments which may not necessarily be three-dimensional, or even graphics based, such as chatrooms, teleconference platforms, online data exchange services, online messaging services or telephony/communication services or platforms.
Preferably, embodiments of the system and method may be arranged to include an interaction server, or provide an interaction service whereby the host of the virtual event may be able to interact with the participants. The host, which may be a performer, artist and/or their supporting teams may be performing, presenting, speaking, singing, acting or otherwise hosting the event, and the participants may in turn provide their interaction or feedback back to the host as part of their participation within the virtual event. As an example, where the virtual event is a virtual concert with the host being a singer or artist, the participants may be able to use the interaction server or interaction service to interact with the host by submitting interactive data to the host. This interaction may include text messages, gifts in the forms of digital tokens, digital gifts or payments, artworks or icons, or voice/sound data, video data, location data, movement data or biometric data.
Moreover, such examples of interactive data may also be further processed by examples of the system for facilitating a virtual event such that it may be made more meaningful and useful for  the event host to react to the interactive data, either in the next event, or preferably, in real time. Thus, in the example of where the virtual event is a virtual concert, the performer or host may be able to deliver their performance within the virtual world or metaverse, followed by requesting for the participants or audience to undertake certain interactive actions such as asking questions, asking the audience to sing along, or asking the audience members to move their body and dance with the music. In turn, based on the interactive data that is captured from the participants and returned to the host as a form of feedback, the host is able to determine the level or type of interactive actions undertaken by the audience, and thus adjusting or adapting their performance accordingly.
Embodiments of the present invention may be advantageous as a virtual event may be made more interactive by the event host, who will be able to monitor or gauge the participant’s response and reaction. Furthermore, by facilitating and monitoring the ability for participants to participate within the virtual event through the performance of various tasks, such as singing along, clapping, following specific motion routines, the event host may be able to involve the participants more within the virtual event and thus improving the experience and immersion of the participants within the event.
As shown in Figure 1, there is a shown a schematic diagram of a computer system or computer server 100 which is arranged to be implemented as an example embodiment of a system for facilitating a virtual event. In this embodiment, the computer system comprises a server 100 which includes suitable components necessary to receive, store and execute appropriate computer instructions. The components may include a processing unit 102, including Central Processing United (CPUs) , Math Co-Processing Unit (Math Processor) , Graphic Processing United (GPUs) or Tensor processing united (TPUs) for tensor or multi-dimensional array calculations or manipulation operations, read-only memory (ROM) 104, random access memory (RAM) 106, and input/output devices such as disk drives 108, input devices 110 such as an Ethernet port, a USB port, etc. Display 112 such as a liquid crystal display, a light emitting display or any other suitable display and communications links 114. The server 100 may include instructions that may be included in ROM 104, RAM 106 or disk drives 108 and may be executed by the processing unit 102. There may be provided a plurality of communication links 114 which may variously connect to one or more computing devices such as a server, personal computers, terminals, wireless or handheld computing devices, Internet of Things (IoT) devices, smart devices, edge computing devices. At least one of a plurality of communications link may be connected to an external computing network through a telephone line or other type of communications link.
The server 100 may include storage devices such as a disk drive 108 which may encompass solid state drives, hard disk drives, optical drives, magnetic tape drives or remote or cloud-based storage devices. The server 100 mayuse a single disk drive or multiple disk drives, or a remote storage service. The server 100 may also have a suitable operating system 116 which resides on the disk drive or in the ROM of the server 100.
The computer or computing apparatus 100 may also provide the necessary computational capabilities to operate or to interface with a machine learning network, such as a neural network, to provide various functions and outputs. The neural network may be implemented locally, or it may also be accessible or partially accessible via a server or cloud-based service. The machine learning network may also be untrained, partially trained or fully trained, and/or may also be retrained, adapted or updated over time.
The computer or computing apparatus 100 may also provide the necessary computational and communication capabilities to operate as, either in part, or completely, as a host server or as a function provider server for virtual reality environments that may also be referred to virtual reality worlds, virtual worlds, virtual environments, gaming environments, gaming worlds, augment reality worlds, metaverse or any similar terms which represent similar virtual worlds, augment reality worlds or virtual universes. The computer or computing apparatus 100 may be implement to operate such virtual worlds or universes and allow for various activities by users who may enter (log in) to these virtual worlds, including gaming, meetings, concerts, operas or other events, virtual or online activities, explorations, social activities, the import, export, trade or exchange of real or virtual properties, currencies or assets that may exists within the real or virtual worlds or universe or virtual currencies or assets which may be usable on multiple platforms and crosses multiple real or virtual worlds or universes. Such virtual currencies, property or assets may be supported by various ledger systems such as open or distributed ledgers or block chain technologies which may operate with the virtual worlds or universes.
With reference to Figure 2, there is illustrated a block diagram of an example embodiment ofa system for facilitating a virtual event 200 comprising: avirtual event platform 204 arranged to receive an event stream from an event host 212 for streaming to a participant group; aparticipant or event streaming gateway 208 arranged to transmit the event stream to one or more members of the participant group 210; and, an interaction server 206 arranged to receive interactive data from the one or more members of the participant group 210 and to transmit the interactive data to the event host 212.
As shown in this example embodiment, the system for facilitating a virtual event 200 includes a virtual world, virtual environment service, metaverse service202 which may be provided by one or more computer systems arranged to provide a virtual reality world service 202 or any other online data/communication exchange/teleconference services for users. In these virtual world or environments202, users may log into the virtual environment services 202 as an avatar, virtual character or other forms of virtual representation of the user so as to access the virtual worlds, virtual environments or metaverse 202. Such services may include, for example, and without limitations, metaverse services such as those of Sandbox, Decentraland, Fortnite or any other metaverse or virtual reality environments that allow users to log into and to explore, manipulate and perform interactions or transactions with other users or operators of the virtual world or virtual environment by controlling an avatar, virtual character or user reference. One such interaction may include activities such as the operation or delivery of a virtual event by a virtual event host 212, with one or more virtual event participants210 who may join the virtual event within the virtual world 202 by logging into the virtual world 202 and accessing the virtual event. Such a virtual event may include, without limitations, music concerts, operas, Chinese operas, dramas, plays, presentations, lectures, seminars, religious sermons, consultations, conferences, therapeutic treatments, counselling, meetings or operations between an event host and one or more participants. These events may occur in real time (live) or they may be pre-recorded and broadcast at a predetermined time.
Preferably, the virtual event may be performed or conducted by use of a virtual event platform 204 which may be integrated within the virtual world service 202 itself. The virtual event platform 204 is arranged to receive a content input from the virtual event host 212 once the host 212 has connected or logged into the virtual world service 202. In one example, the virtual event platform 204 is arranged to receive the content input from the host 212, and then provide this content input to the participant streaming gateway 208 which is arranged to stream the content to each of the one or members 210 who are participating within the virtual event.
In one example, the virtual event host 212 may be a performer such as a singer or artists or may be a presenter to present a lecture, sermon or speech to the audience participants 210. To start this process of delivering and hosting this virtual event, the event host 212 may firstly log into the virtual world 202 and access the virtual event platform 204 and then proceeding to begin the virtual event by streaming content from a host interface to the virtual event platform 204. The host interface may be an electronic or communication device having the function of connecting to the virtual event platform on behalf of the host, and preferably, would include various audio and visual equipment so as to capture images or audio of the host in delivering this virtual event.
Audience participants 210 may then also log into the virtual world 202 so as to view or otherwise participate in the virtual event with a participant or user device. The participant or user device may include an interface providing visual and audio signals and would have computing and telecommunications capabilities to connect into the virtual world 202 and to receive the media stream from the virtual host 212. In turn, the media stream from the host 212 may be streamed to the participants device or devices. This device may include smartphones, computers, wearable devices, virtual reality goggles, smart glasses, smart devices, or headphones that may at least be able to receive part or all of the media stream from the host 212. The user device may also be referred to as an entry point to the virtual world 202 as it allows the user to connect and enter the virtual world 202.
Once the virtual event is started, media inputs from the host 212 may be streamed to the virtual event platform 204, which will in turn be streamed to the audience participants or members of the participant groups 210 via the participant/event streaming gateway 208. In turn, the members of the participants 210 (the audience members) may then receive the media stream from the host 212 and would be able to listen, watch or otherwise experience the content that is delivered by the event host 212. In instances where the virtual event is a concert, the audience 210 may receive a stream of the music or song, the audience 210 may also receive a stream of images, either recorded live or pre-produced. Provided also within the system for facilitating a virtual event 200 is an interaction server 206 or interaction service which is arranged to receive interactive data from the one or more audience members 210 and in turn process this data for distribution to other audience members 210 or to the event host 212.
In this example, the virtual event platform 204, participant streaming gateway 208 and the interaction server 206 may be implemented within the virtual world 202 itself. In these examples, the virtual world 202 may include programmable functions which allow an operator to specifically program the functions of the virtual event platform 204, the participant streaming gateway 208 and interaction server 206 within the virtual world environment 202 such that the computing systems which operate the virtual world 202 would provide the functions of the virtual event platform 204, participant streaming gateway 208 and the interaction server 206 itself.
As shown in this example embodiment, where the host 212 begins to host a virtual event, the host 212 may firstly connect to the virtual worlds 202 with their avatar or character so as to stream their content into the virtual world 202. This content is in turn will be streamed to the audience participants 210. The audience participants 210 may use their user devices to log it  into the virtual world 202 and participate in this virtual event. This event may be a virtual concert which allows the audience members 210 to listen and watch the host 212 who may be the performer of this concert and may take the form whereby the virtual event platform 204 has a prebuilt space within the virtual world 202 (such as a virtual concert hall) and thus allowing both the host 212 and the audience 210 to interact within the virtual world 202 at the virtual concert hall so as to participate in this virtual concert. Other types of virtual spaces may also be arranged to be a suitable virtual connection point for different types of virtual events.
During the streaming of the performance by the host 212, each audience member 210 may than interact or otherwise provide feedback in the forms of interactive data to the host 212 and to other audience members 210 by using functions provided within their own user devices. As it will be described in further details with reference to Figure 4, the type of interactive data 400 may include biometric data, sound or visual data, text or other digital selection data, and movement data. Each of the specific groups of data may be provided by each of the audience participant 210 by use of their user device and any connected peripheral devices. When the interactive data 400 is collected from the each of the participants 210, it is then transmitted to the interaction server 206. The server 206 may then transmit this data to the event host 212 and thus allowing the event host 212 to review this interactive data 400 from the audience 210. Preferably, the interaction server 206 may perform some processing of this data including analysis of the data 400 so as to provide summaries of various statuses of the various audience members 210 so that the host 212 may interpret the data quickly and respond accordingly. Such embodiments may be advantageous to the event host212 and audience members210 as the event host 212 may be able to quickly review the feedback and interaction of the audience members 210 and in turn, adjust, adaptor change their performance or operation of the virtual event to improve or further enhance the experience of each of the audience members 210.
In this example, the host 212 may also specifically request that the audience participant 210 to interact with the host 212 via the virtual event platform 204. This may be actioned by the host 212 issuing specific instructions to the audience 210 via the virtual event platform 204 and then proceeding to monitor the interactive data 400 that is received from the audience members 210. As an example, where the host 212 may be a singer or artist, the host 212 may request their audience members 210 to sing along to specific phrases of a song or musical performance. In turn the audience members 210 may then activate a microphone or other type of audio data stream on their user device so as to sing along during the performance. This voice data may then be transmitted from the user device to the interaction server 206and in turn to the host 212 who may then review the interactive data 400 to assess the participation level as well as the quality of the participation by their audience 210. The host 212 may also record or further  process the audio data from the audience 210 so as to create sound effects or additional audio tracks to form part of their performance in real time or for further publications. Other examples of interaction between the audience members 210 and the host 212 may include a request by the host 212 to and answer specific questions, activate or move a particular body part such as the head or arms, execute a specific dance routine, or to provide feedback in the terms of likes or other types of digital selections which may be submitted by the audience members 210 via an interface on their user device.
As illustrated in this example, an advantage of this embodiment of the system for facilitating an event may be found in that the system allows for a greater level of engagement with the audience as well as to allow the audience to enjoy a more immersive experience. The host may also in turn experience a greater level of interaction from their audience and thus providing the host an opportunity to increase the quality and immersive experience of the virtual event. Furthermore, the interactive data may also include biometric data such as heart ratesof the audience members which may in turn be processed or analysed to recognize the emotions, engagement or enjoyment of the audience members. In turn, the biometric data may also provide guidance as to whether the audience is excited, bored, happy or any other emotion so that they may adjust their events either in real-time or in the future.
In some embodiments the interaction server 206 may also perform specific processing on the interactive data 400 before it is distributed to the host 212 and other audience members 210. In specific examples where the virtual event may include a large number of audience members 210, the host 212 may engage with a team to process the information provided by the participants 210. Thus, in order to increase the efficiency of processing such a large amount of information from the participants 210, the interactive data 400 may be processed with specific statistical analysis tools such as summations, averages, standard deviations and regressions so as to provide a picture of the general well-being of the participants. As an example, biometric data from all of the participant members may be analysed by the interaction server so as to devise a specific emotion of the audience. Additionally, movement data or other types of digital selections as well as voice data may be measured and assessed by the interaction server 206 to gauge the emotion and experience of each of the participant users. This may be performed by a series of analytical processes including the use of machine learning systems such as neural networks that may classify the interactive data into specific emotions or representative well-being attributes of the audience group which will in turn allow the host 212 to alter their immediate performance or to perform an alternative plan to their performance so as to improve the virtual event and its experience for the participant group 210. Examples of such classifiers may include the use of neural networks, multi-class classification systems or embedding  systems which are able to classify multiple dimensional vectors or parameters into a small set of classifiers, and thus allow for various interactive data, such as biometric data, movement data etc, to identify a specific state of the audience 210.
With reference to Figure 3, there is illustratedanother embodiment of the system for facilitating virtual events 300. In this example embodiment, the interaction server 306 is arranged to be separated, either controlled separately, or physically separated from the virtual world 202 or the computing systems which provide the metaverse, virtual world or virtual environments 202.
As shown in this example the virtual event platform 204 and the participant/event streaming gateway 208 may be implemented within the metaverse, virtual world or virtual environment 202. However, the interaction server 306 may be implemented separately on an additional server that is connected with the host212 and the participant groups 210. In some instances, due to design considerations or to improve operation of the virtual event, the separated interaction server 306 may also be in communication with the virtual world platform 204 or the virtual world 202.
In this example, the virtual event may be initiated by having the host 212 starting to access the virtual world 202 and beginning to host the virtual event for its audience members or participants 210 within the virtual world 202. The audience members 210 may join the virtual event by connecting with the virtual event platform 204 and in some instances, each of the audience members 210 may have their profile or avatar checked before being admitted to the virtual event. These checks may include the identity of the audience members 210 as well as whether the member 210 has any specific privilegesor assets such as tickets or in possession of any specific tokens, including any specific Non-Fungible Tokens (NFTs) .
Once the audience members 210 join the virtual event, the event, in the form of media data such as visual, audio, text may be streamed to the one or more members of the audience 210 or event participant group. However, in this example embodiment, any interactive data produced by the audience members 210 are either in part or entirely, streamed to the host 212 via an external interaction server 306. This may be advantageous in specific examples where the virtual world 202 or virtual world platform 204 does not yet provide the necessary functionality toreceive and process the interactive data and thus to allow complete interactive data to be delivered to the host 212, a separate server to provide the function of the interaction server 306may be necessary or preferable.
In this example, a virtual host 212 may continue to use a virtual world platform 204 to stream the event to multiple audience members 210. However, both the host 212 and each of the members of the audience 210 may connect to a separate interaction server 306 via their user device such as a tablet computer or a smartphone so as to obtain or exchange interactive data between the host 212 and each of the members 210 of the participant group. Various functions provided by the interaction server 306 including processing of the interactive data, storage of the interactive data and the analysis of the interactive data may be performed by the interaction server 306 separately. This is particularly advantageous in examples where the virtual world 202 does not provide a complete set of tools for the implementation of the processing of various interactive data and thus a separate interaction server 306 may be implemented to process the interactive data. Furthermore, where audience members 210 and the host 212 may host that event through multiple virtual world platforms, a single interaction server 306 may also be adapted to serve multiple virtual events in multiple virtual worlds.
With reference to Figure 4, there is illustrated a block diagram of the interactive data 400 which may be provided to the virtual host212 by the event participants 210. As shown in this example, the virtual event host 212 may be a single individual such as a performer, singer or lecturer or it may be a group of performers that may include an artist, singer, musician, sound engineer, producer, director, crew, etc. The event hosts 212 may stream their event via the virtual event platform inside the virtual world 202 whilst they may also have access to a host interface 402 which may connect with the interaction server to obtain the interactive data 400. The interaction server may be similar to the server 206/306 asdescribed with reference to Figures 2 and 3 and may be intricately formed within the virtual world 202 or it may also be a separate server from that of the virtual world 202. As described herein, the host interface may be arranged to receive and process interactive data 400 received from the one or more participants of the virtual event and in turn transmit the processed interactive data 400 to the event host and/or to the one or more participants 210.
In this example the interaction server may receive interactive data 400 obtained from the participants of the virtual event 210. Such interactive data 400 may include, without limitations:
Movement data 404 of each of the participants, including data obtained from an Inertial Measurement Unit (IMU) or G-sensor that may be held implemented in a user device or peripheral device held or worn by each participant member;
Biometric data 406 of each of the participants, including biometric measurements of each participant’s heart rate as measured by a heart rate sensor in a user device or peripheral device held or worn byeach participant member;
Multimedia data 408, including sound and/or video streams as captured by a user device or peripheral device as held or worn by each of the participants arranged to capture sounds or video images of each the participants and their surroundings;
Location data 412 as recorded by any user device or peripheral device as held or worn by each of the participants arranged to relay the actual location of the participant to the event host; and,
- Any text or digital selections/choices410 made by the one or more members of the participant group for transmission to the event host. This may include, for example, text messages, emotion icons, digital gifts, gift tokens, cryptocurrencies, virtual asserts, or digital artworks etc.
Preferably, the interactive data 400 may also be received by the event host 212 both from individual participants416 or the interactive data from the participants may be grouped 414 together so as to ease its processing by the interaction server and presentation by the host interface 402. The event host interface 402 may be arranged to show the data of the members of the participant group and where necessary, the host 212 may select specific members of the participants and monitor data as received from a specific member 416 or groups of members 414. Preferably in order to manage a large virtual event with many member participants, the interaction server may use an analytical tool such as a learning engine implemented with neural networks or other type of machine learning systems to process and perform a classification of the various interaction information received so as to devise specific states or reports for the host 212. These statesmay report on the levels of crowd participation, participant’s general emotions and an assessment of their level feedback and generally whether the feedback is positive or negative for the virtual host 212.
In this example, the virtual host 212 may also request the one or more members of the participants to perform specific actions such as singalongs, nodding or shaking of their heads, dance routines, clapping, or the host 212 may ask specific questions or instructions such as whether the audience is enjoying the event or wishes for an encore performance. During the streaming of the virtual event to the one or more members of the participants, the participants may reply accordingly via their user or peripheral devices to provide this interactive data to the event host 212.
The interactive data 400 received from the audience members may also be analysed by the interaction server. For example, where the virtual event host 212 may ask whether the audience wishes for them to encore a particular song, the audience members may scream into the microphones that they prefer an encore or repeat of a song. The audio 408 from this interactive  data 400 may in turn be analysed by a machine learning network or any voice to text system so as to determine whether the audience wishes for an encore performance or whether the audience wishes for the host to move to a different song or topic. Once this information has been processed by the interaction server, the results may then be sent to the event host and presented on the host interface for the host to see and react accordingly.
With reference to Figure 5, there is illustrated a block diagram of the various data and sources of data that may be used by each of the members of the participants 210. In this example, the participant members210 may connect to the virtual events with a user device 502, which may include, without limitation, computing device such as a computer device, computer tablet, computer goggles, smart phone and further useadditional peripheral devices504 which connect to their user device 502. Through these  devices  502, 504, each of the one or more members of the participants may provide their multimedia data 408, location data 412, biometric data 406 and movement data 404 and text or digital selections 410. Preferably peripheral devices 504 such as goggles, headphones, smart control multimedia/multifunction chairs or furniture, smart glasses, wearable devices, Internet of Things (IoT) devices, smart wristbands, smart vests, smart hats, smart clothing or interactive devices such as smart light/glow sticks may be implemented to collect interactive data 400 from the participants 210.
In a preferred example, the participants 210 may use a peripheral device 504 which includes a pair of smart headphones that may include anIMU, microphone and a heart rate monitor to receive voice 408, biometric 406 and movement data 404 from the user 210. In this example, the user 210 may be able to move their heads, execute dance routines, singalong or answer with their voice, as well as provide their heart rates which may in turn be transmitted as interactive data 400 to the virtual event host. This interactive data 400, once processed and analysedmay be an indicator of the emotions, participation and enjoyment of the participants. Furthermore, the interaction server may process this data for redistribution to the participant group such that the one or more members 210 of the participant group may also identify whether their peers are experiencing or enjoying the events in the same way that they are also experiencing the event. In some examples, the one or more participants 210 may also observe and review the interactive data 400 and may be able to explore the feelings, participation levels and emotions of the other participants as well as to connect, interact and compete with the other participants to obtain certain rewards or prizes as well as to increase their profile rankings or levels, which may allow them greater access to the host 212 or other members 210, or unlock specific access to interactive data 400 or other virtual events. Similarly, the interactive data 400 may also be utilised in a sports training, games or exercise program where the heartbeat rates may be used to measure the efforts made by the user, whilst the IMU may also measure the movement of the  user undergoing specific movements or exercise routines as part of the virtual event, which may be a virtual exercise training, competition or games session.
With reference to Figure 6, there is provided a block diagram of an example embodiment of an interaction server 206/306. In this embodiment, the  interaction server  206, 306 is arranged to provide the interactive data 400 from the event participants 210 to the event host 212. Additionally, the interaction server 206/306 may also be arranged to process the interactive data 400 and provide the processed interactive data 614 to the event host and/or the virtual world or virtual environment 202.
Preferably, as shown in Figure 6, the interaction server 206/306 is also arranged to provide avatar control data to the virtual world or virtual environments. This may be advantageous as the interactive data provided by the event participants may also be used to control and animate the avatar of each of the participants within the virtual world or virtual environment 202 so as to allow the avatar to be better animated within the virtual world, or to allow the avatar to move about the virtual world, or to also allow the avatar to interact, transact and issue instructions with other avatars, objects or functions within the virtual world 202.
In this embodiment, the individual participants may provide interactive data 400 to the interaction server206/306 with their user device and any connected peripheral device. Such interactive data 400 may include, without limitation, movement data 404, biometric data 406, multimedia data 408, location data 412, digital selection data 410 as described above with reference to Figure 4. Once the interactive data 400 is received by the interaction server, the data400 may be directly transmitted to the event host 212 and presented on a host interface 402. The host interface 402 will allow the host 212 to select and view the interactive data 400 of all of the participants or specific participants. The host 412 may also group the interactive data 400 by the interactive data type or by the participant identity or groups of participants with common profiles or attributes.
As shown in this embodiment, the interaction server 206/306 includes a learning network which may include a machine learning processor or a neural network processor arranged 612 to process and classifying the interactive data 400 as received from the participants. In one embodiment, the learning network processor612 analyses the interactive data 400 and performs the classification of the data to devise specific processed attributes 614 or characteristics of the participants including, without limitation, the emotions of the participants, the participation rates of the participants, the likely demographic or profile information of the participants, and  the general response of the participants during the virtual event. Additional attributes 614 may be obtained from the analysis of the interactive data by the learning network processor 612.
Preferably, the interaction server 206/306 may also process the interactive data 400 to provide movement and animation data to the virtual world or virtual environment 202 so as to manipulate and animate the participants within the virtual world 202. In this example, movement data is obtained from the IMU as worn or held by each of the participants. The IMU data may mirror specific movements. This data may then be interpreted 606 to be specific animations of the avatar within the virtual world 202, including dance moves, movement of the head or limbs or general movement such as walking or turning of the avatar. A learning network may also be used to interpret the IMU data to devise specific movements of the avatar so as to mirror a physical movement of the participant within the virtual world.
In some examples, the IMU may be placed on a head phone or smart glasses and thus reflect the movement of the head of a participant. In these examples, a direct movement of the head by a participant may then be animated on their avatar within the virtual world. Furthermore, the movement of the head may represent agreement or disagreement, or specific instructions and thus specific rules may be used to calculate and determine whether the movement is an animation on the avatar, or for actuating specific instructions 608 or to move the avatar in specific directions 606 within the virtual world or environment 202.
The interaction server may also include a lip-syncing function 604 which may be performed, in one example, by processing the audio data as received from the participants. By analysing the sounds made by the participants, a table of mouth movements reflective of specific sounds may then be applied as animation to the avatar of the participant within the virtual world or environment. Furthermore, a machine learning system may also be used to process the sounds so as to generate text from the sounds witha speech to text tool. The text may also then be processed to determine feedback for the host or as instructions from the participant.
In some embodiments, the interaction server 206/306 may also use the biometric data as collected from the participants to alter or change the appearance and “look and feel” of the virtual event. In these embodiments, the interaction server 206/306 may determine the mood or emotions of the participants and in turn, instruct the virtual event platform within the virtual world 202 to alter its appearance and experience for the participants. As an example, where the heart rates of the audience indicates that the audience is excited and immersed into the virtual event, the look and feel and of the virtual event may follow a specific colour scheme, followed  by adjustment of the tone and loudness of music or sounds so as to create a more compatible environment and in turn, increase the quality of the virtual event experience,
Although not required, the embodiments described with reference to the Figures can be implemented as an application programming interface (API) or as a series of libraries for use by a developer or can be included within another software application, such as a terminal or personal computer operating system or a portable computing device operating system. Generally, as program modules include routines, programs, objects, components and data files assisting in the performance of particular functions, the skilled person will understand that the functionality of the software application may be distributed across a number of routines, objects or components to achieve the same functionality desired herein.
It will also be appreciated that where the methods and systems of the present invention are either wholly implemented by computing system or partly implemented by computing systems then any appropriate computing system architecture may be utilised. This will include stand-alone computers, network computers and dedicated hardware devices. Where the terms “computing system” and “computing device” are used, these terms are intended to cover any appropriate arrangement of computer hardware capable of implementing the function described.
It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the invention as shown in the specific embodiments without departing from the spirit or scope of the invention as broadly described. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive.
Any reference to prior art contained herein is not to be taken as an admission that the information is common general knowledge, unless otherwise indicated.

Claims (25)

  1. A method for facilitating a virtual event comprising the steps of:
    - receiving an event stream from an event host;
    - transmitting the event stream to one or more members of a participant group; and,
    - receiving interactive data from the one or more members of the participant group; and transmitting the interactive data to the event host.
  2. A method for facilitating a virtual event in accordance with claim 1, wherein the interactive data is transmitted to the one or more members of the participant group.
  3. A method for facilitating a virtual event in accordance with claim 2, wherein the interactive data is processed before it is transmitted to the event host.
  4. A method for facilitating a virtual event in accordance with claim 3, wherein the interactive data is transmitted to selected members of the participant group.
  5. A method for facilitating a virtual event in accordance with claim 4, wherein the interactive data includes biometric data of the one or more members of the participant group.
  6. A method for facilitating a virtual event in accordance with claim 4 or 5, wherein the interactive data includes motion data arranged to represent the motion of the one or more members of the participant group.
  7. A method for facilitating a virtual event in accordance with claim 6, wherein the interactive data includes visual and/or audio data captured by the one or more members of the participant group.
  8. A method for facilitating a virtual event in accordance with claim 7, wherein the visual and/or audio data is processed to animate a mouth an avatar associated with each of the one or more members of the participant group.
  9. A method for facilitating a virtual event in accordance with claim 6 or 7, wherein the motion data is processed to animate the avatar associated with each of the one or more members of the participant group.
  10. A method for facilitating a virtual event in accordance with any one of claims 6 to 9,  wherein the motion data is further processed to determine an action or instruction of the one or more members of the participant group.
  11. A method for facilitating a virtual event in accordance with claim 10, wherein the interactive data is processed to determine one or more emotional attributes of the participant group.
  12. A method for facilitating a virtual event in accordance with claim 11, wherein the interactive data is processed with a learning network arranged to classify the interactive data to determine the one or more emotional attributes of the participant group.
  13. A system for facilitating a virtual event comprising:
    - a virtual event platform arranged to receive an event stream from an event host for streaming to a participant group;
    - a participant streaming gateway arranged to transmit the event stream to one or more members of the participant group; and,
    - an interaction server arranged to receive interactive data from the one or more members of the participant group and to transmit the interactive data to the event host.
  14. A system for facilitating a virtual event in accordance with claim 13, wherein the interactive data is transmitted to the one or more members of the participant group.
  15. A system for facilitating a virtual event in accordance with claim 14, wherein the interactive data is processed before it is transmitted to the event host.
  16. A system for facilitating a virtual event in accordance with claim 15, wherein the interactive data is transmitted to selected members of the participant group.
  17. A system for facilitating a virtual event in accordance with claim 16, wherein the interactive data includes biometric data of the one or more members of the participant group.
  18. A system for facilitating a virtual event in accordance with claim 16 or 17, wherein the interactive data includes motion data arranged to represent the motion of the one or more members of the participant group.
  19. A system for facilitating a virtual event in accordance with claim 18, wherein the interactive data includes visual and/or audio data captured by the one or more members of the  participant group.
  20. A system for facilitating a virtual event in accordance with claim 19, wherein the visual and/or audio data is processed to animate a mouth an avatar associated with each of the one or more members of the participant group.
  21. A system for facilitating a virtual event in accordance with claim 18, 19 or 20, wherein the motion data is processed to animate the avatar associated with each of the one or more members of the participant group.
  22. A system for facilitating a virtual event in accordance with any one of claims 18 to 21, wherein the motion data is further processed to determine an action or instruction of the one or more members of the participant group.
  23. A system for facilitating a virtual event in accordance with claim 22, wherein the interactive data is processed to determine one or more emotional attributes of the participant group.
  24. A system for facilitating a virtual event in accordance with claim 23, wherein the interactive data is processed with a learning network arranged to classify the interactive data to determine the one or more emotional attributes of the participant group.
  25. A system for facilitating a virtual event in accordance with claim 24, wherein the virtual event is facilitated within a virtual world and the interaction server is separately implemented and controlled from the virtual world.
PCT/CN2022/087746 2022-04-19 2022-04-19 A system and method for facilitating a virtual event WO2023201534A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/087746 WO2023201534A1 (en) 2022-04-19 2022-04-19 A system and method for facilitating a virtual event

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/087746 WO2023201534A1 (en) 2022-04-19 2022-04-19 A system and method for facilitating a virtual event

Publications (1)

Publication Number Publication Date
WO2023201534A1 true WO2023201534A1 (en) 2023-10-26

Family

ID=88418889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087746 WO2023201534A1 (en) 2022-04-19 2022-04-19 A system and method for facilitating a virtual event

Country Status (1)

Country Link
WO (1) WO2023201534A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106131602A (en) * 2016-07-20 2016-11-16 平安健康互联网股份有限公司 Interactive system based on main broadcaster's end and method thereof
CN110784736A (en) * 2019-11-26 2020-02-11 网易(杭州)网络有限公司 Virtual article presenting method and device for live broadcast room, electronic equipment and storage medium
US20200404219A1 (en) * 2019-06-18 2020-12-24 Tmrw Foundation Ip & Holding Sarl Immersive interactive remote participation in live entertainment
CN114079799A (en) * 2020-08-21 2022-02-22 上海昊骇信息科技有限公司 Music live broadcast system and method based on virtual reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106131602A (en) * 2016-07-20 2016-11-16 平安健康互联网股份有限公司 Interactive system based on main broadcaster's end and method thereof
US20200404219A1 (en) * 2019-06-18 2020-12-24 Tmrw Foundation Ip & Holding Sarl Immersive interactive remote participation in live entertainment
CN110784736A (en) * 2019-11-26 2020-02-11 网易(杭州)网络有限公司 Virtual article presenting method and device for live broadcast room, electronic equipment and storage medium
CN114079799A (en) * 2020-08-21 2022-02-22 上海昊骇信息科技有限公司 Music live broadcast system and method based on virtual reality

Similar Documents

Publication Publication Date Title
US11575531B2 (en) Dynamic virtual environment
US20190270018A1 (en) Spectator audio analysis in online gaming environments
CN102450032B (en) Avatar integrated shared media selection
JP2020072841A (en) Filtering and parental control method for limiting visual operation on head-mounted display
US20110072367A1 (en) Three dimensional digitally rendered environments
US20230335121A1 (en) Real-time video conference chat filtering using machine learning models
US11400381B2 (en) Virtual influencers for narration of spectated video games
CN116782986A (en) Identifying a graphics interchange format file for inclusion with content of a video game
US11651541B2 (en) Integrated input/output (I/O) for a three-dimensional (3D) environment
Nagele et al. Interactive audio augmented reality in participatory performance
CN111586430A (en) Online interaction method, client, server and storage medium
Branch et al. Tele-immersive improv: Effects of immersive visualisations on rehearsing and performing theatre online
WO2022251077A1 (en) Simulating crowd noise for live events through emotional analysis of distributed inputs
Porwol et al. VR-Participation: The feasibility of the Virtual Reality-driven multi-modal communication technology facilitating e-Participation
Ji et al. Demonstration of VRBubble: enhancing peripheral avatar awareness for people with visual impairments in social virtual reality
WO2023201534A1 (en) A system and method for facilitating a virtual event
US20220385700A1 (en) System and Method for an Interactive Digitally Rendered Avatar of a Subject Person
KR20230089002A (en) System and method for providing metaverse based virtual concert platform associated with offline studio
de Freitas Martins et al. A mobile game based on participatory sensing with real-time client-server architecture for large entertainment events
Wadley Voice in virtual worlds
CN113516974A (en) Method and apparatus for providing interactive service
US11582424B1 (en) System and method for an interactive digitally rendered avatar of a subject person
US20240013488A1 (en) Groups and Social In Artificial Reality
Huang et al. A voice-assisted intelligent software architecture based on deep game network
JP7062126B1 (en) Terminals, information processing methods, programs, and recording media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22937772

Country of ref document: EP

Kind code of ref document: A1