WO2023218247A1 - Virtual collaboration and presentation system - Google Patents

Virtual collaboration and presentation system Download PDF

Info

Publication number
WO2023218247A1
WO2023218247A1 PCT/IB2023/000411 IB2023000411W WO2023218247A1 WO 2023218247 A1 WO2023218247 A1 WO 2023218247A1 IB 2023000411 W IB2023000411 W IB 2023000411W WO 2023218247 A1 WO2023218247 A1 WO 2023218247A1
Authority
WO
WIPO (PCT)
Prior art keywords
presentation
virtual
user
data
input
Prior art date
Application number
PCT/IB2023/000411
Other languages
French (fr)
Inventor
Adrian Moise
Benjamin ROSENOFF
Original Assignee
Aequilibrium Software Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aequilibrium Software Inc. filed Critical Aequilibrium Software Inc.
Publication of WO2023218247A1 publication Critical patent/WO2023218247A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • H04L65/1094Inter-user-equipment sessions transfer or sharing

Definitions

  • the present disclosure relates to virtual collaboration and presentation systems. More specifically, the disclosure pertains to a system and method to allow multiple users to interactively engage with project data through a virtual environment, represented by virtual objects and user avatars.
  • the present disclosure relates to virtual collaboration and presentation systems. More specifically, the disclosure pertains to a system and method to allow multiple users to interactively engage with project data through a virtual presentation/environment, represented by virtual objects and user avatars.
  • the present disclosure provides a system for interactive presentation of project data as a virtual presentation, the system comprising: a data server to store the project data; a processing server connected to the data server, wherein the processing server: receives the project data from the data server; generates the virtual presentation of the received project data, wherein the virtual presentation comprises: a plurality of virtual objects representing the project data; and a plurality of user avatars representing, individually, each user of a plurality of users; and transmits the generated virtual presentation; a plurality of user devices, wherein each user device is controlled by at least one user and wherein each user device comprises: a data reception unit to receive the transmitted virtual presentation; a data display unit to present the received virtual presentation; a data input unit to receive a presentation alteration input from at least one user, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; a cloud
  • the present disclosure provides a method for interactive presentation of project data as a virtual presentation, the method comprising: receiving, the project data; generating, a virtual presentation of the project data, wherein the virtual presentation comprises plurality of virtual objects representing the project data and a plurality of user avatars representing, individually, each user of a plurality of users; receiving, the generated virtual presentation at a plurality of user devices; displaying, the received virtual presentation at data display unit of each user device; receiving, a presentation alteration input from at least one user on each user device, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; modifying, the virtual presentation based on the received presentation alteration input to generate a modified virtual representation; and rendering, the modified virtual presentation on each of the plurality of user devices via the cloud platform.
  • each user device individually, is associated with a sensor module, wherein the sensor module detects at least one gesture input to interact with the presented virtual presentation.
  • the sensor module comprises at least one sensor selected from a depth camera, a motion tracking device, an ultrasonic sensor and an optical flow sensor.
  • the data server or the cloud server enables bi-directional data synchronization of the project data to enable changes made within the visualization presentation, wherein the changes are updated in the stored project data.
  • the virtual presentation comprises at least one notification corresponding to a deadline, a critical issue associated with the project; and a project milestone.
  • each virtual object corresponds to at least one agile ceremony.
  • each user device associated with a haptic device to provide haptic feedback to the user based on the interaction of with interactions with the at least one virtual object.
  • the data display unit is selected from a virtual reality device, an augmented reality device, an extended reality device and a holographic projection device.
  • each user is selected form: a host user who is associated with at least one host control; and at least one guest user, wherein each guest user is associated with at least one guest control.
  • the virtual presentation enables display of combination of a geospatial visualization and a contextual overlay, allowing the at least one user to view an agile management data in relation to a physical location.
  • the presentation alteration input enables creation and manipulation of a virtual sticky note within the virtual presentation.
  • each avatar represents a category of the user.
  • FIG. 1 illustrates a system architecture for interactive presentation of project data as a virtual presentation and components/elements thereof, in accordance with embodiments of present disclosure.
  • FIG. 2 illustrates a method for interactive presentation of project data as a virtual presentation, in accordance with embodiments of present disclosure.
  • FIG. 3 A illustrates virtual presentation displayed on user device, in accordance with embodiments of present disclosure.
  • FIG. 3B illustrates engagement of user with virtual presentation to prepare modified virtual presentation, in accordance with embodiments of present disclosure.
  • FIG. 3C illustrates modified virtual presentation, in accordance with embodiments of present disclosure.
  • the present disclosure relates to virtual collaboration and presentation systems. More specifically, the disclosure pertains to a system and method to allow multiple users to interactively engage with project data through a virtual presentation/environment, represented by virtual objects and user avatars.
  • FIG. 1 illustrates a system architecture for interactive presentation of project data as a virtual presentation and components/elements thereof, in accordance with embodiments of present disclosure.
  • Exemplary system architecture 100 (interchangeably referred as system 100) for interactive representation of a project data as a virtual presentation is depicted.
  • the system 100 provides an immersive and interactive virtual environment (also referred as virtual simulated environment) for the users to interact on project data.
  • the system 100 encompasses several components to enable seamless collaboration among multiple users.
  • the system 100 may include a data server 102 and a processing server 104, the servers 102 and 104 are communicably coupled with each other.
  • the data server 102 may store the project data such as text, images, multimedia, videos, 3D models, and the like.
  • the data server 102 may transmit the stored project data to the processing server 104.
  • the processing server 104 upon receiving stored project data, generates a virtual presentation.
  • the processing server 104 transmits the generated virtual presentation, to a cloud platform 108.
  • the cloud platform 108 can be communicably coupled with data server 102, processing server 104, and user devices 106-1, 106-2, ... 106-n (collectively or individually referred to as user devices 106 or user device 106). Each user device 106-1, 106-2, ... 106-n can be operated/controlled, individually, by each user.
  • the cloud platform 108 can facilitate real-time synchronization and rendering of the virtual presentation on the user devices 106, in virtual simulative environment.
  • the virtual presentation may include one or more virtual objects that represent the project data. These objects can be visual, auditory, or a combination of both, and can be interactive or static.
  • the virtual presentation also includes a multitude of user avatars, each representing an individual user, from the group of users participating in the project.
  • the cloud platform 108 ensures that all users experience a consistent and real time view of the virtual presentation, enabling seamless collaboration and interaction among user devices 106.
  • the system 100 utilizes platform 108 for efficient communication of project data, modifications, and user inputs, thereby providing smooth and synchronized virtual environment to all the user devices 106.
  • each user device 106 can be controlled by different user.
  • Each user device 106 may include, individually, a data reception unit to receive the transmitted virtual presentation and a data display unit to display the received virtual presentation to the user.
  • the displayed virtual presentation on each user device 106 enables the users to view and interact with the virtual objects and other user avatars.
  • the data input unit on each user device 106 facilitates receiving of virtual presentation and alteration input from users. Such input can be in the form of a presentation modification input, a data modification input, and an avatar modification input.
  • the presentation modification input includes customization of the virtual presentation, on the user device 106.
  • the user can utilize user device 106 to rearrange virtual objects, alter the layout, or adjust environmental settings such as lighting, background, and sound.
  • the presentation modification input can be received via touchscreen, gesture, buttons, and the like.
  • the presentation modification input allows users to tailor the presentation according to their preferences or specific project requirements. Tailored presentation enhances the user experience, making the virtual environment more engaging, comfortable, and efficient. Additionally, such presentation creates more conducive atmosphere for collaboration, discussion, and decision making.
  • the data modification input emphasizes the alteration of virtual objects that symbolize the project data.
  • the user can add, delete, edit content of the virtual objects, thereby enabling dynamic collaboration, among all user devices 106, in real time.
  • the users can quickly address new information, changes, or issues that arise during the project lifecycle, ensuring that everyone stays informed and engaged.
  • the avatar modification input enables users to modify their user avatars, granting them the ability to personalize their appearance, change their position within the virtual presentation. These customization options enhance the sense of presence and engagement within the virtual environment, making interactions feel more natural and immersive.
  • the cloud platform 108 receives the transmitted virtual presentation from the processing server 104, as well as the presentation alteration input from user device 106. Based on the received input, the cloud platform 108 modifies the virtual presentation, to generate a modified virtual representation. This ensures that all user devices 106 are synchronized and display the most updated version of the virtual presentation in real time.
  • the modified virtual presentation is then rendered on each of the user devices 106, allowing users to see and interact with the updated virtual environment.
  • the cloud platform 108’s real time rendering capability ensures that any changes made by user are instantly visible on other user devices 106.
  • virtual presentation in a simulated environment is combination of various types of data, including but not limited to, computergenerated graphics, audio, and sometimes haptic feedback to deliver immersive and interactive experience for users.
  • exemplary virtual presentation may involve contextual analysis for presenting virtual objects, layout, avatars, design, and the like which form part of virtual presentation. Additionally, an animation, audio, multimedia data can be incorporated to bring virtual presentation to life.
  • virtual simulative environment refers to realistic, immersive, and interactive 3D space that replicates real-world scenarios or presents agile project data in a visually engaging manner.
  • Such generation of virtual simulative environment utilizes 3D modelling, texturing, lighting, animation, and interactivity.
  • the 3D modelling involves the construction of digital 3D objects, scenes, and characters through specialized tool to create polygonal meshes, NURBS surfaces, or other geometric primitives that define the shape and structure of objects.
  • the created 3D models can be further improved through texturing and material assignment add realism and visual appeal to the objects. Textures are 2D images that are mapped onto the 3D models surfaces, while materials define properties such as color, transparency, and reflectivity.
  • Lighting plays a crucial role in creating a sense of depth, mood, and atmosphere within the virtual environment. Additionally, it also involves placement of light sources strategically to illuminate the scene, generate shadows, and emphasize specific areas or objects. Lighting techniques can range from simple point lights to more complex global illumination algorithms that simulate real-world light behaviour. Further, cloud server 108 can enable amalgamation of animation to 3D model brings life to the virtual environment to create realistic and dynamic movement within the scene. As last step, the cloud server 108 enables display of virtual simulative environment with one or more visual cues, accessible menus, and user-friendly controls that allow users to perform actions and access information within the virtual environment seamlessly.
  • virtual simulated environment is a digital space created using computer graphics, sound, and haptic feedback to simulate, a variety of real -world or imagined scenarios, as virtual presentation. They can be experienced using specialized hardware, such as VR headset and motion-tracking gloves, or through simpler means like personal computers and smartphones. Virtual environment often utilizes artificial intelligence and/or machine learning algorithms to create realistic and responsive virtual characters or objects, which users can interact with in real-time.
  • the virtual presentation can be used to showcase a 3D model of a building or structure. Users can navigate through the virtual environment, inspecting various aspects of the design and making modifications as needed. This interactive approach allows architects, engineers, and stakeholders to work together, discuss design options, and make informed decisions.
  • the system 100 can be employed to visualize and collaborate on 3D models of products, such as vehicles, electronics, or furniture. Designers and engineers can interact with the virtual objects, making adjustments to the design and testing different configurations. The real time collaboration aspect of the system 100 enables faster decision-making and a more efficient design process.
  • the cloud platform 108 can provide a secure and centralized location for storing and managing project data. This facilitates easy data retrieval, sharing, and backup, as well as rendering of updated version and access management.
  • the cloud platform 108 can allow efficient scalability, as the system 100 can accommodate varying number of users and project data without compromising performance.
  • the system 100 and user device 106 can be designed to support multiple input methods, such as keyboard and mouse, touchscreens, or even virtual reality (VR) and augmented reality (AR) devices. This flexibility ensures that users can interact with the virtual presentation in a manner that best suits their needs and preferences.
  • one or more artificial intelligence (Al) and machine learning (ML) mechanisms can be employed in the system 100. These mechanisms can be employed to analyse project data, identify patterns and trends, and provide insights and recommendations to users. This added layer of intelligence can enhance decision making and further streamline the collaboration process.
  • each user device 106 can be associated with a sensor module designed to detect at least one gesture input, for interacting with the displayed virtual presentation.
  • gesture-based input enables the system 100 to allow users to engage with the virtual environment using natural, intuitive movements, such as hand gestures, body position, or orientation. The users can provide inputs to navigate, manipulate, and collaborate within the virtual presentation effortlessly.
  • the sensor module can be exemplified as a depth camera, a motion tracking device, an ultrasonic sensor, an optical flow sensor and a gesture sensor.
  • the depth camera can capture movement of hands and body in three-dimensional space.
  • Motion tracking device can track the user's movements in real-time, allowing for smooth and responsive interaction with the virtual presentation.
  • Ultrasonic sensors use sound waves to measure distances and detect motion to recognize user gestures.
  • Optical flow sensors capture the motion of objects in the field of view, on the user's movements and interactions. The collected data from these sensors can be processed and translated into interactions within the virtual presentation.
  • the data server 102 or the cloud platform 108 can implement bidirectional data synchronization of the project data to accommodate changes made within the virtual presentation.
  • This real time synchronization ensures that any updates or modifications made by users, such as altering designs or adding new information, are accurately reflected in the stored project data on the cloud platform 108.
  • This bi-directional data synchronization plays a vital role in maintaining consistency and accuracy throughout the collaborative process. As multiple users interact with the virtual presentation simultaneously, their individual changes are synchronized with the project data in real time. This assists all participants to be in sync with the latest/updated information.
  • the virtual presentation includes at least one notification related to a deadline or a project milestone to aid the users with information pertaining to important events and deadlines that require attention. Accordingly, the users can prioritize tasks and allocate resources to meet deadline.
  • each virtual object represents at least one agile ceremony, such as sprint planning, daily stand-ups, sprint reviews, or sprint retrospectives.
  • agile project management elements such as sprint planning, daily stand-ups, sprint reviews, or sprint retrospectives.
  • users can engage with these ceremonies to transform traditional, static agile ceremonies into dynamic, collaborative experiences within the virtual presentation.
  • the users can interact with these virtual objects to understand project's progress and goals, therefore, elevates agile project management by offering more engaging and interactive way of conducting and participating in agile ceremonies.
  • each user device 106 can be operatively coupled to a haptic device that delivers haptic feedback to the user based on their interactions with virtual objects in the environment.
  • a haptic device that delivers haptic feedback to the user based on their interactions with virtual objects in the environment. This feature significantly enhances the realism and immersion of the virtual experience, as users can "feel" the objects they interact with through tactile sensations.
  • the system 100 goes beyond visual and auditory stimuli to create more engaging and interactive experience for users. This multi-sensory approach deepens the user's connection to the virtual environment and facilitates more intuitive and natural interactions.
  • the data display unit can be any device which can display the virtual environment.
  • Non -limiting examples of display 106 can be VR device, an AR device, an extended reality device, and a holographic projection device.
  • the system 100 can adapt, from immersive VR experiences to contextually rich AR overlays or holographic projections.
  • each user can be categorized as either a host user or a guest user.
  • Host users are associated with host controls, which grant them the ability to manage and control the virtual presentation. This includes functions such as modifying virtual objects, setting permissions, or inviting guest users.
  • the guest users may have access to guest controls, which allow them to interact with the virtual presentation within the boundaries set by the host user's permissions and restrictions.
  • the distinction between host user and guest user ensures appropriate level of control and access granted to each participant.
  • the system 100 maintains security and preventing unauthorized changes within the virtual presentation.
  • the virtual presentation combines geospatial visualization with contextual overlay, enabling users to view agile management data in relation to a physical location.
  • This spatial context plays a vital role when dealing with projects that span multiple locations.
  • the fusion of geospatial visualization and contextual overlay in the virtual presentation allows users to analyse project data in more granular level.
  • sticky notes can be utilized inn system 100.
  • the presentation alteration input enables users to create and manipulate virtual sticky notes within the virtual presentation.
  • the virtual sticky for brainstorming, planning, and organization of project data in the virtual environment.
  • user device 106 Through user device 106, the user can create, edit, and move these notes, allowing them to visually organize their thoughts, ideas, and tasks in a collaborative virtual presentation.
  • a software development company is working on a complex web application with multiple teams distributed across the globe.
  • Alice as a project manager, organizes a progress review meeting to discuss the project data using the cloud platform 108 based interactive virtual presentation, implemented via system 100.
  • the data server 102 can be arranged to store the project data, including project specifications, sprint plans, design mock-ups, and other project-related information.
  • the processing server 104 generates a virtual presentation, by using the project data received from data server 102, with the following features:
  • the graph can be dynamically adjusted based on user input during the meeting;
  • users can access the virtual presentation through their user devices 106, such as VR headset, AR glasses and holographic projection devices.
  • user devices 106 such as VR headset, AR glasses and holographic projection devices.
  • Bob a team lead, notices that the task "Implement User Authentication" in the Tn Progress' column requires more time and resources due to unforeseen technical challenges.
  • Bob can provide gesture input (for an instance, rotate hand 5 times in clockwise direction) to transmit a data modification input. He can extend the task's deadline by five week and allocates additional developer hours to the task.
  • This action updates the virtual object representing the task on the virtual presentation adjusts the resource allocation graph and shifts the task's position on the interactive timeline.
  • Carol a user experience designer, modifies her avatar's appearance to include a "UX Expert” badge, indicating her area of expertise to other team members. She also moves her avatar closer to the "Improve Login Page Design” task, signalling her interest in discussing this topic during the meeting. Carol shares her screen within the virtual environment, sselling her proposed design improvements, which are then discussed and approved by the team.
  • FIG. 2 illustrates a method for interactive presentation of project data as a virtual presentation, in accordance with embodiments of present disclosure.
  • the method for presenting project data as a virtual presentation enables users to engage with the project information in a dynamic, immersive, and collaborative way.
  • the generation of virtual environment comprising virtual objects representing the project data and user avatars for each user.
  • the user can modify the virtual presentation, virtual objects, or their avatars, creating a seamless and engaging experience across multiple user devices 106.
  • the method for interactive presentation of project data as virtual presentation involves the following steps.
  • Step 202 Receiving Project Data
  • step 202 receiving project data from various sources.
  • This data may include schedules, budgets, resource allocations, progress updates, and other relevant information.
  • the data can be obtained from project management platform, spreadsheets, databases, or other repositories.
  • Step 204 Generating a Virtual Presentation
  • the virtual presentation may include virtual objects representing the project data and user avatars representing each user individually.
  • the virtual objects may include 3D models, charts, graphs, and other visual representations of the project data.
  • the user avatars are designed to represent the users in the virtual environment, allowing them to interact with the virtual objects and collaborate with other users.
  • Step 206 Receiving the Generated Virtual Presentation at User Devices
  • user devices 106 can include desktop computers, laptops, tablets, smartphones, and VR headsets.
  • the virtual presentation is received, at user device 106, through a secure, cloud-based platform 108 that ensures data integrity and accessibility.
  • Step 208 Displaying the Virtual Presentation on User Devices
  • the data display unit of each user device 106 displays the virtual environment. Users can view and interact with the virtual objects and avatars, exploring the project data in a visually engaging and immersive manner.
  • the virtual presentation can be rendered in 2D or 3D, depending on the capabilities of the user device 106 and the preferences of the user.
  • Step 210 Receiving Presentation Alteration Input
  • presentation alteration inputs can include at least one from: (a) presentation modification input, (b) data modification input, and (c) avatar modification input.
  • Step 212 Modifying the Virtual Presentation
  • the virtual presentation is modified to generate a modified virtual representation.
  • This process involves updating the virtual objects, avatars, and overall presentation layout according to the user's input.
  • the modified virtual representation reflects the most up-to-date project data and user interactions, ensuring that all users have access to accurate and current information.
  • Step 214 Rendering the Modified Virtual Presentation on User Devices
  • the modified virtual presentation is rendered on each of the plurality of user devices 106 by utilizing the cloud platform 108.
  • the cloud platform 108 can also enable users to save their progress and share the modified virtual presentation with others, facilitating seamless communication and teamwork.
  • each user device 106 is equipped with a voice recognition module, which enables users to interact with the virtual presentation using voice commands.
  • voice recognition users can easily navigate through the virtual presentation, perform various actions, and get information without the need for physical input, such as typing or clicking.
  • voice recognition technology can enhance the usability and accessibility of virtual presentation.
  • the method can also involve tracking user interaction level with the virtual presentation, such as the frequency of modifications or the types of interactions performed. This collected data can be analysed to generate insights that can help to improve project efficiency or team collaboration. For example, if users frequently make certain types of changes, a notification can be issued for better project organization or communication.
  • each user device 106 can be equipped with a sensor module that can detect gesture inputs.
  • the sensors can recognize various hand gestures, such as swipes, pinches, and rotations, which users can use to interact with the virtual presentation.
  • gesture inputs By using natural gestures, users can navigate through the presentation and make modifications to the content with ease, providing a more intuitive and engaging experience.
  • the use of gesture inputs also eliminates the need for traditional input methods like a mouse or keyboard, making the presentation accessible to a wider range of users. Overall, the integration of gesture recognition technology can enhance the usability and user-friendliness of virtual presentations.
  • FIG. 3 A illustrates virtual presentation displayed on user device, in accordance with embodiments of present disclosure. As illustrated, user is present in front of the user device 106. The user can interact with virtual presentation, which is displayed on user device 106. The user can provide presentation modification input, data modification input and avatar modification input on user device 106.
  • FIG. 3B illustrates engagement of user with virtual presentation to prepare modified virtual presentation, in accordance with embodiments of present disclosure.
  • a predetermined 3D location close to the user the hand movement as gesture input can be recognised.
  • Different hand motions can be tracked and recognised by a camera 302, when the hand is in activation zone.
  • the hand motion can be calibrated against previously established user input to carry out actions.
  • the user input modifies displayed virtual presentation to modified virtual representation. Simultaneously another hand or other body parts movement can also be recognized.
  • single-handed gestures may be considered as interacting with data representations, such as viewing or changing the associated data
  • combined gestures of user's dominant hand drawing on their non-dominant hand's palm may be linked to user inputs related to avatar navigation within the virtual environment around data representations.
  • FIG. 3C illustrates modified virtual presentation, in accordance with embodiments of present disclosure.
  • the displayed virtual environment as in Fig. 3A
  • modified virtual presentation is displayed.
  • the modified virtual presentation via cloud platform 108, is updated to all user devices 106 in real time.
  • non-transitory computer-readable storage medium containing executable instructions can be used to enable an interactive presentation of project data as a virtual presentation. When executed by a microprocessor, these instructions facilitate the generation, modification, and rendering of the virtual presentation on a plurality of user devices 106.
  • the system architecture comprises a server arrangement, a non-transitory storage device, and a microprocessor for executing the routines.
  • the server arrangement can be understood as a combination of data server 102 and processing server 104.
  • the data server 102 and processing server 104 can be communicably coupled through cloud platform 108 to store and manage the project data, virtual presentation, and user interactions.
  • the cloud platform 108 can process project data, generates virtual presentation, and synchronization of the virtual presentation across multiple user devices 106.
  • the server arrangement utilizes non-transitory storage device that is arranged to store a set of executable routines. These routines encompass various functions and processes required to facilitate the interactive presentation of project data as virtual presentation.
  • the microprocessor can be coupled to non-transitory storage device and is responsible for executing the set of routines.
  • the microprocessor receives the project data, processes it, and generates the virtual presentation based on the executable instructions.
  • the non-transitory computer-readable storage medium contains executable instructions for the following processes:
  • the term ‘user’ as used herein relates to any entity including a person (i.e., human being) or a virtual personal assistant (an autonomous program or a bot) using a device and/or system 100 and method described herein.
  • the term “user device” 106 relates to an electronic device associated with (or used by) user that is capable of enabling the user to perform specific tasks associated with the aforementioned system/method.
  • the user device 106 is intended to be broadly interpreted to include any electronic device that may be used for voice and/or data communication over a wireless communication network. Examples of user device 106 include, but are not limited to, cellular phones, personal digital assistants (PDAs), handheld devices, wireless modems, laptop computers, personal computers, etc.
  • user device 106 may alternatively be referred to as a mobile station, a mobile terminal, a subscriber station, a remote station, a user terminal, a terminal, a subscriber unit, an access terminal, etc.
  • the user device 106 includes a casing, a memory, a processor, a network interface card, a microphone, a speaker, a keypad, and a display.
  • the user device 106 is to be construed broadly, so as to encompass a variety of different types of mobile stations, subscriber stations or, more generally, communication devices, including examples such as a combination of a data card inserted in a laptop.
  • the data server 102 and processing server 104 can be operatively or communicably coupled with a network.
  • network relates to an arrangement of interconnected programmable and/or non-programmable components that are configured to facilitate data communication between one or more electronic devices and/or databases, whether available or known at the time of filing or as later developed.
  • the network may include, but is not limited to, one or more peer-to-peer network, a hybrid peer- to-peer network, local area networks (LANs), radio access networks (RANs), metropolitan area networks (MANS), wide area networks (WANs), all or a portion of a public network such as the global computer network known as the Internet, a private network, a cellular network and any other communication system or systems at one or more locations.
  • the network includes wired or wireless communication that can be carried out via any number of known protocols, including, but not limited to, Internet Protocol (IP), Wireless Access Protocol (WAP), Frame Relay, or Asynchronous Transfer Mode (ATM).
  • IP Internet Protocol
  • WAP Wireless Access Protocol
  • ATM Asynchronous Transfer Mode
  • any other suitable protocols using voice, video, data, or combinations thereof can also be employed.
  • avatar represents a graphical representation of user, often taking the form of a 2D or 3D character or icon.
  • Avatars are commonly used in digital communication platforms, allowing users to express their identities and interact with others in a more engaging manner. By providing a unique and relatable identity, avatars encourage users to engage in meaningful communication and form lasting connections with others in the virtual presentation. In virtual environments, avatar facilitate user interaction, allowing individuals to communicate and collaborate.
  • Avatars provide a sense of presence and personalization, as users can customize their digital selves with a range of features, including appearance, clothing, and accessories, to reflect their identity, interests, or aspirations.
  • any or a combination of artificial intelligence or machine learning mechanisms such as decision tree learning, Bayesian network, deep learning, random forest, supervised vector machines, reinforcement learning, prediction models, Statistical Algorithms, Classification, Logistic Regression, Support Vector Machines, Linear Discriminant Analysis, K-Nearest Neighbours, Decision Trees, Random Forests, Regression, Linear Regression, Support Vector Regression, Logistic Regression, Ridge Regression, Partial Least-Squares Regression, Non-Linear Regression, Clustering, Hierarchical Clustering - Agglomerative, Hierarchical Clustering - Divisive, K-Means Clustering, K-Nearest Neighbours Clustering, EM (Expectation Maximization) Clustering, Principal Components Analysis Clustering (PCA), Dimensionality Reduction, Non-Negative Matrix Factorization (NMF), Kernel PCA, Linear Discriminant Analysis (LDA), Generalized Discriminant Analysis (kernel trick again), Ensemble Al
  • PCA Principal Component
  • Example embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including hardware, software, firmware, and a combination thereof. For example, in one embodiment, each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations can be implemented by computer program instructions.
  • These computer program instructions may be loaded onto a general- purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
  • processing means or ‘microprocessor’ or ‘processor’ or ‘processors’ includes, but is not limited to, a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor

Abstract

A system for interactive presentation of project data as a virtual presentation, enabling real-time collaboration among multiple users. The system comprises a data server for storing project data, a processing server for generating the virtual presentation with virtual objects and user avatars, multiple user devices, and a cloud platform. A user can modify the virtual presentation, virtual objects, and avatars. The cloud platform synchronizes modifications in real time across all user devices.

Description

VIRTUAL COLLABORATION AND PRESENTATION SYSTEM
CROSS REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit under 35 U.S.C. §119(e) of co-pending U.S. Provisional Application No. 63/340,836 entitled “SYSTEMS AND METHODS FOR INTERACTIVE VISUALIZATION OF AGILE MANAGEMENT DATA” filed May 11, 2022, which is incorporated herein by reference.
FIELD
[0002] The present disclosure relates to virtual collaboration and presentation systems. More specifically, the disclosure pertains to a system and method to allow multiple users to interactively engage with project data through a virtual environment, represented by virtual objects and user avatars.
BACKGROUND
[0003] The background description includes information that may be useful in understanding the present disclosure/invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0004] In recent years, the shift towards remote work, virtual teams, and distributed organizations has created a thrust for advanced collaboration tools. These tools are well crafted to accommodate needs of users working from different geo-coordinates. This shift is driven by plethora of factors, including digitization, amplified globalization, and the need for cost- effective and flexible work arrangements. As a result, organizations are more inclined towards remote work policies and seeking innovative ways to maintain effective communication and collaboration among team members.
[0005] Generally, video conferencing and screen sharing, have been embraced to address the needs of remote workers. However, they often lack the interactivity and immersion required to fully engage users in a collaborative experience. Participants in video conferences may find it difficult to express their ideas and contribute effectively to the discussion due to limited interaction options, while screen sharing may not provide an optimal view of the shared content, leading to disengagement and miscommunication. [0006] To address these shortcomings, virtual presentation systems are primarily utilized. The existing virtual presentation systems have attempted to provide three-dimensional environments where users can interact on various topics of interest, for an instance, lab research, clinical data, project data etc, Effective project data management and collaboration are essential for organizations to achieve their goals in today's competitive business landscape. To make informed decisions, project stakeholders need access to various types of project data, such as tasks, timelines, resources, and progress. The virtual presentation enables geographically apart users to efficiently convey project data and obtain desired output.
[0007] However, despite huge potential of the virtual presentation, existing solutions often lack interactivity, leads to suboptimal user experiences, inability to accommodate real-time modifications made by multiple users. In addition to these challenges, existing solutions attract requirements of tailored hardware and software, making them inaccessible for users with limited resources or technical expertise. These limitations adversely impact the user experience, hinder effective collaboration and impose additional barriers to entry for organizations and individuals seeking to adopt virtual presentation to communicate and collaborate project data.
[0008] Thus, there remains a need for further contributions in this area of technology. More specifically, a need exists for user-friendly, and interactive virtual presentation system that can overcome the limitations of current technologies, real time updates, enhancing collaboration and communication among distributed team members. Such a system should provide an immersive and engaging environment that allows users to easily interact with project data and collaborate effectively, regardless of their geo-coordinates or technical expertise.
SUMMARY
[0009] The present disclosure relates to virtual collaboration and presentation systems. More specifically, the disclosure pertains to a system and method to allow multiple users to interactively engage with project data through a virtual presentation/environment, represented by virtual objects and user avatars.
[00010] The following presents a simplified summary of various aspects of this disclosure in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects and is intended to neither identify key or critical elements nor delineate the scope of such aspects. Its purpose is to present some concepts of this disclosure in a simplified form as a prelude to the more detailed description that is presented later. [00011] The following paragraphs provide additional support for the claims of the subject application.
[00012] In an aspect the present disclosure provides a system for interactive presentation of project data as a virtual presentation, the system comprising: a data server to store the project data; a processing server connected to the data server, wherein the processing server: receives the project data from the data server; generates the virtual presentation of the received project data, wherein the virtual presentation comprises: a plurality of virtual objects representing the project data; and a plurality of user avatars representing, individually, each user of a plurality of users; and transmits the generated virtual presentation; a plurality of user devices, wherein each user device is controlled by at least one user and wherein each user device comprises: a data reception unit to receive the transmitted virtual presentation; a data display unit to present the received virtual presentation; a data input unit to receive a presentation alteration input from at least one user, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; a cloud platform connected to the data server, the processing server and the plurality of user devices, wherein the cloud platform: receives the transmitted virtual presentation from the processing server; receives the presentation alteration input from each user device; modifies the virtual presentation based on the received presentation alteration input to generate a modified virtual representation; and renders the modified virtual presentation on each of the plurality of user devices.
[00013] In another aspect the present disclosure provides a method for interactive presentation of project data as a virtual presentation, the method comprising: receiving, the project data; generating, a virtual presentation of the project data, wherein the virtual presentation comprises plurality of virtual objects representing the project data and a plurality of user avatars representing, individually, each user of a plurality of users; receiving, the generated virtual presentation at a plurality of user devices; displaying, the received virtual presentation at data display unit of each user device; receiving, a presentation alteration input from at least one user on each user device, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; modifying, the virtual presentation based on the received presentation alteration input to generate a modified virtual representation; and rendering, the modified virtual presentation on each of the plurality of user devices via the cloud platform.
[00014] In an embodiment, each user device, individually, is associated with a sensor module, wherein the sensor module detects at least one gesture input to interact with the presented virtual presentation.
[00015] In an embodiment, the sensor module comprises at least one sensor selected from a depth camera, a motion tracking device, an ultrasonic sensor and an optical flow sensor.
[00016] In an embodiment, the data server or the cloud server enables bi-directional data synchronization of the project data to enable changes made within the visualization presentation, wherein the changes are updated in the stored project data.
[00017] In an embodiment, the virtual presentation comprises at least one notification corresponding to a deadline, a critical issue associated with the project; and a project milestone. [00018] In an embodiment, each virtual object corresponds to at least one agile ceremony.
[00019] In an embodiment, each user device associated with a haptic device to provide haptic feedback to the user based on the interaction of with interactions with the at least one virtual object.
[00020] In an embodiment, the data display unit is selected from a virtual reality device, an augmented reality device, an extended reality device and a holographic projection device.
[00021] In an embodiment, each user is selected form: a host user who is associated with at least one host control; and at least one guest user, wherein each guest user is associated with at least one guest control.
[00022] In an embodiment, the virtual presentation enables display of combination of a geospatial visualization and a contextual overlay, allowing the at least one user to view an agile management data in relation to a physical location.
[00023] In an embodiment, the presentation alteration input enables creation and manipulation of a virtual sticky note within the virtual presentation.
[00024] In an embodiment, each avatar represents a category of the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[00025] The features and advantages of the present disclosure would be more clearly understood from the following description taken in conjunction with the accompanying drawings in which: [00026] FIG. 1 illustrates a system architecture for interactive presentation of project data as a virtual presentation and components/elements thereof, in accordance with embodiments of present disclosure.
[00027] FIG. 2 illustrates a method for interactive presentation of project data as a virtual presentation, in accordance with embodiments of present disclosure.
[00028] FIG. 3 A illustrates virtual presentation displayed on user device, in accordance with embodiments of present disclosure.
[00029] FIG. 3B illustrates engagement of user with virtual presentation to prepare modified virtual presentation, in accordance with embodiments of present disclosure.
[00030] FIG. 3C illustrates modified virtual presentation, in accordance with embodiments of present disclosure.
DETAILED DESCRIPTION
[00031] In the following detailed description of the invention, reference is made to the accompanying drawings that form a part hereof, and in which is shown, by way of illustration, specific embodiments in which the invention may be practiced. In the drawings, like numerals describe substantially similar components throughout the several views. These embodiments are described in sufficient detail to claim those skilled in the art to practice the invention. Other embodiments may be utilized, and structural, logical, and electrical changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims and equivalents thereof.
[00032] The use of the terms “a” and “an” and “the” and “at least one” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The use of the term “at least one” followed by a list of one or more items (for example, “at least one of A and B”) is to be construed to mean one item selected from the listed items (A or B) or any combination of two or more of the listed items (A and B), unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
[00033] The present disclosure relates to virtual collaboration and presentation systems. More specifically, the disclosure pertains to a system and method to allow multiple users to interactively engage with project data through a virtual presentation/environment, represented by virtual objects and user avatars.
[00034] FIG. 1 illustrates a system architecture for interactive presentation of project data as a virtual presentation and components/elements thereof, in accordance with embodiments of present disclosure. Exemplary system architecture 100 (interchangeably referred as system 100) for interactive representation of a project data as a virtual presentation is depicted. The system 100 provides an immersive and interactive virtual environment (also referred as virtual simulated environment) for the users to interact on project data. The system 100 encompasses several components to enable seamless collaboration among multiple users. The system 100 may include a data server 102 and a processing server 104, the servers 102 and 104 are communicably coupled with each other. The data server 102 may store the project data such as text, images, multimedia, videos, 3D models, and the like. The data server 102 may transmit the stored project data to the processing server 104. The processing server 104, upon receiving stored project data, generates a virtual presentation. The processing server 104 transmits the generated virtual presentation, to a cloud platform 108.
[00035] The cloud platform 108 can be communicably coupled with data server 102, processing server 104, and user devices 106-1, 106-2, ... 106-n (collectively or individually referred to as user devices 106 or user device 106). Each user device 106-1, 106-2, ... 106-n can be operated/controlled, individually, by each user. The cloud platform 108 can facilitate real-time synchronization and rendering of the virtual presentation on the user devices 106, in virtual simulative environment. The virtual presentation may include one or more virtual objects that represent the project data. These objects can be visual, auditory, or a combination of both, and can be interactive or static. The virtual presentation also includes a multitude of user avatars, each representing an individual user, from the group of users participating in the project. These avatars allow the users to interact with the virtual presentation and with each other. [00036] The cloud platform 108 ensures that all users experience a consistent and real time view of the virtual presentation, enabling seamless collaboration and interaction among user devices 106. The system 100 utilizes platform 108 for efficient communication of project data, modifications, and user inputs, thereby providing smooth and synchronized virtual environment to all the user devices 106.
[00037] In an embodiment, each user device 106 can be controlled by different user. Each user device 106 may include, individually, a data reception unit to receive the transmitted virtual presentation and a data display unit to display the received virtual presentation to the user. The displayed virtual presentation on each user device 106 enables the users to view and interact with the virtual objects and other user avatars. The data input unit on each user device 106 facilitates receiving of virtual presentation and alteration input from users. Such input can be in the form of a presentation modification input, a data modification input, and an avatar modification input.
[00038] In a preceding embodiment, the presentation modification input includes customization of the virtual presentation, on the user device 106. The user can utilize user device 106 to rearrange virtual objects, alter the layout, or adjust environmental settings such as lighting, background, and sound. The presentation modification input can be received via touchscreen, gesture, buttons, and the like. The presentation modification input allows users to tailor the presentation according to their preferences or specific project requirements. Tailored presentation enhances the user experience, making the virtual environment more engaging, comfortable, and efficient. Additionally, such presentation creates more conducive atmosphere for collaboration, discussion, and decision making.
[00039] In another embodiment, the data modification input emphasizes the alteration of virtual objects that symbolize the project data. The user can add, delete, edit content of the virtual objects, thereby enabling dynamic collaboration, among all user devices 106, in real time. The users can quickly address new information, changes, or issues that arise during the project lifecycle, ensuring that everyone stays informed and engaged.
[00040] In a preceding embodiment, the avatar modification input enables users to modify their user avatars, granting them the ability to personalize their appearance, change their position within the virtual presentation. These customization options enhance the sense of presence and engagement within the virtual environment, making interactions feel more natural and immersive.
[00041] In an embodiment, the cloud platform 108 receives the transmitted virtual presentation from the processing server 104, as well as the presentation alteration input from user device 106. Based on the received input, the cloud platform 108 modifies the virtual presentation, to generate a modified virtual representation. This ensures that all user devices 106 are synchronized and display the most updated version of the virtual presentation in real time.
[00042] In an embodiment, the modified virtual presentation is then rendered on each of the user devices 106, allowing users to see and interact with the updated virtual environment. The cloud platform 108’s real time rendering capability ensures that any changes made by user are instantly visible on other user devices 106.
[00043] Throughout the present disclosure “virtual presentation” in a simulated environment is combination of various types of data, including but not limited to, computergenerated graphics, audio, and sometimes haptic feedback to deliver immersive and interactive experience for users. Exemplary virtual presentation may involve contextual analysis for presenting virtual objects, layout, avatars, design, and the like which form part of virtual presentation. Additionally, an animation, audio, multimedia data can be incorporated to bring virtual presentation to life.
[00044] Throughout the present disclosure, “virtual simulative environment” refers to realistic, immersive, and interactive 3D space that replicates real-world scenarios or presents agile project data in a visually engaging manner. Such generation of virtual simulative environment utilizes 3D modelling, texturing, lighting, animation, and interactivity. The 3D modelling involves the construction of digital 3D objects, scenes, and characters through specialized tool to create polygonal meshes, NURBS surfaces, or other geometric primitives that define the shape and structure of objects. The created 3D models can be further improved through texturing and material assignment add realism and visual appeal to the objects. Textures are 2D images that are mapped onto the 3D models surfaces, while materials define properties such as color, transparency, and reflectivity. High-quality textures and materials create a more convincing and immersive virtual environment. Lighting plays a crucial role in creating a sense of depth, mood, and atmosphere within the virtual environment. Additionally, it also involves placement of light sources strategically to illuminate the scene, generate shadows, and emphasize specific areas or objects. Lighting techniques can range from simple point lights to more complex global illumination algorithms that simulate real-world light behaviour. Further, cloud server 108 can enable amalgamation of animation to 3D model brings life to the virtual environment to create realistic and dynamic movement within the scene. As last step, the cloud server 108 enables display of virtual simulative environment with one or more visual cues, accessible menus, and user-friendly controls that allow users to perform actions and access information within the virtual environment seamlessly.
[00045] Notably, virtual simulated environment is a digital space created using computer graphics, sound, and haptic feedback to simulate, a variety of real -world or imagined scenarios, as virtual presentation. They can be experienced using specialized hardware, such as VR headset and motion-tracking gloves, or through simpler means like personal computers and smartphones. Virtual environment often utilizes artificial intelligence and/or machine learning algorithms to create realistic and responsive virtual characters or objects, which users can interact with in real-time.
[00046] As an exemplary scenario, the virtual presentation can be used to showcase a 3D model of a building or structure. Users can navigate through the virtual environment, inspecting various aspects of the design and making modifications as needed. This interactive approach allows architects, engineers, and stakeholders to work together, discuss design options, and make informed decisions. Similarly, in the realm of product design, the system 100 can be employed to visualize and collaborate on 3D models of products, such as vehicles, electronics, or furniture. Designers and engineers can interact with the virtual objects, making adjustments to the design and testing different configurations. The real time collaboration aspect of the system 100 enables faster decision-making and a more efficient design process.
[00047] Additionally, the cloud platform 108 can provide a secure and centralized location for storing and managing project data. This facilitates easy data retrieval, sharing, and backup, as well as rendering of updated version and access management. The cloud platform 108 can allow efficient scalability, as the system 100 can accommodate varying number of users and project data without compromising performance. The system 100 and user device 106 can be designed to support multiple input methods, such as keyboard and mouse, touchscreens, or even virtual reality (VR) and augmented reality (AR) devices. This flexibility ensures that users can interact with the virtual presentation in a manner that best suits their needs and preferences. [00048] In an embodiment, one or more artificial intelligence (Al) and machine learning (ML) mechanisms can be employed in the system 100. These mechanisms can be employed to analyse project data, identify patterns and trends, and provide insights and recommendations to users. This added layer of intelligence can enhance decision making and further streamline the collaboration process.
[00049] In an embodiment, each user device 106 can be associated with a sensor module designed to detect at least one gesture input, for interacting with the displayed virtual presentation. Such gesture-based input enables the system 100 to allow users to engage with the virtual environment using natural, intuitive movements, such as hand gestures, body position, or orientation. The users can provide inputs to navigate, manipulate, and collaborate within the virtual presentation effortlessly.
[00050] The sensor module can be exemplified as a depth camera, a motion tracking device, an ultrasonic sensor, an optical flow sensor and a gesture sensor. The depth camera can capture movement of hands and body in three-dimensional space. Motion tracking device can track the user's movements in real-time, allowing for smooth and responsive interaction with the virtual presentation. Ultrasonic sensors use sound waves to measure distances and detect motion to recognize user gestures. Optical flow sensors capture the motion of objects in the field of view, on the user's movements and interactions. The collected data from these sensors can be processed and translated into interactions within the virtual presentation.
[00051] In an embodiment, the data server 102 or the cloud platform 108 can implement bidirectional data synchronization of the project data to accommodate changes made within the virtual presentation. This real time synchronization ensures that any updates or modifications made by users, such as altering designs or adding new information, are accurately reflected in the stored project data on the cloud platform 108. This bi-directional data synchronization plays a vital role in maintaining consistency and accuracy throughout the collaborative process. As multiple users interact with the virtual presentation simultaneously, their individual changes are synchronized with the project data in real time. This assists all participants to be in sync with the latest/updated information.
[00052] In an embodiment, the virtual presentation includes at least one notification related to a deadline or a project milestone to aid the users with information pertaining to important events and deadlines that require attention. Accordingly, the users can prioritize tasks and allocate resources to meet deadline.
[00053] In an embodiment, each virtual object represents at least one agile ceremony, such as sprint planning, daily stand-ups, sprint reviews, or sprint retrospectives. By incorporating agile project management elements within the virtual environment, users can engage with these ceremonies to transform traditional, static agile ceremonies into dynamic, collaborative experiences within the virtual presentation. The users can interact with these virtual objects to understand project's progress and goals, therefore, elevates agile project management by offering more engaging and interactive way of conducting and participating in agile ceremonies.
[00054] In an embodiment, each user device 106 can be operatively coupled to a haptic device that delivers haptic feedback to the user based on their interactions with virtual objects in the environment. This feature significantly enhances the realism and immersion of the virtual experience, as users can "feel" the objects they interact with through tactile sensations. By incorporating haptic feedback, the system 100 goes beyond visual and auditory stimuli to create more engaging and interactive experience for users. This multi-sensory approach deepens the user's connection to the virtual environment and facilitates more intuitive and natural interactions.
[00055] In an embodiment, the data display unit can be any device which can display the virtual environment. Non -limiting examples of display 106 can be VR device, an AR device, an extended reality device, and a holographic projection device. By supporting multiple display options, the system 100 can adapt, from immersive VR experiences to contextually rich AR overlays or holographic projections.
[00056] In an embodiment, each user can be categorized as either a host user or a guest user. Host users are associated with host controls, which grant them the ability to manage and control the virtual presentation. This includes functions such as modifying virtual objects, setting permissions, or inviting guest users. The guest users may have access to guest controls, which allow them to interact with the virtual presentation within the boundaries set by the host user's permissions and restrictions. The distinction between host user and guest user ensures appropriate level of control and access granted to each participant. By implementing such a hierarchical structure, the system 100 maintains security and preventing unauthorized changes within the virtual presentation.
[00057] In an embodiment, the virtual presentation combines geospatial visualization with contextual overlay, enabling users to view agile management data in relation to a physical location. This spatial context plays a vital role when dealing with projects that span multiple locations. The fusion of geospatial visualization and contextual overlay in the virtual presentation allows users to analyse project data in more granular level.
[00058] In an embodiment, for keeping the projects in track and reminders, sticky notes can be utilized inn system 100. The presentation alteration input enables users to create and manipulate virtual sticky notes within the virtual presentation. The virtual sticky for brainstorming, planning, and organization of project data in the virtual environment. Through user device 106, the user can create, edit, and move these notes, allowing them to visually organize their thoughts, ideas, and tasks in a collaborative virtual presentation.
[00059] To understand better, an exemplary scenario is illustrated herein. A software development company is working on a complex web application with multiple teams distributed across the globe. Alice, as a project manager, organizes a progress review meeting to discuss the project data using the cloud platform 108 based interactive virtual presentation, implemented via system 100. The data server 102 can be arranged to store the project data, including project specifications, sprint plans, design mock-ups, and other project-related information. The processing server 104 generates a virtual presentation, by using the project data received from data server 102, with the following features:
[00060] (a) a three-dimensional representation of the project's tasks, with virtual objects representing different tasks cards, organized by columns for 'To Do,' Tn Progress,' and 'Done.' Task cards can include additional details, such as deadlines, assignees, and priority levels;
[00061] (b) 3D graph displaying the distribution of resources, such as personnel and budget, across different project components. The graph can be dynamically adjusted based on user input during the meeting;
[00062] (c) 3D Gantt chart showing project milestones and deadlines, enabling users to visualize the project's progress and make adjustments as needed; and
[00063] (d) customizable avatars representing each user, allowing them to interact with the virtual presentation and communicate with others.
[00064] In an embodiment, during the meeting, users can access the virtual presentation through their user devices 106, such as VR headset, AR glasses and holographic projection devices. For example, Bob, a team lead, notices that the task "Implement User Authentication" in the Tn Progress' column requires more time and resources due to unforeseen technical challenges. Bob can provide gesture input (for an instance, rotate hand 5 times in clockwise direction) to transmit a data modification input. He can extend the task's deadline by five week and allocates additional developer hours to the task. This action updates the virtual object representing the task on the virtual presentation adjusts the resource allocation graph and shifts the task's position on the interactive timeline. Meanwhile, Carol, a user experience designer, modifies her avatar's appearance to include a "UX Expert" badge, indicating her area of expertise to other team members. She also moves her avatar closer to the "Improve Login Page Design" task, signalling her interest in discussing this topic during the meeting. Carol shares her screen within the virtual environment, showcasing her proposed design improvements, which are then discussed and approved by the team.
[00065] The cloud platform 108 receives these presentation alteration inputs and updates the virtual presentation in real-time. As a result, all participants can see the changes made by Bob, Carol, and others. After the meeting, the system 100 synchronizes the changes made to the project data, ensuring that all updates are reflected. [00066] FIG. 2 illustrates a method for interactive presentation of project data as a virtual presentation, in accordance with embodiments of present disclosure. The method for presenting project data as a virtual presentation enables users to engage with the project information in a dynamic, immersive, and collaborative way. The generation of virtual environment comprising virtual objects representing the project data and user avatars for each user. The user can modify the virtual presentation, virtual objects, or their avatars, creating a seamless and engaging experience across multiple user devices 106. The method for interactive presentation of project data as virtual presentation involves the following steps.
[00067] Step 202: Receiving Project Data
[00068] At step 202, receiving project data from various sources. This data may include schedules, budgets, resource allocations, progress updates, and other relevant information. The data can be obtained from project management platform, spreadsheets, databases, or other repositories.
[00069] Step 204: Generating a Virtual Presentation
[00070] At step 204, generating virtual presentation of the project data. The virtual presentation may include virtual objects representing the project data and user avatars representing each user individually. The virtual objects may include 3D models, charts, graphs, and other visual representations of the project data. The user avatars are designed to represent the users in the virtual environment, allowing them to interact with the virtual objects and collaborate with other users.
[00071] Step 206: Receiving the Generated Virtual Presentation at User Devices Once the virtual presentation is generated, it is transmitted to user devices 106. These user devices 106 can include desktop computers, laptops, tablets, smartphones, and VR headsets. The virtual presentation is received, at user device 106, through a secure, cloud-based platform 108 that ensures data integrity and accessibility.
[00072] Step 208: Displaying the Virtual Presentation on User Devices Upon receiving virtual presentation, the data display unit of each user device 106 displays the virtual environment. Users can view and interact with the virtual objects and avatars, exploring the project data in a visually engaging and immersive manner. The virtual presentation can be rendered in 2D or 3D, depending on the capabilities of the user device 106 and the preferences of the user.
[00073] Step 210: Receiving Presentation Alteration Input As users navigate the virtual environment, they can provide presentation alteration inputs to modify the virtual presentation. These inputs can include at least one from: (a) presentation modification input, (b) data modification input, and (c) avatar modification input.
[00074] Step 212: Modifying the Virtual Presentation
Based on the received presentation alteration input, the virtual presentation is modified to generate a modified virtual representation. This process involves updating the virtual objects, avatars, and overall presentation layout according to the user's input. The modified virtual representation reflects the most up-to-date project data and user interactions, ensuring that all users have access to accurate and current information.
[00075] Step 214: Rendering the Modified Virtual Presentation on User Devices
[00076] The modified virtual presentation is rendered on each of the plurality of user devices 106 by utilizing the cloud platform 108. The cloud platform 108 can also enable users to save their progress and share the modified virtual presentation with others, facilitating seamless communication and teamwork.
[00077] In this embodiment, each user device 106 is equipped with a voice recognition module, which enables users to interact with the virtual presentation using voice commands. With voice recognition, users can easily navigate through the virtual presentation, perform various actions, and get information without the need for physical input, such as typing or clicking. Overall, the integration of voice recognition technology can enhance the usability and accessibility of virtual presentation.
[00078] In an embodiment, the method can also involve tracking user interaction level with the virtual presentation, such as the frequency of modifications or the types of interactions performed. This collected data can be analysed to generate insights that can help to improve project efficiency or team collaboration. For example, if users frequently make certain types of changes, a notification can be issued for better project organization or communication.
[00079] In this embodiment, each user device 106 can be equipped with a sensor module that can detect gesture inputs. The sensors can recognize various hand gestures, such as swipes, pinches, and rotations, which users can use to interact with the virtual presentation. By using natural gestures, users can navigate through the presentation and make modifications to the content with ease, providing a more intuitive and engaging experience. The use of gesture inputs also eliminates the need for traditional input methods like a mouse or keyboard, making the presentation accessible to a wider range of users. Overall, the integration of gesture recognition technology can enhance the usability and user-friendliness of virtual presentations. [00080] FIG. 3 A illustrates virtual presentation displayed on user device, in accordance with embodiments of present disclosure. As illustrated, user is present in front of the user device 106. The user can interact with virtual presentation, which is displayed on user device 106. The user can provide presentation modification input, data modification input and avatar modification input on user device 106.
[00081] FIG. 3B illustrates engagement of user with virtual presentation to prepare modified virtual presentation, in accordance with embodiments of present disclosure. When user's hand enters in an activation zone, a predetermined 3D location close to the user, the hand movement as gesture input can be recognised. Different hand motions can be tracked and recognised by a camera 302, when the hand is in activation zone. The hand motion can be calibrated against previously established user input to carry out actions. The user input modifies displayed virtual presentation to modified virtual representation. Simultaneously another hand or other body parts movement can also be recognized. For instance, single-handed gestures may be considered as interacting with data representations, such as viewing or changing the associated data, whereas combined gestures of user's dominant hand drawing on their non-dominant hand's palm may be linked to user inputs related to avatar navigation within the virtual environment around data representations.
[00082] FIG. 3C illustrates modified virtual presentation, in accordance with embodiments of present disclosure. Upon receiving the input through user device 106, the displayed virtual environment (as in Fig. 3A) is modified, and modified virtual presentation is displayed. The modified virtual presentation, via cloud platform 108, is updated to all user devices 106 in real time.
[00083] In an exemplary aspect, non-transitory computer-readable storage medium containing executable instructions can be used to enable an interactive presentation of project data as a virtual presentation. When executed by a microprocessor, these instructions facilitate the generation, modification, and rendering of the virtual presentation on a plurality of user devices 106. The system architecture comprises a server arrangement, a non-transitory storage device, and a microprocessor for executing the routines.
[00084] The server arrangement can be understood as a combination of data server 102 and processing server 104. The data server 102 and processing server 104 can be communicably coupled through cloud platform 108 to store and manage the project data, virtual presentation, and user interactions. The cloud platform 108 can process project data, generates virtual presentation, and synchronization of the virtual presentation across multiple user devices 106. The server arrangement utilizes non-transitory storage device that is arranged to store a set of executable routines. These routines encompass various functions and processes required to facilitate the interactive presentation of project data as virtual presentation. The microprocessor can be coupled to non-transitory storage device and is responsible for executing the set of routines. The microprocessor receives the project data, processes it, and generates the virtual presentation based on the executable instructions. The non-transitory computer-readable storage medium contains executable instructions for the following processes:
[00085] Throughout the present disclosure, the term ‘user’ as used herein relates to any entity including a person (i.e., human being) or a virtual personal assistant (an autonomous program or a bot) using a device and/or system 100 and method described herein.
[00086] Throughout the present disclosure, the term “user device” 106 relates to an electronic device associated with (or used by) user that is capable of enabling the user to perform specific tasks associated with the aforementioned system/method. Furthermore, the user device 106 is intended to be broadly interpreted to include any electronic device that may be used for voice and/or data communication over a wireless communication network. Examples of user device 106 include, but are not limited to, cellular phones, personal digital assistants (PDAs), handheld devices, wireless modems, laptop computers, personal computers, etc. Moreover, user device 106 may alternatively be referred to as a mobile station, a mobile terminal, a subscriber station, a remote station, a user terminal, a terminal, a subscriber unit, an access terminal, etc. Additionally, the user device 106 includes a casing, a memory, a processor, a network interface card, a microphone, a speaker, a keypad, and a display. Moreover, the user device 106 is to be construed broadly, so as to encompass a variety of different types of mobile stations, subscriber stations or, more generally, communication devices, including examples such as a combination of a data card inserted in a laptop.
[00087] The data server 102 and processing server 104 can be operatively or communicably coupled with a network. Throughout the present disclosure, the term “network” relates to an arrangement of interconnected programmable and/or non-programmable components that are configured to facilitate data communication between one or more electronic devices and/or databases, whether available or known at the time of filing or as later developed. Furthermore, the network may include, but is not limited to, one or more peer-to-peer network, a hybrid peer- to-peer network, local area networks (LANs), radio access networks (RANs), metropolitan area networks (MANS), wide area networks (WANs), all or a portion of a public network such as the global computer network known as the Internet, a private network, a cellular network and any other communication system or systems at one or more locations. Additionally, the network includes wired or wireless communication that can be carried out via any number of known protocols, including, but not limited to, Internet Protocol (IP), Wireless Access Protocol (WAP), Frame Relay, or Asynchronous Transfer Mode (ATM). Moreover, any other suitable protocols using voice, video, data, or combinations thereof, can also be employed.
[00088] Throughout the present disclosure, the term “avatar” represents a graphical representation of user, often taking the form of a 2D or 3D character or icon. Avatars are commonly used in digital communication platforms, allowing users to express their identities and interact with others in a more engaging manner. By providing a unique and relatable identity, avatars encourage users to engage in meaningful communication and form lasting connections with others in the virtual presentation. In virtual environments, avatar facilitate user interaction, allowing individuals to communicate and collaborate. Avatars provide a sense of presence and personalization, as users can customize their digital selves with a range of features, including appearance, clothing, and accessories, to reflect their identity, interests, or aspirations.
[00089] In an aspect, any or a combination of artificial intelligence or machine learning mechanisms such as decision tree learning, Bayesian network, deep learning, random forest, supervised vector machines, reinforcement learning, prediction models, Statistical Algorithms, Classification, Logistic Regression, Support Vector Machines, Linear Discriminant Analysis, K-Nearest Neighbours, Decision Trees, Random Forests, Regression, Linear Regression, Support Vector Regression, Logistic Regression, Ridge Regression, Partial Least-Squares Regression, Non-Linear Regression, Clustering, Hierarchical Clustering - Agglomerative, Hierarchical Clustering - Divisive, K-Means Clustering, K-Nearest Neighbours Clustering, EM (Expectation Maximization) Clustering, Principal Components Analysis Clustering (PCA), Dimensionality Reduction, Non-Negative Matrix Factorization (NMF), Kernel PCA, Linear Discriminant Analysis (LDA), Generalized Discriminant Analysis (kernel trick again), Ensemble Algorithms, Deep Learning, Reinforcement Learning, AutoML (Bonus) and the like can be employed.
[00090] Example embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including hardware, software, firmware, and a combination thereof. For example, in one embodiment, each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations can be implemented by computer program instructions. These computer program instructions may be loaded onto a general- purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
[00091] Throughout the present disclosure, the term ‘processing means’ or ‘microprocessor’ or ‘processor’ or ‘processors’ includes, but is not limited to, a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor). [00092] The term “non-transitory storage device” or “storage” or “memory,” as used herein relates to a random access memory, read only memory and variants thereof, in which a computer can store data or software for any duration.
[00093] Operations in accordance with a variety of aspects of the disclosure described above would not have to be performed in the precise order described. Rather, various steps can be handled in reverse order or simultaneously or not at all.
[00094] While several implementations have been described and illustrated herein, a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein may be utilized, and each of such variations and/or modifications is deemed to be within the scope of the implementations described herein. More generally, all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize or be able to ascertain using no more than routine experimentation, many equivalents to the specific implementations described herein. It is, therefore, to be understood that the foregoing implementations are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, implementations may be practiced otherwise than as specifically described and claimed. Implementations of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A system for interactive presentation of project data as a virtual presentation, the system comprising: a data server to store the proj ect data; a processing server connected to the data server, wherein the processing server: receives the project data from the data server; generates the virtual presentation of the received project data, wherein the virtual presentation comprises: a plurality of virtual objects representing the project data; and a plurality of user avatars representing, individually, each user of a plurality of users; and transmits the generated virtual presentation; a plurality of user devices, wherein each user device is controlled by at least one user and wherein each user device comprises: a data reception unit to receive the transmitted virtual presentation; a data display unit to present the received virtual presentation; a data input unit to receive a presentation alteration input from at least one user, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; a cloud platform connected to the data server, the processing server and the plurality of user devices, wherein the cloud platform: receives the transmitted virtual presentation from the processing server; receives the presentation alteration input from each user device; modifies the virtual presentation based on the received presentation alteration input to generate a modified virtual representation; and renders the modified virtual presentation on each of the plurality of user devices.
2. The system of claim 1, wherein each user device, individually, is associated with a sensor module, wherein the sensor module detects at least one gesture input to interact with the presented virtual presentation.
3. The system of claim 2, wherein the sensor module comprises at least one sensor selected from a depth camera, a motion tracking device, an ultrasonic sensor and an optical flow sensor.
4. The system of claim 1, wherein the data server or the cloud server enables bi-directional data synchronization of the project data to enable changes made within the visualization presentation, wherein the changes are updated in the stored project data.
5. The system of claim 1, wherein the virtual presentation comprises at least one notification corresponding to a deadline, a critical issue associated with the project; and a project milestone.
6. The system of claim 1, wherein each virtual object corresponds to at least one agile ceremony.
7. The system of claim 1, wherein each user device associated with a haptic device to provide haptic feedback to the user based on the interaction of with interactions with the at least one virtual object.
8. The system of claim 1, wherein the data display unit is selected from a virtual reality device, an augmented reality device, an extended reality device and a holographic projection device.
9. The system of claim 1, wherein each user is selected form: a host user who is associated with at least one host control; and at least one guest user, wherein each guest user is associated with at least one guest control.
10. The system of claim 1, wherein the virtual presentation enables display of combination of a geospatial visualization and a contextual overlay, allowing the at least one user to view an agile management data in relation to a physical location.
11. The system of claim 1, wherein the presentation alteration input enables creation and manipulation of a virtual sticky note within the virtual presentation.
12. The system of claim 1, wherein each avatar represents a category of the user.
13. A method for interactive presentation of project data as a virtual presentation, the method comprising: receiving, the project data; generating, a virtual presentation of the project data, wherein the virtual presentation comprises plurality of virtual objects representing the project data and a plurality of user avatars representing, individually, each user of a plurality of users; receiving, the generated virtual presentation at a plurality of user devices; displaying, the received virtual presentation at data display unit of each user device; receiving, a presentation alteration input from at least one user on each user device, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; modifying, the virtual presentation based on the received presentation alteration input to generate a modified virtual representation; and rendering, the modified virtual presentation on each of the plurality of user devices via the cloud platform.
14. The method of claim 13, wherein each user device is associated with a voice recognition module to allow at least one user to interact with the virtual presentation.
15. The method of claim 13, further comprising tracking a level of user interaction to generate an insight based on the collected data to improve project efficiency or team collaboration.
16. The method of claim 13, wherein each user device, individually, is associated with a sensor module, wherein the sensor module detects at least one gesture input to interact with the presented virtual presentation.
17. The method of claim 13, wherein the sensor module comprises at least one sensor selected from a depth camera, a motion tracking device, an ultrasonic sensor and an optical flow sensor.
18. A non-transitory computer-readable storage medium, comprising executable instructions that, when executed by a microprocessor, for interactive presentation of project data as a virtual presentation, comprising: arranging, a server arrangement comprising: a non-transitory storage device that is arranged to store a set of executable routines; and utilizing, the microprocessor which is coupled to the non-transitory storage device and operable to execute the set of routines for: receiving, the project data; generating, a virtual presentation of the proj ect data, wherein the virtual presentation comprises plurality of virtual objects representing the project data and a plurality of user avatars representing, individually, each user of a plurality of users; receiving, the generated virtual presentation at a plurality of user devices; displaying the received virtual presentation at data display unit of each user device; receiving, a presentation alteration input from at least one user on each user device, wherein the received presentation alteration input comprises at least one of: a presentation modification input for modifying the virtual presentation; a data modification input for modifying the at least one virtual object; and an avatar modification input for modifying the user avatar; modifying, the virtual presentation based on the received presentation alteration input to generate a modified virtual representation; and rendering, the modified virtual presentation on each of the plurality of user devices via the cloud platform.
PCT/IB2023/000411 2022-05-11 2023-05-10 Virtual collaboration and presentation system WO2023218247A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263340836P 2022-05-11 2022-05-11
US63/340,836 2022-05-11

Publications (1)

Publication Number Publication Date
WO2023218247A1 true WO2023218247A1 (en) 2023-11-16

Family

ID=88729846

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/000411 WO2023218247A1 (en) 2022-05-11 2023-05-10 Virtual collaboration and presentation system

Country Status (1)

Country Link
WO (1) WO2023218247A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210096553A1 (en) * 2019-09-26 2021-04-01 Rockwell Automation Technologies, Inc. Collaboration tools
US20220130844A1 (en) * 2019-08-01 2022-04-28 Rohm Co., Ltd. Nonvolatile semiconductor memory device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220130844A1 (en) * 2019-08-01 2022-04-28 Rohm Co., Ltd. Nonvolatile semiconductor memory device
US20210096553A1 (en) * 2019-09-26 2021-04-01 Rockwell Automation Technologies, Inc. Collaboration tools

Similar Documents

Publication Publication Date Title
Ens et al. Grand challenges in immersive analytics
Gross Supporting effortless coordination: 25 years of awareness research
Norouzi et al. A systematic review of the convergence of augmented reality, intelligent virtual agents, and the internet of things
CN105074741B (en) It is recommended that continuous item
CN104471564B (en) Modification is created when transforming the data into and can consume content
CN104350493B (en) Transform the data into consumable content
CN109074551A (en) The activity feeding of file in trust
Zhou et al. Pervasive social computing: augmenting five facets of human intelligence
CN105474157A (en) Mobile device interfaces
US11877203B2 (en) Controlled exposure to location-based virtual content
Streitz et al. Cooperative buildings: integrating information, organization, and architecture
CN106062794A (en) Displaying and navigating implicit and explicit enterprise people relationships
US20240126406A1 (en) Augment Orchestration in an Artificial Reality Environment
Yang et al. Classifying virtual reality-based collaboration environments: practical insights for application in fashion design
Wodehouse et al. The configuration and experience mapping of an accessible VR environment for effective design reviews
Hardy Experiences: a year in the life of an interactive desk
WO2023218247A1 (en) Virtual collaboration and presentation system
Belkacem et al. Interactive Visualization on Large High-Resolution Displays: A Survey
Vosinakis et al. Virtual environments for collaborative design: requirements and guidelines from a social action perspective
Streitz et al. Distributed, Ambient and Pervasive Interactions: Understanding Humans: 6th International Conference, DAPI 2018, Held as Part of HCI International 2018, Las Vegas, NV, USA, July 15–20, 2018, Proceedings, Part I
Franz et al. A virtual reality scene taxonomy: Identifying and designing accessible scene-viewing techniques
US11948263B1 (en) Recording the complete physical and extended reality environments of a user
US20240071242A1 (en) Mixed reality scenario generation for cross-industry training
Belo Context-Aware Adaptive User Interfaces for Mixed Reality
Grønli Cloud computing and context-awareness: a study of the adapted user experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23803099

Country of ref document: EP

Kind code of ref document: A1