WO2024083302A1 - Virtual portal between physical space and virtual space in extended reality environments - Google Patents

Virtual portal between physical space and virtual space in extended reality environments Download PDF

Info

Publication number
WO2024083302A1
WO2024083302A1 PCT/EP2022/078769 EP2022078769W WO2024083302A1 WO 2024083302 A1 WO2024083302 A1 WO 2024083302A1 EP 2022078769 W EP2022078769 W EP 2022078769W WO 2024083302 A1 WO2024083302 A1 WO 2024083302A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
participant
immersive
environment
physical
Prior art date
Application number
PCT/EP2022/078769
Other languages
French (fr)
Inventor
Tommy Arngren
Peter ÖKVIST
Andreas Kristensson
David Lindero
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/EP2022/078769 priority Critical patent/WO2024083302A1/en
Publication of WO2024083302A1 publication Critical patent/WO2024083302A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present disclosure relates to rendering extended reality (XR) environments and associated XR rendering devices, and more particularly to rendering objects in immersive XR environments for display on XR participant devices.
  • XR extended reality
  • Immersive extended reality (XR) environments have been developed which provide a myriad of different types of user experiences for gaming, on-line meetings, cocreation of products, etc.
  • One type of immersive XR environments (also referred to as "XR environments") is virtual reality (VR) environments where human users only see computer generated graphical renderings.
  • VR virtual reality
  • AR augmented reality
  • Example XR environment rendering devices include, without limitation, XR environment servers, XR headsets, gaming consoles, smartphones running an XR application, and tablet/laptop/desktop computers running an XR application.
  • Oculus Quest is an example XR device and Google Glass is an example AR device.
  • XR meeting applications are tools for experiencing online immersive meetings and are useful as a thinking and planning space for oneself as well as among a group of people.
  • Some XR meeting applications support AR devices, browsers, and VR devices.
  • a participant using a browser may join via desktop, tablet-PC or smartphone and share their views using a front faced cam or a web cam.
  • Some XR meeting solutions have mobile application versions, e.g., Android and iOS, which allow a user to navigate in the virtual space on the screen or activate an AR mode to display the meeting in their own surroundings.
  • the XR meeting solutions introduce new features to online meetings that allow for new ways to share and create content etc.
  • Today’s commonly and commercially available XR devices typically include a head mounted device (HMD) and a pair of hand controllers, sometimes with more advanced solutions including “foot controllers”.
  • HMD head mounted device
  • foot controllers sometimes with more advanced solutions including “foot controllers”.
  • Immersive XR environments such as gaming environments and meeting environments, are often configured to display computer generated avatars which represent poses of human users in the immersive XR environments.
  • a user may select and customize an avatar, such as gender, clothing, hair style, etc. to represent that user for viewing by other users participating in the immersive XR environment.
  • Users can interact with virtual objects that are rendered in the immersive XR environment, such as by controlling their avatars to manipulate virtual objects.
  • Immersive XR environments presently support little if any interface between physical objects and virtual objects that are rendered.
  • Some embodiments disclosed herein are directed to an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment.
  • the XR rendering device includes processing circuitry adapted to perform operations.
  • the operations include defining a virtual portal with an entrance boundary at a location in physical space of a first participant.
  • the operations also include tracking a location of a physical object relative to the location of the entrance boundary.
  • the operations also include rendering a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.
  • Some other embodiments are directed to a corresponding method by an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment.
  • the method includes defining a virtual portal with an entrance boundary at a location in physical space of a first participant.
  • the method also includes tracking a location of a physical object relative to the location of the entrance boundary.
  • the method also includes rendering a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.
  • Some potential advantages of these embodiments are that they allow one participant (sending participant) in an immersive XR environment to cause a physical object to become rendered as a virtual object by the participant passing the physical object through the location of the virtual portal in physical space.
  • the participant may view the rendered virtual object and another participant (receiving participant) in the immersive XR environment may simultaneously view the virtual object.
  • the receiving participant is able to interact with the virtual object by, for example, using a hand of the corresponding avatar to manipulate the rendered pose of the virtual object.
  • Feedback can be provided to the sending participant to experience the interaction of the receiving participant with the virtual object.
  • Figure 1 illustrates an XR system that includes a plurality of participant devices that communicate through networks with an XR rendering device to operate in accordance with some embodiments of the present disclosure
  • Figure 2 is an example illustration of the XR rendering device tracking location of a physical object in the first participant’s physical space while the physical object is being passed through a virtual portal and gradually rendered in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure
  • Figure 3 illustrates an alternative example operation of the XR rendering device tracking location of a physical object in the first participant’s physical space while the physical object is being passed through a virtual portal and abruptly rendered in its entirety in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure
  • Figure 4 illustrates an example virtual portal defined relative to a reference coordinate system in a physical space and relative to a first participant and a second participant device, in accordance with some embodiments of the present disclosure
  • Figure 5 illustrates an example implementation of a pair of virtual portals 200 defined in separate physical spaces of Participant 1 and Participant 2, in accordance with some embodiments of the present disclosure
  • Figure 6 illustrates an example implementation of a virtual portal in a physical space shared by the first participant and the second participant, in accordance with some embodiments of the present disclosure
  • Figures 7 through 9 are flowcharts of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure.
  • Figure 10 is a block diagram of components of an XR rendering device that are configured to operate in accordance with some embodiments of the present disclosure.
  • Figure 1 illustrates an XR system that includes a plurality of participant devices 110a and 110b that communicate through networks 120 with an XR rendering device 100 to operate in accordance with some embodiments of the present disclosure.
  • the XR rendering device 100 is configured to generate a graphical representation of an immersive XR environment (also called an "XR environment" for brevity) which is viewable from various perspectives of virtual poses of human participants in the XR environment through display screens of the various participant devices 1 lOa-b.
  • the illustrated devices include XR headsets 110a and 110b which can be worn by participants to view and navigate through the XR environment.
  • the participant devices are not limited to XR headsets, as they could be a personal computer, tablet, smart phone, or other electronic device which can be operated by a participant to view and navigate through the XR environment.
  • the participants may have associated avatars which are rendered in the XR environment to represent poses (e.g., location, body assembly orientation, etc.) of the participants relative to a coordinate system of the XR environment.
  • the XR rendering device 100 is illustrated in Figure 1 as being a centralized network computing server separate from one or more of the participant devices, in some other embodiments the XR rendering device 100 is implemented as a component of one or more of the participant devices.
  • one of the participant devices may be configured to perform operations of the XR rendering device in a centralized manner controlling rendering for or by other ones of the participant devices.
  • each of the participant devices may be configured to perform at least some of the operations of the XR rendering device in a distributed decentralized manner with coordinated communications being performed between the distributed XR rendering devices (e.g., between software instances of XR rendering devices).
  • Some embodiments of the present disclosure are directed to an XR rendering device enabling one or more participants in an immersive XR environment to pass a physical object through one or more virtual portals located in physical space, to trigger the XR rendering device 100 to render a virtual object representing the physical object in the immersive XR environment.
  • a first participant may reach a hand through the virtual portal to trigger to the XR rendering device 100 to render a computer generated virtual representation of the hand in the immersive XR environment.
  • the first participant and another second participant may simultaneously view the virtual hand.
  • the second participant may similarly reach a hand through a virtual portal to trigger rendering of another virtual hand, and move the hand to virtually shake the virtual hand of the first participant.
  • the first and second participant may be provided corresponding feedback of the virtual handshake, e.g., as tactile feedback initiated by the XR rendering device 100 and provided through tactile gloves worn by the first and second participants.
  • the type of physical object that can be passed through a virtual portal for rendering and possible interaction with by participants in the immersive XR environment is not limited by any of the operational embodiments herein.
  • Figure 7 is a flowchart of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure.
  • an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment.
  • the XR rendering device includes processing circuitry adapted to perform operations.
  • the operations include to define 700 a virtual portal with an entrance boundary at a location in physical space of a first participant.
  • the operations track 702 a location of a physical object relative to the location of the entrance boundary.
  • the operations render 704 a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.
  • the rendering of the virtual object may be provided to a device of the participant and/or one or more other participants for viewing and possible interaction with.
  • XR headsets can be worn by participants in the immersive XR environment to provide an intuitive way for the participants to view the rendered virtual object and possibly also view a rendering of the virtual portal.
  • Figure 8 is a flowchart of operations that may be performed by the XR rendering device in accordance with some embodiments of the present disclosure. Referring to Figure 8, the operations further include communicating the virtual object to an XR headset worn by another one of the participants and/or the first participant for display.
  • FIG. 2 is an example illustration of the XR rendering device 100 tracking location of a physical object 210 in the first participant’s physical space while the physical object 210 is being passed through a virtual portal 200 and rendered in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure.
  • Participant 1 and Participant 2 are present participants in an immersive XR environment rendered by the XR rendering device 100.
  • Participant 1 and Participant 2 can simultaneously view virtual objects that are rendered by the XR rendering device 100 in their respective virtual spaces, such as through XR headsets worn by the participants.
  • the XR rendering device 100 is communicatively coupled to at least one sensor 230 that tracks location of a physical object 210, e.g., a smart phone, relative to the boundary of a virtual portal 200 having a location defined in physical space.
  • the sensor 230 may be one or more cameras mounted to an XR headset and facing forward in the wearer's eye gaze direction.
  • the sensor 230 may additionally or alternatively be other types of sensors able to track location of physical objects, such as LIDAR sensors, ultrasound sensors, etc.
  • two cameras may be used to track location of the physical object 210 in three- dimensional (3D) space or a single LIDAR sensor may be used.
  • LIDAR and one or more cameras may be used together to track and determine location, shape, texture, and color of the physical object 210.
  • the XR rendering device 100 defines 700 the virtual portal 200 with an entrance boundary at a location in physical space of the first participant.
  • the virtual portal 200 functions as a boundary through which the first participant can pass the physical object 210 to trigger the XR rendering device 100 to render a virtual representation 220 in the virtual spaces of the participants, e.g., Participant 1 and Participant 2, within the immersive XR environment.
  • the XR rendering device 100 defines a location, size, and shape of an entrance boundary of virtual portal 200.
  • the location of the entrance boundary of virtual portal 200 is defined in physical space, such as relative to a feature on a physical table in a room, a distance away from Participant 1, relative to an object worn by Participant 1 (smartwatch, smart glasses, etc.), etc.
  • a computer generated representation of the virtual portal 200 may be displayed in the immersive XR environment in the virtual space viewed by Participant 1.
  • the computer generated representation of the virtual portal 200 may additionally or alternatively be displayed in the in the virtual space viewed by Participant 2, such as when Participants 1 and 2 are sharing an immersive XR environment.
  • virtual space is used herein to refer to the displayable space within the computer rendered immersive XR environment.
  • physical space is used to refer to the real space in which the human participant resides.
  • the XR rendering device 100 defines a location of an entrance boundary of the virtual portal 200 relative to a reference system in physical space where the participant resides and which is tracked by one or more sensors 230.
  • the entrance boundary may be a single point, may extend along a two-dimensional (2D) plane (e.g., flat rectangle or flat circle), such as illustrated in Figure 2, or may be a three-dimensional (3D) shape (e.g., sphere or box).
  • the entrance boundary may be defined to be a fixed distance and/or a fixed angle (fixed pose offset) relative to the participant or relative to a defined physical object which is in the physical space.
  • the participant and/or the defined physical object may move within the physical space, and the entrance boundary location in the reference system of physical space may be dynamically updated to track changes in location of the participant and/or the defined physical object 210.
  • the XR rendering device 100 uses sensor data received from the sensor 230 to track 702 a location of the physical object 210, such as a smart phone as illustrated, in the first participant’s physical space relative to the location of an entrance boundary of the virtual portal 200 in the first participant’s virtual space.
  • a location of the physical object 210 such as a smart phone as illustrated
  • the XR rendering device then renders 704 a virtual object 220 representation of at least a portion of the physical object 210.
  • the XR rendering device may render the virtual object 220 for viewing by the second participant in the second participant’s virtual space, e.g., as illustrated in Figure 2.
  • the first participant may also see an overlay of the virtual object 220 displayed over the physical object 210.
  • the first participant may see the physical object 210 when passing the physical object through the virtual portal 200.
  • the XR rendering device 100 tracks the location of the physical object 210 relative to the entrance boundary of the virtual portal 200. When a threshold portion of the physical object is determined to have crossed the entrance boundary, the XR rendering device responds by performing operations to generate a virtual object 220 representation of at least a portion of the physical object 210 in the immersive XR environment.
  • the XR rendering device 100 may, for example, cause the virtual object 220 to be displayed through the XR headsets worn by Participant 1 and Participant 2.
  • the operation to render the virtual object representation of at least a portion of the physical object in the immersive XR environment is performed responsive to determining that at least a portion of the physical object exceeding a threshold has crossed the entrance boundary.
  • Participant 1 can see a smartphone 210 resting on a table by either viewing video displayed from a camera on a VR headset or by seeing the smartphone 210 through a see-through display of an AR headset. Participant 1 desires to display a virtual object representation 220 of the smartphone 210 to another participant (Participant 2) in the immersive XR environment. Therefore, Participant 1 picks-up the smartphone 210 and moves it through the entrance boundary 200 of the virtual portal. As the XR rendering device 100 tracks, via the sensor 230, the smartphone 210 moving through the entrance boundary 200, it renders the virtual object representation of the smartphone 220 in the immersive XR environment.
  • the rendering can include displaying the virtual object 220 through the XR headset worn by Participant 2 with a pose (location and orientation) defined in a virtual space of the immersive XR environment relative to the perspective of Participant 2 looking toward a computer generated representation of the virtual portal 200.
  • the rendering can include displaying the virtual object 220 through the XR headset worn by Participant 1 with a pose (location and orientation) defined in the virtual space relative to the perspective of Participant 1 looking toward a computer generated representation of the virtual portal 200.
  • the XR rendering device 100 may render an amount of the virtual object 220 that is proportional to an amount of the physical object 210 (smartphone) that has moved through the entrance boundary, which can result in Participants 1 and 2 seeing the virtual object 220 gradually appear in the immersive XR environment.
  • the operation to render 704 the virtual object 220 representation of at least a portion of the physical object 210 in the immersive XR environment renders an amount of the virtual object 220 that is proportional to an amount of the physical object 210 that has moved across the entrance boundary of the virtual portal 200.
  • the XR rendering device 100 may render the virtual object 220 in its entirety when a threshold amount of the physical object 210 (e.g., an edge of the smartphone) has moved through the entrance boundary 200, which can result in Participants 1 and 2 seeing the virtual object to instantly appear in the immersive XR environment.
  • Figure 3 illustrates an alternative example operation of the XR rendering device 100 tracking location of the physical object 210 in the first participant’s physical space while the physical object is being passed through the virtual portal 200 and abruptly rendered in its entirety as a virtual object 300 in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure.
  • the operation to render the virtual object representation of at least a portion of the physical object in the immersive XR environment renders the virtual object in its entirety responsive to a threshold amount of the physical object having been moved across the entrance boundary of the virtual portal.
  • Figure 4 illustrates an example virtual portal 200 defined relative to a reference coordinate system in a physical space, and relative to locations of Participant 1 and Participant 2, in accordance with some embodiments of the present disclosure. Locations of physical objects relative to an entrance boundary 200 of the virtual portal can be tracked using sensors 230 relative to the reference coordinate system.
  • the XR rendering device 100 defines a location of an entrance boundary 200 relative to a reference system in physical space where the participant resides.
  • the entrance boundary is illustrated as including a focus point (FP) at a three-dimensional location (0, 0, 0) centered in the virtual portal 200 defined relative to a reference coordinate system in physical space.
  • the size and shape of the entrance boundary 200 can be defined relative to the FP.
  • the entrance boundary 200 has a two-dimensional square shape, but could have any shape in 2D or 3D space.
  • the illustrated Participant 1 and Participant 2 are also located in the same physical space, and therefore can move physical objects which are trackable by the sensors 230 relative to the same reference coordinate system in physical space.
  • the XR headsets 400 worn by Participant 1 and Participant 2 can include an object tracking sensor(s) 230, such as one or more front-facing cameras, Lidar sensors, and/or ultrasound sensors.
  • the object tracking sensor(s) 230 are configured to output data which is processed, e.g., by the XR rendering device 100, to track location of physical objects relative to the reference coordinate system in physical space.
  • Cameras can output image frames which can be processed to identify a physical object, shape of the physical object, color of the physical object, texture of the physical object, and location of the physical object.
  • Lidar sensors can output point cloud data which can be similarly processed to identify an object, its shape, and its location.
  • the object tracking sensor(s) may be separate from the XR headsets or a combination of XR headset-based object tracking sensor(s) and separate object tracking sensor(s) may be used.
  • the operation to render 704 the virtual object representation of at least a portion of the physical object in the immersive XR environment renders an amount of the virtual object that is proportional to an amount of the physical object that has moved across the entrance boundary of the virtual portal.
  • the XR rendering device 100 may terminate or stop rendering of the virtual object based on the physical object being removed from the virtual portal.
  • the operations further include terminating rendering the virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by the other one of the participants and/or the first participant responsive to determining the physical object has no longer crossed the entrance boundary of the virtual portal.
  • the operations further include to terminate rendering the virtual object 220 representation of at least a portion of the physical object 210 in the immersive XR environment for viewing by the other one of the participants and/or the first participant responsive to determining the physical object 210 has no longer crossed the entrance boundary of the virtual portal 200.
  • the XR rendering device may trigger audible feedback, tactile feedback, or other visual feedback to one or both of Participants 1 and 2.
  • Participants 1 and 2 may be provided a tone or other audible notification, e.g., through XR headsets, when the virtual object representation of the smartphone is rendered in the immersive XR environment.
  • Participants 1 and 2 may be provided tactile feedback, such as a vibration through tactile gloves or controllers handled by Participants 1 and 2 when the virtual object representation of the smartphone is rendered in the immersive XR environment.
  • Participants 1 and 2 may be provided visual feedback, such as text and/or graphical object displayed by the XR headsets, indicating that the virtual object representation of the smartphone is rendered in the immersive XR environment, which can be helpful when the Participants are not looking in a direction of the virtual portal.
  • visual feedback such as text and/or graphical object displayed by the XR headsets, indicating that the virtual object representation of the smartphone is rendered in the immersive XR environment, which can be helpful when the Participants are not looking in a direction of the virtual portal.
  • Figure 9 illustrates a flowchart of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure.
  • the operations further include generating 900 a notification through an electronic device operated by the first participant responsive to determining the physical object 210 has crossed the entrance boundary of the virtual portal 200.
  • the notification is configured to trigger at least one of outputting audible feedback, generating tactile feedback, or displaying visual feedback to the first participant.
  • Figure 5 illustrates an example implementation of a pair of virtual portals 200 defined in separate physical spaces of Participant 1 and Participant 2, in accordance with some embodiments of the present disclosure.
  • FIG. 5 illustrates that Participants 1 and 2 share a virtual portal having an entrance boundary 200 defined relative to separate physical spaces of Participants 1 and 2.
  • Participants 1 and 2 may be physically distant from each other but share the entrance boundary 200 of the virtual portal.
  • the virtual portal can have a single entrance boundary that is defined relative a coordinate system of the physical space of Participant 1 or Participant l’s VR headset 400, or may have another entrance boundary that is defined relative a coordinate system of the physical space of Participant 2 or Participant 2’s VR headset 400.
  • the XR rendering device can define the virtual portal with a first entrance boundary at a location in a first coordinate system of the physical space of the first participant and a second entrance boundary that is defined relative a second coordinate system of a physical space of a second participant.
  • the second entrance boundary may be defined and used responsive to a rule becoming satisfied for interacting with the virtual portal, e.g., when the second participant is within a threshold distance of the virtual portal and/or has an avatar with an eyesight gaze that is within a threshold angle toward the virtual portal.
  • the operation to define the virtual portal with the entrance boundary at the location in physical space of the first participant further includes to define the virtual portal with a first entrance boundary at a location in a first coordinate system of the physical space of the first participant and a second entrance boundary that is defined relative a second coordinate system of a physical space of a second participant.
  • Figure 6 illustrates an example alternative implementation of a virtual portal in a physical space shared between the first participant and second participant, in accordance with some embodiments of the present disclosure.
  • the XR rendering device 100 can define locations of Participant 1 (negative z-axis) with Participant 2 (positive z-axis) in a common reference system, so that, for example, any object moved by Participant 1 is tracked with a negative z-value (within FP frame) and rendered at a correspondingly virtual location for viewing by Participant 2 (positive z-axis).
  • the XR rendering device 100 monitors image data from each participant’s sensor(s), such as a camera.
  • sensor(s) such as a camera.
  • the relationship physical object z ⁇ z threshold is determined to be satisfied which triggers object detection and rendering of appropriate digital representation of the physical object (virtual object).
  • determining that an edge of the physical object has passed through the virtual portal can trigger object detection and rendering of a virtual object representation of the physical object on the receiver side, e.g.,.
  • the virtual object gradually appears (more) on receiving side, i.e. from 0% - 100% of physical object is represented and rendered as a virtual object on receiving side.
  • the virtual object gradually appears (more) on the receiving side, i.e. from 0% - 100%.
  • Movement and other attributes of the physical object within the virtual portal can be rendered as corresponding motion and other attributes of the virtual object that are displayed for viewing by Participant 1, Participant 2, and/or other participants.
  • the XR rendering device 100 is a central server.
  • the centralized XR rendering device 100 can receive sensor data (e.g., image data) from at least one sensor (e.g., a camera), and may operate to automatically determine a corresponding digital representation for rendering as the virtual object based on received image data.
  • the rendering may be based on processing the received image data through image processing techniques and/or machine learning techniques.
  • the XR rendering device 100 may access a database that includes 3D representations of different objects (wireframes, point clouds, 3D models, etc.).
  • the XR rendering device 100 may select a 3D representation from a set of representations and its associated preferences/rules/characteristics.
  • the cameras may continuously send image data to the XR rendering device 100, which adapts the 3D representation accordingly.
  • the image data can be processed to determine information about position, gesture, orientation, etc. which is translated into parameters.
  • the parameters are used to control rendering of 3D representation and adapt appearance of the virtual object displayed to Participant 1 and/or Participant 2.
  • Some embodiments are directed to determining a pose of the virtual object being rendered in immersive XR environment.
  • the pose may correspond to the location and/or angular orientation.
  • the XR rendering device may be configured to determine a pose of a physical object that has crossed the entrance boundary of the virtual portal and render a pose of the virtual object rendered in the immersive XR environment based on the pose of the physical object.
  • an image stream from a camera on the XR headset of Participant 1 can be processed to determine the pose of the physical smartphone as it is held across the threshold of the entrance boundary.
  • the XR rendering device can render the virtual smartphone with a pose that corresponds to (e.g., same as or defined offset relative to) the determined pose of the physical smartphone.
  • the XR rendering device may dynamically change the pose of the virtual smartphone being rendered to track in real-time changes in pose of the physical smartphone.
  • the operations further include to determine pose of the physical object relative to a coordinate system of the physical space of the first participant.
  • the operations also further include to render pose of the virtual object in the immersive XR environment responsive to the determined pose of the physical object.
  • Some embodiments are directed to configuring the virtual portal.
  • a virtual portal may be defined to have any shape, where the shape may be defined depending on use case, context, participant settings and preferences, type of objects to be pushed through, communication channel capabilities, XR system capabilities (e.g., camera sensor capabilities).
  • shape may be defined depending on use case, context, participant settings and preferences, type of objects to be pushed through, communication channel capabilities, XR system capabilities (e.g., camera sensor capabilities).
  • the virtual portal may be configured to apply a filter mechanism to effect the virtual object being rendered.
  • the operations further include to determine a filter mechanism to apply an effect to the virtual object.
  • the rendering of the virtual object representation of at least the portion of the physical object in the immersive XR environment includes to render the virtual object of at least the portion of the physical object with the filter mechanism applied.
  • a filter may be used to “clean” a physical object of features such as, but not limited to, hot-words, symbols, or tattoos.
  • tattoo(s) on the first participant’s hand pushed through the virtual portal can operationally disappear on the receiver side and/or the sender side.
  • the XR system removes a tattoo by “erasing” the tattoo using operations such as occlusion by overlaying with XR display data so the receiver side receives a rendering of a digitalized hand which does not contain tattoo parts. Instead, the tattoo parts have been replaced by other skin texture from the hand.
  • a filter may be used to change/modify colors and/or textures of the virtual object compared to the physical object.
  • the filter may apply a hydro dipping effect of texture and/or color to the virtual object.
  • Hydro dipping is a method of applying printed graphics to three dimensional objects.
  • the entrance boundary of the virtual portal may be decorated with a decorative pattern or texture which is then transferred to the surface of the virtual object when the physical object is passed through the virtual portal. This transferring of the decorative pattern or texture may be applied based on many factors such as the pose of the physical object as it crosses the entrance boundary.
  • a filter may be used to apply affects such as animated movement of the virtual object being rendered.
  • Corresponding haptic feedback may be provided to one or more participants interacting with the virtual object.
  • a filter may be used to map the physical object with a selectable one of a plurality of different virtual objects.
  • a physical apple can be mapped by the XR rendering device to be rendered in the immersive XR environment as a virtual representation of an orange.
  • one type of smartphone can be mapped to be rendered as another type of smartphone or other electronic device, such as a tablet computer.
  • rendering the virtual object representation of at least the portion of the physical object with the filter mechanism applied includes at least one of: (1) applying a filter to erase a defined portion of the virtual object; (2) applying a filter to modify a color of at least part of the virtual object; (3) applying a filter to modify a texture of at least part of the virtual object; and (4) applying a filter to modify a rotation and/or translational movement of the virtual object.
  • a filter setting may be defined on one or more of the XR headsets and/or on the XR rendering device.
  • resolution of the conflict may be performed based one or more rules, such as a prioritization ranking of the filter settings.
  • a virtual portal may have different size on the sending and receiving sides.
  • the receiving side virtual portal may be scaled an amount defined by a specific context in which the virtual portal is used to receive physical objects.
  • a virtual portal may have different sound profiles associated with protrusion of objects in general, and certain objects in particular. There may be a pre-defined sound profile per type of objects and/or individual objects, and/or associated with respective objects’ attributes, such as: “a hand coming through” triggers sound profile A; “right hand coming through” triggers sound profile B; “right hand coming through, holding a red apple” triggers sound profile C; and “apple coming through, interchanged to, orange coming through” triggers sound profile D.
  • Respective sound profiles may be any of a “sound” or object-to-speech sequence associated with detected to-be/being protruded physical/virtual object and the physical/virtual object’s attributes.
  • the operations include to determine the physical object has crossed the entrance boundary of the virtual portal. Then responsive to determining the physical object has crossed the entrance boundary of the virtual portal, the operations further include to select a sound profile from among a set of a plurality of different sound profiles based on an identified characteristic of the physical object, and control an electronic device operated by the first participant to output sound defined by the selected sound profile.
  • a physical object with digital capabilities is passed through the virtual portal.
  • a non-digital object such as a hand or apple, may be pushed through the virtual portal.
  • the sending participant may similarly push through a physical electronic device, such as a smart watch, smartphone, or other electronic device.
  • the physical object includes an electronic device.
  • the operations further include to identify characteristic of an interaction of an avatar representing a second participant in the immersive XR environment with the virtual object representation of the electronic device.
  • the operations also further include to communicate with the electronic device to control an operational function of the electronic device based on the identified characteristic of the interaction.
  • the XR rendering device tracks interaction of Participant 2's avatar with the virtual smartphone and may change the rendered pose of the virtual smartphone responsive to Participant 2's interaction.
  • the XR rendering device may rotate the virtual smartphone to track movement of Participant 2 avatar's hand rotating while virtually holding the virtual smartphone. In this manner, Participant 2 can rotate the virtual smartphone to view other surfaces thereof, and Participant 1 can visually observe Participant 2's interaction with the virtual smartphone.
  • the XR rendering device tracks interaction of Participant 2's avatar with the virtual smartphone and provides feedback to Participant 1 to responsively change pose of the physical smartphone which, in turn, will then change the rendered pose of the virtual smartphone.
  • the XR rendering device may track movement of Participant 2 avatar's which is posed to virtually hold the virtual smartphone, and provide visual feedback, audible feedback, and/or tactile feedback to guide Participant 1 to change pose of the physical smartphone in a manner that corresponds to the determined movement of Participant 2 avatar's hand.
  • Participant 1 change pose e.g., rotates, the physical smartphone
  • the XR rendering device can track the observed movements to render updated poses of the virtual smartphone.
  • the operations include to track interaction of an avatar representing a second participant in the immersive XR environment interacting with the virtual object.
  • the operations also further include to change rendered pose of the virtual object in the immersive XR environment responsive to the tracked interaction.
  • the operations include to track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object in the immersive XR environment.
  • the operations also further include to change rendered pose of the virtual object in the immersive XR environment based on the tracked rotation and/or translational movement of the part of the avatar.
  • the operations include to track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object in the immersive XR environment.
  • the operations also further include to provide visual feedback, audible feedback, and/or tactile feedback through an electronic device of the first participant to guide the first participant to rotate and/or translationally move the physical object in a manner that corresponds to the tracked rotation and/or translational movement of the part of the avatar.
  • the digitally represented virtual object may have certain characteristic based on: pre-determined parameters associated to physical objects stored on a central server, Participant 1 (sender side) or participant 2 (receiving side) preferences, or a combination there of. These characteristics may include: a type of 3D model (e.g., point cloud, wire frame, polygon, surface, etc.); color, texture, physical weight, density, size, center of mass, moment of inertia; animation instructions, such as whether the texture should change or if the virtual object should move by itself or maybe blink; or whether the virtual object may be controlled and manipulated by participant 1 (sender side) and/or participant 2 (receiving side), such as by touch, feel and manipulate the object.
  • a type of 3D model e.g., point cloud, wire frame, polygon, surface, etc.
  • color, texture, physical weight, density, size, center of mass, moment of inertia e.g., animation instructions, such as whether the texture should change or if the virtual object should move by
  • the operation to render the virtual object representation of at least a portion of the physical object in the immersive XR environment includes to identify a characteristic of the physical object.
  • the operation also includes to select virtual object characteristic from among a set of a plurality of different virtual object characteristics based on the identified characteristic of the physical object.
  • the operation also includes to render the virtual object representation of at least a portion of the physical object in the immersive XR environment using the select virtual object characteristic.
  • a self-view setting may be used to allow Participant 1 (sending side) to have visual feedback of the digitally represented object viewed by Participant 2 (receiving side).
  • the visual feedback may be generated based on an image or video feed-back from a camera oriented toward Participant 2's field of view.
  • another replicating virtual portal may be configured to have its intended receiving side (at Participant 2 side) facing the actual sending side (Participant 1) so that Participant 1 in a replicating virtual portal sees Participant l’s own protruding object just as Participant 2 would have.
  • FIG. 10 is a block diagram of components of an XR rendering device 100 that are configured to operate in accordance with some embodiments of the present disclosure.
  • the XR rendering device 100 can include processing circuitry, at least one network interface 1020 (network interface), and a display device 1030.
  • the processing circuitry may include at least one processor circuit 1000 (processor), at least one memory 1010 (memory).
  • the processor 1000 is operationally connected to these various components.
  • the memory 1010 stores executable instructions 1012 that are executed by the processor 1000 to perform operations.
  • the processor 1000 may include one or more data processing circuits, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor), which may be collocated or distributed across one or more data networks.
  • a general purpose and/or special purpose processor e.g., microprocessor and/or digital signal processor
  • the processor 1000 is configured to execute the instructions 1012 in the memory 1010, described below as a computer readable medium, to perform some or all of the operations and methods for one or more of the embodiments disclosed herein for an XR rendering device.
  • the XR rendering device may be separate from and communicatively connect to the participant devices or may be at least partially integrated within one or more of the participant devices.
  • the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
  • the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
  • the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Some embodiments disclosed herein are directed to an extended reality, XR, rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment. The XR rendering device includes a processing circuitry adapted to perform operations. The operations include defining a virtual portal with an entrance boundary at a location in physical space of a first participant. The operations also include tracking a location of a physical object relative to the location of the entrance boundary. The operations also include rendering a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.

Description

VIRTUAL PORTAL BETWEEN PHYSICAL SPACE AND VIRTUAL SPACE IN EXTENDED REALITY ENVIRONMENTS
TECHNICAL FIELD
[0001] The present disclosure relates to rendering extended reality (XR) environments and associated XR rendering devices, and more particularly to rendering objects in immersive XR environments for display on XR participant devices.
BACKGROUND
[0002] Immersive extended reality (XR) environments have been developed which provide a myriad of different types of user experiences for gaming, on-line meetings, cocreation of products, etc. One type of immersive XR environments (also referred to as "XR environments") is virtual reality (VR) environments where human users only see computer generated graphical renderings. Another type of XR environment is augmented reality (AR) environments where users see a combination of computer generated graphical renderings overlaid on a view of the physical real-world through, e.g., see-through display screens. [0003] Example XR environment rendering devices include, without limitation, XR environment servers, XR headsets, gaming consoles, smartphones running an XR application, and tablet/laptop/desktop computers running an XR application. Oculus Quest is an example XR device and Google Glass is an example AR device.
[0004] XR meeting applications are tools for experiencing online immersive meetings and are useful as a thinking and planning space for oneself as well as among a group of people. Some XR meeting applications support AR devices, browsers, and VR devices. A participant using a browser may join via desktop, tablet-PC or smartphone and share their views using a front faced cam or a web cam. Some XR meeting solutions have mobile application versions, e.g., Android and iOS, which allow a user to navigate in the virtual space on the screen or activate an AR mode to display the meeting in their own surroundings. The XR meeting solutions introduce new features to online meetings that allow for new ways to share and create content etc. Today’s commonly and commercially available XR devices typically include a head mounted device (HMD) and a pair of hand controllers, sometimes with more advanced solutions including “foot controllers”.
[0005] Immersive XR environments, such as gaming environments and meeting environments, are often configured to display computer generated avatars which represent poses of human users in the immersive XR environments. A user may select and customize an avatar, such as gender, clothing, hair style, etc. to represent that user for viewing by other users participating in the immersive XR environment. Users can interact with virtual objects that are rendered in the immersive XR environment, such as by controlling their avatars to manipulate virtual objects. Immersive XR environments presently support little if any interface between physical objects and virtual objects that are rendered.
SUMMARY
[0006] Some embodiments disclosed herein are directed to an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment. The XR rendering device includes processing circuitry adapted to perform operations. The operations include defining a virtual portal with an entrance boundary at a location in physical space of a first participant. The operations also include tracking a location of a physical object relative to the location of the entrance boundary. The operations also include rendering a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.
[0007] Some other embodiments are directed to a corresponding method by an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment. The method includes defining a virtual portal with an entrance boundary at a location in physical space of a first participant. The method also includes tracking a location of a physical object relative to the location of the entrance boundary. The method also includes rendering a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.
[0008] Some potential advantages of these embodiments are that they allow one participant (sending participant) in an immersive XR environment to cause a physical object to become rendered as a virtual object by the participant passing the physical object through the location of the virtual portal in physical space. The participant may view the rendered virtual object and another participant (receiving participant) in the immersive XR environment may simultaneously view the virtual object. In some further embodiments, the receiving participant is able to interact with the virtual object by, for example, using a hand of the corresponding avatar to manipulate the rendered pose of the virtual object. Feedback can be provided to the sending participant to experience the interaction of the receiving participant with the virtual object. Through these and other embodiments the usual boundary between physical and virtual space can become intuitively crossed by participants in the immersive XR environment.
[0009] Other XR rendering devices, methods, and computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional XR rendering devices, methods, and computer program products be included within this description and protected by the accompanying claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
[0011] Figure 1 illustrates an XR system that includes a plurality of participant devices that communicate through networks with an XR rendering device to operate in accordance with some embodiments of the present disclosure;
[0012] Figure 2 is an example illustration of the XR rendering device tracking location of a physical object in the first participant’s physical space while the physical object is being passed through a virtual portal and gradually rendered in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure;
[0013] Figure 3 illustrates an alternative example operation of the XR rendering device tracking location of a physical object in the first participant’s physical space while the physical object is being passed through a virtual portal and abruptly rendered in its entirety in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure;
[0014] Figure 4 illustrates an example virtual portal defined relative to a reference coordinate system in a physical space and relative to a first participant and a second participant device, in accordance with some embodiments of the present disclosure;
[0015] Figure 5 illustrates an example implementation of a pair of virtual portals 200 defined in separate physical spaces of Participant 1 and Participant 2, in accordance with some embodiments of the present disclosure; [0016] Figure 6 illustrates an example implementation of a virtual portal in a physical space shared by the first participant and the second participant, in accordance with some embodiments of the present disclosure;
[0017] Figures 7 through 9 are flowcharts of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure; and [0018] Figure 10 is a block diagram of components of an XR rendering device that are configured to operate in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION
[0019] Inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings, in which examples of embodiments of inventive concepts are shown. Inventive concepts may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of various present inventive concepts to those skilled in the art. It should also be noted that these embodiments are not mutually exclusive. Components from one embodiment may be tacitly assumed to be present/used in another embodiment.
[0020] Figure 1 illustrates an XR system that includes a plurality of participant devices 110a and 110b that communicate through networks 120 with an XR rendering device 100 to operate in accordance with some embodiments of the present disclosure. The XR rendering device 100 is configured to generate a graphical representation of an immersive XR environment (also called an "XR environment" for brevity) which is viewable from various perspectives of virtual poses of human participants in the XR environment through display screens of the various participant devices 1 lOa-b. For example, the illustrated devices include XR headsets 110a and 110b which can be worn by participants to view and navigate through the XR environment. The participant devices are not limited to XR headsets, as they could be a personal computer, tablet, smart phone, or other electronic device which can be operated by a participant to view and navigate through the XR environment. The participants may have associated avatars which are rendered in the XR environment to represent poses (e.g., location, body assembly orientation, etc.) of the participants relative to a coordinate system of the XR environment.
[0021] Although the XR rendering device 100 is illustrated in Figure 1 as being a centralized network computing server separate from one or more of the participant devices, in some other embodiments the XR rendering device 100 is implemented as a component of one or more of the participant devices. For example, one of the participant devices may be configured to perform operations of the XR rendering device in a centralized manner controlling rendering for or by other ones of the participant devices. Alternatively, each of the participant devices may be configured to perform at least some of the operations of the XR rendering device in a distributed decentralized manner with coordinated communications being performed between the distributed XR rendering devices (e.g., between software instances of XR rendering devices).
[0022] Some embodiments of the present disclosure are directed to an XR rendering device enabling one or more participants in an immersive XR environment to pass a physical object through one or more virtual portals located in physical space, to trigger the XR rendering device 100 to render a virtual object representing the physical object in the immersive XR environment. In an illustrative example, a first participant may reach a hand through the virtual portal to trigger to the XR rendering device 100 to render a computer generated virtual representation of the hand in the immersive XR environment. The first participant and another second participant may simultaneously view the virtual hand. The second participant may similarly reach a hand through a virtual portal to trigger rendering of another virtual hand, and move the hand to virtually shake the virtual hand of the first participant. The first and second participant may be provided corresponding feedback of the virtual handshake, e.g., as tactile feedback initiated by the XR rendering device 100 and provided through tactile gloves worn by the first and second participants. As will be appreciated in view of the description provided below, the type of physical object that can be passed through a virtual portal for rendering and possible interaction with by participants in the immersive XR environment is not limited by any of the operational embodiments herein. [0023] Figure 7 is a flowchart of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure.
[0024] Referring to Figure 7, an XR rendering device is provided for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment. The XR rendering device includes processing circuitry adapted to perform operations. The operations include to define 700 a virtual portal with an entrance boundary at a location in physical space of a first participant. The operations track 702 a location of a physical object relative to the location of the entrance boundary. The operations render 704 a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant responsive to determining the physical object has crossed the entrance boundary of the virtual portal.
[0025] The rendering of the virtual object may be provided to a device of the participant and/or one or more other participants for viewing and possible interaction with. XR headsets can be worn by participants in the immersive XR environment to provide an intuitive way for the participants to view the rendered virtual object and possibly also view a rendering of the virtual portal. Figure 8 is a flowchart of operations that may be performed by the XR rendering device in accordance with some embodiments of the present disclosure. Referring to Figure 8, the operations further include communicating the virtual object to an XR headset worn by another one of the participants and/or the first participant for display.
[0026] Figure 2 is an example illustration of the XR rendering device 100 tracking location of a physical object 210 in the first participant’s physical space while the physical object 210 is being passed through a virtual portal 200 and rendered in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure. Referring to Figure 2, Participant 1 and Participant 2 are present participants in an immersive XR environment rendered by the XR rendering device 100. Participant 1 and Participant 2 can simultaneously view virtual objects that are rendered by the XR rendering device 100 in their respective virtual spaces, such as through XR headsets worn by the participants. The XR rendering device 100 is communicatively coupled to at least one sensor 230 that tracks location of a physical object 210, e.g., a smart phone, relative to the boundary of a virtual portal 200 having a location defined in physical space. The sensor 230 may be one or more cameras mounted to an XR headset and facing forward in the wearer's eye gaze direction. The sensor 230 may additionally or alternatively be other types of sensors able to track location of physical objects, such as LIDAR sensors, ultrasound sensors, etc. In some embodiments, two cameras may be used to track location of the physical object 210 in three- dimensional (3D) space or a single LIDAR sensor may be used. LIDAR and one or more cameras may be used together to track and determine location, shape, texture, and color of the physical object 210.
[0027] Referring to Figures 2 and 7, the XR rendering device 100 defines 700 the virtual portal 200 with an entrance boundary at a location in physical space of the first participant. The virtual portal 200 functions as a boundary through which the first participant can pass the physical object 210 to trigger the XR rendering device 100 to render a virtual representation 220 in the virtual spaces of the participants, e.g., Participant 1 and Participant 2, within the immersive XR environment. [0028] In some embodiments, the XR rendering device 100 defines a location, size, and shape of an entrance boundary of virtual portal 200. The location of the entrance boundary of virtual portal 200 is defined in physical space, such as relative to a feature on a physical table in a room, a distance away from Participant 1, relative to an object worn by Participant 1 (smartwatch, smart glasses, etc.), etc. A computer generated representation of the virtual portal 200 may be displayed in the immersive XR environment in the virtual space viewed by Participant 1. The computer generated representation of the virtual portal 200 may additionally or alternatively be displayed in the in the virtual space viewed by Participant 2, such as when Participants 1 and 2 are sharing an immersive XR environment.
[0029] The term virtual space is used herein to refer to the displayable space within the computer rendered immersive XR environment. In contrast, physical space is used to refer to the real space in which the human participant resides.
[0030] In some embodiments, the XR rendering device 100 defines a location of an entrance boundary of the virtual portal 200 relative to a reference system in physical space where the participant resides and which is tracked by one or more sensors 230. The entrance boundary may be a single point, may extend along a two-dimensional (2D) plane (e.g., flat rectangle or flat circle), such as illustrated in Figure 2, or may be a three-dimensional (3D) shape (e.g., sphere or box). The entrance boundary may be defined to be a fixed distance and/or a fixed angle (fixed pose offset) relative to the participant or relative to a defined physical object which is in the physical space. The participant and/or the defined physical object may move within the physical space, and the entrance boundary location in the reference system of physical space may be dynamically updated to track changes in location of the participant and/or the defined physical object 210.
[0031] The tracking of a physical object 210 relative to the virtual portal 200 is discussed in further detail below in references to Figures 4 and 5.
[0032] Referring to Figures 2 and 7, The XR rendering device 100 uses sensor data received from the sensor 230 to track 702 a location of the physical object 210, such as a smart phone as illustrated, in the first participant’s physical space relative to the location of an entrance boundary of the virtual portal 200 in the first participant’s virtual space.
[0033] Referring to Figures 2, 7, and 8, The XR rendering device then renders 704 a virtual object 220 representation of at least a portion of the physical object 210. The XR rendering device may render the virtual object 220 for viewing by the second participant in the second participant’s virtual space, e.g., as illustrated in Figure 2. Optionally, as illustrated in Figure 2, the first participant may also see an overlay of the virtual object 220 displayed over the physical object 210. Alternatively, as not illustrated in Figure 2, the first participant may see the physical object 210 when passing the physical object through the virtual portal 200.
[0034] According to some embodiments illustrated in Figure 2, the XR rendering device 100 tracks the location of the physical object 210 relative to the entrance boundary of the virtual portal 200. When a threshold portion of the physical object is determined to have crossed the entrance boundary, the XR rendering device responds by performing operations to generate a virtual object 220 representation of at least a portion of the physical object 210 in the immersive XR environment. The XR rendering device 100 may, for example, cause the virtual object 220 to be displayed through the XR headsets worn by Participant 1 and Participant 2. For example, when an edge or other threshold portion of the physical object 210 is determined to have passed through the virtual portal 200, a partial rendering of the physical object 220 corresponding to the portion of the physical object 210 which has passed through the virtual portal 200 is displayed for viewing by Participant 2 and/or Participant 1. [0035] In other words, in some embodiments, the operation to render the virtual object representation of at least a portion of the physical object in the immersive XR environment, is performed responsive to determining that at least a portion of the physical object exceeding a threshold has crossed the entrance boundary.
[0036] In an illustrative example, Participant 1 can see a smartphone 210 resting on a table by either viewing video displayed from a camera on a VR headset or by seeing the smartphone 210 through a see-through display of an AR headset. Participant 1 desires to display a virtual object representation 220 of the smartphone 210 to another participant (Participant 2) in the immersive XR environment. Therefore, Participant 1 picks-up the smartphone 210 and moves it through the entrance boundary 200 of the virtual portal. As the XR rendering device 100 tracks, via the sensor 230, the smartphone 210 moving through the entrance boundary 200, it renders the virtual object representation of the smartphone 220 in the immersive XR environment. The rendering can include displaying the virtual object 220 through the XR headset worn by Participant 2 with a pose (location and orientation) defined in a virtual space of the immersive XR environment relative to the perspective of Participant 2 looking toward a computer generated representation of the virtual portal 200. Similarly, the rendering can include displaying the virtual object 220 through the XR headset worn by Participant 1 with a pose (location and orientation) defined in the virtual space relative to the perspective of Participant 1 looking toward a computer generated representation of the virtual portal 200. [0037] As illustrated in Figure 2, in some embodiments, when rendering the virtual object 220, the XR rendering device 100 may render an amount of the virtual object 220 that is proportional to an amount of the physical object 210 (smartphone) that has moved through the entrance boundary, which can result in Participants 1 and 2 seeing the virtual object 220 gradually appear in the immersive XR environment.
[0038] In other words, in some embodiments, the operation to render 704 the virtual object 220 representation of at least a portion of the physical object 210 in the immersive XR environment, renders an amount of the virtual object 220 that is proportional to an amount of the physical object 210 that has moved across the entrance boundary of the virtual portal 200. [0039] Alternatively, in some embodiments, when rendering the virtual object, the XR rendering device 100 may render the virtual object 220 in its entirety when a threshold amount of the physical object 210 (e.g., an edge of the smartphone) has moved through the entrance boundary 200, which can result in Participants 1 and 2 seeing the virtual object to instantly appear in the immersive XR environment.
[0040] Figure 3 illustrates an alternative example operation of the XR rendering device 100 tracking location of the physical object 210 in the first participant’s physical space while the physical object is being passed through the virtual portal 200 and abruptly rendered in its entirety as a virtual object 300 in the virtual space of the immersive XR environment, in accordance with some embodiments of the present disclosure.
[0041] In other words, in some embodiments, the operation to render the virtual object representation of at least a portion of the physical object in the immersive XR environment, renders the virtual object in its entirety responsive to a threshold amount of the physical object having been moved across the entrance boundary of the virtual portal.
[0042] Figure 4 illustrates an example virtual portal 200 defined relative to a reference coordinate system in a physical space, and relative to locations of Participant 1 and Participant 2, in accordance with some embodiments of the present disclosure. Locations of physical objects relative to an entrance boundary 200 of the virtual portal can be tracked using sensors 230 relative to the reference coordinate system.
[0043] As explained above, the XR rendering device 100 defines a location of an entrance boundary 200 relative to a reference system in physical space where the participant resides. In the example of Figure 4, the entrance boundary is illustrated as including a focus point (FP) at a three-dimensional location (0, 0, 0) centered in the virtual portal 200 defined relative to a reference coordinate system in physical space. The size and shape of the entrance boundary 200 can be defined relative to the FP. In the illustrated example, the entrance boundary 200 has a two-dimensional square shape, but could have any shape in 2D or 3D space. The illustrated Participant 1 and Participant 2 are also located in the same physical space, and therefore can move physical objects which are trackable by the sensors 230 relative to the same reference coordinate system in physical space.
[0044] In Figure 4, the XR headsets 400 worn by Participant 1 and Participant 2 can include an object tracking sensor(s) 230, such as one or more front-facing cameras, Lidar sensors, and/or ultrasound sensors. The object tracking sensor(s) 230 are configured to output data which is processed, e.g., by the XR rendering device 100, to track location of physical objects relative to the reference coordinate system in physical space. Cameras can output image frames which can be processed to identify a physical object, shape of the physical object, color of the physical object, texture of the physical object, and location of the physical object. Lidar sensors can output point cloud data which can be similarly processed to identify an object, its shape, and its location. Alternatively, the object tracking sensor(s) may be separate from the XR headsets or a combination of XR headset-based object tracking sensor(s) and separate object tracking sensor(s) may be used.
[0045] In other words, in some embodiments, the operation to render 704 the virtual object representation of at least a portion of the physical object in the immersive XR environment, renders an amount of the virtual object that is proportional to an amount of the physical object that has moved across the entrance boundary of the virtual portal.
[0046] In some embodiments, the XR rendering device 100 may terminate or stop rendering of the virtual object based on the physical object being removed from the virtual portal. In these embodiments, the operations further include terminating rendering the virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by the other one of the participants and/or the first participant responsive to determining the physical object has no longer crossed the entrance boundary of the virtual portal.
[0047] In other words, in some embodiments, the operations further include to terminate rendering the virtual object 220 representation of at least a portion of the physical object 210 in the immersive XR environment for viewing by the other one of the participants and/or the first participant responsive to determining the physical object 210 has no longer crossed the entrance boundary of the virtual portal 200.
[0048] In some embodiments, the XR rendering device may trigger audible feedback, tactile feedback, or other visual feedback to one or both of Participants 1 and 2. In the smartphone example above, Participants 1 and 2 may be provided a tone or other audible notification, e.g., through XR headsets, when the virtual object representation of the smartphone is rendered in the immersive XR environment. In another embodiment, Participants 1 and 2 may be provided tactile feedback, such as a vibration through tactile gloves or controllers handled by Participants 1 and 2 when the virtual object representation of the smartphone is rendered in the immersive XR environment. In still another embodiment, Participants 1 and 2 may be provided visual feedback, such as text and/or graphical object displayed by the XR headsets, indicating that the virtual object representation of the smartphone is rendered in the immersive XR environment, which can be helpful when the Participants are not looking in a direction of the virtual portal.
[0049] Figure 9 illustrates a flowchart of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure.
[0050] Referring to Figures 2 and 9, in some of these embodiments, the operations further include generating 900 a notification through an electronic device operated by the first participant responsive to determining the physical object 210 has crossed the entrance boundary of the virtual portal 200. The notification is configured to trigger at least one of outputting audible feedback, generating tactile feedback, or displaying visual feedback to the first participant.
[0051] Figure 5 illustrates an example implementation of a pair of virtual portals 200 defined in separate physical spaces of Participant 1 and Participant 2, in accordance with some embodiments of the present disclosure.
[0052] Figure 5 illustrates that Participants 1 and 2 share a virtual portal having an entrance boundary 200 defined relative to separate physical spaces of Participants 1 and 2. For example, Participants 1 and 2 may be physically distant from each other but share the entrance boundary 200 of the virtual portal. The virtual portal can have a single entrance boundary that is defined relative a coordinate system of the physical space of Participant 1 or Participant l’s VR headset 400, or may have another entrance boundary that is defined relative a coordinate system of the physical space of Participant 2 or Participant 2’s VR headset 400. More particularly, the XR rendering device can define the virtual portal with a first entrance boundary at a location in a first coordinate system of the physical space of the first participant and a second entrance boundary that is defined relative a second coordinate system of a physical space of a second participant. The second entrance boundary may be defined and used responsive to a rule becoming satisfied for interacting with the virtual portal, e.g., when the second participant is within a threshold distance of the virtual portal and/or has an avatar with an eyesight gaze that is within a threshold angle toward the virtual portal.
[0053] In some of these embodiments, the operation to define the virtual portal with the entrance boundary at the location in physical space of the first participant further includes to define the virtual portal with a first entrance boundary at a location in a first coordinate system of the physical space of the first participant and a second entrance boundary that is defined relative a second coordinate system of a physical space of a second participant. [0054] Figure 6 illustrates an example alternative implementation of a virtual portal in a physical space shared between the first participant and second participant, in accordance with some embodiments of the present disclosure.
[0055] In the illustrative example shown in Figure 6, the XR rendering device 100 can define locations of Participant 1 (negative z-axis) with Participant 2 (positive z-axis) in a common reference system, so that, for example, any object moved by Participant 1 is tracked with a negative z-value (within FP frame) and rendered at a correspondingly virtual location for viewing by Participant 2 (positive z-axis).
[0056] Analyzing objects interacting with the virtual portal is required in the implementation of various embodiments discussed. In some embodiments, the XR rendering device 100 monitors image data from each participant’s sensor(s), such as a camera. When a threshold amount (z threshold) of a physical object (hand, etc.) location (physical object z) has passed through the virtual portal, the relationship physical object z < z threshold is determined to be satisfied which triggers object detection and rendering of appropriate digital representation of the physical object (virtual object). In a further embodiment, determining that an edge of the physical object has passed through the virtual portal (e.g., physical object z < 0) can trigger object detection and rendering of a virtual object representation of the physical object on the receiver side, e.g.,. displayed to Participant 2. In some embodiments, as the physical object moves through the virtual portal, the virtual object gradually appears (more) on receiving side, i.e. from 0% - 100% of physical object is represented and rendered as a virtual object on receiving side. As the physical object moves through the virtual portal, the virtual object gradually appears (more) on the receiving side, i.e. from 0% - 100%. Movement and other attributes of the physical object within the virtual portal can be rendered as corresponding motion and other attributes of the virtual object that are displayed for viewing by Participant 1, Participant 2, and/or other participants.
[0057] In some embodiments, the XR rendering device 100 is a central server. The centralized XR rendering device 100 can receive sensor data (e.g., image data) from at least one sensor (e.g., a camera), and may operate to automatically determine a corresponding digital representation for rendering as the virtual object based on received image data. The rendering may be based on processing the received image data through image processing techniques and/or machine learning techniques. The XR rendering device 100 may access a database that includes 3D representations of different objects (wireframes, point clouds, 3D models, etc.). The XR rendering device 100 may select a 3D representation from a set of representations and its associated preferences/rules/characteristics. The cameras may continuously send image data to the XR rendering device 100, which adapts the 3D representation accordingly. The image data can be processed to determine information about position, gesture, orientation, etc. which is translated into parameters. The parameters are used to control rendering of 3D representation and adapt appearance of the virtual object displayed to Participant 1 and/or Participant 2. As the physical object pass through the virtual portal, increased negative z-value, then a corresponding part of the virtual object (a % of the whole object until 100%) is digitally represented at the receiving side.
[0058] Some embodiments are directed to determining a pose of the virtual object being rendered in immersive XR environment. The pose may correspond to the location and/or angular orientation. The XR rendering device may be configured to determine a pose of a physical object that has crossed the entrance boundary of the virtual portal and render a pose of the virtual object rendered in the immersive XR environment based on the pose of the physical object. In the context of the smartphone example discussed above, an image stream from a camera on the XR headset of Participant 1 can be processed to determine the pose of the physical smartphone as it is held across the threshold of the entrance boundary. The XR rendering device can render the virtual smartphone with a pose that corresponds to (e.g., same as or defined offset relative to) the determined pose of the physical smartphone. The XR rendering device may dynamically change the pose of the virtual smartphone being rendered to track in real-time changes in pose of the physical smartphone.
[0059] In other words, in some embodiments, the operations further include to determine pose of the physical object relative to a coordinate system of the physical space of the first participant. The operations also further include to render pose of the virtual object in the immersive XR environment responsive to the determined pose of the physical object.
[0060] Some embodiments are directed to configuring the virtual portal.
[0061] A virtual portal may be defined to have any shape, where the shape may be defined depending on use case, context, participant settings and preferences, type of objects to be pushed through, communication channel capabilities, XR system capabilities (e.g., camera sensor capabilities).
[0062] In some embodiments, the virtual portal may be configured to apply a filter mechanism to effect the virtual object being rendered.
[0063] In some of these embodiments, the operations further include to determine a filter mechanism to apply an effect to the virtual object. The rendering of the virtual object representation of at least the portion of the physical object in the immersive XR environment includes to render the virtual object of at least the portion of the physical object with the filter mechanism applied.
[0064] Some examples of filter mechanisms which may be applied to virtual objects being rendered are now discussed:
[0065] First, a filter may be used to “clean” a physical object of features such as, but not limited to, hot-words, symbols, or tattoos. For example, tattoo(s) on the first participant’s hand pushed through the virtual portal can operationally disappear on the receiver side and/or the sender side. In this example, the XR system removes a tattoo by “erasing” the tattoo using operations such as occlusion by overlaying with XR display data so the receiver side receives a rendering of a digitalized hand which does not contain tattoo parts. Instead, the tattoo parts have been replaced by other skin texture from the hand.
[0066] Second, a filter may be used to change/modify colors and/or textures of the virtual object compared to the physical object. For example, the filter may apply a hydro dipping effect of texture and/or color to the virtual object. Hydro dipping is a method of applying printed graphics to three dimensional objects. In one example, the entrance boundary of the virtual portal may be decorated with a decorative pattern or texture which is then transferred to the surface of the virtual object when the physical object is passed through the virtual portal. This transferring of the decorative pattern or texture may be applied based on many factors such as the pose of the physical object as it crosses the entrance boundary.
[0067] Third, a filter may be used to apply affects such as animated movement of the virtual object being rendered. Corresponding haptic feedback may be provided to one or more participants interacting with the virtual object.
[0068] Fourth, a filter may be used to map the physical object with a selectable one of a plurality of different virtual objects. For example, a physical apple can be mapped by the XR rendering device to be rendered in the immersive XR environment as a virtual representation of an orange. In another example, one type of smartphone can be mapped to be rendered as another type of smartphone or other electronic device, such as a tablet computer. [0069] In other words, in some embodiments, rendering the virtual object representation of at least the portion of the physical object with the filter mechanism applied includes at least one of: (1) applying a filter to erase a defined portion of the virtual object; (2) applying a filter to modify a color of at least part of the virtual object; (3) applying a filter to modify a texture of at least part of the virtual object; and (4) applying a filter to modify a rotation and/or translational movement of the virtual object.
[0070] In some of these embodiments, a filter setting may be defined on one or more of the XR headsets and/or on the XR rendering device. When one or more of the filter settings operationally conflict or are not compatible, resolution of the conflict may be performed based one or more rules, such as a prioritization ranking of the filter settings.
[0071] In some embodiments, a virtual portal may have different size on the sending and receiving sides. For example, the receiving side virtual portal may be scaled an amount defined by a specific context in which the virtual portal is used to receive physical objects. [0072] In some embodiments, a virtual portal may have different sound profiles associated with protrusion of objects in general, and certain objects in particular. There may be a pre-defined sound profile per type of objects and/or individual objects, and/or associated with respective objects’ attributes, such as: “a hand coming through” triggers sound profile A; “right hand coming through” triggers sound profile B; “right hand coming through, holding a red apple” triggers sound profile C; and “apple coming through, interchanged to, orange coming through” triggers sound profile D. Respective sound profiles may be any of a “sound” or object-to-speech sequence associated with detected to-be/being protruded physical/virtual object and the physical/virtual object’s attributes.
[0073] In other words, in some embodiments, the operations include to determine the physical object has crossed the entrance boundary of the virtual portal. Then responsive to determining the physical object has crossed the entrance boundary of the virtual portal, the operations further include to select a sound profile from among a set of a plurality of different sound profiles based on an identified characteristic of the physical object, and control an electronic device operated by the first participant to output sound defined by the selected sound profile.
[0074] In some embodiments, a physical object with digital capabilities is passed through the virtual portal. A non-digital object, such as a hand or apple, may be pushed through the virtual portal. The sending participant may similarly push through a physical electronic device, such as a smart watch, smartphone, or other electronic device. [0075] In other words, in some embodiments, the physical object includes an electronic device. The operations further include to identify characteristic of an interaction of an avatar representing a second participant in the immersive XR environment with the virtual object representation of the electronic device. The operations also further include to communicate with the electronic device to control an operational function of the electronic device based on the identified characteristic of the interaction.
[0076] Some further embodiments are now described in the context of the example above where Participant 1 has passed a physical smartphone across the entrance boundary of the virtual portal, which has triggered the XR rendering device to render a virtual smartphone in the immersive XR environment to be viewable by Participants 1 and 2 through their respective XR headsets.
[0077] In one of these further embodiments, the XR rendering device tracks interaction of Participant 2's avatar with the virtual smartphone and may change the rendered pose of the virtual smartphone responsive to Participant 2's interaction. For example, the XR rendering device may rotate the virtual smartphone to track movement of Participant 2 avatar's hand rotating while virtually holding the virtual smartphone. In this manner, Participant 2 can rotate the virtual smartphone to view other surfaces thereof, and Participant 1 can visually observe Participant 2's interaction with the virtual smartphone.
[0078] In another of these further embodiments, the XR rendering device tracks interaction of Participant 2's avatar with the virtual smartphone and provides feedback to Participant 1 to responsively change pose of the physical smartphone which, in turn, will then change the rendered pose of the virtual smartphone. For example, the XR rendering device may track movement of Participant 2 avatar's which is posed to virtually hold the virtual smartphone, and provide visual feedback, audible feedback, and/or tactile feedback to guide Participant 1 to change pose of the physical smartphone in a manner that corresponds to the determined movement of Participant 2 avatar's hand. As Participant 1 change pose, e.g., rotates, the physical smartphone, the XR rendering device can track the observed movements to render updated poses of the virtual smartphone.
[0079] In some further embodiments, the operations include to track interaction of an avatar representing a second participant in the immersive XR environment interacting with the virtual object. The operations also further include to change rendered pose of the virtual object in the immersive XR environment responsive to the tracked interaction.
[0080] In some further embodiments, the operations include to track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object in the immersive XR environment. The operations also further include to change rendered pose of the virtual object in the immersive XR environment based on the tracked rotation and/or translational movement of the part of the avatar.
[0081] In some further embodiments, the operations include to track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object in the immersive XR environment. The operations also further include to provide visual feedback, audible feedback, and/or tactile feedback through an electronic device of the first participant to guide the first participant to rotate and/or translationally move the physical object in a manner that corresponds to the tracked rotation and/or translational movement of the part of the avatar.
[0082] In some embodiments, the digitally represented virtual object may have certain characteristic based on: pre-determined parameters associated to physical objects stored on a central server, Participant 1 (sender side) or participant 2 (receiving side) preferences, or a combination there of. These characteristics may include: a type of 3D model (e.g., point cloud, wire frame, polygon, surface, etc.); color, texture, physical weight, density, size, center of mass, moment of inertia; animation instructions, such as whether the texture should change or if the virtual object should move by itself or maybe blink; or whether the virtual object may be controlled and manipulated by participant 1 (sender side) and/or participant 2 (receiving side), such as by touch, feel and manipulate the object.
[0083] In other words, in some embodiments, the operation to render the virtual object representation of at least a portion of the physical object in the immersive XR environment, includes to identify a characteristic of the physical object. The operation also includes to select virtual object characteristic from among a set of a plurality of different virtual object characteristics based on the identified characteristic of the physical object. The operation also includes to render the virtual object representation of at least a portion of the physical object in the immersive XR environment using the select virtual object characteristic.
[0084] In some embodiments, a self-view setting may be used to allow Participant 1 (sending side) to have visual feedback of the digitally represented object viewed by Participant 2 (receiving side). The visual feedback may be generated based on an image or video feed-back from a camera oriented toward Participant 2's field of view.
[0085] In some further embodiments, another replicating virtual portal may be configured to have its intended receiving side (at Participant 2 side) facing the actual sending side (Participant 1) so that Participant 1 in a replicating virtual portal sees Participant l’s own protruding object just as Participant 2 would have.
[0086] Example XR Rendering Device Configuration
[0087] Figure 10 is a block diagram of components of an XR rendering device 100 that are configured to operate in accordance with some embodiments of the present disclosure. The XR rendering device 100 can include processing circuitry, at least one network interface 1020 (network interface), and a display device 1030. The processing circuitry may include at least one processor circuit 1000 (processor), at least one memory 1010 (memory). The processor 1000 is operationally connected to these various components. The memory 1010 stores executable instructions 1012 that are executed by the processor 1000 to perform operations. The processor 1000 may include one or more data processing circuits, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor), which may be collocated or distributed across one or more data networks. The processor 1000 is configured to execute the instructions 1012 in the memory 1010, described below as a computer readable medium, to perform some or all of the operations and methods for one or more of the embodiments disclosed herein for an XR rendering device. As explained above, the XR rendering device may be separate from and communicatively connect to the participant devices or may be at least partially integrated within one or more of the participant devices.
[0088] Further Definitions and Embodiments:
[0089] In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
[0090] When an element is referred to as being "connected", "coupled", "responsive", or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected", "directly coupled", "directly responsive", or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, "coupled", "connected", "responsive", or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term "and/or" includes any and all combinations of one or more of the associated listed items.
[0091] It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
[0092] As used herein, the terms "comprise", "comprising", "comprises", "include", "including", "includes", "have", "has", "having", or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation "e.g.", which derives from the Latin phrase "exempli gratia," may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation "i.e.", which derives from the Latin phrase "id est," may be used to specify a particular item from a more general recitation.
[0093] Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
[0094] These computer program instructions may also be stored in a tangible computer- readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as "circuitry," "a module" or variants thereof.
[0095] It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
[0096] Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims

CLAIMS:
1. An extended reality, XR, rendering device (100) for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment, the XR rendering device (100) comprising: a processing circuitry adapted to: define a virtual portal (200) with an entrance boundary at a location in physical space of a first participant; track a location of a physical object (210) relative to the location of the entrance boundary; and responsive to determining the physical object (210) has crossed the entrance boundary of the virtual portal (200), render a virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment for viewing by another one of the participants and/or the first participant.
2. The XR rendering device (100) of Claim 1, wherein the operations further comprise to: communicate the virtual object (220) to an XR headset worn by the other one of the participants and/or the first participant for display.
3. The XR rendering device (100) of any of Claims 1 to 2, wherein the operation to render the virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment, is performed responsive to determining that at least a portion of the physical object (210) exceeding a threshold has crossed the entrance boundary.
4. The XR rendering device (100) of any of Claims 1 to 3, wherein the operation to render the virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment, renders the virtual object (220) in its entirety responsive to a threshold amount of the physical object (210) having been moved across the entrance boundary of the virtual portal (200).
5. The XR rendering device (100) of any of Claims 1 to 3, wherein the operation to render the virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment, renders an amount of the virtual object (220) that is proportional to an amount of the physical object (210) that has moved across the entrance boundary of the virtual portal (200).
6. The XR rendering device (100) of any of Claims 1 to 5, wherein the operations further comprise to: responsive to determining the physical object (210) has crossed the entrance boundary of the virtual portal (200), generate a notification through an electronic device operated by the first participant, wherein the notification is configured to trigger at least one of outputting audible feedback, generating tactile feedback, or displaying visual feedback to the first participant.
7. The XR rendering device (100) of any of Claims 1 to 6, wherein the operation to define the virtual portal (200) with the entrance boundary at the location in physical space of the first participant further comprises to: define the virtual portal (200) with a first entrance boundary at a location in a first coordinate system of the physical space of the first participant and a second entrance boundary that is defined relative a second coordinate system of a physical space of a second participant.
8. The XR rendering device (100) of any of Claims 1 to 7, wherein the operations further comprise to: determine pose of the physical object (210) relative to a coordinate system of the physical space of the first participant; render pose of the virtual object (220) in the immersive XR environment responsive to the determined pose of the physical object (210).
9. The XR rendering device (100) of any of Claims 1 to 8, wherein the operations further comprise to: determine a filter mechanism to apply an effect to the virtual object (220), and wherein the rendering the virtual object (220) representation of at least the portion of the physical object (210) in the immersive XR environment comprises to: render the virtual object (220) of at least the portion of the physical object (210) with the filter mechanism applied.
10. The XR rendering device (100) of Claim 9, wherein to render the virtual object (220) representation of at least the portion of the physical object (210) with the filter mechanism applied comprises at least one of: applying a filter to erase a defined portion of the virtual object (220); applying a filter to modify a color of at least part of the virtual object (220); applying a filter to modify a texture of at least part of the virtual object (220); and applying a filter to modify a rotation and/or translational movement of the virtual object (220).
11. The XR rendering device (100) of any of Claims 1 to 10, wherein the operations further comprise to: responsive to determining the physical object (210) has crossed the entrance boundary of the virtual portal (200), select a sound profile from among a set of a plurality of different sound profiles based on an identified characteristic of the physical object (210); and control an electronic device operated by the first participant to output sound defined by the selected sound profile.
12. The XR rendering device (100) of any of Claims 1 to 11, wherein the physical object (210) comprises an electronic device, and the operations further comprise to: identify characteristic of an interaction of an avatar representing a second participant in the immersive XR environment with the virtual object (220) representation of the electronic device; communicate with the electronic device to control an operational function of the electronic device based on the identified characteristic of the interaction.
13. The XR rendering device (100) of any of Claims 1 to 12, wherein the operations further comprise to: track interaction of an avatar representing a second participant in the immersive XR environment interacting with the virtual object (220); and change rendered pose of the virtual object (220) in the immersive XR environment responsive to the tracked interaction.
14. The XR rendering device (100) of Claim 13, wherein the operations further comprise to: track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object (220) in the immersive XR environment; and change rendered pose of the virtual object (220) in the immersive XR environment based on the tracked rotation and/or translational movement of the part of the avatar.
15. The XR rendering device (100) of any of Claims 13 to 14, wherein the operations further comprise to: track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object (220) in the immersive XR environment; and provide visual feedback, audible feedback, and/or tactile feedback through an electronic device of the first participant to guide the first participant to rotate and/or translationally move the physical object (210) in a manner that corresponds to the tracked rotation and/or translational movement of the part of the avatar.
16. The XR rendering device (100) of any of Claims 1 to 15, wherein the operation to render the virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment, comprises to: identify a characteristic of the physical object (210); select virtual object characteristic from among a set of a plurality of different virtual object characteristics based on the identified characteristic of the physical object (210); and render the virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment using the select virtual object characteristic.
17. The XR rendering device (100) of any of Claims 1 to 16, wherein the operations further comprise to: responsive to determining the physical object (210) has no longer crossed the entrance boundary of the virtual portal (200), terminate rendering the virtual object (220) representation of at least a portion of the physical object (210) in the immersive XR environment for viewing by the other one of the participants and/or the first participant.
18. A method by an extended reality, XR, rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants in the immersive XR environment, the method comprising: defining (700) a virtual portal with an entrance boundary at a location in physical space of a first participant; tracking (702) a location of a physical object relative to the location of the entrance boundary; and responsive to determining the physical object has crossed the entrance boundary of the virtual portal, rendering (704) a virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by another one of the participants and/or the first participant.
19. The method of Claim 18, further comprising:
Communicating (800) the virtual object to an XR headset worn by the other one of the participants and/or the first participant for display.
20. The method of any of Claims 18 to 19, wherein rendering (704) the virtual object representation of at least a portion of the physical object in the immersive XR environment, is performed responsive to determining that at least a portion of the physical object exceeding a threshold has crossed the entrance boundary.
21. The method of any of Claims 18 to 20, wherein rendering (704) the virtual object representation of at least a portion of the physical object in the immersive XR environment, renders the virtual object in its entirety responsive to a threshold amount of the physical object having been moved across the entrance boundary of the virtual portal.
22. The method of any of Claims 18 to 20, wherein rendering (704) the virtual object representation of at least a portion of the physical object in the immersive XR environment, renders an amount of the virtual object that is proportional to an amount of the physical object that has moved across the entrance boundary of the virtual portal.
23. The method of any of Claims 18 to 22, further comprising: responsive to determining the physical object has crossed the entrance boundary of the virtual portal, generating (900) a notification through an electronic device operated by the first participant, wherein the notification is configured to trigger at least one of outputting audible feedback, generating tactile feedback, or displaying visual feedback to the first participant.
24. The method of any of Claims 18 to 23, further comprising: responsive to determining the physical object has no longer crossed the entrance boundary of the virtual portal, terminating rendering the virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by the other one of the participants and/or the first participant.
25. The method of any of Claims 18 to 24, wherein defining (700) the virtual portal with the entrance boundary at the location in physical space of the first participant further comprises to: define the virtual portal with a first entrance boundary at a location in a first coordinate system of the physical space of the first participant and a second entrance boundary that is defined relative a second coordinate system of a physical space of a second participant who satisfies a rule for interacting with the virtual portal.
26. The method of any of Claims 18 to 25, further comprising: determine pose of the physical object relative to a coordinate system of the physical space of the first participant; render pose of the virtual object in the immersive XR environment responsive to the determined pose of the physical object.
27. The method of any of Claims 18 to 26, further comprising: determine a filter mechanism to apply an effect to the virtual object, and wherein the rendering the virtual object representation of at least the portion of the physical object in the immersive XR environment comprises to: render the virtual object of at least the portion of the physical object with the filter mechanism applied.
28. The method of Claim 27, wherein rendering (704) the virtual object representation of at least the portion of the physical object with the filter mechanism applied comprises at least one of applying a filter to erase a defined portion of the virtual object; applying a filter to modify a color of at least part of the virtual object; applying a filter to modify a texture of at least part of the virtual object; and applying a filter to modify a rotation and/or translational movement of the virtual object.
29. The method of any of Claims 18 to 28, further comprising: responsive to determining the physical object has crossed the entrance boundary of the virtual portal, select a sound profile from among a set of a plurality of different sound profiles based on an identified characteristic of the physical object; and control an electronic device operated by the first participant to output sound defined by the selected sound profile.
30. The method of any of Claims 18 to 29, wherein the physical object comprises an electronic device, and further comprising: identify characteristic of an interaction of an avatar representing a second participant in the immersive XR environment with the virtual object representation of the electronic device; communicate with the electronic device to control an operational function of the electronic device based on the identified characteristic of the interaction.
31. The method of any of Claims 18 to 30, further comprising: track interaction of an avatar representing a second participant in the immersive XR environment interacting with the virtual object; and change rendered pose of the virtual object in the immersive XR environment responsive to the tracked interaction.
32. The method of Claim 31, further comprising: track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object in the immersive XR environment; and change rendered pose of the virtual object in the immersive XR environment based on the tracked rotation and/or translational movement of the part of the avatar.
33. The method of any of Claims 31 to 32, further comprising: track rotation and/or translational movement of a part of the avatar of the second participant while the part is virtually touching the virtual object in the immersive XR environment; and provide visual feedback, audible feedback, and/or tactile feedback through an electronic device of the first participant to guide the first participant to rotate and/or translationally move the physical object in a manner that corresponds to the tracked rotation and/or translational movement of the part of the avatar.
34. The method of any of Claims 18 to 33, wherein rendering (704) the virtual object representation of at least a portion of the physical object in the immersive XR environment, comprises to: identify a characteristic of the physical object; select virtual object characteristic from among a set of a plurality of different virtual object characteristics based on the identified characteristic of the physical object; and render the virtual object representation of at least a portion of the physical object in the immersive XR environment using the select virtual object characteristic.
35. The method of any of Claims 18 to 34, further comprising: responsive to determining the physical object has no longer crossed the entrance boundary of the virtual portal, terminating rendering the virtual object representation of at least a portion of the physical object in the immersive XR environment for viewing by the other one of the participants and/or the first participant.
PCT/EP2022/078769 2022-10-17 2022-10-17 Virtual portal between physical space and virtual space in extended reality environments WO2024083302A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/078769 WO2024083302A1 (en) 2022-10-17 2022-10-17 Virtual portal between physical space and virtual space in extended reality environments

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/078769 WO2024083302A1 (en) 2022-10-17 2022-10-17 Virtual portal between physical space and virtual space in extended reality environments

Publications (1)

Publication Number Publication Date
WO2024083302A1 true WO2024083302A1 (en) 2024-04-25

Family

ID=84331837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/078769 WO2024083302A1 (en) 2022-10-17 2022-10-17 Virtual portal between physical space and virtual space in extended reality environments

Country Status (1)

Country Link
WO (1) WO2024083302A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021145954A1 (en) * 2020-01-16 2021-07-22 Microsoft Technology Licensing, Llc Remote collaborations with volumetric space indications
EP3923149A1 (en) * 2019-02-04 2021-12-15 Sony Group Corporation Information processing device and information processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3923149A1 (en) * 2019-02-04 2021-12-15 Sony Group Corporation Information processing device and information processing method
WO2021145954A1 (en) * 2020-01-16 2021-07-22 Microsoft Technology Licensing, Llc Remote collaborations with volumetric space indications

Similar Documents

Publication Publication Date Title
CN113661691B (en) Electronic device, storage medium, and method for providing an augmented reality environment
US9829989B2 (en) Three-dimensional user input
AU2021290132B2 (en) Presenting avatars in three-dimensional environments
US10192363B2 (en) Math operations in mixed or virtual reality
AU2016210884A1 (en) Method and system for providing virtual display of a physical environment
CA2941333A1 (en) Virtual conference room
US11893154B2 (en) Systems, methods, and graphical user interfaces for updating display of a device relative to a user&#39;s body
CN111226187A (en) System and method for interacting with a user through a mirror
US20220254125A1 (en) Device Views and Controls
EP4254943A1 (en) Head-tracking based media selection for video communications in virtual environments
Nijholt Capturing obstructed nonverbal cues in augmented reality interactions: a short survey
US20210327121A1 (en) Display based mixed-reality device
CN112987914B (en) Method and apparatus for content placement
JP2023095862A (en) Program and information processing method
WO2024083302A1 (en) Virtual portal between physical space and virtual space in extended reality environments
EP3185103A1 (en) A gazed virtual object identification determination module, a system for implementing gaze translucency, and a related method
EP4371296A1 (en) Adjusting pose of video object in 3d video stream from user device based on augmented reality context information from augmented reality display device
US20230298250A1 (en) Stereoscopic features in virtual reality
US20240103705A1 (en) Convergence During 3D Gesture-Based User Interface Element Movement
US20240320893A1 (en) Lightweight Calling with Avatar User Representation
WO2024061462A1 (en) Rendering user avatar and digital object in extended reality based on user interactions with physical object
WO2024199656A1 (en) Multiple participant and object interactions through a virtual portal in a virtual space of extended reality environment
WO2024228846A1 (en) Devices, methods, and graphical user interfaces for displaying a representation of a person
CN118057266A (en) Method and device for controlling user position in virtual scene
WO2024220073A1 (en) Hybrid sensor fusion for avatar generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22802125

Country of ref document: EP

Kind code of ref document: A1