WO2023208317A1 - Equitable rendering of avatar heights for extended reality environments - Google Patents

Equitable rendering of avatar heights for extended reality environments Download PDF

Info

Publication number
WO2023208317A1
WO2023208317A1 PCT/EP2022/060950 EP2022060950W WO2023208317A1 WO 2023208317 A1 WO2023208317 A1 WO 2023208317A1 EP 2022060950 W EP2022060950 W EP 2022060950W WO 2023208317 A1 WO2023208317 A1 WO 2023208317A1
Authority
WO
WIPO (PCT)
Prior art keywords
height
avatar
participant
rendering
participants
Prior art date
Application number
PCT/EP2022/060950
Other languages
French (fr)
Inventor
Peter ÖKVIST
Gunilla BERNDTSSON
Tommy Arngren
Tommy Falk
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/EP2022/060950 priority Critical patent/WO2023208317A1/en
Publication of WO2023208317A1 publication Critical patent/WO2023208317A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present disclosure relates to rendering extended reality (XR) environments and associated XR rendering devices, and more particularly to rendering avatars in immersive XR environments displayed on XR participant devices.
  • XR extended reality
  • Immersive extended reality (XR) environments have been developed which provide a myriad of different types of user experiences for gaming, on-line meetings, cocreation of products, etc.
  • Immersive XR environments can include virtual reality (VR) environments where human users only see computer generated graphical renderings and can include augmented reality (AR) environments where users see a combination of computer generated graphical renderings overlaid on a view of the physical real-world through, e.g., see-through display screens.
  • Example XR environment rendering devices include, without limitation, XR environment servers, XR headsets, gaming consoles, smartphones running an XR application, and tablet/laptop/desktop computers running an XR application. Oculus Quest is an example XR device and Google Glass is an example AR device.
  • XR meeting applications are tools for native digital meetings and also useful as a thinking and planning space for oneself as well as having online meetings in a digital environment.
  • Some XR meeting applications support AR devices, browsers, and VR devices.
  • a participant using a browser may join via desktop, tablet-PC or smartphone and share their views using a front faced cam or a web cam.
  • some XR meeting solutions have mobile application versions, e.g., Android and iOS, which allow a user to navigate in the virtual space on the screen or activate an augmented reality mode to display the meeting in their own surroundings.
  • the XR meeting solutions introduce new features to online meetings that allow for new ways to share and create content etc.
  • Today’s commonly and commercially available XR devices typically include an HMD and a pair of hand controllers, sometimes with more advanced solutions also “foot controllers”.
  • Immersive XR environments such as gaming environments and meeting environments, are often configured to display computer generated avatars which represent poses of human users in the immersive XR environments.
  • a user may select and customize an avatar, such as gender, clothing, hair style, etc. to represent that user for viewing by other users participating in the immersive XR environment.
  • an avatar such as gender, clothing, hair style, etc.
  • users can be unexpectedly disappointed with how their avatar is viewed by other participants as the user's avatar moves through an environment and/or transitions between different poses, such as standing, sitting, squatting, and laying.
  • Some embodiments disclosed herein are directed to an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the group of participants which are rendered in the immersive XR environment.
  • the XR rendering device includes at least one processor and at least one memory storing instructions executable by the at least one processor to perform operations.
  • the operations determine a participant avatar height based on a rendering height preference.
  • the operations also render the avatar of a participant based on the determined participant avatar height.
  • Some other related embodiments are directed to a corresponding method by an XR rendering device for rendering an immersive XR environment on a display for viewing by a participant among a group of participants who have associated avatars representing the participants which are rendered in the immersive XR environment.
  • the method determines a participant avatar height based on a rendering height preference.
  • the method also renders the avatar of the participant based on the determined participant avatar height.
  • Some potential advantages of these embodiments are that they allow a participant to define and control how height of the participant's avatar is viewed by other participants.
  • a participant may define that the XR rendering device is to render the participant's avatar in a manner that maintains a defined height equity (according to the rendering height preference) relative to other participants' avatars independent of the participant's pose, e.g., standing, sitting, or crouching.
  • the participant may define that height of the participant's avatar is to be rendered to maintain virtual eye contact with participants' avatars, e.g., such as by scaling the sensed height of the participant (e.g., based on headset and/or hand controller sensed heights) based on the averaged heights of other participants' avatars.
  • Figure 1 illustrates an XR system that includes a plurality of participant devices that communicate through networks with an XR rendering device to operate in accordance with some embodiments of the present disclosure
  • Figure 2 illustrates an immersive XR environment with participants' avatars and a shared virtual presentation screen that are rendered with various poses within the XR environment, in accordance with some embodiments of the present disclosure
  • Figure 3 is a further block diagram of an XR rendering system which illustrates data flows and operations between a plurality of participant devices and an XR rendering device in accordance with some embodiments of the present disclosure
  • Figures 4 and 5 illustrate an example of various operations which are performed based on a rendering height preference relative to location of a participant to determine height of the participant's avatar, in accordance with some embodiments of the present disclosure
  • Figures 6 through 9 are flowcharts of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure.
  • Figure 10 is a block diagram of components of an XR rendering device that are configured to operate in accordance with some embodiments of the present disclosure.
  • Figure 1 illustrates an XR system that includes a plurality of participant devices 1 lOa-d that communicate through networks 120 with an XR rendering device 100 to operate in accordance with some embodiments of the present disclosure.
  • the XR rendering device 100 is configured to generate a graphical representation of an immersive XR environment (also called an "XR environment" for brevity) which is viewable from various perspectives of virtual poses of human participants in the XR environment through display screens of the various participant devices 1 lOa-d.
  • the illustrated devices include VR headsets 1 lOa-c which can be worn by participants to view and navigate through the XR environment, and a personal computer 1 lOd which can be operated by a participant to view and navigate through the XR environment.
  • the participants have associated avatars which are rendered in the XR environment to represent poses (e.g., location, body assembly orientation, etc.) of the participants relative to a coordinate system of the XR environment.
  • the XR rendering device 100 may include a rendering height determination module 102 that performs operations disclosed herein for determine a participant avatar height based on a rendering height preference. The XR rendering device 100 then renders the participant avatar with the determined height for viewing by other participants through their respective devices, e.g., HOb-llOd.
  • the XR rendering device 100 is illustrated in Figure 1 as being a centralized network computing server separate from one or more of the participant devices, in some other embodiments the XR rendering device 100 is implemented as a component of one or more of the participant devices.
  • one of the participant devices may be configured to perform operations of the XR rendering device in a centralized manner controlling rendering for or by other ones of the participant devices.
  • each of the participant devices may be configured to perform at least some of the operations of the XR rendering device in a distributed decentralized manner with coordinated communications being performed between the distributed XR rendering devices (e.g., between software instances of XR rendering devices).
  • FIG. 2 illustrates an immersive XR environment with avatars 200a-f that are graphically rendered with poses (e.g., at locations and with orientations) representing the present field of views (FOVs) of associated human participants in the XR environment.
  • poses e.g., at locations and with orientations
  • a shared virtual presentation screen 210 is also graphically rendered at a location within the XR environment, and can display pictures and/or video that are being presented for viewing by the participants in the XR environment.
  • an XR rendering device e.g., an XR environment server or a participant device 110a
  • an XR rendering device can become constrained by its processing bandwidth limitations when attempting to simultaneously render in real-time each of the participants' avatars, the virtual screen 200d, the shared virtual presentation screen 210, and the virtual objects including room surfaces and other parts of the XR environment.
  • Existing XR rendering environments can have undesirable operations for how avatars are rendered (e.g., when operating with a hand(s)-headset (head mounted display)- only sensor setup), such as how a participant’s avatar's legs-feet are attached to a torso, and/or how a transition of a physical person from standing to sitting is represented through the rendering of the person's avatar in the XR environment. For example, when physical person transitions from a standing position to sitting on a chair in a real room, this physical movement can trigger a corresponding change in height of the person's avatar responsive to the sensed person's height changing.
  • XR devices and rendering devices typically sense “height” in relation to the user’s defined physical floor, and do not operationally consider “why” the height changed.
  • a person when a person is for some reason sitting down (e.g., due to body pain, injury, or wheelchair due to disability), the person may desire to be perceived with a height that is equitable to other avatars rendered in the XR environment, e.g., meeting, and not being perceived as seated when others are standing or otherwise perceived “down there”.
  • a person may choose to sit on the floor for some reason but still want to be represented as a person standing up in an XR meeting.
  • a person in a wheelchair may desire to have the avatar rendered in a sitting posture without possible intermittent transitioning to standing due to erroneous interpretation of movements of the headset and/or handsets.
  • Another limitation with existing XR rendering environments is that the general position of an avatar in relation to the XR rendered environment and in relation to other avatars can become very awkward. For example, a physical person who is laying down may result in the person's avatar being rendered with a strange body posture and/or with avatar body parts or the entire avatar body being rendered below a room floor, ground surface, etc.
  • Existing XR rendering environments have not provided participants with an operation to set a fixed avatar height independent of sensed participant height, such as based on handset and/or headset position.
  • Figure 6 is a flowchart of operations that can be performed by an XR rendering device 100 in accordance with some embodiments of the present disclosure
  • an XR rendering device renders an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the group of participants which are rendered in the immersive XR environment.
  • the XR rendering device includes at least one processor and at least one memory storing instructions executable by the at least one processor to perform operations.
  • the operations determine 600 a participant avatar height based on a rendering height preference.
  • the operations render 602 the avatar of a participant based on the determined participant avatar height.
  • Some potential advantages of these embodiments are that they allow a participant to define and control how height of the participant's avatar is viewed by other participants.
  • a participant may define that the XR rendering device is to render the participant's avatar in a manner that maintains a defined height equity (according to the rendering height preference) relative to other participants' avatars independent of the participant's pose, e.g., standing, sitting, or crouching.
  • the participant may define that height of the participant's avatar is to be rendered to maintain virtual eye contact with other participants' avatars, e.g., such as by scaling the sensed height of the participant (e.g., based on headset and/or hand controller sensed heights) based on the averaged heights of other participants' avatars.
  • FIG. 3 is a further block diagram of an XR rendering device 100 which illustrates data flows and operations between a plurality of participant devices and the XR rendering device 100 in accordance with some embodiments of the present disclosure.
  • each of the participants can define a participant avatar height based on a rendering height preference that is to be used by other participant's devices to control the height of the avatars that are rendered.
  • the rendering height preference may be stored as an attribute of a user's profile in the participant's device.
  • the rendering height preference is used by the rendering circuit 300 of the XR Rendering Device 100 for rendering the respective avatars.
  • a first participant can define a rendering height preference which is provided 310a to the XR rendering device 100 and requests that rendering height preference be given to an avatar associated with the first participant for rendering.
  • a second participant can define a rendering height preference which is provided 310b to the XR rendering device 100 and requests that rendering height preference be given to an avatar associated with the second participant for rendering.
  • Other participants can similarly define rendering height preference which are provided to the acts or rendering devices 100 to control rendering related to the respective other participants.
  • the XR rendering device 100 can use the rendering height preferences that have been defined to provide avatar heights for participants 314a, 314b, etc. which control the rendering operations performed by the respective participant devices.
  • the rendering height preferences provided by the various participants to the XR rendering device 100 may be stored in the rendering height determination module 102 (Fig. 1) with an association to respective identities of the participants.
  • Figure 7 is a flowchart of further operations that can be performed by an XR rendering device 100 in accordance with some embodiments of the present disclosure
  • the operations further include to determine 700 height of the XR rendering device relative to a reference plane.
  • the operations render 702 a lower body of the avatar in a standing posture when the height of the XR rendering device is greater than the rendering height preference.
  • the operations render 704 the lower body of the avatar in one of a sitting posture, a kneeling posture, a squatting posture, and a laying posture when the height of the XR rendering device is less than the rendering height preference.
  • Another limitation with existing XR rendering environments is that no operations are provided to enable a person to modify avatar height (for example, if the participant has a height of 1.90 m or 1.55 m) to more accurately or preferentially represent the participant's real-world height.
  • the operations render the avatar of the participant with the fixed avatar height independent of height of the XR rendering device relative to a reference plane.
  • Various embodiments can provide a dynamic operation that allow a participant to maintain avatar height equity independent of the participant's position, which can enable virtual eye contact even if the participant's real-world body posture changes.
  • the embodiments provide participants in-the-meeting-room with avatars’ length rendering equity in relation to “standing height in meeting” based on an offset value relative to a tallest (or shortest) physical user participating in the digital meeting.
  • the operations determine the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a tallest avatar among the group of other participants in the immersive XR environment.
  • the operations determine the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a shortest avatar among the group of other participants in the immersive XR environment.
  • the rendering height preference defines an average height relationship to other avatar heights, and the operations determine the participant avatar height as an average of heights of avatars of the group of other participants in the immersive XR environment.
  • participant devices provide the XR rendering device 100 with information of the participant's physical attributes, such as participant’s standing body length and/or other relevant body metrics, such as length of arm(s), leg(s), torso, neck, head size, etc.
  • the rendering height preference is defined by the participant as at least one of: a total body length of the participant; an arm length of the participant; a leg length of the participant; a torso length of the participant; a neck length of the participant; and a head size of the participant.
  • the height of the participant's avatar is determined based on relative heights of other avatars being rendered in the immersive XR environment.
  • the operation to determine the participant's avatar height is based on at least one of: adding or multiplying a value of the rendering height preference to a total body length of a tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to an arm length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the tallest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to
  • the operation to determine the participant's avatar height is based on at least one of: adding or multiplying (e.g., of the sensed participant's body length or defined body length) a value of the rendering height preference to an arm length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the shortest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the shortest avatar among the group of other participants in the immersive XR environment.
  • an average height value, or other mathematically computed height value, of avatars among the group of other participants in the immersive XR environment is determined and used to determine the participant’s avatar height.
  • the operation to determine the participant's avatar height is based on at least one of: adding or multiplying (e.g., of the sensed participant's body length or defined body length) a value of the rendering height preference to an arm length of the average height value among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the average height value among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the average height value among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the average height value among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size
  • Some further operations may include determining the participant’s avatar height based on participant-provided body size information and other avatar heights.
  • the operations can include identifying the tallest participant among participants in the immersive XR environment.
  • the rendering height is the length of the avatar between the avatar's head and feet or between two other defined locations on the avatar.
  • the operations can include rendering other avatars at the heights position according to the determined render height avatars for otherparticipants i.
  • the XR rendering device 100 determines the participant’s height has changed but is less, or more, than a threshold defined by the rendering height preference, and responsively prevents triggering rendering of the avatar with a lower body position and similarly prevent rendering of a tilting avatar head looking-up at the other avatars.
  • a participant’s XR device detects a new position and the altitude height change is communicated to the XR rendering device 100.
  • the XR rendering device 100 operationally considers rules to determine if height-equity avatar rendering should be invoked, such as: the participant may be requested with manual input; the participant’s profile may be read of corresponding information entries; determine a height-offset rendering adjustment factor; or apply a height-offset rendering adjustment factor.
  • a participant’s XR device detects a new position (posture) and informs the XR rendering device 100.
  • the XR rendering device 100 considers rule(s) defined by the rendering height preference to determine if height-equity avatar rendering should be invoked, such as: a participant may be queried to define or indicate a desired height for the avatar; or a participant’s profile may be accessed to determine if the participant has defined how their avatar height is to be controlled when the participant's height changes.
  • a participant’s XR device detected new position may not be used to adjust how the avatar is rendered, the new position may be ignored by the XR rendering device 100 and/or the participant's XR device provided to managing server; or a participant's XR device detected new position may be provided to XR rendering device 100 for use in rendering avatar height as a defined or scaled offset (e.g., + or - offset) relative to the new height of the participant.
  • a defined or scaled offset e.g., + or - offset
  • a participant may perceive her avatar in digital environment being too tall (or too short) in respect to a real-life experience or any previous knowledge associated with interaction with a specific other (second person) participant.
  • the rendering height preference identifies another participant and a height offset relative to an avatar height of the identified other participant.
  • the operations determine that an avatar for the identified other participant is present in the immersive XR environment. Based on the determination that the avatar for the identified other participant is present in the immersive XR environment, the operations determine the participant avatar height based on the height offset relative to the avatar height of the identified other participant.
  • the operation to determine the participant avatar height based on the height offset relative to the avatar height of the identified other participant includes scaling the avatar height of the identified other participant inversely with distance between location of the avatar of the participant and location of the avatar of the identified other participant.
  • the operation of suggested automatic “avatar height equity adjustment” you may apply a small virtual height movement (“comfort movement”) to mitigate that rendered avatars are being perceived as “string-puppets dangling in a rope.” This seems to be one risk if pure static height head positions are considered.
  • Virtual z- movement may be selected given z-variations associated with physical participants moving patterns, or random e.g. assuming a normal distributed z-variation around a preferred average comfort offset value.
  • the operations further include when the rendering height preference is defined as the fixed avatar height, to repetitively transition height of the avatar of the participant between a range of heights determined based on a defined percentage of the fixed avatar height.
  • the operations further include when the rendering height preference is defined as the fixed avatar height, to determine head tilt of the participant, and render the participant's view of the immersive XR environment from location of the avatar of the participant at the fixed avatar height and to track the determined head tilt of the participant.
  • the rendering height preference is defined as the fixed avatar height
  • Some embodiments are directed to ensuring that the eyes of the avatars a participant is talking to seems to look at the participant even when the participant has adjusted how the participant sees the height of different participants. This may be implicitly solved given that with “avatar height equity adjustment” being active in “the participant’s VR device”, the participant in other participants’ head-mounted displays (HMDs) should be “at same height” then other participant's avatars should look the participant in the eyes without knowing that the participant is sitting down.
  • HMDs head-mounted displays
  • the XR device can determine that the participant is sitting down, and then if the participant’s HMD actually renders other avatars face/ eyes, the participant's HMD rendering of those avatars may alter the participant’s “eye gazing direction” to point at the participant’s eyes. If the participant’s HMD does not render but only receives a pre-rendered stream from the XR rendering device 100, the operations may instruct the XR rendering device 100 to alters other participants' avatars’ gazing to be directed toward the eyes of the participant's avatar.
  • Figures 8 and 9 are flowcharts of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure
  • the operations further include rendering 800 eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line-of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the standing posture when the height of the XR rendering device is greater than the rendering height preference.
  • the operations include rendering 802 eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line- of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture when the height of the XR rendering device is less than the rendering height preference.
  • the operations further render 900 eyes of the avatar of the participant rendered with the lower body in the standing posture to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant when the height of the XR rendering device is greater than the rendering height preference.
  • the operations can also render 902 eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture, to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant when the height of the XR rendering device is less than the rendering height preference.
  • the operations include adjusting the height position of the head, and may adjust the height of the avatar’s body, based on any or more of the following information:
  • the operations further include determining the rendering height preference based on at least one of the following: sensor information indicating the participant’s physical head height relative to a reference plane; sensor information indicating when the participant has one of a defined set of a standing posture, a sitting posture, a kneeling posture, a squatting posture, and a laying posture; and indication of the participant’s physical height.
  • the operations instead of operations starting with height of the tallest physical person as a reference and calculating a reduction offset to subtract from the tallest person to determine height of a participant's avatar (according to the defined rendering height preference), the operations likewise start with height of the shortest person in the considered group and calculate an addition offset to add to the shortest person's height.
  • the XR rendering device 100 operationally determines avatar height as an offset with respect to, e.g., average height of participants, and calculates a rendering offset value relative to the average height value. This may be useful in situations where a first physical participant may desire to have an avatar rendered with a height based on “an average among individuals in group” or to be “a defined offset shorter/taller than the average”.
  • Figures 4 and 5 illustrate an example of various operations which are performed based on a rendering height preference relative to location of a participant to determine height of the participant's avatar, in accordance with some embodiments of the present disclosure.
  • the rendering height preference defines a distance-scaled height relationship to other avatar heights.
  • the operations determine the participant avatar height as a scaled combination of heights of avatars of the group of other participants in the immersive XR environment 400.
  • the scaling varies inversely with respective distance between location of the avatar of the participant 402 and locations of the avatars of respective other participants 404-410.
  • FIG 4 An example of the embodiments is illustrated in Figure 4.
  • the avatar of the participant 402 may have a rendering height preference with a defined distance-scaled height relationship to other avatar heights for other avatars 404-410.
  • the scaling may vary inversely with respective distance between location of the avatar of the participant 402 and locations of the avatars of respective other participants 404-410.
  • the further away avatars such as 408 and 410
  • the dashed line in Figure 5 may correspond to a threshold distance from a location of the participant 402 beyond which height of participants is not used to determine height of the participant's avatar 402.
  • the heights of the participants associated with avatars 408 and 410, and/or the heights of the avatars 408 and 410 themselves may not be used when determine height of the participant's avatar 402.
  • the rendering height preference defines a maximum distance 500 between location of the avatar of the participant 402 and locations of other avatars 404-410 that are used to determine the participant avatar height.
  • the determination of the participant avatar height based on the rendering height preference comprises to determine the participant avatar height based on heights of avatars of other participants that have locations that are not further than the maximum distance from the location of the avatar of the participant 402.
  • the maximum distance 500 is defined between the location of the avatar of the participant 402 and locations of other avatars 404-410 that are used to determine the participant avatar height.
  • FIG. 10 is a block diagram of components of an XR rendering device 100 that are configured to operate in accordance with some embodiments of the present disclosure.
  • the XR rendering device 100 can include at least one processor circuit 1000 (processor), at least one memory 1010 (memory), at least one network interface 1020 (network interface), and a display device 1030.
  • the processor 1000 is operationally connected to these various components.
  • the memory 1010 stores executable instructions 1012 that are executed by the processor 1000 to perform operations.
  • the processor 1000 may include one or more data processing circuits, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor), which may be collocated or distributed across one or more data networks.
  • a general purpose and/or special purpose processor e.g., microprocessor and/or digital signal processor
  • the processor 1000 is configured to execute the instructions 1012 in the memory 1010, described below as a computer readable medium, to perform some or all of the operations and methods for one or more of the embodiments disclosed herein for an XR rendering device.
  • the XR rendering device may be separate from and communicatively connect to the participant devices, or may be at least partially integrated within one or more of the participant devices.
  • the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
  • the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
  • the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the group of participants which are rendered in the immersive XR environment. The XR rendering device includes at least one processor and at least one memory storing instructions executable by the at least one processor. The at least one memory storing instructions executable by the at least one processor to perform operations to determine a participant avatar height based on a rendering height preference. The operations also include rendering the avatar of a participant based on the determined participant avatar height.

Description

EQUITABLE RENDERING OF AVATAR HEIGHTS FOR EXTENDED REALITY ENVIRONMENTS
TECHNICAL FIELD
[0001] The present disclosure relates to rendering extended reality (XR) environments and associated XR rendering devices, and more particularly to rendering avatars in immersive XR environments displayed on XR participant devices.
BACKGROUND
[0002] Immersive extended reality (XR) environments have been developed which provide a myriad of different types of user experiences for gaming, on-line meetings, cocreation of products, etc. Immersive XR environments (also referred to as "XR environments") can include virtual reality (VR) environments where human users only see computer generated graphical renderings and can include augmented reality (AR) environments where users see a combination of computer generated graphical renderings overlaid on a view of the physical real-world through, e.g., see-through display screens. [0003] Example XR environment rendering devices include, without limitation, XR environment servers, XR headsets, gaming consoles, smartphones running an XR application, and tablet/laptop/desktop computers running an XR application. Oculus Quest is an example XR device and Google Glass is an example AR device.
[0004] XR meeting applications are tools for native digital meetings and also useful as a thinking and planning space for oneself as well as having online meetings in a digital environment. Some XR meeting applications support AR devices, browsers, and VR devices. A participant using a browser may join via desktop, tablet-PC or smartphone and share their views using a front faced cam or a web cam. Also, some XR meeting solutions have mobile application versions, e.g., Android and iOS, which allow a user to navigate in the virtual space on the screen or activate an augmented reality mode to display the meeting in their own surroundings. The XR meeting solutions introduce new features to online meetings that allow for new ways to share and create content etc. Today’s commonly and commercially available XR devices typically include an HMD and a pair of hand controllers, sometimes with more advanced solutions also “foot controllers”.
[0005] Immersive XR environments, such as gaming environments and meeting environments, are often configured to display computer generated avatars which represent poses of human users in the immersive XR environments. A user may select and customize an avatar, such as gender, clothing, hair style, etc. to represent that user for viewing by other users participating in the immersive XR environment. Although some user customization of avatars is provided, users can be unexpectedly disappointed with how their avatar is viewed by other participants as the user's avatar moves through an environment and/or transitions between different poses, such as standing, sitting, squatting, and laying.
SUMMARY
[0006] Some embodiments disclosed herein are directed to an XR rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the group of participants which are rendered in the immersive XR environment. The XR rendering device includes at least one processor and at least one memory storing instructions executable by the at least one processor to perform operations. The operations determine a participant avatar height based on a rendering height preference. The operations also render the avatar of a participant based on the determined participant avatar height.
[0007] Some other related embodiments are directed to a corresponding method by an XR rendering device for rendering an immersive XR environment on a display for viewing by a participant among a group of participants who have associated avatars representing the participants which are rendered in the immersive XR environment. The method determines a participant avatar height based on a rendering height preference. The method also renders the avatar of the participant based on the determined participant avatar height.
[0008] Some potential advantages of these embodiments are that they allow a participant to define and control how height of the participant's avatar is viewed by other participants. Thus, for example, a participant may define that the XR rendering device is to render the participant's avatar in a manner that maintains a defined height equity (according to the rendering height preference) relative to other participants' avatars independent of the participant's pose, e.g., standing, sitting, or crouching. Additionally, or alternatively, the participant may define that height of the participant's avatar is to be rendered to maintain virtual eye contact with participants' avatars, e.g., such as by scaling the sensed height of the participant (e.g., based on headset and/or hand controller sensed heights) based on the averaged heights of other participants' avatars.
[0009] Other XR rendering devices, methods, and computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional XR rendering devices, methods, and computer program products be included within this description and protected by the accompanying claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
[0011 ] Figure 1 illustrates an XR system that includes a plurality of participant devices that communicate through networks with an XR rendering device to operate in accordance with some embodiments of the present disclosure;
[0012] Figure 2 illustrates an immersive XR environment with participants' avatars and a shared virtual presentation screen that are rendered with various poses within the XR environment, in accordance with some embodiments of the present disclosure;
[0013] Figure 3 is a further block diagram of an XR rendering system which illustrates data flows and operations between a plurality of participant devices and an XR rendering device in accordance with some embodiments of the present disclosure;
[0014] Figures 4 and 5 illustrate an example of various operations which are performed based on a rendering height preference relative to location of a participant to determine height of the participant's avatar, in accordance with some embodiments of the present disclosure;
[0015] Figures 6 through 9 are flowcharts of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure; and [0016] Figure 10 is a block diagram of components of an XR rendering device that are configured to operate in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION
[0017] Inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings, in which examples of embodiments of inventive concepts are shown. Inventive concepts may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of various present inventive concepts to those skilled in the art. It should also be noted that these embodiments are not mutually exclusive. Components from one embodiment may be tacitly assumed to be present/used in another embodiment. [0018] Figure 1 illustrates an XR system that includes a plurality of participant devices 1 lOa-d that communicate through networks 120 with an XR rendering device 100 to operate in accordance with some embodiments of the present disclosure. The XR rendering device 100 is configured to generate a graphical representation of an immersive XR environment (also called an "XR environment" for brevity) which is viewable from various perspectives of virtual poses of human participants in the XR environment through display screens of the various participant devices 1 lOa-d. For example, the illustrated devices include VR headsets 1 lOa-c which can be worn by participants to view and navigate through the XR environment, and a personal computer 1 lOd which can be operated by a participant to view and navigate through the XR environment. The participants have associated avatars which are rendered in the XR environment to represent poses (e.g., location, body assembly orientation, etc.) of the participants relative to a coordinate system of the XR environment.
[0019] The XR rendering device 100 may include a rendering height determination module 102 that performs operations disclosed herein for determine a participant avatar height based on a rendering height preference. The XR rendering device 100 then renders the participant avatar with the determined height for viewing by other participants through their respective devices, e.g., HOb-llOd.
[0020] Although the XR rendering device 100 is illustrated in Figure 1 as being a centralized network computing server separate from one or more of the participant devices, in some other embodiments the XR rendering device 100 is implemented as a component of one or more of the participant devices. For example, one of the participant devices may be configured to perform operations of the XR rendering device in a centralized manner controlling rendering for or by other ones of the participant devices. Alternatively, each of the participant devices may be configured to perform at least some of the operations of the XR rendering device in a distributed decentralized manner with coordinated communications being performed between the distributed XR rendering devices (e.g., between software instances of XR rendering devices).
[0021 ] Figure 2 illustrates an immersive XR environment with avatars 200a-f that are graphically rendered with poses (e.g., at locations and with orientations) representing the present field of views (FOVs) of associated human participants in the XR environment. In the illustrated example, streaming video from a camera of the participant device 1 lOd (personal computer) is displayed in a virtual screen 200d instead of rendering an avatar to represent the participant. A shared virtual presentation screen 210 is also graphically rendered at a location within the XR environment, and can display pictures and/or video that are being presented for viewing by the participants in the XR environment.
[0022] In a multi-participant XR environment scenario such as illustrated in Figure 2, an XR rendering device (e.g., an XR environment server or a participant device 110a) can become constrained by its processing bandwidth limitations when attempting to simultaneously render in real-time each of the participants' avatars, the virtual screen 200d, the shared virtual presentation screen 210, and the virtual objects including room surfaces and other parts of the XR environment.
[0023] Existing XR rendering environments can have undesirable operations for how avatars are rendered (e.g., when operating with a hand(s)-headset (head mounted display)- only sensor setup), such as how a participant’s avatar's legs-feet are attached to a torso, and/or how a transition of a physical person from standing to sitting is represented through the rendering of the person's avatar in the XR environment. For example, when physical person transitions from a standing position to sitting on a chair in a real room, this physical movement can trigger a corresponding change in height of the person's avatar responsive to the sensed person's height changing.
[0024] XR devices and rendering devices typically sense “height” in relation to the user’s defined physical floor, and do not operationally consider “why” the height changed.
Consequently, a physical person sitting down results in the person's avatar being rendered in the XR environment to show that the person became shorter, and which can result in the rendered legs becoming strangely folded and crammed beneath the torso.
[0025] Similarly, when a person is for some reason sitting down (e.g., due to body pain, injury, or wheelchair due to disability), the person may desire to be perceived with a height that is equitable to other avatars rendered in the XR environment, e.g., meeting, and not being perceived as seated when others are standing or otherwise perceived “down there”. A person may choose to sit on the floor for some reason but still want to be represented as a person standing up in an XR meeting. Similarly, a person in a wheelchair may desire to have the avatar rendered in a sitting posture without possible intermittent transitioning to standing due to erroneous interpretation of movements of the headset and/or handsets.
[0026] In another real-world example, persons who participate in XR gaming have described their desires to be able to detach from their real-world constraints and physical impairments.
[0027] Another limitation with existing XR rendering environments is that the general position of an avatar in relation to the XR rendered environment and in relation to other avatars can become very awkward. For example, a physical person who is laying down may result in the person's avatar being rendered with a strange body posture and/or with avatar body parts or the entire avatar body being rendered below a room floor, ground surface, etc. Existing XR rendering environments have not provided participants with an operation to set a fixed avatar height independent of sensed participant height, such as based on handset and/or headset position.
[0028] Figure 6 is a flowchart of operations that can be performed by an XR rendering device 100 in accordance with some embodiments of the present disclosure
[0029] According to some embodiments of the present disclosure and Figure 6, an XR rendering device renders an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the group of participants which are rendered in the immersive XR environment. The XR rendering device includes at least one processor and at least one memory storing instructions executable by the at least one processor to perform operations. The operations determine 600 a participant avatar height based on a rendering height preference. The operations render 602 the avatar of a participant based on the determined participant avatar height.
[0030] These embodiments enable a physical person who is participating in an XR environment while sitting down to rest or constrained to a wheelchair, etc., to define that the person's avatar is to be rendered standing, may further define that the avatar is to be scaled or height-aligned based on heights of other meeting participants' avatars.
[0031] Some potential advantages of these embodiments are that they allow a participant to define and control how height of the participant's avatar is viewed by other participants. Thus, for example, a participant may define that the XR rendering device is to render the participant's avatar in a manner that maintains a defined height equity (according to the rendering height preference) relative to other participants' avatars independent of the participant's pose, e.g., standing, sitting, or crouching. Additionally or alternatively, the participant may define that height of the participant's avatar is to be rendered to maintain virtual eye contact with other participants' avatars, e.g., such as by scaling the sensed height of the participant (e.g., based on headset and/or hand controller sensed heights) based on the averaged heights of other participants' avatars.
[0032] Figure 3 is a further block diagram of an XR rendering device 100 which illustrates data flows and operations between a plurality of participant devices and the XR rendering device 100 in accordance with some embodiments of the present disclosure. [0033] Referring to Figure 3, each of the participants can define a participant avatar height based on a rendering height preference that is to be used by other participant's devices to control the height of the avatars that are rendered. The rendering height preference may be stored as an attribute of a user's profile in the participant's device. The rendering height preference is used by the rendering circuit 300 of the XR Rendering Device 100 for rendering the respective avatars. For example, a first participant can define a rendering height preference which is provided 310a to the XR rendering device 100 and requests that rendering height preference be given to an avatar associated with the first participant for rendering. Similarly, a second participant can define a rendering height preference which is provided 310b to the XR rendering device 100 and requests that rendering height preference be given to an avatar associated with the second participant for rendering. Other participants can similarly define rendering height preference which are provided to the acts or rendering devices 100 to control rendering related to the respective other participants. Alternatively or additionally, the XR rendering device 100 can use the rendering height preferences that have been defined to provide avatar heights for participants 314a, 314b, etc. which control the rendering operations performed by the respective participant devices. The rendering height preferences provided by the various participants to the XR rendering device 100 may be stored in the rendering height determination module 102 (Fig. 1) with an association to respective identities of the participants.
[0034] Figure 7 is a flowchart of further operations that can be performed by an XR rendering device 100 in accordance with some embodiments of the present disclosure [0035] Referring to Figure 7, the operations further include to determine 700 height of the XR rendering device relative to a reference plane. The operations render 702 a lower body of the avatar in a standing posture when the height of the XR rendering device is greater than the rendering height preference. The operations render 704 the lower body of the avatar in one of a sitting posture, a kneeling posture, a squatting posture, and a laying posture when the height of the XR rendering device is less than the rendering height preference.
[0036] Another limitation with existing XR rendering environments is that no operations are provided to enable a person to modify avatar height (for example, if the participant has a height of 1.90 m or 1.55 m) to more accurately or preferentially represent the participant's real-world height.
[0037] In some embodiments, when the rendering height preference is defined as a fixed avatar height, the operations render the avatar of the participant with the fixed avatar height independent of height of the XR rendering device relative to a reference plane. [0038] Various embodiments can provide a dynamic operation that allow a participant to maintain avatar height equity independent of the participant's position, which can enable virtual eye contact even if the participant's real-world body posture changes.
[0039] The embodiments provide participants in-the-meeting-room with avatars’ length rendering equity in relation to “standing height in meeting” based on an offset value relative to a tallest (or shortest) physical user participating in the digital meeting.
[0040] In some embodiments, the operations determine the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a tallest avatar among the group of other participants in the immersive XR environment.
[0041] In some embodiments, the operations determine the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a shortest avatar among the group of other participants in the immersive XR environment.
[0042] In some embodiments, the rendering height preference defines an average height relationship to other avatar heights, and the operations determine the participant avatar height as an average of heights of avatars of the group of other participants in the immersive XR environment.
[0043] In a first preparing step, physical participants’ respective XR devices provide the XR rendering device 100 with information of the participant's physical attributes, such as participant’s standing body length and/or other relevant body metrics, such as length of arm(s), leg(s), torso, neck, head size, etc.
[0044] In some embodiments, the rendering height preference is defined by the participant as at least one of: a total body length of the participant; an arm length of the participant; a leg length of the participant; a torso length of the participant; a neck length of the participant; and a head size of the participant.
[0045] In some embodiments, the height of the participant's avatar is determined based on relative heights of other avatars being rendered in the immersive XR environment. The operation to determine the participant's avatar height is based on at least one of: adding or multiplying a value of the rendering height preference to a total body length of a tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to an arm length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the tallest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the tallest avatar among the group of other participants in the immersive XR environment.
[0046] In some other embodiments, the operation to determine the participant's avatar height is based on at least one of: adding or multiplying (e.g., of the sensed participant's body length or defined body length) a value of the rendering height preference to an arm length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the shortest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the shortest avatar among the group of other participants in the immersive XR environment.
[0047] In some other embodiments, an average height value, or other mathematically computed height value, of avatars among the group of other participants in the immersive XR environment is determined and used to determine the participant’s avatar height. The operation to determine the participant's avatar height is based on at least one of: adding or multiplying (e.g., of the sensed participant's body length or defined body length) a value of the rendering height preference to an arm length of the average height value among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the average height value among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the average height value among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the average height value among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the average height value among the group of other participants in the immersive XR environment. [0048] Some further operations may include determining the participant’s avatar height based on participant-provided body size information and other avatar heights. The operations can include identifying the tallest participant among participants in the immersive XR environment. The operations then determine the tallest participant’s avatar rendering height in the immersive XR environment according to Offset i = Length tallest - Length otherparticipant i. For example, the rendering height is the length of the avatar between the avatar's head and feet or between two other defined locations on the avatar. The operations determine a rendering z-position of other avatars in relation to the tallest participant’s avatar’s height in the immersive XR environment, according to render height avatars for otherparticipants i = render_height_tallest_participant - offset i. The operations can include rendering other avatars at the heights position according to the determined render height avatars for otherparticipants i.
[0049] Then, in a next operational step during a stand-up meeting in the immersive XR environment, a first physical participant “sits down” or otherwise takes on a lower body position and the avatar height change is provided to the XR rendering device 100. Based on the determined participant avatar height, the XR rendering device 100 determines the participant’s height has changed but is less, or more, than a threshold defined by the rendering height preference, and responsively prevents triggering rendering of the avatar with a lower body position and similarly prevent rendering of a tilting avatar head looking-up at the other avatars.
[0050] In some embodiments, a participant’s XR device detects a new position and the altitude height change is communicated to the XR rendering device 100. The XR rendering device 100 operationally considers rules to determine if height-equity avatar rendering should be invoked, such as: the participant may be requested with manual input; the participant’s profile may be read of corresponding information entries; determine a height-offset rendering adjustment factor; or apply a height-offset rendering adjustment factor.
[0051] In other embodiments, a participant’s XR device detects a new position (posture) and informs the XR rendering device 100. The XR rendering device 100 considers rule(s) defined by the rendering height preference to determine if height-equity avatar rendering should be invoked, such as: a participant may be queried to define or indicate a desired height for the avatar; or a participant’s profile may be accessed to determine if the participant has defined how their avatar height is to be controlled when the participant's height changes. When the XR rendering device 100 determines that height-equity avatar rendering is to be invoked, then: a participant’s XR device detected new position may not be used to adjust how the avatar is rendered, the new position may be ignored by the XR rendering device 100 and/or the participant's XR device provided to managing server; or a participant's XR device detected new position may be provided to XR rendering device 100 for use in rendering avatar height as a defined or scaled offset (e.g., + or - offset) relative to the new height of the participant.
[0052] In some embodiments, the operation of suggested automatic “avatar height equity adjustment”, a participant may perceive her avatar in digital environment being too tall (or too short) in respect to a real-life experience or any previous knowledge associated with interaction with a specific other (second person) participant.
[0053] In some embodiments, the rendering height preference identifies another participant and a height offset relative to an avatar height of the identified other participant. The operations determine that an avatar for the identified other participant is present in the immersive XR environment. Based on the determination that the avatar for the identified other participant is present in the immersive XR environment, the operations determine the participant avatar height based on the height offset relative to the avatar height of the identified other participant.
[0054] In some embodiments, the operation to determine the participant avatar height based on the height offset relative to the avatar height of the identified other participant, includes scaling the avatar height of the identified other participant inversely with distance between location of the avatar of the participant and location of the avatar of the identified other participant.
[0055] In some embodiments, the operation of suggested automatic “avatar height equity adjustment”, you may apply a small virtual height movement (“comfort movement”) to mitigate that rendered avatars are being perceived as “string-puppets dangling in a rope.” This seems to be one risk if pure static height head positions are considered. Virtual z- movement may be selected given z-variations associated with physical participants moving patterns, or random e.g. assuming a normal distributed z-variation around a preferred average comfort offset value.
[0056] In some embodiments, the operations further include when the rendering height preference is defined as the fixed avatar height, to repetitively transition height of the avatar of the participant between a range of heights determined based on a defined percentage of the fixed avatar height.
[0057] In some embodiments, the operations further include when the rendering height preference is defined as the fixed avatar height, to determine head tilt of the participant, and render the participant's view of the immersive XR environment from location of the avatar of the participant at the fixed avatar height and to track the determined head tilt of the participant.
[0058] Some embodiments are directed to ensuring that the eyes of the avatars a participant is talking to seems to look at the participant even when the participant has adjusted how the participant sees the height of different participants. This may be implicitly solved given that with “avatar height equity adjustment” being active in “the participant’s VR device”, the participant in other participants’ head-mounted displays (HMDs) should be “at same height” then other participant's avatars should look the participant in the eyes without knowing that the participant is sitting down.
[0059] In the participant’s HMD, the XR device can determine that the participant is sitting down, and then if the participant’s HMD actually renders other avatars face/ eyes, the participant's HMD rendering of those avatars may alter the participant’s “eye gazing direction” to point at the participant’s eyes. If the participant’s HMD does not render but only receives a pre-rendered stream from the XR rendering device 100, the operations may instruct the XR rendering device 100 to alters other participants' avatars’ gazing to be directed toward the eyes of the participant's avatar.
[0060] Figures 8 and 9 are flowcharts of operations that can be performed by an XR rendering device in accordance with some embodiments of the present disclosure [0061] Referring to Figure 8, the operations further include rendering 800 eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line-of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the standing posture when the height of the XR rendering device is greater than the rendering height preference. The operations include rendering 802 eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line- of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture when the height of the XR rendering device is less than the rendering height preference.
[0062] Referring to Figure 9, the operations further render 900 eyes of the avatar of the participant rendered with the lower body in the standing posture to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant when the height of the XR rendering device is greater than the rendering height preference. The operations can also render 902 eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture, to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant when the height of the XR rendering device is less than the rendering height preference.
[0063] In some embodiments, the operations include adjusting the height position of the head, and may adjust the height of the avatar’s body, based on any or more of the following information:
• sensor information about the participant’s physical head position relative to a reference plane, e.g., floor level;
• sensor information that identifies if the participant is standing up, sitting down, squatting or kneeling;
• information indicating the subject’s physical height;
• information indicating the height of other avatars in the vicinity of the participant’s avatar;
• preference setting, indicating if the participant wants to be rendered as always standing up or always sitting down; and
• preference setting, indicating preferred height relative to the other avatars in the vicinity.
[0064] In some embodiments, the operations further include determining the rendering height preference based on at least one of the following: sensor information indicating the participant’s physical head height relative to a reference plane; sensor information indicating when the participant has one of a defined set of a standing posture, a sitting posture, a kneeling posture, a squatting posture, and a laying posture; and indication of the participant’s physical height.
[0065] In some embodiments, instead of operations starting with height of the tallest physical person as a reference and calculating a reduction offset to subtract from the tallest person to determine height of a participant's avatar (according to the defined rendering height preference), the operations likewise start with height of the shortest person in the considered group and calculate an addition offset to add to the shortest person's height. In a related embodiment, the XR rendering device 100 operationally determines avatar height as an offset with respect to, e.g., average height of participants, and calculates a rendering offset value relative to the average height value. This may be useful in situations where a first physical participant may desire to have an avatar rendered with a height based on “an average among individuals in group” or to be “a defined offset shorter/taller than the average”.
[0066] Figures 4 and 5 illustrate an example of various operations which are performed based on a rendering height preference relative to location of a participant to determine height of the participant's avatar, in accordance with some embodiments of the present disclosure. [0067] Referring to Figure 4, the rendering height preference defines a distance-scaled height relationship to other avatar heights. The operations determine the participant avatar height as a scaled combination of heights of avatars of the group of other participants in the immersive XR environment 400. The scaling varies inversely with respective distance between location of the avatar of the participant 402 and locations of the avatars of respective other participants 404-410.
[0068] An example of the embodiments is illustrated in Figure 4. In Figure 4, the avatar of the participant 402 may have a rendering height preference with a defined distance-scaled height relationship to other avatar heights for other avatars 404-410. As stated above, the scaling may vary inversely with respective distance between location of the avatar of the participant 402 and locations of the avatars of respective other participants 404-410.
Therefore, the further away avatars, such as 408 and 410, has a lower weight than the closer avatars, such as 404 and 406, on the rendering height preference. The dashed line in Figure 5 may correspond to a threshold distance from a location of the participant 402 beyond which height of participants is not used to determine height of the participant's avatar 402. For example, in Figure 5, because avatars 408 and 410 are beyond the threshold distance, the heights of the participants associated with avatars 408 and 410, and/or the heights of the avatars 408 and 410 themselves, may not be used when determine height of the participant's avatar 402.
[0069] Referring to Figure 5, in some embodiments, the rendering height preference defines a maximum distance 500 between location of the avatar of the participant 402 and locations of other avatars 404-410 that are used to determine the participant avatar height. The determination of the participant avatar height based on the rendering height preference comprises to determine the participant avatar height based on heights of avatars of other participants that have locations that are not further than the maximum distance from the location of the avatar of the participant 402.
[0070] In another example of the embodiments illustrated in Figure 5, the maximum distance 500 is defined between the location of the avatar of the participant 402 and locations of other avatars 404-410 that are used to determine the participant avatar height. In another embodiment, there may be at least another maximum distance 510 defined to give avatars between the maximum distance 500 and the another maximum distance 510 a lower weight in determining the rendering height preference.
[0071 ] Example XR Rendering Device Configuration
[0072] Figure 10 is a block diagram of components of an XR rendering device 100 that are configured to operate in accordance with some embodiments of the present disclosure. The XR rendering device 100 can include at least one processor circuit 1000 (processor), at least one memory 1010 (memory), at least one network interface 1020 (network interface), and a display device 1030. The processor 1000 is operationally connected to these various components. The memory 1010 stores executable instructions 1012 that are executed by the processor 1000 to perform operations. The processor 1000 may include one or more data processing circuits, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor), which may be collocated or distributed across one or more data networks. The processor 1000 is configured to execute the instructions 1012 in the memory 1010, described below as a computer readable medium, to perform some or all of the operations and methods for one or more of the embodiments disclosed herein for an XR rendering device. As explained above, the XR rendering device may be separate from and communicatively connect to the participant devices, or may be at least partially integrated within one or more of the participant devices.
[0073] Further Definitions and Embodiments:
[0074] In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
[0075] When an element is referred to as being "connected", "coupled", "responsive", or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected", "directly coupled", "directly responsive", or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, "coupled", "connected", "responsive", or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term "and/or" includes any and all combinations of one or more of the associated listed items.
[0076] It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
[0077] As used herein, the terms "comprise", "comprising", "comprises", "include", "including", "includes", "have", "has", "having", or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation "e.g.", which derives from the Latin phrase "exempli gratia," may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation "i.e.", which derives from the Latin phrase "id est," may be used to specify a particular item from a more general recitation.
[0078] Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
[0079] These computer program instructions may also be stored in a tangible computer- readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as "circuitry," "a module" or variants thereof.
[0080] It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the flinch onality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
[0081 ] Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims

CLAIMS:
1. An extended reality, XR, rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the group of participants which are rendered in the immersive XR environment, the XR rendering device comprising: at least one processor; and at least one memory storing instructions executable by the at least one processor to perform operations to: determine a participant avatar height based on a rendering height preference; and render the avatar of a participant based on the determined participant avatar height.
2. The XR rendering device of Claim 1, wherein: when the rendering height preference is defined as a fixed avatar height, render the avatar of the participant with the fixed avatar height independent of height of the XR rendering device relative to a reference plane.
3. The XR rendering device of Claim 2, wherein the operations further comprise when the rendering height preference is defined as the fixed avatar height, to: repetitively transition height of the avatar of the participant between a range of heights determined based on a defined percentage of the fixed avatar height.
4. The XR rendering device of Claim 2, wherein the operations further comprise when the rendering height preference is defined as the fixed avatar height, to: determine head tilt of the participant; and render the participant's view of the immersive XR environment from location of the avatar of the participant at the fixed avatar height and to track the determined head tilt of the participant.
5. The XR rendering device of any of Claims 1 to 4, further comprising: determine height of the XR rendering device relative to the reference plane; when the height of the XR rendering device is greater than the rendering height preference, render a lower body of the avatar in a standing posture; and when the height of the XR rendering device is less than the rendering height preference, render the lower body of the avatar in one of a sitting posture, a kneeling posture, a squatting posture, and a laying posture. The XR rendering device of Claim 5, wherein the operations further comprise: when the height of the XR rendering device is greater than the rendering height preference, render eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line-of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the standing posture; and when the height of the XR rendering device is less than the rendering height preference, render eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line-of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture. The XR rendering device of Claim 5, wherein the operations further comprise: when the height of the XR rendering device is greater than the rendering height preference, render eyes of the avatar of the participant rendered with the lower body in the standing posture to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant; and when the height of the XR rendering device is less than the rendering height preference, render eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture, to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant.
8. The XR rendering device of any of Claims 1 to 7, wherein the operations determine the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a tallest avatar among the group of other participants in the immersive XR environment.
9. The XR rendering device of any of Claims 1 to 7, wherein the operations determine the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a shortest avatar among the group of other participants in the immersive XR environment.
10. The XR rendering device of any of Claims 1 to 7, wherein the rendering height preference defines an average height relationship to other avatar heights, and the operations determine the participant avatar height as an average of heights of avatars of the group of other participants in the immersive XR environment.
11. The XR rendering device of any of Claims 1 to 10, wherein: the rendering height preference defines a distance-scaled height relationship to other avatar heights; and the operations determine the participant avatar height as a scaled combination of heights of avatars of the group of other participants in the immersive XR environment, wherein the scaling varies inversely with respective distance between location of the avatar of the participant and locations of the avatars of respective other participants.
12. The XR rendering device of any of Claims 6 to 11, wherein: the rendering height preference defines a maximum distance between location of the avatar of the participant and locations of other avatars that are used to determine the participant avatar height; and the determination of the participant avatar height based on the rendering height preference comprises to determine the participant avatar height based on heights of avatars of other participants that have locations that are not further than the maximum distance from the location of the avatar of the participant.
13. The XR rendering device of any of Claims 1 to 12, wherein the rendering height preference is defined by the participant as at least one of: a total body length of the participant; an arm length of the participant; a leg length of the participant; a torso length of the participant; a neck length of the participant; and a head size of the participant.
14. The XR rendering device of Claim 13, wherein the operation to determine the participant avatar height is based on at least one of: adding or multiplying a value of the rendering height preference to a total body length of a tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to an arm length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the tallest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the tallest avatar among the group of other participants in the immersive XR environment.
15. The XR rendering device of Claim 13, wherein the operation to determine the participant avatar height is based on at least one of: adding or multiplying a value of the rendering height preference to a total body length of a shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to an arm length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the shortest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the shortest avatar among the group of other participants in the immersive XR environment.
16. The XR rendering device of any of Claims 1 to 15, wherein: the rendering height preference identifies another participant and a height offset relative to an avatar height of the identified other participant; determine that an avatar for the identified other participant is present in the immersive XR environment; and based on the determination that the avatar for the identified other participant is present in the immersive XR environment, determine the participant avatar height based on the height offset relative to the avatar height of the identified other participant.
17. The XR rendering device of Claim 16, wherein the operation to determine the participant avatar height based on the height offset relative to the avatar height of the identified other participant, comprises: scaling the avatar height of the identified other participant inversely with distance between location of the avatar of the participant and location of the avatar of the identified other participant.
18. The XR rendering device of any of Claims 1 to 16, wherein the operations further comprise: determining the rendering height preference based on at least one of the following: sensor information indicating the participant’s physical head height relative to a reference plane; sensor information indicating when the participant has one of a defined set of a standing posture, a sitting posture, a kneeling posture, a squatting posture, and a laying posture; and indication of the participant’s physical height.
19. A method performed by an extended reality, XR, rendering device for rendering an immersive XR environment on a display device for viewing by a participant among a group of participants who have associated avatars representing the participants which are rendered in the immersive XR environment, method comprising: determining (600) a participant avatar height based on a rendering height preference; and rendering (602) the avatar of the participant based on the determined participant avatar height.
20. The method of Claim 19, wherein: when the rendering height preference is defined as a fixed avatar height, rendering the avatar of the participant with the fixed avatar height independent of height of the XR rendering device relative to a reference plane.
21. The method of Claim 20, wherein the rendering height preference is defined as the fixed avatar height, comprising: repetitively transitioning height of the avatar of the participant between a range of heights determined based on a defined percentage of the fixed avatar height.
22. The method of Claim 19, wherein the rendering height preference is defined as the fixed avatar height, comprising: determining head tilt of the participant; and rendering the participant's view of the immersive XR environment from location of the avatar of the participant at the fixed avatar height and to track the determined head tilt of the participant.
23. The method of any of Claims 19 to 22, further comprising:
Determining (700) height of the XR rendering device relative to the reference plane; when the height of the XR rendering device is greater than the rendering height preference, rendering (702) a lower body of the avatar in a standing posture; and when the height of the XR rendering device is less than the rendering height preference, rendering (704) the lower body of the avatar in one of a sitting posture, a kneeling posture, a squatting posture, and a laying posture.
24. The method of Claim 23, further comprising: when the height of the XR rendering device is greater than the rendering height preference, rendering (800) eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line-of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the standing posture; and when the height of the XR rendering device is less than the rendering height preference, rendering (802) eyes of avatars of the group of other participants who are looking at the avatar of the participant to maintain a line-of-sight directed toward eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture.
25. The method of Claim 23, further comprising: when the height of the XR rendering device is greater than the rendering height preference, rendering (900) eyes of the avatar of the participant rendered with the lower body in the standing posture to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant; and when the height of the XR rendering device is less than the rendering height preference, rendering (902) eyes of the avatar of the participant rendered with the lower body in the one of the sitting posture, the kneeling posture, the squatting posture, and the laying posture, to maintain a line-of-sight directed toward eyes of avatars of the group of other participants who are looking at the avatar of the participant.
26. The method of any of Claims 19 to 25, wherein determining the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a tallest avatar among the group of other participants in the immersive XR environment.
27. The method of any of Claims 19 to 25, wherein determining the participant avatar height as an offset height value, defined by the rendering height preference, added or multiplied to a height of a shortest avatar among the group of other participants in the immersive XR environment.
28. The method of any of Claims 19 to 25, wherein the rendering height preference defines an average height relationship to other avatar heights, and the operations determine the participant avatar height as an average of heights of avatars of the group of other participants in the immersive XR environment.
29. The method of any of Claims 19 to 28, wherein: the rendering height preference defines a distance-scaled height relationship to other avatar heights; and determining the participant avatar height as a scaled combination of heights of avatars of the group of other participants in the immersive XR environment, wherein the scaling varies inversely with respective distance between location of the avatar of the participant and locations of the avatars of respective other participants. 30. The method of any of Claims 24 to 29, wherein: the rendering height preference defines a maximum distance between location of the avatar of the participant and locations of other avatars that are used to determine the participant avatar height; and the determination of the participant avatar height based on the rendering height preference comprises to determine the participant avatar height based on heights of avatars of other participants that have locations that are not further than the maximum distance from the location of the avatar of the participant.
31. The method of any of Claims 19 to 30, wherein the rendering height preference is defined by the participant as at least one of: a total body length of the participant; an arm length of the participant; a leg length of the participant; a torso length of the participant; a neck length of the participant; and a head size of the participant.
32. The method of Claim 31, wherein the operation to determine the participant avatar height is based on at least one of: adding or multiplying a value of the rendering height preference to a total body length of a tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to an arm length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the tallest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the tallest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the tallest avatar among the group of other participants in the immersive XR environment.
33. The method of Claim 31, wherein the operation to determine the participant avatar height is based on at least one of: adding or multiplying a value of the rendering height preference to a total body length of a shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to an arm length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a leg length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a torso length of the shortest avatar among the group of other participants in the immersive XR environment; adding or multiplying a value of the rendering height preference to a neck length of the shortest avatar among the group of other participants in the immersive XR environment; and adding or multiplying a value of the rendering height preference to a head size of the shortest avatar among the group of other participants in the immersive XR environment.
34. The method of any of Claims 19 to 33, wherein: the rendering height preference identifies another participant and a height offset relative to an avatar height of the identified other participant; determining that an avatar for the identified other participant is present in the immersive XR environment; and based on the determination that the avatar for the identified other participant is present in the immersive XR environment, determining the participant avatar height based on the height offset relative to the avatar height of the identified other participant.
35. The method of Claim 34, wherein the determining the participant avatar height based on the height offset relative to the avatar height of the identified other participant, comprises: scaling the avatar height of the identified other participant inversely with distance between location of the avatar of the participant and location of the avatar of the identified other participant.
36. The method of any of Claims 19 to 35, further comprise: determining the rendering height preference based on at least one of the following: sensor information indicating the participant’s physical head height relative to a reference plane; sensor information indicating when the participant has one of a defined set of a standing posture, a sitting posture, a kneeling posture, a squatting posture, and a laying posture; and indication of the participant’s physical height.
PCT/EP2022/060950 2022-04-26 2022-04-26 Equitable rendering of avatar heights for extended reality environments WO2023208317A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/060950 WO2023208317A1 (en) 2022-04-26 2022-04-26 Equitable rendering of avatar heights for extended reality environments

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/060950 WO2023208317A1 (en) 2022-04-26 2022-04-26 Equitable rendering of avatar heights for extended reality environments

Publications (1)

Publication Number Publication Date
WO2023208317A1 true WO2023208317A1 (en) 2023-11-02

Family

ID=81846534

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/060950 WO2023208317A1 (en) 2022-04-26 2022-04-26 Equitable rendering of avatar heights for extended reality environments

Country Status (1)

Country Link
WO (1) WO2023208317A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190188895A1 (en) * 2017-12-14 2019-06-20 Magic Leap, Inc. Contextual-based rendering of virtual avatars
WO2022066423A1 (en) * 2020-09-24 2022-03-31 Sterling Labs Llc Recommended avatar placement in an environmental representation of a multi-user communication session

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190188895A1 (en) * 2017-12-14 2019-06-20 Magic Leap, Inc. Contextual-based rendering of virtual avatars
WO2022066423A1 (en) * 2020-09-24 2022-03-31 Sterling Labs Llc Recommended avatar placement in an environmental representation of a multi-user communication session

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"5G; Extended Reality (XR) in 5G (3GPP TR 26.928 version 16.1.0 Release 16)", vol. 3GPP SA, no. V16.1.0, 19 January 2021 (2021-01-19), pages 1 - 133, XP014390401, Retrieved from the Internet <URL:http://www.etsi.org/deliver/etsi_tr/126900_126999/126928/16.01.00_60/tr_126928v160100p.pdf> [retrieved on 20210119] *

Similar Documents

Publication Publication Date Title
US11557102B2 (en) Methods for manipulating objects in an environment
US10722800B2 (en) Co-presence handling in virtual reality
CN107533640B (en) Method, user equipment and storage medium for gaze correction
US7626569B2 (en) Movable audio/video communication interface system
CN114365197A (en) Placing virtual content in an environment with multiple physical participants
CN107534755B (en) Apparatus and method for gaze correction
US10438418B2 (en) Information processing method for displaying a virtual screen and system for executing the information processing method
KR20230003154A (en) Presentation of avatars in three-dimensional environments
JP2024502810A (en) Systems and methods for providing spatial awareness in virtual reality
WO2023208317A1 (en) Equitable rendering of avatar heights for extended reality environments
US20230343049A1 (en) Obstructed objects in a three-dimensional environment
US20230316674A1 (en) Devices, methods, and graphical user interfaces for modifying avatars in three-dimensional environments
WO2022242854A1 (en) Prioritizing rendering by extended reality rendering device responsive to rendering prioritization rules
JP7264941B2 (en) Program, information processing device and information processing method
JP2018097847A (en) Information processing method, and program for causing computer to execute the information processing method
WO2022242855A1 (en) Extended reality rendering device prioritizing which avatar and/or virtual object to render responsive to rendering priority preferences
WO2024061462A1 (en) Rendering user avatar and digital object in extended reality based on user interactions with physical object
US20240112303A1 (en) Context-Based Selection of Perspective Correction Operations
US20230152935A1 (en) Devices, methods, and graphical user interfaces for presenting virtual objects in virtual environments
WO2022223113A1 (en) Extended reality servers preforming actions directed to virtual objects based on overlapping field of views of participants
Kawabata et al. Depth of Field Blur Effect Considering Convergence Distance in Virtual Reality
TWI802124B (en) Cloud video conference system and method of remote video conference
US20240104859A1 (en) User interfaces for managing live communication sessions
GB2616850A (en) Pseudo 3D virtual meeting
WO2024083302A1 (en) Virtual portal between physical space and virtual space in extended reality environments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22725460

Country of ref document: EP

Kind code of ref document: A1