EP4302557A1 - Commande d'accès à un canal synchrone d'un système sans fil - Google Patents

Commande d'accès à un canal synchrone d'un système sans fil

Info

Publication number
EP4302557A1
EP4302557A1 EP22705249.5A EP22705249A EP4302557A1 EP 4302557 A1 EP4302557 A1 EP 4302557A1 EP 22705249 A EP22705249 A EP 22705249A EP 4302557 A1 EP4302557 A1 EP 4302557A1
Authority
EP
European Patent Office
Prior art keywords
twt
video
wireless communication
clock
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22705249.5A
Other languages
German (de)
English (en)
Inventor
Qi Xue
Srinivas Katar
Chao ZOU
Naveen GANGADHARAN
Neelakantan Nurani Krishnan
Sandip Homchaudhuri
Xiaolong Huang
Anish Ashok
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/188,165 external-priority patent/US11696345B2/en
Priority claimed from US17/188,275 external-priority patent/US11800572B2/en
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of EP4302557A1 publication Critical patent/EP4302557A1/fr
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W74/00Wireless channel access
    • H04W74/08Non-scheduled access, e.g. ALOHA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W74/00Wireless channel access
    • H04W74/08Non-scheduled access, e.g. ALOHA
    • H04W74/0866Non-scheduled access, e.g. ALOHA using a dedicated channel for access
    • H04W74/0883Non-scheduled access, e.g. ALOHA using a dedicated channel for access for un-synchronized access
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0212Power saving arrangements in terminal devices managed by the network, e.g. network or access point is master and terminal is slave
    • H04W52/0216Power saving arrangements in terminal devices managed by the network, e.g. network or access point is master and terminal is slave using a pre-established activity schedule, e.g. traffic indication frame
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0225Power saving arrangements in terminal devices using monitoring of external events, e.g. the presence of a signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W56/00Synchronisation arrangements
    • H04W56/001Synchronization between nodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W74/00Wireless channel access
    • H04W74/08Non-scheduled access, e.g. ALOHA
    • H04W74/0808Non-scheduled access, e.g. ALOHA using carrier sensing, e.g. carrier sense multiple access [CSMA]
    • H04W74/0816Non-scheduled access, e.g. ALOHA using carrier sensing, e.g. carrier sense multiple access [CSMA] with collision avoidance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W74/00Wireless channel access
    • H04W74/08Non-scheduled access, e.g. ALOHA
    • H04W74/0866Non-scheduled access, e.g. ALOHA using a dedicated channel for access
    • H04W74/0875Non-scheduled access, e.g. ALOHA using a dedicated channel for access with assigned priorities based access
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • H04W84/12WLAN [Wireless Local Area Networks]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • This disclosure relates generally to wireless communications, and more specifically, to synchronous channel access control of a wireless system.
  • a wireless local area network may be formed by one or more access points (APs) that provide a shared wireless communication medium for use by a number of client devices also referred to as stations (STAs).
  • the basic building block of a WLAN conforming to the Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards is a Basic Service Set (BSS), which is managed by an AP.
  • BSS Basic Service Set
  • Each BSS is identified by a Basic Service Set Identifier (BSSID) that is advertised by the AP.
  • An AP periodically broadcasts beacon frames to enable any STAs within wireless range of the AP to establish or maintain a communication link with the WLAN.
  • Some wireless communication devices may be associated with low- latency traffic (such as gaming or extended reality (XR) traffic) having strict end-to-end latency, packet loss, and throughput requirements. It is desirable for WLANs to ensure that such low-latency traffic can be handled without violating their respective latency and packet loss requirements.
  • low- latency traffic such as gaming or extended reality (XR) traffic
  • XR extended reality
  • the method may be performed by a wireless communication device, and may include obtaining control of a wireless medium.
  • Control of the wireless medium is associated with a first priority of transmitting, over the wireless medium, a first physical layer protocol data unit (PPDU) of an application file from the wireless communication device to a second device, and the first priority is different than a second priority of transmitting data from the second device to the wireless communication device over the wireless medium.
  • the method also may include providing the first PPDU to the second device.
  • the method also may include providing one or more subsequent PPDUs of the application file to the second device. Providing the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • the wireless communication device includes a processing system and an interface.
  • the interface is configured to obtain control of a wireless medium. Control of the wireless medium is associated with a first priority of transmitting, over the wireless medium, a first PPDU of an application file from the wireless communication device to a second device, and the first priority is different than a second priority of transmitting data from the second device to the wireless communication device over the wireless medium.
  • the interface also is configured to provide the first PPDU to the second device and provide one or more subsequent PPDUs of the application file to the second device. Providing the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • the method may be performed by a wireless communication device, and may include obtaining a first PPDU of an application file from an AP over a wireless medium.
  • the AP obtains control of the wireless medium.
  • Control of the wireless medium is associated with a first priority of transmitting the first PPDU over the wireless medium, and the first priority is different than a second priority of transmitting data from the wireless communication device to the AP over the wireless medium.
  • the method also may include obtaining one or more subsequent PPDUs of the application file from the AP. Obtaining the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • the wireless communication device includes a processing system and an interface.
  • the interface is configured to obtain a first PPDU of an application file from an AP over a wireless medium.
  • the AP obtains control of the wireless medium.
  • Control of the wireless medium is associated with a first priority of transmitting the first PPDU over the wireless medium, and the first priority is different than a second priority of transmitting data from the wireless communication device to the AP over the wireless medium.
  • the interface also is configured to obtain one or more subsequent PPDUs of the application file from the AP. Obtaining the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • the method may be performed by a device, and may include obtaining, from a second device, uplink (UL) data over a wireless medium and providing, to the second device, downlink (DL) data including PPDUs over the wireless medium.
  • UL uplink
  • DL downlink
  • One or more PPDUs are provided to the second device during a current target wake time (TWT) window, and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided to the second device; or when the first PPDU is provided from an application layer to a media access control layer (MAC) of the device.
  • TWT current target wake time
  • MAC media access control layer
  • the device includes a processing system and an interface.
  • the interface is configured to obtain, from a second device, UL data over a wireless medium and provide, to the second device, DL data including PPDUs over the wireless medium.
  • One or more PPDUs are provided to the second device during a current TWT window, and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided to the second device; or when the first PPDU is provided from an application layer to a MAC of the device.
  • the method may be performed by a device, and may include providing UL data to a second device over a wireless medium and obtaining, from the second device, DL data including PPDUs over the wireless medium.
  • One or more PPDUs are obtained from the second device during a current TWT window, and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided by the second device; or when the first PPDU is provided from an application layer to a MAC of the second device.
  • the device includes a processing system and an interface.
  • the interface is configured to provide UL data to a second device over a wireless medium and obtain, from the second device, DL data including PPDUs over the wireless medium.
  • One or more PPDUs are obtained from the second device during a current TWT window, and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided by the second device; or when the first PPDU is provided from an application layer to a MAC of the second device.
  • the method may be performed by a device, and may include rendering a plurality of video frames to be provided to a second device, splitting each video frame of the plurality of video frames into a plurality of video slices, and, for each video slice of the plurality of video slices, generating a plurality of PPDUs to include the video slice.
  • Each PPDU includes one or more media access control layer (MAC) service data units (MSDUs) associated with the video slice, and the video slice is identified by a port number and a differentiated services field codepoint (DSCP) value included in each MSDU of the plurality of PPDU.
  • the method also includes, for each video slice of the plurality of video slices, queuing the MSDUs for transmission to the second device.
  • MAC media access control layer
  • DSCP differentiated services field codepoint
  • the device includes a processing system and an interface.
  • the processing system is configured to render a plurality of video frames to be provided to a second device, split each video frame of the plurality of video frames into a plurality of video slices, and, for each video slice of the plurality of video slices, generate a plurality of PPDUs to include the video slice.
  • Each PPDU includes one or more MSDUs associated with the video slice, and the video slice is identified by a port number and a DSCP value included in each MSDU of the plurality of PPDU.
  • the processing system also is configured to, for each video slice of the plurality of video slices, queue the MSDUs for transmission to the second device.
  • the method may be performed by a device, and may include obtaining, from a second device, one or more PPDUs associated with a video frame.
  • the second device renders a plurality of video frames to be provided to the device, and the second device splits each video frame of the plurality of video frames into a plurality of video slices.
  • the second device For each video slice of the plurality of video slices, the second device generates a plurality of PPDUs to include the video slice.
  • Each PPDU includes one or more MSDUs associated with the video slice, and the video slice is identified by a port number and a DSCP value included in each MSDU of the plurality of PPDU.
  • the second device queues the MSDUs for transmission to the device.
  • the device includes a processing system and an interface.
  • the interface is configured to obtain, from a second device, one or more PPDUs associated with a video frame.
  • the second device renders a plurality of video frames to be provided to the device, and the second device splits each video frame of the plurality of video frames into a plurality of video slices.
  • the second device For each video slice of the plurality of video slices, the second device generates a plurality of PPDUs to include the video slice.
  • Each PPDU includes one or more MSDUs associated with the video slice, and the video slice is identified by a port number and a DSCP value included in each MSDU of the plurality of PPDU.
  • the second device queues the MSDUs for transmission to the device.
  • the method may be performed by a device, and may include attempting to provide a plurality of PPDUs associated with one or more video frames of an XR experience to a second device and measuring one or more of a PPDU transmission latency associated with attempting to provide the plurality of PPDUs or a PPDU transmission drop associated with attempting to provide the plurality of PPDUs.
  • One or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • the device includes a processing system and an interface.
  • the interface is configured to attempt to provide a plurality of PPDUs associated with one or more video frames of the XR experience to a second device.
  • the processing system is configured to measure one or more of a PPDU transmission latency associated with attempting to provide the plurality of PPDUs or a PPDU transmission drop associated with attempting to provide the plurality of PPDUs.
  • One or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • the method may be performed by a device, and may include attempting to provide a plurality of pose data frames associated with one or more video frames of an XR experience to a second device and measuring one or more of a pose data frame transmission latency associated with attempting to provide the plurality of pose data frames or a pose data frame transmission drop associated with attempting to provide the plurality of pose data frames.
  • One or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • the device includes a processing system and an interface.
  • the interface is configured to attempt to provide a plurality of pose data frames associated with one or more video frames of an XR experience to a second device.
  • Processing system is configured to measure one or more of a pose data frame transmission latency associated with attempting to provide the plurality of pose data frames or a pose data frame transmission drop associated with attempting to provide the plurality of pose data frames.
  • One or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • the method may be performed by a wireless communication device, and may include communicating with a first device over a first wireless link and communicating with a second device over a second wireless link.
  • the wireless communication device communicates concurrently with the first device and the second device using one of multi-link operation (MLO) techniques or a TWT mode, and the wireless communication device is configured to give preference to communications on the second wireless link versus communications on the first wireless link.
  • MLO multi-link operation
  • the wireless communication device includes a processing system and an interface.
  • the interface is configured to communicate with a first device over a first wireless link and communicate with a second device over a second wireless link.
  • the wireless communication device communicates concurrently with the first device and the second device using one of MLO techniques or a TWT mode, and the wireless communication device is configured to give preference to communications on the second wireless link versus communications on the first wireless link.
  • Figure 1 A shows a pictorial diagram of an example wireless communication network.
  • Figure IB shows a pictorial diagram of a group of example devices to provide an extended reality (XR) experience.
  • XR extended reality
  • FIG. 2A shows an example protocol data unit (PDU) usable for communications between wireless communication devices.
  • PDU protocol data unit
  • Figure 2B shows an example field in the PDU of Figure 2A.
  • Figure 3 A shows another example PDU usable for communications between wireless communication devices.
  • Figure 3B shows another example PDU usable for communications between wireless communication devices.
  • FIG 4 shows an example physical layer convergence protocol (PLCP) protocol data unit (PPDU) usable for communications between wireless communication devices.
  • PLCP physical layer convergence protocol
  • PPDU protocol data unit
  • Figure 5 shows a block diagram of an example wireless communication device.
  • Figure 6A shows a block diagram of an example access point (AP).
  • AP access point
  • FIG. 6B shows a block diagram of an example station (STA).
  • STA station
  • Figure 7 shows a sequence diagram illustrating example motion-to- render-to-photon (M2R2P) operations.
  • Figure 8 shows a flowchart illustrating an example process for asynchronous channel access control according to some implementations.
  • Figure 9 shows a flowchart illustrating an example process for asynchronous channel access control according to some implementations.
  • Figure 10 shows a sequence diagram illustrating example transmissions between devices for an XR experience.
  • Figure 11 shows a flowchart illustrating an example process for synchronous channel access control based on a target wake time (TWT) session according to some implementations.
  • Figure 12 shows a flowchart illustrating an example process for synchronous channel access control based on a TWT session according to some implementations.
  • Figure 13 shows a sequence diagram illustrating example timings of pose data frames and rendering of video frames associated with a motion to render (M2R) latency.
  • Figure 14 shows a sequence diagram illustrating example timings of pose data frames and rendering of video frames associated with a M2R latency.
  • Figure 15A shows a block diagram illustrating an example of synchronizing the clocks of the render device and the display device.
  • Figure 15B shows a block diagram illustrating an example of synchronizing the clocks of the render device and the display device.
  • Figure 15C shows a block diagram illustrating an example of synchronizing the clocks of the render device and the display device.
  • Figure 16 shows a flowchart illustrating an example process for managing data for transmission according to some implementations.
  • Figure 17 shows a flowchart illustrating an example process for managing data for transmission according to some implementations.
  • Figure 18 shows a block diagram illustrating an example of generating queues for one or more video frames.
  • Figure 19 shows a flowchart illustrating an example process for generating feedback according to some implementations.
  • Figure 20 shows a flowchart illustrating an example process for generating feedback according to some implementations.
  • Figure 21 shows a block diagram of an example control field.
  • Figure 22 shows a flowchart illustrating an example process for supporting concurrent wireless links with multiple devices.
  • Figure 23 shows a sequence diagram illustrating example timings of XR activity and wireless activity with an AP.
  • the following description is directed to certain implementations for the purposes of describing innovative aspects of this disclosure.
  • RF radio frequency
  • IEEE 802.11 the Institute of Electrical and Electronics Engineers
  • the IEEE 802.15 the Bluetooth® standards as defined by the Bluetooth Special Interest Group (SIG), or the Long Term Evolution (LTE), 3G, 4G or 5G (New Radio (NR)) standards promulgated by the 3rd Generation Partnership Project (3GPP), among others.
  • SIIG Bluetooth Special Interest Group
  • LTE Long Term Evolution
  • 3GPP 3rd Generation Partnership Project
  • the described implementations can be implemented in any device, system or network that is capable of transmitting and receiving RF signals according to one or more of the following technologies or techniques: code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), single-user (SU) multiple-input multiple-output (MIMO), and multi-user (MU) MIMO.
  • CDMA code division multiple access
  • TDMA time division multiple access
  • FDMA frequency division multiple access
  • OFDMA orthogonal FDMA
  • SC-FDMA single-carrier FDMA
  • SU single-user
  • MIMO multiple-input multiple-output
  • MU multi-user MIMO.
  • the described implementations also can be implemented using other wireless communication protocols or RF signals suitable for use in one or more of a wireless personal area network (WPAN), a wireless local area network (WLAN), a wireless wide area network (WWAN), or an internet of
  • Various implementations relate generally to an architecture to provide an extended reality (XR) experience to a user.
  • Wireless data associated with an XR experience generally have strict latency and packet loss restrictions to prevent reduction of the user experience. For example, packets for video frames or other data (including audio, haptic commands, and so on) must be rendered, delivered to a displaying device (such as a head mounted display (HMD), a wearables display, or other such wireless or wired display devices), and displayed at near real time.
  • HMD head mounted display
  • IMU inertial measurement unit
  • Typical wireless systems are not designed under such restrictions for rendering, delivery, and displaying of data to ensure a minimum frame rate and to prevent synchronization issues, studder, or lag for an XR experience.
  • Some implementations more specifically relate to asynchronous channel access control of a wireless system for devices providing an XR experience.
  • a device may adjust the priority of one or more physical layer (PHY) convergence protocol (PLCP) data units (PPDUs) and may perform other operations to ensure control of the wireless medium at certain times while still allowing for other devices to communicate on the wireless medium.
  • PHY physical layer
  • PLCP convergence protocol
  • the device may adjust a backoff counter or adjust one or more enhanced distributed channel access (EDCA) parameters to ensure obtaining control of the wireless medium to transmit a first PPDU of an application file.
  • EDCA enhanced distributed channel access
  • the device may again adjust a backoff counter or adjust one or more EDCA parameters to allow other devices to obtain control of the wireless medium in certain scenarios (such as for a display device to provide information back to the device or for another device to transmit using the shared wireless medium).
  • EDCA enhanced distributed channel access
  • the render device ensuring control of the wireless medium may allow the render device to meet the latency requirements for the XR experience.
  • the render device allowing others to transmit on the wireless medium may allow the wireless medium to be shared in an environment with multiple devices in a basic service set (BSS) or with multiple overlapping basic service sets (OBSSs).
  • BSS basic service set
  • OBSSs overlapping basic service sets
  • the render device also may be configured to allow the display device to provide sensor measurement and other information to the render device for rendering to meet latency and other requirements for the XR experience.
  • a render device may use a target wake time (TWT) session to communicate with the display device during one or more TWT service periods (also referred to as windows).
  • TWT target wake time
  • the TWT session may be configured and managed to ensure control of the wireless medium to meet latency requirements or other requirements for the XR experience.
  • Use of TWT windows allows other devices to use the wireless medium outside of the TWT windows.
  • the render device ensuring control of the wireless medium via use of TWT may allow the render device to meet the latency requirements for the XR experience.
  • the render device allowing others to transmit on the wireless medium may allow the wireless medium to be shared in an environment with multiple devices.
  • a render device may manage application data to be transmitted, and the display device may manage application data received to ensure that latency requirements of the XR experience are met.
  • Management of the data includes generating, queuing, and identifying MSDUs for different video slices to be transmitted, flushing queues when appropriate, and indicating flushing to the display device.
  • Management at the display device may be flushing a reorder (REO) queue or otherwise managing the REO queue receiving the MSDUs from the render device.
  • REO reorder
  • Management of the queues may ensure that latency requirements of the XR experience are met. Management of the queues also may ensure the removal of stale data that may increase a communication latency between the render device and the display device.
  • a render device may attempt to transmit PPDUs to the display device and measure a latency or a packet drop associated with transmitting the PPDUs.
  • a display device may attempt to transmit pose data frames or tracking frames and measure a latency or a packet drop associated with transmitting the frames.
  • the render device and the display device may perform other measurements associated with the XR experience, and the measurements may indicate when the XR experience is being impacted (such as measuring an increase in latency or packet drop).
  • One or more parameters of the XR experience may be adjusted, and the measurements or the adjustments may be indicated to the other device.
  • Generating and providing feedback may allow the render device and the display device to determine when to adjust the XR experience to meet latency requirements or other requirements.
  • a render or relay device include concurrent links to an AP or STA and to a display device.
  • the render device or a relay device is to support the concurrent links while giving preference to communications with the display device (which may be associated with an XR experience having latency and other requirements).
  • the render or relay device may use an enhanced set of multi-link operation (MLO) techniques or TWT mode techniques to support the concurrent links and meet the latency and other requirements of the XR experience.
  • MLO multi-link operation
  • TWT mode techniques to support the concurrent links and meet the latency and other requirements of the XR experience.
  • FIG. 1 A shows a block diagram of an example wireless communication network 100.
  • the wireless communication network 100 can be an example of a wireless local area network (WLAN) such as a Wi-Fi network (and will hereinafter be referred to as WLAN 100).
  • WLAN wireless local area network
  • the WLAN 100 can be a network implementing at least one of the IEEE 802.11 family of standards (such as that defined by the IEEE 802.11-2016 specification or amendments thereof including, but not limited to, 802.11 ah, 802. llad, 802.1 lay, 802.1 lax, 802.11az, 802.11ba, and 802.1 lbe).
  • the WLAN 100 may include numerous wireless communication devices such as an access point (AP) 102 and multiple stations (STAs) 104. While only one AP 102 is shown, the WLAN network 100 also can include multiple APs 102.
  • Each of the STAs 104 also may be referred to as a mobile station (MS), a mobile device, a mobile handset, a wireless handset, an access terminal (AT), a user equipment (UE), a subscriber station (SS), or a subscriber unit, among other possibilities.
  • the STAs 104 may represent various devices such as mobile phones, personal digital assistant (PDAs), other handheld devices, netbooks, notebook computers, tablet computers, laptops, display devices (for example, TVs, computer monitors, navigation systems, HMDs, among others), music or other audio or stereo devices, remote control devices (“remotes”), printers, kitchen or other household appliances, key fobs (for example, for passive keyless entry and start (PKES) systems), among other possibilities.
  • PDAs personal digital assistant
  • netbooks notebook computers
  • tablet computers laptops
  • display devices for example, TVs, computer monitors, navigation systems, HMDs, among others
  • music or other audio or stereo devices for example, remote control devices (“remotes”), printers, kitchen
  • a single AP 102 and an associated set of STAs 104 may be referred to as a basic service set (BSS), which is managed by the respective AP 102.
  • BSS basic service set
  • Figure 1 A additionally shows an example coverage area 106 of the AP 102, which may represent a basic service area (BSA) of the WLAN 100.
  • the BSS may be identified to users by a service set identifier (SSID), as well as to other devices by a basic service set identifier (BSSID), which may be a medium access control (MAC) address of the AP 102.
  • SSID service set identifier
  • BSSID basic service set identifier
  • MAC medium access control
  • the AP 102 periodically broadcasts beacon frames (“beacons”) including the BSSID to enable any STAs 104 within wireless range of the AP 102 to “associate” or re-associate with the AP 102 to establish a respective communication link 108 (hereinafter also referred to as a “Wi-Fi link”), or to maintain a communication link 108, with the AP 102.
  • the beacons can include an identification of a primary channel used by the respective AP 102 as well as a timing synchronization function for establishing or maintaining timing synchronization with the AP 102.
  • the AP 102 may provide access to external networks to various STAs 104 in the WLAN via respective communication links 108.
  • STAs 104 is configured to perform passive or active scanning operations (“scans”) on frequency channels in one or more frequency bands (for example, the 2.4 GHz, 5.0 GHz, 6.0 GHz, or 60 GHz bands).
  • a STA 104 listens for beacons, which are transmitted by respective APs 102 at a periodic time interval referred to as the target beacon transmission time (TBTT) (measured in time units (TUs) where one TU may be equal to 1024 microseconds (ps)).
  • TBTT target beacon transmission time
  • TUs time units
  • ps microseconds
  • Each STA 104 may be configured to identify or select an AP 102 with which to associate based on the scanning information obtained through the passive or active scans, and to perform authentication and association operations to establish a communication link 108 with the selected AP 102.
  • the AP 102 assigns an association identifier (AID) to the STA 104 at the culmination of the association operations, which the AP 102 uses to track the STA 104.
  • AID association identifier
  • a STA 104 may have the opportunity to select one of many BSSs within range of the STA or to select among multiple APs 102 that together form an extended service set (ESS) including multiple connected BSSs.
  • An extended network station associated with the WLAN 100 may be connected to a wired or wireless distribution system that may allow multiple APs 102 to be connected in such an ESS.
  • a STA 104 can be covered by more than one AP 102 and can associate with different APs 102 at different times for different transmissions.
  • a STA 104 after association with an AP 102, a STA 104 also may be configured to periodically scan its surroundings to find a more suitable AP 102 with which to associate.
  • a STA 104 that is moving relative to its associated AP 102 may perform a “roaming” scan to find another AP 102 having more desirable network characteristics such as a greater received signal strength indicator (RSSI) or a reduced traffic load.
  • RSSI received signal strength indicator
  • STAs 104 may form networks without APs 102 or other equipment other than the STAs 104 themselves.
  • a network is an ad hoc network (or wireless ad hoc network).
  • Ad hoc networks may alternatively be referred to as mesh networks or peer-to-peer (P2P) networks.
  • P2P peer-to-peer
  • ad hoc networks may be implemented within a larger wireless network such as the WLAN 100.
  • the STAs 104 may be capable of communicating with each other through the AP 102 using communication links 108, STAs 104 also can communicate directly with each other via direct wireless links 110.
  • two STAs 104 may communicate via a direct communication link 110 regardless of whether both STAs 104 are associated with and served by the same AP 102.
  • one or more of the STAs 104 may assume the role filled by the AP 102 in a BSS.
  • Such a STA 104 may be referred to as a group owner (GO) and may coordinate transmissions within the ad hoc network.
  • Examples of direct wireless links 110 include Wi-Fi Direct connections, connections established by using a Wi-Fi Tunneled Direct Link Setup (TDLS) link, and other P2P group connections.
  • the APs 102 and STAs 104 may function and communicate (via the respective communication links 108) according to the IEEE 802.11 family of standards (such as that defined by the IEEE 802.11-2016 specification or amendments thereof including, but not limited to, 802.11 ah, 802.1 lad, 802.1 lay, 802.1 lax, 802.1 laz,
  • the APs 102 and STAs 104 transmit and receive wireless communications (hereinafter also referred to as “Wi-Fi communications”) to and from one another in the form of physical layer convergence protocol (PLCP) protocol data units (PPDUs).
  • the APs 102 and STAs 104 in the WLAN 100 may transmit PPDUs over an unlicensed spectrum, which may be a portion of spectrum that includes frequency bands traditionally used by Wi-Fi technology, such as the 2.4 GHz band, the 5.0 GHz band, the 60 GHz band, the 3.6 GHz band, and the 900 MHz band.
  • Some implementations of the APs 102 and STAs 104 described herein also may communicate in other frequency bands, such as the 6.0 GHz band, which may support both licensed and unlicensed communications.
  • the APs 102 and STAs 104 also can be configured to communicate over other frequency bands such as shared licensed frequency bands, where multiple operators may have a license to operate in the same or overlapping frequency band or bands.
  • Each of the frequency bands may include multiple sub-bands or frequency channels.
  • PPDUs conforming to the IEEE 802.1 In, 802.1 lac, and 802.1 lax standard amendments may be transmitted over the 2.4 and 5.0 GHz bands, each of which is divided into multiple 20 MHz channels.
  • these PPDUs are transmitted over a physical channel having a minimum bandwidth of 20 MHz, but larger channels can be formed through channel bonding.
  • PPDUs may be transmitted over physical channels having bandwidths of 40 MHz, 80 MHz, 160, or 320 MHz by bonding together multiple 20 MHz channels.
  • Each PPDU is a composite structure that includes a PHY preamble and a payload in the form of a PLCP service data unit (PSDU).
  • the information provided in the preamble may be used by a receiving device to decode the subsequent data in the PSDU.
  • the preamble fields may be duplicated and transmitted in each of the multiple component channels.
  • the PHY preamble may include both a legacy portion (or “legacy preamble”) and a non-legacy portion (or “non-legacy preamble”).
  • the legacy preamble may be used for packet detection, automatic gain control and channel estimation, among other uses.
  • the legacy preamble also may generally be used to maintain compatibility with legacy devices.
  • a PPDU also may be referred to as a physical layer protocol data unit or a PHY protocol data unit.
  • devices may communicate directly with one another.
  • a render device may communicate directly with a display device.
  • video frame data, audio data, and sensor information may be communicated between the devices providing the XR experience to allow rendering and displaying video (or providing audio or other data) synchronized with movements of the display device.
  • an XR experience refers to one or more of a virtual reality (VR, in which a user may be isolated from the real world), augmented reality (AR, in which a virtual world or virtual information may be overlaid on real world objects in a display), or mixed reality (MR, in which virtual objects and real world objects may be integrated into a display that may appear seamless to the user) experience that may be provided to a user.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • the user may have a HMD (or other display device), audio device, or a haptic feedback device (such as haptic gloves or vest) that is synchronized to provide information to the user based on the user’s movements.
  • Figure IB shows a pictorial diagram of a group 150 of example devices to provide an XR experience.
  • the device 152 is a display device to display video frames of the XR experience. While the device 152 is depicted as an HMD, the device 152 may be any suitable display device, such as a tablet or smartphone or eyeglasses.
  • the device 154 is a device to provide the video frames to the display device 152 over wireless communication link 156.
  • the device 154 may be an example implementation of a STA 104 in Figure 1A.
  • the device 154 may be configured as a software enabled AP (soft AP or SAP) for the display device 152.
  • the device 154 is a render device configured to render the video frames.
  • the device 154 is a bridge device to obtain video frames rendered by a cloud server behind an AP 158 or the AP 158 itself over a wireless communication link 160 and provide the video frames to the device 152. While the device 154 is depicted as a smartphone, the device 154 may be any suitable device for rendering or providing frames, such as a desktop computer, laptop computer, or tablet. While the device 154 is depicted as communicating with the AP 158, in some implementations, the device 154 communicates with another STA in a peer-to-peer (P2P) mode. In the provided examples, an AP may refer to a STA communicably coupled to the device 154 in the P2P mode.
  • P2P peer-to-peer
  • the device 152 or the device 154 is within wireless communication range (also referred to as within range) of one or more other devices.
  • one or more of the devices 152 or 154 may be within range of device 162.
  • the device 162 may be in the BSS of the AP 158 or may be in an other BSS (OBSS) associated with a different AP.
  • OBSS BSS
  • the device 162 is configured to communicate on the same wireless medium as the device 154 (such as in the same frequency band, on the same channel, or on an overlapping channel). In this manner, the devices 154 and 162 share the wireless medium to communicate with other devices.
  • an application file may be a smallest segment of data that is recognizable by a receiving device at an application layer.
  • a receiving device is unable to process any portion of the application file at an application layer unless all portions of the application file are obtained and joined to generate the entire application file.
  • An example application file includes a video slice of a video frame.
  • a video slice also may be referred to as a video frame slice or a slice.
  • each frame may be divided into a plurality of slices (such as a group of pixel columns, pixel rows, macroblocks, or other portions of a video frame), and each video slice may be packaged into one or more PPDUs.
  • an application file may refer to other portions (such as a video frame) or may include other data (such as audio, haptic information, a document, application scripts, and so on).
  • the device 152 may be other than a display device (such as an audio device or haptic feedback device).
  • the present disclosure is not limited to the management and transport of video data and devices specific to such.
  • FIG. 1 shows an example protocol data unit (PDU) 200 usable for communications between wireless communication devices.
  • PDU protocol data unit
  • the PDU 200 can be configured as a PPDU.
  • the PDU 200 includes a PHY preamble 202 and a PHY payload 204.
  • the PHY preamble 202 may include a legacy portion that itself includes a legacy short training field (L-STF) 206, a legacy long training field (L-LTF) 208, and a legacy signaling field (L-SIG) 210.
  • the PHY preamble 202 also may include a non-legacy portion (non-legacy fields 212).
  • the L- STF 206 generally enables a receiving device to perform automatic gain control (AGC) and coarse timing and frequency estimation.
  • AGC automatic gain control
  • the L-LTF 208 generally enables a receiving device to perform fine timing and frequency estimation and also to estimate the wireless channel.
  • the L-SIG 210 generally enables a receiving device to determine a duration of the PDU and use the determined duration to avoid transmitting on top of the PDU.
  • the L-STF 206, the L-LTF 208, and the L-SIG 210 may be modulated according to a binary phase shift keying (BPSK) modulation scheme.
  • the payload 204 includes data 214.
  • the payload 204 may be modulated according to a BPSK modulation scheme, a quadrature BPSK (Q-BPSK) modulation scheme, a quadrature amplitude modulation (QAM) modulation scheme, or another appropriate modulation scheme.
  • the payload 204 may generally carry higher layer data, for example, in the form of medium access control (MAC) protocol data units (MPDUs) or aggregated MPDUs (A-MPDUs).
  • MAC medium access control
  • MPDUs
  • Figure 2B shows an example L-SIG field 220 in the PDU of Figure 2A.
  • the L-SIG 220 includes a data rate field 222, a reserved bit 224, a length field 226, a parity bit 228, and a tail field 220.
  • the data rate field 222 indicates a data rate (note that the data rate indicated in the data rate field 222 may not be the actual data rate of the data carried in the payload 204).
  • the length field 226 indicates a length of the packet in units of, for example, bytes.
  • the parity bit 228 is used to detect bit errors.
  • the tail field 230 includes tail bits that are used by the receiving device to terminate operation of a decoder (for example, a Viterbi decoder). The receiving device utilizes the data rate and the length indicated in the data rate field 222 and the length field 226 to determine a duration of the packet in units of, for example, microseconds (ps).
  • FIG. 3 A shows another example PDU 300 usable for wireless communication between wireless communication devices.
  • the PDU 300 may be used for SU, OFDMA or MU-MIMO transmissions.
  • the PDU 300 may be formatted as a High Efficiency (HE) WLAN PPDET in accordance with the IEEE 802.1 lax amendment to the IEEE 802.11 wireless communicate on protocol standard.
  • the PDU 300 includes a PHY preamble including a legacy portion 302 and a non-legacy portion 304.
  • the PDU 300 may further include a PHY payload 306 after the preamble, for example, in the form of a PSDU including a data field 324.
  • the legacy portion 302 of the preamble includes an L-STF 308, an L-
  • the non-legacy portion 304 includes a repetition of L-SIG (RL-SIG) 314, a first HE signal field (HE-SIG-A) 316, an HE short training field (HE- STF) 320, and one or more HE long training fields (or symbols) (HE-LTFs) 322.
  • the second portion 304 further includes a second HE signal field (HE-SIG-B) 318 encoded separately from HE-SIG-A 316.
  • RL-SIG 314 and HE- SIG-A 316 may be duplicated and transmitted in each of the component 20 MHz channels in instances involving the use of a bonded channel.
  • the content in HE-SIG-B 318 may be unique to each 20 MHz channel and target specific STAs 104.
  • RL-SIG 314 may indicate to HE-compatible STAs 104 that the PDU 300 is an HE PPDU.
  • An AP 102 may use HE-SIG-A 316 to identify and inform multiple STAs 104 that the AP has scheduled UL or DL resources for them.
  • HE- SIG-A 316 may include a resource allocation subfield that indicates resource allocations for the identified STAs 104.
  • HE-SIG-A 316 may be decoded by each HE-compatible STA 104 served by the AP 102.
  • HE-SIG-A 316 further includes information usable by each identified STA 104 to decode an associated HE-SIG-B 318.
  • HE-SIG-A 316 may indicate the frame format, including locations and lengths of HE-SIG-Bs 318, available channel bandwidths and modulation and coding schemes (MCSs), among other examples.
  • MCSs modulation and coding schemes
  • HE-SIG-A 316 also may include HE WLAN signaling information usable by STAs 104 other than the identified STAs 104.
  • HE-SIG-B 318 may carry STA-specific scheduling information such as, for example, STA-specific (or “user-specific”) MCS values and STA-specific RU allocation information. In the context of DL MU-OFDMA, such information enables the respective STAs 104 to identify and decode corresponding resource units (RUs) in the associated data field 324.
  • Each HE-SIG-B 318 includes a common field and at least one STA-specific field. The common field can indicate RU allocations to multiple STAs 104 including RU assignments in the frequency domain, indicate which RUs are allocated for MU-MIMO transmissions and which RUs correspond to MU-OFDMA transmissions, and the number of users in allocations, among other examples.
  • the common field may be encoded with common bits, CRC bits, and tail bits.
  • the user-specific fields are assigned to particular STAs 104 and may be used to schedule specific RUs and to indicate the scheduling to other WLAN devices.
  • Each user-specific field may include multiple user block fields.
  • Each user block field may include two user fields that contain information for two respective STAs to decode their respective RU payloads in data field 324.
  • FIG. 3B shows another example PPDU 350 usable for wireless communication between wireless communication devices.
  • the PDU 350 may be used for SU, OFDMA or MU-MIMO transmissions.
  • the PDU 350 may be formatted as an Extreme High Throughput (EHT) WLAN PPDU in accordance with the IEEE 802.1 lbe amendment to the IEEE 802.11 wireless communication protocol standard, or may be formatted as a PPDU conforming to any later (post-EHT) version of a new wireless communication protocol conforming to a future IEEE 802.11 wireless communication protocol standard or other wireless communication standard.
  • the PDU 350 includes a PHY preamble including a legacy portion 352 and a non-legacy portion 354.
  • the PDU 350 may further include a PHY payload 356 after the preamble, for example, in the form of a PSDU including a data field 374.
  • the legacy portion 352 of the preamble includes an L-STF 358, an L-
  • the non-legacy portion 354 of the preamble includes an RL-SIG 364 and multiple wireless communication protocol version-dependent signal fields after RL-SIG 364.
  • the non-legacy portion 354 may include a universal signal field 366 (referred to herein as “U-SIG 366”) and an EHT signal field 368 (referred to herein as “EHT-SIG 368”).
  • U-SIG 366 and EHT-SIG 368 may be structured as, and carry version-dependent information for, other wireless communication protocol versions beyond EHT.
  • the non-legacy portion 354 further includes an additional short training field 370 (referred to herein as “EHT-STF 370,” although it may be structured as, and carry version-dependent information for, other wireless communication protocol versions beyond EHT) and one or more additional long training fields 372 (referred to herein as “EHT-LTFs 372,” although they may be structured as, and carry version-dependent information for, other wireless communication protocol versions beyond EHT).
  • EHT-STF 370 additional short training field 370
  • EHT-LTFs 372 additional long training fields 372
  • the information in U-SIG 366 and EHT-SIG 368 may be duplicated and transmitted in each of the component 20 MHz channels in instances involving the use of a bonded channel.
  • EHT-SIG 368 may additionally or alternatively carry information in one or more non-primary 20 MHz channels that is different than the information carried in the primary 20 MHz channel.
  • EHT-SIG 368 may include one or more jointly encoded symbols and may be encoded in a different block from the block in which U-SIG 366 is encoded. EHT-SIG 368 may be used by an AP to identify and inform multiple STAs 104 that the AP has scheduled UL or DL resources for them. EHT-SIG 368 may be decoded by each compatible STA 104 served by the AP 102. EHT-SIG 368 may generally be used by a receiving device to interpret bits in the data field 376. For example, EHT-SIG 368 may include RU allocation information, spatial stream configuration information, and per-user signaling information such as MCSs, among other examples.
  • EHT-SIG 368 may further include a cyclic redundancy check (CRC) (for example, four bits) and a tail (for example, 6 bits) that may be used for binary convolutional code (BCC).
  • CRC cyclic redundancy check
  • BCC binary convolutional code
  • EHT-SIG 368 may include one or more code blocks that each include a CRC and a tail. In some aspects, each of the code blocks may be encoded separately.
  • EHT-SIG 368 may carry STA-specific scheduling information such as, for example, user-specific MCS values and user-specific RU allocation information.
  • EHT-SIG 368 may generally be used by a receiving device to interpret bits in the data field 376.
  • Each EHT-SIG 368 may include a common field and at least one user-specific field.
  • the common field can indicate RU distributions to multiple STAs 104, indicate the RU assignments in the frequency domain, indicate which RUs are allocated for MU-MIMO transmissions and which RUs correspond to MU-OFDMA transmissions, and the number of users in allocations, among other examples.
  • the common field may be encoded with common bits, CRC bits, and tail bits.
  • the user-specific fields are assigned to particular STAs 104 and may be used to schedule specific RUs and to indicate the scheduling to other WLAN devices.
  • Each user-specific field may include multiple user block fields. Each user block field may include, for example, two user fields that contain information for two respective STAs to decode their respective RU payloads.
  • RL-SIG 364 and U-SIG 366 may indicate to EHT- or later version-compliant STAs 104 that the PPDU 350 is an EHT PPDU or a PPDU conforming to any later (post-EHT) version of a new wireless communication protocol conforming to a future IEEE 802.11 wireless communication protocol standard.
  • U-SIG 366 may be used by a receiving device to interpret bits in one or more of EHT-SIG 368 or the data field 376.
  • FIG. 4 shows an example PPDU 400 usable for communications between an AP 102 and a number of STAs 104.
  • each PPDU 400 includes a PHY preamble 402 and a PSDU 404.
  • Each PSDU 404 may carry one or more MAC protocol data units (MPDUs).
  • MPDUs MAC protocol data units
  • each PSDU 404 may carry an aggregated MPDU (A-MPDU) 408 that includes an aggregation of multiple A-MPDU subframes 406.
  • A-MPDU subframe 406 may include a MAC delimiter 410 and a MAC header 412 prior to the accompanying MPDU 414, which includes the data portion (“payload” or “frame body”) of the A-MPDU subframe 406.
  • the MPDU 414 may carry one or more MAC service data unit (MSDU) subframes 416.
  • the MPDU 414 may carry an aggregated MSDU (A-MSDU) 418 including multiple MSDU subframes 416.
  • A-MSDU aggregated MSDU
  • Each MSDU subframe 416 contains a corresponding MSDU 420 preceded by a subframe header 422.
  • the MAC header 412 may include a number of fields containing information that defines or indicates characteristics or attributes of data encapsulated within the frame body 414.
  • the MAC header 412 also includes a number of fields indicating addresses for the data encapsulated within the frame body 414.
  • the MAC header 412 may include a combination of a source address, a transmitter address, a receiver address, or a destination address.
  • the MAC header 412 may include a frame control field containing control information.
  • the frame control field specifies the frame type, for example, a data frame, a control frame, or a management frame.
  • the MAC header 412 may further include a duration field indicating a duration extending from the end of the PPDU until the end of an acknowledgment (ACK) of the last PPDU to be transmitted by the wireless communication device (for example, a block ACK (BA) in the case of an A- MPDU).
  • ACK acknowledgment
  • BA block ACK
  • the use of the duration field serves to reserve the wireless medium for the indicated duration, thus establishing the NAV.
  • Each A-MPDU subframe 406 also may include a frame check sequence (FCS) field 424 for error detection.
  • the FCS field 424 may include a cyclic redundancy check (CRC).
  • APs 102 and STAs 104 can support multi-user
  • MU multi-user multiple-input, multiple-output
  • MU-OFDMA multi-user orthogonal frequency division multiple access
  • the available frequency spectrum of the wireless channel may be divided into multiple resource units (RUs) each including a number of different frequency subcarriers (“tones”).
  • RUs may be allocated or assigned by an AP 102 to different STAs 104 at particular times.
  • the sizes and distributions of the RUs may be referred to as an RU allocation.
  • RUs may be allocated in 2 MHz intervals, and as such, the smallest RU may include 26 tones consisting of 24 data tones and 2 pilot tones. Consequently, in a 20 MHz channel, up to 9 RUs (such as 2 MHz, 26-tone RUs) may be allocated (because some tones are reserved for other purposes).
  • Adjacent RUs may be separated by a null subcarrier (such as a DC subcarrier), for example, to reduce interference between adjacent RUs, to reduce receiver DC offset, and to avoid transmit center frequency leakage.
  • a null subcarrier such as a DC subcarrier
  • an AP 102 can transmit a trigger frame to initiate and synchronize an UL MU-OFDMA or UL MU-MIMO transmission from multiple STAs 104 to the AP 102.
  • trigger frames may thus enable multiple STAs 104 to send UL traffic to the AP 102 concurrently in time.
  • a trigger frame may address one or more STAs 104 through respective association identifiers (AIDs), and may assign each AID (and thus each STA 104) one or more RUs that can be used to send UL traffic to the AP 102.
  • the AP also may designate one or more random access (RA) RUs for which unscheduled STAs 104 may contend.
  • RA random access
  • FIG 5 shows a block diagram of an example wireless communication device 500.
  • the wireless communication device 500 can be an example of a device for use in a STA such as one of the STAs 104 described above with reference to Figure 1 A or one or both of the devices 152 or 154 with reference to Figure IB.
  • the wireless communication device 500 can be an example of a device for use in an AP such as the AP 102 described above with reference to Figure 1 A.
  • the wireless communication device 500 is capable of transmitting (or outputting for transmission) and receiving wireless communications (for example, in the form of wireless packets).
  • the wireless communication device can be configured to transmit and receive packets in the form of PPDUs and MPDUs conforming to an IEEE 802.11 standard, such as that defined by the IEEE 802.11-2016 specification or amendments thereof including, but not limited to, 802.1 lah, 802.1 lad, 802.1 lay, 802.1 lax, 802.1 laz, 802.11ba, and 802.11be.
  • IEEE 802.11 standard such as that defined by the IEEE 802.11-2016 specification or amendments thereof including, but not limited to, 802.1 lah, 802.1 lad, 802.1 lay, 802.1 lax, 802.1 laz, 802.11ba, and 802.11be.
  • the wireless communication device 500 can be, or can include, a chip, system on chip (SoC), chipset, package, or device that includes one or more modems 502, for example, a Wi-Fi (IEEE 802.11 compliant) modem.
  • the one or more modems 502 (collectively “the modem 502”) additionally include a WWAN modem (for example, a 3GPP 4G LTE or 5G compliant modem).
  • the wireless communication device 500 also includes one or more radios 504 (collectively “the radio 504”).
  • the wireless communication device 506 further includes one or more processors, processing blocks or processing elements 506 (collectively “the processor 506”), and one or more memory blocks or elements 508 (collectively “the memory 508”).
  • the modem 502 can include an intelligent hardware block or device such as, for example, an application-specific integrated circuit (ASIC) among other possibilities.
  • the modem 502 is generally configured to implement a PHY layer.
  • the modem 502 is configured to modulate packets and to output the modulated packets to the radio 504 for transmission over the wireless medium.
  • the modem 502 is similarly configured to obtain modulated packets received by the radio 504 and to demodulate the packets to provide demodulated packets.
  • the modem 502 may further include digital signal processing (DSP) circuitry, automatic gain control (AGC), a coder, a decoder, a multiplexer, and a demultiplexer.
  • DSP digital signal processing
  • AGC automatic gain control
  • data obtained from the processor 506 is provided to a coder, which encodes the data to provide encoded bits.
  • the encoded bits are mapped to points in a modulation constellation (using a selected MCS) to provide modulated symbols.
  • the modulated symbols may be mapped to a number Nss of spatial streams or a number NSTS of space-time streams.
  • the modulated symbols in the respective spatial or space-time streams may be multiplexed, transformed via an inverse fast Fourier transform (IFFT) block, and subsequently provided to the DSP circuitry for Tx windowing and filtering.
  • the digital signals may be provided to a digital-to-analog converter (DAC).
  • the resultant analog signals may be provided to a frequency upconverter, and ultimately, the radio 504.
  • the modulated symbols in the respective spatial streams are precoded via a steering matrix prior to their provision to the IFFT block.
  • digital signals received from the radio 504 are provided to the DSP circuitry, which is configured to acquire a received signal, for example, by detecting the presence of the signal and estimating the initial timing and frequency offsets.
  • the DSP circuitry is further configured to digitally condition the digital signals, for example, using channel (narrowband) filtering, analog impairment conditioning (such as correcting for I/Q imbalance), and applying digital gain to ultimately obtain a narrowband signal.
  • the output of the DSP circuitry may be fed to the AGC, which is configured to use information extracted from the digital signals, for example, in one or more received training fields, to determine an appropriate gain.
  • the output of the DSP circuitry also is coupled with the demodulator, which is configured to extract modulated symbols from the signal and, for example, compute the logarithm likelihood ratios (LLRs) for each bit position of each subcarrier in each spatial stream.
  • the demodulator is coupled with the decoder, which may be configured to process the LLRs to provide decoded bits.
  • the decoded bits from all of the spatial streams are fed to the demultiplexer for demultiplexing.
  • the demultiplexed bits may be descrambled and provided to the MAC layer (the processor 506) for processing, evaluation, or interpretation.
  • the radio 504 generally includes at least one radio frequency (RF) transmitter (or “transmitter chain”) and at least one RF receiver (or “receiver chain”), which may be combined into one or more transceivers.
  • the RF transmitters and receivers may include various DSP circuitry including at least one power amplifier (PA) and at least one low-noise amplifier (LNA), respectively.
  • PA power amplifier
  • LNA low-noise amplifier
  • the RF transmitters and receivers may in turn be coupled to one or more antennas.
  • the wireless communication device 500 can include, or be coupled with, multiple transmit antennas (each with a corresponding transmit chain) and multiple receive antennas (each with a corresponding receive chain).
  • the symbols output from the modem 502 are provided to the radio 504, which transmits the symbols via the coupled antennas. Similarly, symbols received via the antennas are obtained by the radio 504, which provides the symbols to the modem 502.
  • the processor 506 can include an intelligent hardware block or device such as, for example, a processing core, a processing block, a central processing unit (CPU), a microprocessor, a microcontroller, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a programmable logic device (PLD) such as a field programmable gate array (FPGA), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • FPGA field programmable gate array
  • the processor 506 processes information received through the radio 504 and the modem 502, and processes information to be output through the modem 502 and the radio 504 for transmission through the wireless medium.
  • the processor 506 may implement a control plane and MAC layer configured to perform various operations related to the generation and transmission of MPDUs, frames, or packets.
  • the MAC layer is configured to perform or facilitate the coding and decoding of frames, spatial multiplexing, space-time block coding (STBC), beamforming, and OFDMA resource allocation, among other operations or techniques.
  • the processor 506 may generally control the modem 502 to cause the modem to perform various operations described above.
  • the memory 504 can include tangible storage media such as random- access memory (RAM) or read-only memory (ROM), or combinations thereof.
  • the memory 504 also can store non-transitory processor- or computer-executable software (SW) code containing instructions that, when executed by the processor 506, cause the processor to perform various operations described herein for wireless communication, including the generation, transmission, reception, and interpretation of MPDUs, frames or packets.
  • SW computer-executable software
  • FIG. 6A shows a block diagram of an example AP 602.
  • the AP 602 can be an example implementation of the AP 102 described with reference to Figure 1 A or the AP 158 described with reference to Figure IB.
  • the AP 602 includes a wireless communication device (WCD) 610.
  • WCD wireless communication device
  • the wireless communication device 610 may be an example implementation of the wireless communication device 500 described with reference to Figure 5.
  • the AP 602 also includes multiple antennas 620 coupled with the wireless communication device 610 to transmit and receive wireless communications.
  • the AP 602 additionally includes an application processor 630 coupled with the wireless communication device 610, and a memory 640 coupled with the application processor 630.
  • the AP 602 further includes at least one external network interface 650 that enables the AP 602 to communicate with a core network or backhaul network to gain access to external networks including the Internet.
  • the external network interface 650 may include one or both of a wired (for example, Ethernet) network interface and a wireless network interface (such as a WWAN interface).
  • a wired (for example, Ethernet) network interface and a wireless network interface (such as a WWAN interface).
  • Ones of the aforementioned components can communicate with other ones of the components directly or indirectly, over at least one bus.
  • the AP 602 further includes a housing that encompasses the wireless communication device 610, the application processor 630, the memory 640, and at least portions of the antennas 620 and external network interface 650.
  • FIG. 6B shows a block diagram of an example STA 604.
  • the STA 604 can be an example implementation of the STA 104 described with reference to Figure 1A or one or more of the devices 152 or 154 described with reference to Figure IB.
  • the STA 604 includes a wireless communication device 615.
  • the wireless communication device 615 may be an example implementation of the wireless communication device 500 described with reference to Figure 5.
  • the STA 604 also includes one or more antennas 624 coupled with the wireless communication device 615 to transmit and receive wireless communications.
  • the STA 604 additionally includes an application processor 635 coupled with the wireless communication device 615, and a memory 645 coupled with the application processor 635.
  • the STA 604 further includes a user interface (UI) 655 (such as a touchscreen or keypad) and a display 665, which may be integrated with the UI 655 to form a touchscreen display.
  • UI user interface
  • the STA 604 may further include one or more sensors 675 such as, for example, one or more inertial sensors, accelerometers, temperature sensors, pressure sensors, or altitude sensors.
  • the STA 604 further includes a housing that encompasses the wireless communication device 615, the application processor 635, the memory 645, and at least portions of the antennas 624, UI 655, and display 665.
  • the STA 604 may be, may include, or may be coupled to a HMD or other device used to provide an XR experience to a user (such as an audio device (including headphones), haptic gloves, a haptic vest, and so on). If the STA 604 includes a HMD, the display 665 may be integrated into glasses, a monocle, or other head mounted unit. In this manner, the STA 604 may be a display device for an XR experience.
  • the sensors 675 may include an inertial measurement unit (IMU), which may include a gyroscope, an accelerometer, or other sensors to determine an orientation or motion of the STA 604.
  • IMU inertial measurement unit
  • the IMU may be included in an HMD to measure the orientation or motion of the HMD as a user moves his or her head.
  • the orientation information (and, optionally, the motion information) of the device may be referred to as pose information.
  • the IMU measures the pose information at a defined interval (such as every 100 microseconds (ps)).
  • the display device (such as a HMD) includes one or more cameras in its sensors 675.
  • the one or more cameras may be used by the display device to generate tracking frames or other tracking information to be used by the render device in generating video frames.
  • the tracking information may include markers or points in the user’s field of view (FOV) in the HMD used to determine locations of information or objects in the video frames.
  • markers indicating a location of a building door may be used to render a video frame to overlay a name of a business, open hours, and so on near the door in the user’s FOV.
  • the UL traffic from the display device to the render device may include pose data frames (also referred to as pose frames) and tracking frames. While rendering is described in the examples as being based on a pose frame, rendering also may be based on one or more tracking frames from the display device.
  • the STA 604 may be configured as a SAP to the device 152.
  • the software to cause the STA 604 to act as a SAP may be stored in memory 645 (or another suitable memory of the STA 604) and executed by the application processor 635. In this manner, the STA 604 (acting as a SAP) may perform some functionalities of the AP 602.
  • the render device for an XR experience may be the device 154 (which may be a STA 604 acting as a SAP), the AP 158 (with the device 154 acting as a bridge device), or a combination of both.
  • the AP 602 may be referred to in the examples herein in describing a render device for an XR experience only for clarity and simplicity in explaining aspects of the present disclosure.
  • the display device for an XR experience is the device 152.
  • the STA 604 may be referred to in the examples herein in describing a display device for an XR experience.
  • the render device and the display device may communicate over a 6 GHz frequency band (such as to transmit PPDUs or pose packets between the devices).
  • the downlink (DL) transmission rate from the render device to the display device may be at least 50 megabits per second (Mbps) to 120 Mbps to support 45 to 60 frames per second (fps), and the uplink (UL) transmission rate from the display device to the render device may be a minimum of 0.5 Mbps to 20 Mbps.
  • the devices may be configured such that one way latency (such as on the DL or on the UL) is less than 10 ms.
  • the application processor 630 or 635 may be configured to execute one or more host layers of the open systems interconnection (OSI) model, including the application layer.
  • the application layer of the render device may refer to the application processor 630 or other components performing operations at the application layer.
  • the application layer for an XR experience at the render device may include rendering for an XR application executed by the render device.
  • the application layer of the display device may refer to the application processor 635 or other components performing operations at the application layer.
  • the application layer for an XR experience at the display device may include displaying a video for an XR application executed by the display device.
  • the WCD 610 or 615 may be configured to execute one or more layers of the OSI model, such as the network layer, the media access control layer (MAC), or the physical layer (PHY).
  • layers may be referred to as a wireless layer or a Wi-Fi layer.
  • the wireless layer of the render device may refer to the WCD 610 performing operations at the wireless layer.
  • the wireless layer of the display device may refer to the WCD 615 performing operations at the wireless layer.
  • the process from measuring pose information to render a video frame based on the pose information may be referred to as motion-to-render (M2R).
  • the process from rendering the video frame to displaying the video frame may be referred to as render-to-photon (R2P).
  • the overall process from measuring pose information to rendering a video frame using the pose information to displaying the video frame may be referred to as motion-to-render-to-photon (M2R2P).
  • the render device and the display device are configured such that M2R2P is less than 70 ms (such as between 50 and 70 ms).
  • FIG. 7 shows a sequence diagram 700 illustrating example M2R2P operations.
  • the example M2R2P operations are with reference to one video frame, and the M2R2P operations may be repeated for each video frame.
  • the timings of operations are not drawn to scale, and some operations may be slightly later than previous operations or may take a longer or shorter amount of time than as depicted.
  • the IMU of the display device generates a plurality of IMU measurements (702). Each line indicates an IMU measurement. As depicted, the IMU measurements may be periodic without reference to when video frames are to be rendered or displayed. For a video frame to be rendered and displayed, the display device generates an IMU measurement packet (704) from one of the IMU measurements. The IMU measurement that is selected may be the current IMU measurement when a request for the packet is obtained from the render device or when the packet is to be generated based on a defined periodicity for the packets. After generating the IMU measurement packet, the display device may transmit the packet to the render device during an UL transmission 706 of a transmit/receive window 708.
  • the render device may render and encode the video frame (710) based on the pose information (or other information) obtained in the IMU measurement packet.
  • the render device may generate one or more PPDUs to include the encoded video frame data and transmit the one or more PPDUs to the display device during a DL transmission 712 of a transmit/receive window 708.
  • the timing of the UL and DL transmissions 706 and 712 include over the air (OTA) time and processing time at the wireless layer.
  • the display device may decode the video frame (714).
  • the display device may use a jitter buffer (716) to capture decoded frame information and smooth video frame generation times to prevent jitter in the video.
  • the jitter buffer is used to ensure a fixed playback schedule of video frames by removing variations in decoding times.
  • the display device may perform asynchronous time warping (ATW) on one or more video frames (718) to increase the perceived frame rate of the displayed video or compensate for latency between when the IMU measurement occurs and when the video frame is displayed.
  • the display device displays the video frame (720), which is a video frame based on the selected IMU measurement at 702.
  • an IMU measurement may be referred to as pose information
  • an IMU measurement packet may be referred to as a pose packet or pose data packet.
  • Pose information may include an orientation of the device with reference to the azimuth, a movement of the device, a location of the device, or other characteristics of a particular position of the device used to generate appropriate video frames to be displayed by the device.
  • the pose packet is depicted as being generated once for each video frame.
  • the pose packet may be generated at a higher frequency than once for each video frame. For example, a 60 fps video corresponds to 16.67 ms per video frame, and the display device may generate a pose packet every 2 ms. In this manner, at least 8 pose packets may be generated during display of one video frame.
  • the display device may select the most recent pose packet when a pose packet is to be provided to the render device.
  • the render device may generate a video frame based on the obtained pose packet.
  • a most recent pose packet may not be obtained by the render device because of a DL transmission from the render device to the display device.
  • the render device may use the last obtained pose packet to generate a video frame.
  • the display device may provide a plurality of pose packets during rendering and encoding a previous video frame.
  • sending of the pose packets is synchronized with rendering and encoding. In this manner, timing of sending the pose packet is ensured so that the render device obtains the pose packet to render a next video frame.
  • M2R2P latency requirements of an XR experience may cause the wireless system to provide preference to the render device and the display device to obtain control of the wireless medium.
  • Preference to the render device and the display device may be with reference to other devices in the same BSS or one or more devices from OBSSs within range of the render device or the display device.
  • Obtaining control of the wireless medium may refer to a device (such as the render device) obtaining control of one or more wireless channels for transmission or reception.
  • Access to one or more wireless channels may be asynchronous (in which devices contend for the wireless medium during contention windows) or synchronous (in which one or more APs coordinate access to the wireless medium between devices).
  • providing preference may include adjusting one or more contention parameters (such as a backoff timer value or one or more EDCA parameters) or coordinating transmission / reception windows between the render device and the display device to prevent interference by one or more other devices transmitting on the wireless medium.
  • Asynchronous control of the wireless medium may be referred to as asynchronous channel access control.
  • Synchronous control of the wireless medium may be referred to as synchronous channel access control.
  • the wireless system also may be configured to prevent collisions between DL and UL traffic between the render device and the display device, prevent interference between links between the render device and an AP and the render device and the display device, or allow power saving at the render device or the display device based on channel access control.
  • FIG 8 shows a flowchart illustrating an example process 800 for asynchronous channel access control according to some implementations.
  • Control of the wireless medium and other operations in the example process may be performed by a device (such as the device 154 in Figure IB) or a wireless communication device implemented in the device (such as the WCD 615 in Figure 6B).
  • the device performing the operations may be a render device of an XR experience.
  • the device obtains control of a wireless medium.
  • Control of the wireless medium is associated with a first priority of transmitting, over the wireless medium, a first PPDU of an application file from the device to a second device (804).
  • the first priority is different than a second priority of transmitting data from the second device to the device over the wireless medium (806).
  • an application file may be a smallest portion of data that is whole (without missing portions) that can be processed by a device.
  • the device provides the first PPDU to the second device.
  • the device provides one or more subsequent PPDUs of the application file to the second device.
  • the second device is a display device
  • the application file is associated with a video frame (such as including a video slice of a video frame).
  • a plurality of PPDUs may be associated with a video frame.
  • the action of providing the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium (812).
  • the first priority may be associated with a first set of EDCA parameters
  • the second priority may be associated with a second set of EDCA parameters
  • the third priority may be associated with a third set of EDCA parameters.
  • EDCA parameters may include one or more of an arbitration interframe spacing number (AIFSN), a minimum contention window (CWmin), or a maximum contention window (CWmax) used by the render device or the display device to determine when to transmit to the other device (such as based on CW mechanisms defined in the IEEE 802.11 standards).
  • AIFSN arbitration interframe spacing number
  • CWmin minimum contention window
  • CWmax maximum contention window
  • the first priority may be associated with aggressive EDCA parameters to obtain control of the wireless medium over all other devices to contend for the wireless medium.
  • the CWmin and the CWmax for the first priority are a first length that is less than other CWs used by the other devices.
  • the AIFSN for the first priority is a lower number than other AIFSNs used by the other devices.
  • lower numbers may indicate a priority of the traffic based on a classification of the packets to be transmitted (such as high priority, best effort, and so on).
  • a lower priority or a higher priority may refer to a higher or lower AIFSN, respectively.
  • the AIFSN may be 1 for the first priority to indicate highest priority traffic to be transmitted, and the AIFSN may be 3 for the third priority to indicate traffic with a lower priority than the first priority.
  • the AIFSN may be 2 for the second priority to indicate traffic with a higher priority than the third priority and a lower priority than the first priority.
  • the render device may reserve the CW for both the render device and the display device. In this manner, the CWmin and CWmax may be set to 0 for the display device.
  • the first priority also may be associated with a first backoff counter value for contention of the wireless medium
  • the second priority also may be associated with a second backoff counter value for contention of the wireless medium
  • the third priority also may be associated with a third backoff counter value for contention of the wireless medium.
  • a backoff counter is set to the backoff counter value, and the backoff counter counts down until 0 for a transmit opportunity (TXOP) on the wireless medium.
  • TXOP transmit opportunity
  • the device attempts to transmit on the wireless medium (such as performing carrier sense to determine if the wireless medium is unoccupied and transmitting on the wireless medium based on the carrier sense).
  • a shorter backoff counter value corresponds to a device attempting to transmit on the wireless medium sooner.
  • the first backoff counter value is less than the second backoff counter value, and the second backoff counter value is less than the third backoff counter value.
  • a first PPDU (such as for a video frame) from a render device may be attempted to be transmitted sooner than data (such as a pose packet) from the display device, and the data from the display device may be attempted to be transmitted sooner than the one or more subsequent PPDUs (such as for the video frame) from the render device.
  • the first backoff counter value may be 0, the second backoff counter value may indicate a time period greater than a short interframe spacing (SIFS), and the third backoff counter value may indicate a time period greater than indicated by the second backoff counter value.
  • SIFS short interframe spacing
  • the render device provides the first PPDU associated with the first priority and the one or more subsequent PPDUs associated with the third priority.
  • the render device may configure itself to adjust the one or more EDCA parameters from the first set of parameters to the third set of parameters after transmitting the first PPDU.
  • the render device also may adjust the backoff counter value from the first value to the third value (such as by setting the backoff counter to the third value for the subsequent PPDUs) after transmitting the first PPDU.
  • the render device has a higher probability of transmitting the first PPDU relative to other devices (including the display device and other devices in OBSSs).
  • the display device may transmit data (such as pose packets) to the render device before the one or more subsequent PPDUs are transmitted to the display device.
  • the render device uses a request to send (RTS)
  • the render device may provide (such as broadcast) a first RTS frame to obtain control of the wireless medium.
  • the RTS frame may be transmitted based on the first backoff counter value.
  • the first RTS frame includes an indication of a network allocation vector (NAV) to maintain control of the wireless medium for a first time period.
  • NAV network allocation vector
  • the NAV indicates the time period that the wireless medium is reserved by the device.
  • other devices within range of the render device receive the RTS frame, process the RTS frame, and set their NAV to the indicated amount of time to prevent transmitting on the wireless medium for the amount of time indicated.
  • the example RTS/CTS mechanism and transmission of data between the render device and display device is described in more detail below with reference to Figure 10.
  • Figure 8 depicts an example process from the render device perspective.
  • Figure 9 depicts the example process from the display device perspective.
  • Figure 9 shows a flowchart illustrating an example process 900 for asynchronous channel access control according to some implementations.
  • Operations in the example process may be performed by a device (such as the device 152 in Figure IB) or a wireless communication device implemented in the device (such as the WCD 615 in Figure 6B).
  • the device performing the operations may be a display device of an XR experience.
  • the device obtains a first PPDU of an application file from a second device over a wireless medium.
  • the device may be a display device, and the second device may be a render device.
  • the second device may refer to an infrastructure AP, an SAP (such as the device 154 in Figure IB acting as an SAP) or another device to provide the PPDUs of the application file to the device (the second device may be referred to as an AP or a render device in the below examples with reference to Figures 9 and 10).
  • the application file includes one or more video slices of a video frame.
  • the second device obtains control of the wireless medium to provide the first PPDU (904), and control of the wireless medium is associated with a first priority of transmitting the first PPDU over the wireless medium (906).
  • the first priority is different than a second priority of transmitting data from the device to the second device over the wireless medium (908).
  • the device obtains one or more subsequent PPDUs of the application file from the second device.
  • the action of obtaining the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium (912).
  • the priorities may be associated with one or more of a backoff counter value or sets of EDCA parameters.
  • FIG. 10 shows a sequence diagram 1000 illustrating example transmissions between devices for an XR experience.
  • the devices are depicted as a render device and a display device for explanation purposes, but the concepts described may be extended to other types of devices.
  • Start time to may indicate the beginning of a CW or other moment when the render device is to begin its backoff (BO) counter (such as when one or more PPDUs are ready for transmission at the render device).
  • the BO counter of the render device is set to the first backoff counter value (first BO value) associated with the first PPDU of an application file (such as of a video frame). If the display device is to begin its BO counter, the BO counter of the display device is set to the second backoff counter value (second BO value) associated with the data to be transmitted from the display device to the render device.
  • first BO value first backoff counter value
  • second BO value second backoff counter value
  • the render device may send a RTS frame.
  • the end of the RTS frame on the wireless medium is at time tz.
  • the display device sends a CTS frame to the render device a SIFS duration after time t2 (time t3).
  • Sending the CTS frame may be based on the display device obtaining the RTS frame (which may indicate that the wireless medium is free) and that the display device is ready to obtain data from the render device.
  • the end of the CTS frame on the wireless medium is at time U.
  • the render device sends a first PPDU of an application file (such as of a video frame) to the display device a SIFS duration after time U (time ts).
  • the render device may again wait the BO time period and attempt to resend the RTS frame to the display device.
  • the render device is configured to retry sending the RTS frame five times before indicating that sending the RTS frame failed.
  • the RTS/CTS mechanism before reserving the wireless medium is associated with a video frame. If sending the RTS fails (and the render device is unable to obtain control of the wireless medium), the render device may prevent providing the associated frame to the display device and attempt to obtain control of the wireless medium to provide the next frame to the display device.
  • the display device may repeat the previously displayed video frame or not display a video frame at the time the missed video frame is to be displayed.
  • one or more parameters of the DL communications such as a modulation and coding scheme (MCS), a frame rate, forward error correction (FEC), and so on
  • MCS modulation and coding scheme
  • FEC forward error correction
  • the end of the first PPDU on the wireless medium is at time t».
  • the display device sends a block acknowledgement (BA) to the render device a SIFS duration after time (time tz).
  • the BA indicates that the display device obtained the first PPDU.
  • the BA also may indicate if any portions (such as one or more MPDUs) were corrupted or missing (such as by indicating which MPDUs (which include one or more MSDUs) were obtained in the first PPDU). If the render device does not obtain a BA, the render device may attempt to resend the first PPDU to the display device.
  • the end of the BA on the wireless medium is at time ts.
  • the render device sends a second PPDU to the display device a SIFS duration after time ts (time ty), with the end of the second PPDU on the wireless medium at time tio.
  • the process may repeat for one or more PPDUs to be transmitted from the render device to the display device during the TXOP.
  • the render device may send an Nth PPDU at time tii (for N greater than or equal to 1).
  • the end of the Nth PPDU on the wireless medium is at time ti2.
  • the display device sends a BA to the render device a SIFS duration after time ti2 (time tn), with the end of the BA on the wireless medium at time tl4.
  • the RTS frame indicates a NAV value to reserve the wireless medium for a duration of time greater than the TXOP (such as the original NAV protection duration indicated).
  • a TXOP may be configurable by the render device for up to a 2.5 ms duration.
  • the RTS frame indicates a NAV value an amount equal to or greater than 2.5 ms in case the render device is to extend the NAV protection.
  • Setting the NAV may refer to the render device indicating a value to set each receiving device’s NAV to prevent the receiving device from transmitting for the amount of time.
  • Extending the NAV may refer to the render device indicating a NAV padding value or new NAV value to increase or set each receiving device’s NAV to extend the amount of time the receiving device is to prevent transmitting.
  • the NAV duration and the TXOP may begin at the start time of the RTS frame.
  • the render device indicates the NAV duration as set to the TXOP duration (such as 2.5 ms).
  • the render device also may indicate in each PPDU (such as the first PPDU up through the Nth PPDU) a minimum NAV value to ensure the remaining NAV is a minimum duration or to extend the NAV (such as indicating a NAV padding of 1 ms following the PPDU transmission).
  • the display device may run its BO counter (set to the second BO value) at the end of the TXOP (such as at time tif). During the BO time, the display device listens to the wireless medium for incoming traffic. If the wireless medium remains clear by the time the BO counter reaches 0 (such as at time tis), the display device may send data to the render device. The data from the display device may include pose information or any other suitable UL data for the XR experience. [0138] The end of the data on the wireless medium is at time ti 6. The render device sends a BA to the display device a SIFS duration after time ti 6 (time tn), with the end of the BA on the wireless medium at time tie.
  • the render device may have one or more subsequent PPDUs of the application file to send to the display device. For example, sending a video frame may not be completed during a first TXOP, and the remainder of the video frame is to be sent during one or more subsequent TXOPs. As depicted, the original NAV protection indicates that the wireless medium is still reserved by the render device after time tie.
  • the one or more subsequent PPDUs are associated with a third priority (which is associated with a third backoff counter value (third BO value) or a third set of EDCA parameters).
  • the BO counter of the render device may be set to the third BO value (and the render device may be configured for the third set of EDCA parameters), and the BO counter begins to count down to 0 at the end of the BA on the wireless medium (time tie).
  • the BO counter reaches 0 at time ti 9 , and the render device provides a new RTS frame at time tiv As depicted, time ti 9 is still in the original NAV protection duration.
  • the new RTS may indicate a new NAV value indicating an extended NAV protection (which may be similar in duration to the original NAV protection). In this manner, each receiving device’s NAV is reset to the indicated NAV value, effectively extending the reservation of the NAV by the render device.
  • the new RTS frame also may indicate a new TXOP.
  • the end of the RTS frame on the wireless medium is at time ho.
  • the display device sends a CTS frame to the render device a SIFS duration after time ho (time tzi), with the end of the CTS frame on the wireless medium at time hi.
  • the render device sends an N+lth PPDU to the display device a SIFS duration after time hi (time t23).
  • the process may be repeated similar to the first TXOP for one or more additional TXOPs.
  • the RTS/CTS mechanism may be enabled by the render device for the first PPDU of each video frame and disabled for other PPDUs of the video frame. In this manner, the N+lth PPDU may be sent without the RTS/CTS mechanism (such as to indicate a new TXOP to the display device). The NAV duration may be extended using the NAV padding indicated in each PPDU. In some implementations, the RTS/CTS mechanism may be enabled by the render device for PPDUs other than the first PPDU (such as for the one or more subsequent PPDUs, as depicted in Figure 10, to indicate a new TXOP).
  • the render device may include a video queue to receive MSDUs
  • MSDUs include descriptors to indicate a specific video frame to which the MSDU is associated.
  • the render device provides the MSDUs (in one or more PPDUs) to the display device, and the one or more obtained BAs indicate the MSDUs received by the display device.
  • the render device removes the identified MSDUs from the video queue until all MSDUs are provided to the display device.
  • an MSDU associated with a next video frame may be received at the video queue before all MSDUs associated with a current video frame are successfully provided to the display device.
  • the render device may flush the video queue (skipping the remainder of the current video frame) to begin sending MSDUs of the next video frame.
  • the render device may provide the remaining MSDUs of the current video frame.
  • the render device may configure itself for a first priority of a first PPDU of a video frame.
  • the render device may adjust one or more of the EDCA parameters (and the BO value), whether the RTS/CTS mechanism is enabled, or the number of retries for RTS/CTS. For example, if the video queue is empty and receives a first MSDU associated with a second video frame, the render device may configure itself for the first set of EDCA parameters (which may be associated with a first PPDU of a video frame) and set the BO counter to the first BO value.
  • the render device may enable RTS/CTS (with the first MSDU to be included in the first PPDU of the video frame).
  • the render device may set the number of retries for RTS/CTS to a defined number (such as five retries).
  • the render device may adjust the EDCA parameters (and the BO value) to those associated with the third priority after transmission of the RTS is completed (such as based on obtaining the CTS) or after the defined number of retries.
  • the render device may disable RTS/CTS or adjust the number of retries for RTS/CTS after transmission of the RTS is completed (such as based on obtaining the CTS) or after the defined number of retries.
  • the first MSDU may include metadata identifying the first MSDU in the first PPDU of a video frame (such as in a subframe header 422 of an MSDU subframe 416 as depicted in Figure 4).
  • the display device obtaining and processing the first PPDU may determine that the PPDU includes MSDUs (including the first MSDU) associated with a new video frame based on the metadata.
  • the last MSDU may include metadata identifying the last MSDU in a last PPDU of a video frame (such as in a subframe header 422 of an MSDU subframe 416 as depicted in Figure 4).
  • the display device obtaining and processing the PPDU may determine that the PPDU includes the last MSDU associated with a video frame based on the metadata.
  • the render device may prevent including an indication of NAV padding in the last PPDU or otherwise use the last PPDU to extend the NAV. In this manner, the render device may release control of the wireless medium at the end of the present NAV.
  • the render device may send a contention free (CF)-End beacon after providing the last MSDU to indicate to receiving devices to clear their NAVs. In this manner, the render device may release control of the wireless medium before the end of the present NAV.
  • CF contention free
  • the render device may shorten one or more
  • TXOPs (such as the first TXOP) from a maximum duration.
  • the render device may shorten the TXOP to ensure that the display device may provide UL data to the render device at a sufficient frequency for the XR experience. Shortening the TXOP also may allow the display device to conserve power by placing one or more components of the WCD into a lower power mode for times outside of the TXOP.
  • a device such as the render device and a second device (such as the display device) may be in a first BSS. While not shown in Figure 10, a device from an OBSS may share the wireless medium with the render device and the display device. Based on the first, second, and third priorities, the OBSS device may be prevented from obtaining control of the wireless medium to transmit some types of data. For example, the OBSS device may classify its traffic as best effort, high priority, and so on. The data to be transmitted by the OBSS device is associated with a fourth priority (which may be associated with a fourth set of EDCA parameters or a fourth BO value for the OBSS device based on the classification of the traffic). The OBSS device may contend for the wireless medium to transmit the data associated with the fourth priority.
  • a fourth priority which may be associated with a fourth set of EDCA parameters or a fourth BO value for the OBSS device based on the classification of the traffic.
  • the render device always may obtain control to transmit the first PPDU based on the first priority.
  • the first set of EDCA parameters (associated with the first priority) causes the render device to have preference over the display device and the OBSS device in obtaining control of the wireless medium. If the data associated with the fourth priority is classified as best effort data, the render device (to transmit subsequent PPDUs associated with a third priority) and the display device (to transmit data associated with a second priority) may have preference over the OBSS device.
  • the EDCA parameters associated with the fourth priority may prevent the OBSS device from transmitting the best effort classified data.
  • the third set of EDCA parameters may be configured so that a device is able to transmit high importance classified data (which may be associated with a set of EDCA parameters to allow transmission before transmission of one or more subsequent PPDUs from the render device).
  • high importance classified data such as based on a differentiated services code point (DSCP) value
  • the device may be able (based on its EDCA parameters associated with the DSCP value) to transmit the data after time tie and before the render device is to send the next RTS (which is associated with the third set of EDCA parameters).
  • DSCP differentiated services code point
  • the EDCA parameters of the device associated with the DSCP value may prevent the device from preempting the render device from sending the RTS associated with the third set of EDCA parameters.
  • the device may attempt to transmit such data (or high importance classified data) at the end of the NAV time.
  • synchronous channel access control is based on target wake time (TWT) sessions defined by one or more APs (such as by one or more 802.1 lax enabled or later APs, including IEEE 802.1 lbe and other IEEE standards subsequent to IEEE 802.1 lax).
  • TWT target wake time
  • synchronous channel access control may be based on synchronized transmission/reception windows (which may not conform to TWT sessions defined in the IEEE 802.1 lax standard) coordinated by one or more APs (such as a pre-802.1 lax AP (which also may be referred to as a legacy AP)).
  • APs such as a pre-802.1 lax AP (which also may be referred to as a legacy AP)
  • a TWT session may be between the display device (which may be referred to as a STA regarding the TWT session characteristics) and the render device (which may be referred to as a AP regarding the TWT session characteristics).
  • the TWT session includes one or more TWT service periods (SPs) during which the display device is to remain awake unless determined that no further traffic is to be communicated during the SP.
  • SPs TWT service periods
  • a TWT SP may be referred to herein as a TWT window or a window.
  • a TWT session (also referred to as a TWT) may include a plurality of characteristics.
  • the characteristics include:
  • STA initiates the TWT with the AP, which the AP may accept, reject, or provide an alternative
  • an unsolicited TWT AP sends unsolicited TWT responses based on a known cycle when the STA is listening to the wireless medium
  • STA indicates when awake in a TWT SP (such as via a power save (PS)-Poll or an automatic power save delivery (APSD) trigger frame) before the AP can send DL traffic) or an unannounced TWT (AP can send DL traffic without waiting for an indication from the STA during the TWT SP);
  • PS power save
  • APSD automatic power save delivery
  • AP sends a trigger frame to STA before STA can send UL traffic
  • STA can send UL traffic without waiting for a trigger frame
  • a TWT for synchronous channel access control between a render device and a display device may be an individual TWT, solicited TWT, announced TWT, non-trigger-enabled TWT, and implicit TWT.
  • the TWT is an unsolicited TWT.
  • the TWT may include such characteristics to protect XR activity of the render device and the display device.
  • an individual TWT ensures that the TWT window is set up exclusively for the render device and the display device to communicate DL traffic (and UL traffic) between each other, a solicited TWT and an announced TWT ensures that the display device is ready to receive DL traffic from the render device (to prevent delays associated with retries as a result of the display device not ready to receive DL traffic), a non-trigger-enabled TWT allows the display device to send pose data frames at a greater frequency (without requiring the overhead of trigger frames from the render device), and an implicit TWT reduces overhead caused by the display device requesting (and the render device providing) the next TWT SP start time. While an example set of characteristics for a TWT is indicated, any suitable set of characteristics may be used.
  • XR activity may refer to any actions performed by one or more devices (such as one or more of a render device or a display device) to provide the XR experience to the user.
  • the XR activity may include measuring the position of the display device by the IMU of the display device, generating the pose frames or packets (also referred to as pose data frames) from the IMU measurements, generating tracking frames, providing pose frames and tracking frames to the render device, rendering and encoding one or more video frames based on the pose frames and tracking frames, packetizing the one or more video frames, providing the packets to the display device, decoding the video frames from the received packets, processing the video frames (such as de-jitter and ATW), and displaying the video frames.
  • XR activity additionally or alternatively may include operations to provide audio, haptic feedback, or other sensory information to the user during the XR experience.
  • synchronous channel control access using TWT is to prevent collisions and interference on the wireless medium by aligning the traffic between the render device and the display device to meet the latency and packet loss requirements for the XR experience.
  • UL traffic may be sent by the display device when available (including outside of the TWT windows). In some implementations, though, the UL traffic is transmitted during the TWT windows.
  • UL traffic may include pose frames and tracking frames from the display device.
  • DL traffic may include PPDUs carrying video frame data from the render device.
  • the PPDUs, pose frames, and tracking frames are transmitted during the TWT windows, and data from other devices (or other data from the render device to the AP or other devices) is transmitted outside of the TWT windows.
  • data from other devices or other data from the render device to the AP or other devices
  • the concepts described herein may be applied to other types of UL traffic and DL traffic (such as an application file not being associated with video) and other types of devices.
  • the rendering and encoding of each video frame (710) by the render device may be at a known interval and periodicity associated with the frame rate of the display (720).
  • the TWT windows for the TWT session between the render device and the display device may be setup to have a known interval based on the known interval and periodicity for rendering.
  • the render device may be ready to transmit PPDUs of a next video frame at a known periodicity, and timing of the TWT windows may be based on the periodicity.
  • the TWT windows include the time that the wireless medium is allocated for UL and DL data to be transmitted between the render device and the display device.
  • the display device may be coordinated with when DL traffic is to be transmitted so that both may be transmitted during the TWT windows.
  • the render device may setup the TWT session with the display device based on the rendering of video frames at the rendering device (with the period and length of the TWT windows indicated to the display device).
  • the display device may coordinate transmitting UL traffic based on the indicated period and length of the TWT windows.
  • the length of the TWT windows allows both UL traffic and DL traffic (also referred to as UL data and DL data, respectively) to be transmitted during a same TWT window.
  • FIG 11 shows a flowchart illustrating an example process 1100 for synchronous channel access control based on a TWT session according to some implementations.
  • the operations in the example process may be performed by a device (such as the device 154 in Figure IB) or a wireless communication device implemented in the device (such as the WCD 615 in Figure 6B).
  • the device performing the operations may be a render device of an XR experience.
  • the device obtains UL data from a second device over a wireless medium.
  • the UL data may include one or more pose frames or one or more tracking frames from the second device, which may be a display device (such as a HMD).
  • the device provides, to the second device, DL data including
  • the PPDUs may include video frame data.
  • the device may provide one or more of the PPDUs to the second device (1106).
  • One or more pose frames also may be obtained during the current TWT window. In this manner, both UL data and DL data may be transmitted during the current TWT window.
  • the beginning time of the TWT windows may be adjusted to reduce latency and increase efficiency in using TWT windows. For example, if the TWT window begins an amount of time before either UL data or DL data is transmitted during the TWT window, the second device (such as a display device) is to remain awake and listening for the amount of time of the TWT window not used for transmissions. In addition, if the PPDUs are ready for transmission outside of the TWT windows, the render device must wait until a next TWT window to transmit the PPDUs. If UL data is to be transmitted outside of the TWT windows, the display device must remain in an active power state during times outside of the TWT windows to transmit the UL data.
  • the periodicity of rendering video frames is known (with when PPDUs for the video frames being available for transmission being periodic). If the time at which the PPDUs (such as the first PPDU) are available for transmission during each period is known, the beginning of the TWT window may be determined based on when the PPDUs are available each period. In this manner, a beginning of the current TWT window is associated with one of when a first PPDU of the one or more PPDUs is provided to the second device or when the first PPDU is provided from an application layer to a MAC of the device (1108).
  • the beginning of the current TWT window (and the other TWT windows) is synchronized with the time to transmit DL data and UL data to reduce any latency between the beginning of the TWT window and transmission and to reduce any time the render device is to wait between making one or more PPDUs ready for transmission and transmitting the one or more PPDUs.
  • the DL data and the UL data may be associated with video frames rendered for an XR experience.
  • the device depicted in Figure 11 (such as a render device) renders video frames to be displayed by the second device (such as display device).
  • the UL data from the second device includes pose data frames
  • the rendering of the video frames is associated with the obtained pose data frames.
  • video frames (such as a first video frame) includes a plurality of video slices, rendering of the first video frame is associated with a pose data frame most recently obtained from the second device, and one or more PPDUs to be transmitted to the second device are associated with one or more of the plurality of video slices.
  • a render device splits a video frame into a plurality of slices, and the render device packages each slice into a plurality of PPDUs to be transmitted during the TWT windows.
  • Each TWT window may be used by the render device to transmit a video frame’s PPDUs to the display device (with the next TWT window used to transmit the next video frame’s PPDUs).
  • the second device listens for DL data during the TWT windows. If the transmission of DL data and UL data is coordinated so that both are transmitted during the TWT windows, the second device may not need to transmit outside of the TWT windows.
  • the second device (such as a display device) may enter a low power mode (such as powering down or reducing power to one or more components of its wireless communication device) between TWT windows. In this manner, the second device may reduce power consumption.
  • many display devices (such as HMDs) are battery powered to allow a user to move about and not restrict a user’s freedom of movement.
  • the display device may place one or more components of its radio frequency (RF) front end into a low power mode between TWT windows to conserve power and extend the amount of time the display device may be in use between charges.
  • RF radio frequency
  • the render device may determine to continue to transmit to the display device outside of the TWT windows. For example, the render device may determine that one or more PPDUs of a video frame cannot be provided to the display device during the current TWT window. In this manner, the one or more PPDUs may be provided after the current TWT window. In some implementations, the render device may provide an indication to the display device not to enter into a low power mode (TWT power save mode) after the current TWT window.
  • TWT power save mode a low power mode
  • the indication may be included in a power management (PM) field of a packet’s MAC header (such as a PM bit of the frame control field being set to 0 to indicate that the display device is not to enter into the low power mode) provided to the display device during the current TWT window (such as the last packet provided to the display device during the window).
  • the display device processes the packet’s header and determines that entering the low power mode is to be prevented after the current TWT window.
  • the render device may provide, to the display device, a PPDU after the current TWT window and before a next TWT window.
  • the render device may terminate a current TWT window early if no additional PPDUs are to be transmitted during the current TWT window.
  • the render device may indicate to the display device that the current TWT window is being ended early.
  • the end of service period (EOSP) bit of the Quality of Service (QoS) field of the MAC header of a packet may be set to 1 to indicate that the TWT window is ending.
  • the packet may include a null data frame and be used exclusively to indicate the end of the TWT window via the QoS field’s EOSP bit (which may be referred to as an EOSP packet).
  • an EOSP packet can be sent after more than 10 ms of inactivity during the TWT window.
  • the render device is configured to send the EOSP packet after 10 ms of inactivity after the last PPDU.
  • the TWT window may end before 10 ms of inactivity has elapsed.
  • an MSDU may include metadata to indicate the MSDU being the last MSDU for the video frame.
  • the display device determines that no more PPDUs are to be received from the render device for the video frame, and the display device may enter a low power mode until the next TWT window (effectively ending the TWT window for the display device).
  • the render device may be configured to send the EOSP packet after less than 10 ms of inactivity (such as immediately after the last PPDU).
  • the render device may determine to send the EOSP packet based on the metadata of the last MSDU when preparing the PPDUs for transmission to the display device.
  • the low power mode (TWT power save mode) referred to herein may differ from one or more sleep modes defined in the IEEE 802.11 standard (such as pre- 802.1 lax standards, including 802.1 lba, etc.).
  • the low power mode (TWT power save mode) may be associated with a faster sleep to wake (S2W) and wake to sleep (W2S) time than the standard defined sleep mode (referred to as deep sleep) designed for best effort classified traffic based on delivery traffic indication messages (DTIMs) in an AP’s beacons.
  • S2W sleep to wake
  • W2S wake to sleep
  • deep sleep standard defined sleep mode designed for best effort classified traffic based on delivery traffic indication messages (DTIMs) in an AP’s beacons.
  • DTIMs delivery traffic indication messages
  • the smallest low power period (including S2W and W2S) for the low power mode may be approximately 10 ms
  • the smallest deep sleep period (including S2W and W2S) for deep sleep may be approximately 40 ms (which may be too long for XR activity for video at 60 fps (approximately 16 ms per video frame)).
  • the low power mode may differ from deep sleep by powering down fewer RF front end components, maintaining a carrier signal lock of the wireless medium, or other operations to reduce the amount of time to enter into and exit from the low power mode.
  • Figure 11 depicts an example process from the render device perspective.
  • Figure 12 depicts the example process from the display device perspective.
  • FIG 12 shows a flowchart illustrating an example process 1200 for synchronous channel access control based on a TWT session according to some implementations.
  • Operations in the example process may be performed by a device (such as the device 152 in Figure IB) or a wireless communication device implemented in the device (such as the WCD 615 in Figure 6B).
  • the device performing the operations may be a display device of an XR experience.
  • the device provides UL data to a second device over a wireless medium.
  • the second device may be a render device, and the UL data may include one or more of a pose data frame or a tracking frame.
  • the device obtains, from the second device, DL data including
  • the PPDUs are associated with one or more video frames rendered based on the UL data.
  • the device obtains one or more PPDUs from the second device during a current TWT window (1206), and a beginning of the current TWT window is associated with one of when a first PPDU of the one or more PPDUs is provided to the device or when the first PPDU is provided from an application layer to a MAC of the second device (1208).
  • the beginning of a TWT window is associated with when a first PPDU is to be transmitted.
  • the beginning of the TWT window may coincide with when the first PPDU is to be transmitted.
  • the TWT window may begin before when the first PPDU is to be transmitted.
  • the display device may provide one or more of a tracking frame or one or more pose data frames for each video frame to be rendered.
  • the frequency of providing the tracking frames or one or more pose data frames is the same as the frequency of the video frames.
  • the pose data frames may be provided at a first frequency, and the video frames may be rendered at the first frequency.
  • the display device may provide a pose data frame before when the first PPDU of a video frame is ready for transmission.
  • a TWT window begins when the tracking frame or a first of one or more pose data frames is to be transmitted from the display device to the render device. For example, when the TWT window is to begin may be based on a M2R latency for the XR experience.
  • FIG. 13 shows a sequence diagram 1300 illustrating example timings of pose data frames and rendering of video frames associated with a M2R latency.
  • the display device obtains pose information. For example, the display device determines to package the current IMU measurement into a pose data frame N-l (for any integer N greater than 0) and provide the pose data frame N-l to the render device.
  • UL latency 1304 is the latency between the time of the IMU measurement to the time the render device obtains the pose data frame N-l (at 1306).
  • UL latency 1304 may vary based on the amount of UL data to transmit and the conditions of the wireless medium.
  • the UL latency 1304 may be greater as a result of retries in sending the pose frame N-l or using a lower MCS rate to transmit the pose frame N-l .
  • the render device After obtaining the pose frame N-l (1306), the render device begins rendering video frame N-l (1308).
  • Time 1310 indicates the rendering time to render video frame N-l . While not depicted, the rendering time may include time for rendering and encoding of the video frame.
  • the render device provides video frame N-l to the display device in a plurality of PPDUs.
  • Time 1312 is the time during which the PPDUs are provided from the render device to the display device.
  • the PPDUs may be provided before or after encoding of the entire frame is completed.
  • a video slice may be packaged into one or more PPDUs concurrently with another video slice being processed before packaging into one or more PPDUs.
  • time 1312 may overlap time 1310.
  • the display device again obtains pose information. For example, the display device determines to package the current IMU measurement at that time into pose data frame N.
  • the pose data frame N is obtained by the render device at time 1320 (with the UL latency 1318 being the latency between time 1316 and time 1320).
  • UL latency 1318 is depicted as being greater than UL latency 1304 to depict that UL latencies may vary.
  • the render device begins rendering video frame N.
  • the video may have a specific frame rate, and the interval between video frame renderings may be fixed based on the frame rate, resolution, encoding, and other video parameters.
  • the amount of time between times 1308 and 1322 may be the same as the amount of time between time 1322 and a time to begin rendering video frame N+l.
  • Time 1324 indicates the rendering time to render video frame N.
  • the render device provides video frame N to the display device in a plurality of PPDUs.
  • Time 1326 is the time during which the PPDUs are provided from the render device to the display device. While not shown, times 1312 and 1326 may be variable based on channel conditions, channel size, MCS, use of forward error correction (FEC), number of PPDUs to be transmitted, or other characteristics.
  • FEC forward error correction
  • the render device renders a current frame based on the most recent pose frame obtained from the display device. For video frame N, time 1322 is before time 1320 (when pose frame N is obtained). In this manner, the render device does not render video frame N based on pose frame N (since pose frame N is not yet obtained).
  • the render device renders video frame N based on pose frame N-l .
  • the M2R latency may be a latency between the IMU measurement and rendering a video frame associated with the IMU measurement.
  • the M2R latency 1314 is larger than if pose frame N would be received in time.
  • the M2R latency may be determined based on a predefined or known UL latency.
  • a TWT window 1328 may begin at time 1302
  • the length of the TWT window 1328 may be fixed to include through time 1312. Since time 1312 may vary, the TWT window 1328 may be of a length to accommodate variations in the time 1312 (such as a length to allow transmission of the video frame based on a maximum resolution, minimum MCS, minimum channel size, and so on).
  • the render device also may indicate that one or more PPDUs are to be transmitted after the end of a TWT window to transmit PPDUs unable to be transmitted during the TWT window.
  • the end of the TWT window 1328 may be between the end of time 1312 and time 1316. While not shown, a new TWT window may begin at time 1316.
  • the sequence diagram 1300 (and other sequence diagrams depicted) are not to scale and may vary in operations.
  • the DL of video frame N-l (1312) can be together with the UL of pose frame N (1320) into one TWT window.
  • timing of providing pose frames and rendering video frames is not coordinated, which may cause a video frame to be rendered before receiving a new pose data frame and increase the M2R latency.
  • the render device may coordinate rendering of video frames or the display device may coordinate providing pose frames to the render device so that one or more pose frames are obtained before rendering a new video frame (such as the render device and the display device coordinating so that time 1322 occurs after time 1320).
  • Figure 14 depicts timings that are coordinated to reduce the M2R latency.
  • FIG 14 shows a sequence diagram 1400 illustrating example timings of pose data frames and rendering of video frames associated with a M2R latency.
  • the display device obtains pose information. For example, the display device determines to package the current IMU measurement into a pose data frame N-l (for any integer N greater than 0) and provide the pose data frame N-l to the render device.
  • UL latency 1404 is the amount of time from the time of the IMU measurement to the time the render device obtains the pose frame N-l (at 1406).
  • UL latency 1404 may vary based on the amount of information to transmit and the conditions of the wireless medium.
  • the UL latency 1404 may be greater as a result of retries in sending the pose frame N-l or using a lower MCS rate to transmit the pose frame N-l .
  • the render device After obtaining the pose frame N-l (1406), the render device begins rendering video frame N-l (1408).
  • Time 1410 indicates the rendering time to render video frame N-l .
  • the render device provides video frame N-l to the display device in a plurality of PPDUs.
  • Time 1412 is the time during which the PPDUs are provided from the render device to the display device.
  • the display device obtains pose information and provides the pose data frame N to the render device.
  • UL latency 1418 is the amount of time from the time of the IMU measurement to the time the render device obtains the pose data frame N (at 1420).
  • UL latency 1418 for pose frame N is greater than UL latency 1404 to depict that UL latencies may vary.
  • the display device begins rendering video frame N.
  • Time 1424 indicates the rendering time to render video frame N.
  • the render device provides video frame N to the display device in a plurality of PPDUs.
  • Time 1426 is the time during which the PPDUs are provided from the render device to the display device.
  • TWT window 1428 may be the same as TWT window 1328 in Figure 13.
  • the amount of time between times 1408 and 1422 may be the same as the amount of time between 1308 and 1322 in Figure 13.
  • the difference between Figures 13 and 14 may be the timing between providing pose frames by the display device and rendering video frames by the render device.
  • a timing of rendering the video frames is coordinated with a timing of the display device providing the pose frames to the render device. For example, time 1422 is determined so that time 1422 remains after time 1420.
  • pose frame N is obtained before rendering video frame N, and pose frame N may be used to render video frame N.
  • the timing may be based on an UL latency for the pose frames (such as a potential UL latency based on the lowest MCS, smallest wireless channel, a level of noise, use of FEC, and other parameters to slow throughput of transmitting the pose data frame to the render device).
  • an UL latency for the pose frames such as a potential UL latency based on the lowest MCS, smallest wireless channel, a level of noise, use of FEC, and other parameters to slow throughput of transmitting the pose data frame to the render device.
  • the M2R latency 1414 is reduced.
  • the M2R latency 1414 is less than the M2R latency 1314 in Figure 13.
  • the beginning of the TWT windows may be based on the M2R latency.
  • the render device may determine a TWT window to begin a first offset before the time to render a video frame.
  • the first offset may be the M2R latency (such as M2R latency 1414 before time 1408 to be the beginning of the TWT window 1428).
  • pose data frames may be obtained or provided and video frames may be rendered at the same frequency.
  • more than one pose data frame may be provided during a TWT window (such as providing a pose data frame in the middle or towards the end of the TWT window), but at least the pose data frames provided at the beginning of the TWT windows may be provided at the first frequency.
  • each of the pose data frames obtained at the beginning of the TWT windows may be associated with a rendered video frame.
  • Additional pose data frames may be provided during a TWT window in case there are issues obtaining the first pose data frame during the TWT window (such as based on temporary interference on the wireless medium).
  • the render frame may use the intermediate pose frame (instead of pose frame N-l) to render video frame N.
  • Providing additional pose frames may reduce the latency between when the pose information was obtained and when the video frame is rendered in such instances.
  • obtaining the pose frames and rendering the video frames are at the same frequency, and timing between rendering video frames and obtaining pose frames is coordinated to reduce the M2R latency.
  • Obtaining pose information is performed at an application layer of the display device, and rendering video frames is performed at an application layer of the render device.
  • the application processor 630 (in conjunction with the memory 640) may execute an XR application on the render device (such as a VR or AR application on a smartphone).
  • the application processor 635 may execute an XR application of the display device (such as a VR or AR application on a HMD).
  • the render device renders the video frames or generates audio, haptic feedback, or other information for the XR experience
  • the display device obtains IMU measurements (such as from the sensors 675), displays the video, plays the audio, or provides other information to the user for the XR experience.
  • Operations at the application layer are based on an application layer clock of the device.
  • the device may use a first piezoelectric material (such as a crystal) to generate a host clock provided to the application processor for performing application layer operations.
  • the application layer clock is used by the device for timing of rendering video frames (by the render device) or for timing of displaying the video frames (by the display device).
  • Such a clock may be referred to as the application layer clock.
  • the WCD 610 or 615 is used to perform operations at the lower layers of the OSI model (such as at the MAC). For example, managing and scheduling pose data frames, tracking frames, or PPDUs for transmission and managing receipt of packets may be at the MAC and managed by the WCD 610 or the WCD 615.
  • Operation of the WCD is based on a second clock (referred to as the WCD clock).
  • the WCD clock is used by the device for timing of wireless communications with a second device.
  • the WCD clock is based on a second piezoelectric material (such as a second crystal) of the device different than for the application layer clock. Since a device may use two different crystals (and thus different logic) to generate the application layer clock and the WCD clock, the application layer clock and the WCD clock may be at different frequencies, resolution, or phases.
  • the display device and the render device may synchronize the application layer clock and the WCD clock.
  • the application layer clock and the WCD clock may have the same frequency and phase.
  • the application layer clock may be synchronized to the WCD clock.
  • the WCD clock may be synchronized to the application layer clock.
  • the clocks of one device also may be synchronized to the clocks of the other device. In this manner, one of the four clocks between the render device and the display device is the reference clock, and the other three clocks are synchronized to the reference clock.
  • the TWT window timings may be determined based on the reference clock.
  • the application layer clock of the display device may drive timing of the TWT windows if the display device’s application layer clock is the reference clock, or the application layer clock of the render device may drive timing of the TWT windows if the render device’s application layer clock is the reference clock. If the WCD clock is to be used to synchronize the other clocks, a schedule of the TWT windows may be set, and one or more of the render timing or the display timing of video frames may be adjusted based on the set schedule of TWTs.
  • Figures 15A-15C depict different clocks being the reference clock for synchronization.
  • the application layer clock of the display device is depicted as the reference clock in Figure 15 A
  • the application layer of the render device is depicted as the reference clock in Figure 15B
  • the WCD clock of the render device or the display device is depicted as the reference clock in Figure 15C.
  • Figure 15A shows a block diagram 1500 illustrating an example of synchronizing the clocks of the render device 1502 and the display device 1508.
  • the render device 1502 includes the application layer clock 1504 and the WCD clock 1506, and the display device 1508 includes the application layer clock 1510 and the WCD clock 1512.
  • the application layer clock 1510 is the reference clock, and the other clocks are synchronized to the application layer clock 1510.
  • the WCD clock 1512 is synchronized to the application layer clock 1510.
  • a time of the application layer clock 1510 may be indicated in the pose information or tracking frames to be provided to the render device 1502.
  • the information is provided from the application layer to the MAC (such as to the WCD) to be scheduled and transmitted to the render device 1502.
  • the WCD may obtain the time from the obtained information from the application layer and synchronize the WCD clock based on the indicated time.
  • the WCD clock 1506 synchronizes to the WCD clock 1512.
  • the WCD of the display device 1508 includes a local timing synchronization function (TSF) timer for synchronizing communications over the wireless medium with the render device 1502.
  • the WCD of the render device 1502 also includes a local TSF timer.
  • the WCD of the display device 1508 periodically provides an indication of its TSF timer value to the WCD of the render device 1502 (such as via the pose data frames or via a beacon frame from the display device 1508 to the render device 1502).
  • Synchronizing the WCD clock 1512 to the application layer clock 1510 causes the TSF timer of the WCD of the display device 1508 to be adjusted.
  • the adjusted TSF timer value is indicated to the WCD of the render device 1502.
  • the render device 1502 may adjust its WCD clock based on the adjusted TSF timer values from the display device 1508.
  • the render device 1502 may synchronize the application layer clock 1504 to the WCD clock 1506 (1518).
  • the render device 1502 may be configured to make the TSF timer values available to the application layer (such as via a call from the MAC to the application layer configured for the render device 1502). If the TSF timer is not visible at the application layer, the timing information from the packets obtained from the display device (such as the indicated TSF timer values and the times at which the packets are transmitted) may be used in synchronizing the application layer clock 1504.
  • Figure 15B shows a block diagram 1520 illustrating an example of synchronizing the clocks of the render device 1522 and the display device 1528.
  • the render device 1522 includes the application layer clock 1524 and the WCD clock 1526
  • the display device 1528 includes the application layer clock 1530 and the WCD clock 1532.
  • the application layer clock 1524 is the reference clock
  • the other clocks are synchronized to the application layer clock 1524.
  • the WCD clock 1526 is synchronized to the application layer clock 1524.
  • a time of the application layer clock 1524 may be indicated in the video frames or other information to be provided to the display device 1528.
  • the information is provided from the application layer to the MAC (such as to the WCD) to be scheduled and transmitted to the display device 1528.
  • the WCD may obtain the time from the obtained information from the application layer and synchronize the WCD clock based on the indicated time.
  • the WCD clock 1532 synchronizes to the WCD clock 1526
  • the WCD of the render device 1522 periodically provides an indication of its TSF timer value or other timing information to the WCD of the render device 1502 (such as via one or more MAC control elements (CEs) of the PPDUs for the video frames), and the local TSF timer of the display device 1528 may be adjusted based on updated timing information after synchronizing the WCD clock 1526 to the application layer clock 1524.
  • the display device 1528 may synchronize the application layer clock 1530 to the WCD clock 1532 (1538).
  • the display device 1528 may be configured to make the TSF timer values available to the application layer (such as via a call from the MAC to the application layer configured for the display device 1528). If the TSF timer is not visible at the application layer, the timing information from the packets obtained from the render device (such as the indicated TSF timer values and the times at which the packets are transmitted) may be used in synchronizing the application layer clock 1530.
  • Figure 15C shows a block diagram 1540 illustrating an example of synchronizing the clocks of the render device 1542 and the display device 1548.
  • the render device 1542 includes the application layer clock 1544 and the WCD clock 1546
  • the display device 1548 includes the application layer clock 1550 and the WCD clock 1552.
  • the WCD clock 1546 or 1552 is the reference clock, and the other clocks are synchronized to the WCD clock.
  • the WCD clocks 1546 and 1552 are synchronized to each other.
  • the TSF timers between the devices may be synchronized, with timing information provided between the devices (such as described above with reference to Figures 15A and 15B).
  • the render device 1542 synchronizes the application layer clock 1544 to the WCD clock 1546 (such as described above with reference to Figure 15 A).
  • the display device 1548 synchronizes the application layer clock 1550 to the WCD clock 1552 (such as described above with reference to Figure 15B). [0195] Referring back to Figure 15 A, if the application layer clock 1510 is the reference clock, the application layer clock 1510 initially may be set by the display device 1508 and not need to be adjusted during operation.
  • the display device 1508 may determine the TWT session schedule between the render device 1502 and the display device 1508.
  • the schedule may include the interval of the TWT windows, the size of the TWT windows, and the start times of the TWT windows.
  • the display device 1508 may align UL traffic to the TWT windows (such as aligning the transmission of pose frames to the beginning of the TWT windows).
  • the timing of pose information being packaged and provided to the render device 1502 is based on the TWT schedule that is determined based on the application layer clock 1510.
  • the specific IMU measurements that are to be used to generate pose frames may be based on the TWT schedule that is determined based on the application layer clock 1510.
  • the render device 1502 may align DL traffic to the TWT windows (such as aligning the transmission of the PPDUs to a portion of the TWT windows). For example, referring back to Figure 14, the display device determines the times 1402 and 1416 to be at the beginning of the TWT windows (including TWT window 1428) based on the TWT schedule.
  • the render device may coordinate times 1408 and 1422 (to render video frames N-l and N) to be a first offset from times 1402 and 1416, respectively (with the first offset associated with the M2R latency).
  • M2R latency 1414 may be known.
  • the first offset may be the M2R latency 1414.
  • the render device may coordinate time 1410 to be the M2R latency 1414 after time 1402.
  • the render device may trigger times 1408 and
  • triggering rendering video frame N-l may be based on obtaining pose frame N-l (at time 1406).
  • the difference between times 1406 and 1408 is based on the amount of time needed to trigger rendering the video frame (such as processing the pose frame and providing the pose information by the WCD to the application layer before rendering the video frame at the application layer).
  • the render device fails to obtain the pose frame (such as based on interference on the wireless medium), the render device does not provide an ACK to the display device.
  • the display device may retry to provide the pose frame one or more times (such as up to a maximum number of times or up to a timeout amount of time from the beginning of the TWT window).
  • the timeout may be based on when the render device is to begin rendering the video frame. For example, a timeout time may be an amount of time up to from the beginning of the TWT window to the time when the render device is to begin rendering a video frame (such as time 1402 to 1408).
  • the timeout time may be determined by the render device or may be defined in the TWT session. In this manner, the render device begins counting from the beginning of each TWT window, and the render device triggers rendering the video frame at the first of reaching the timeout time or obtaining the pose frame. If a timeout occurs (with the timeout time being reached before obtaining the pose frame), the render device may use the last obtained pose frame obtained before the TWT window (such as during the last TWT window) to generate the video frame.
  • the display device may indicate when the render device is to begin rendering a video frame.
  • the render device triggers rendering each video frame based on an explicit indication from the display device.
  • a vertical synchronization (Vsync) value at the application layer of the display device may be indicated to the WCD to indicate the time the render device is to render a video frame.
  • the time based on the Vsync may be indicated between the WCDs, and the render device may convert the indicated time to an application layer time to render the video frame.
  • the application layer clock 1524 initially may be set by the render device 1522 and not need to be adjusted during operation. Since the application layer clock 1524 is not adjusted during operation and rendering of the video frames is based on the application layer clock 1524 of the render device 1522, rendering of the video frames may remain constant during operation (with other clocks 1526, 1532, and 1530 being adjusted to remain synchronized to the application layer clock 1524).
  • the render device 1522 may determine the TWT session schedule between the render device 1522 and the display device 1528.
  • the schedule may include the interval of the TWT windows, the size of the TWT windows, and the start times of the TWT windows.
  • the render device 1522 may determine a start time of a TWT window to be an offset before the defined render time for a video frame. For example, referring back to Figure 14, times 1408 and 1422 may be set based on the application layer clock 1530 ( Figure 15B).
  • the render device may determine times 1402 and 1416 (which are the beginning of the TWT windows) to be a first offset (associated with the M2R latency) before the set render times 1408 and 1422.
  • the first offset may be the M2R latency 1414, such as described above.
  • the M2R latency (and the first offset) may be determined to account for UL latency.
  • the display device 1528 may align UL traffic to the TWT windows. For example, referring back to Figure 14, the display device may determine the times 1402 and 1416 to be at the beginning of the TWT windows (including TWT window 1428) based on the TWT schedule indicated by the render device. The display device 1528 also may align the display times for the video frames based on the application layer clock 1530. In this manner, the display time of a current video frame is aligned to a time associated with a current TWT window.
  • the WCD clocks 1546 and 1552 initially may be set and synchronized based on TSF information provided between the render device 1542 and the display device 1548.
  • the render device 1542 may provide timing information to the display device 1548, and the local TSF timers may be synchronized based on the timing information.
  • the application layer clock 1544 and the application layer clock 1550 are adjusted based on the respective WCD clock.
  • the TWT session schedule may be based on network resource availability (in addition to the latency requirements for XR traffic).
  • the TWT window size, interval, and start times may be based on the wireless medium being shared by concurrent links from the render device to an AP and to the display device.
  • the TWT window size, interval, and start times may be based on the wireless medium being shared by multiple links between the render device and multiple display devices (such as in a mesh network).
  • the TWT session schedule may be determined either by the render device or the display device, and the render device and the display device may coordinate application layer operations based on the TWT session schedule.
  • the render device may set times 1408 and 1422 to be a first offset after the beginning of the respective TWT window (such as a first offset equal to the M2R latency 1414 after time 1402 for time 1408).
  • the render device may trigger rendering based on obtaining a pose frame or a timeout occurring (such as described above).
  • the display device may determine the time to display the video frames based on the TWT session schedule.
  • the synchronized clocks may drift from one another.
  • the render device or the display device may periodically synchronize the clocks as a result of the drift.
  • the WCD clocks may remain synchronized based on the TSFs, but the application layer clock at the display device or the render device may drift from the respective WCD clock.
  • the device measures and determines if a drift between the application layer clock and the WCD clock is greater than a defined threshold (such as more than 100 ps). If the drift becomes greater than the defined threshold, the device again may synchronize the application layer clock and the WCD clock. Synchronization may be performed as described above with reference to Figures 15A - 15C and based on which clock is the reference clock. If the device includes the reference clock, the other device also may synchronize its clocks based on the synchronization.
  • the display device may provide more than one pose frame during a TWT window.
  • a first pose frame may be provided at the beginning of a TWT window.
  • the display device also may provide one or more additional pose frames later in the TWT window.
  • the display device may provide an additional pose frame after a defined time of inactivity on the wireless medium during the TWT window.
  • the display device may provide an additional pose frame towards the end of the TWT window or after the render device indicates that no further PPDUs are to be provided (such as based an indication in an MSDU indicating that the MSDU is the last MSDU for the video frame).
  • Each pose frame may be based on a new IMU measurement.
  • the IMU measurements occur independent from packaging the position information as a pose frame to provide to the render device.
  • the IMU may measure the position information at a frequency of 1 kHz (every 1 ms).
  • the display device may use the most recent IMU measurement to generate a pose frame when needed, and the display device may ignore the other IMU measurements for the pose frames.
  • the latency between the IMU measurement and generating the pose frame is up to 1 ms.
  • the IMU measurements may be based on when the pose frames are to be generated. For example, the IMU measurements may be triggered based on a timing of providing pose frames to the render device.
  • timing of communications is managed by the render device and the display device to ensure latency requirements are met for the XR traffic (such as to ensure a smooth display of video frames and to ensure a M2R2P latency is met for the XR experience).
  • the render device may handle XR traffic differently than other traffic (such as best effort classified traffic to another device). In this manner, handling of data and communication of such data may be based on the application (such as whether XR related data or not XR related data).
  • stations For best effort classified traffic, stations typically provide the traffic at the MSDU level to a first in first out (FIFO) queue of the WCD for transmission.
  • the management of the MSDUs in the queue is without concept of data delivery deadlines or latency requirements, and each MSDU is managed independently from other MSDUs.
  • an application file may require more than one MSDU.
  • time sensitivity of delivering application data (such as delivering a video frame within a defined amount of time) may require the MSDUs carrying the application data to be delivered within a defined amount of time. If any MSDUs are lost or late in being obtained by the display device, all other MSDUs obtained for the application file may not be processed and may be useless to the display device.
  • FIG. 16 shows a flowchart illustrating an example process 1600 for managing data for transmission according to some implementations.
  • the example depicts the data as video frames of a video for an XR experience, but the data may be any suitable data of an application file (which may be for or not for an XR experience).
  • the suitable data may be audio data, haptic data, or other data for an XR experience.
  • the suitable data may be other data packaged into a plurality of MSDUs to be transmitted to a second device.
  • the device performing the operations in process 1600 may be a render device, and the second device may be a display device.
  • the device renders a plurality of video frames to be provided to a second device.
  • the device splits each video frame of the plurality of video frames into a plurality of video slices.
  • the device generates, for each video slice, a plurality of PPDUs to include the video slice.
  • Each PPDU includes one or more MSDUs associated with the video slice (1608).
  • the video slice associated with the one or more MSDUs may be identified by a port number and a DSCP value included in each MSDU (1610).
  • the port number may be the ID of the source port to be used in transmitting the MSDU, and the DSCP value may be a value to indicate specific video frames of the XR experience. In some implementations, the DSCP value is incremented for each successive video frame.
  • the device queues, for each video slice, the MSDUs for transmission to the second device. In some implementations, a queue is generated for each video slice.
  • Process 1600 depicted in Figure 16 is from the render device perspective.
  • Process 1700 depicted in Figure 17 may be similar to process 1600 but from the display device perspective.
  • FIG. 17 shows a flowchart illustrating an example process 1700 for managing data for transmission according to some implementations.
  • the device performing the operations in process 1700 may be a display device, and the second device may be a render device.
  • the device obtains, from the second device, one or more PPDUs associated with a video frame.
  • the second device renders a plurality of video frames to be provided to the device (1704).
  • the second device splits each video frame of the plurality of video frames into a plurality of video slices (1706).
  • the second device For each video slice, the second device generates a plurality of PPDUs to include the video slice (1708).
  • Each PPDU includes one or more MSDUs associated with the video slice (1710).
  • the video slice associated with the one or more MSDUs may be identified by a port number and a DSCP value included in each MSDU (1712).
  • the second device queues the MSDUs for transmission to the device (1714).
  • the queues may be MSDU queues generated in software for each video slice, and each MSDU queue may be identified by an IP address (such as the target IP address), port number, and DSCP value.
  • FIG. 18 shows a block diagram 1800 illustrating an example of generating queues for one or more video frames.
  • the video 1802 includes video frames Q and Q+l (for integer Q equal to or greater than 1) rendered by the render device.
  • the render device may split each video frame into video slices.
  • video frame Q includes N video slices (for integer N greater than or equal to 1)
  • video frame Q+l includes N video slices.
  • the render device packages each video slice into a plurality of IP packets (such as R IP packets for each video slice for integer R greater than or equal to 1), and the render device packages each IP packet into an MSDU.
  • the render device may identify the MSDU for IP packet 1 of video slice 1 of video frame Q as the first MSDU for video frame Q, and the render device may identify the MSDU for IP packet R of video slice N of video frame Q as the last MSDU for video frame Q.
  • the render device also may identify the MSDU for IP packet 1 of video slice 1 of video frame Q+l as the first MSDU for video frame Q+l, and the render device may identify the MSDU for IP packet R of video slice N of video frame Q+l as the last MSDU for video frame Q+l.
  • the render device creates queue 1 for the MSDUs of video slice 1 of video frame Q, queue N for the MSDUs for video slice N of video frame Q, queue N+l for the MSDUs for video slice 1 of video frame Q+l, queue 2*N for the MSDUs for video slice N of video frame Q+l, and so on. While each video slice is depicted as being packaged into the same number of IP packets, each video slice may be packaged into any suitable number of IP packets.
  • Each queue may be an MSDU queue generated in software by the render device (such as by the WCD) and stored in a memory of the render device.
  • the queues may be generated and stored in a memory 508 of a WCD 500 ( Figure 5), which may be implemented in the render device.
  • An MSDU queue may be identified and tracked by the WCD of the render device based on an IP address, a port number, and a DSCP value assigned to the video frame by the render device. All MSDUs in a queue are associated with the same IP address, port, and DSCP value.
  • the DSCP value may be based on the video frames being for an XR experience. In this manner, the DSCP values may be application based.
  • the DSCP value also may be based on a type of video frame.
  • a reference frame of the video includes i-slices that may be associated with a different DSCP value than an intermediate frame of the video (a p- frame) including p-slices.
  • the render device may use DSCP values previously reserved, which may be defined at the render device and the display device as being associated with a priority of the specific video slices.
  • the render device assigns a traffic identifier
  • the TID may be included in the MPDU MAC header of the PPDUs for the plurality of MSDUs associated with the video slice.
  • the TID is associated with an access category (AC) of the video slice, and the AC may be associated with a priority of the video slice.
  • the render device may determine to transmit video slice MSDUs over other data based on the TID indicating that the priority of the video slice is greater than the priority of the other data to be transmitted.
  • the priority of the video slice is based on whether the video slice is an i-slice or a p-slice. For example, an i-slice may be associated with a higher priority than a p-slice for transmission by the render device (which may be indicated by a different TID).
  • the render device may schedule the MSDUs for transmission in a plurality of PPDUs to the display device. For example, the render device may attempt to transmit the MSDUs from the MSDU queue of a first video slice before transmitting the MSDUs from the MSDU queue of a next video slice.
  • the display device may fail to obtain one or more MSDUs from the render device (such as a result of interference on the wireless medium). For example, the display device may not provide a BA for a sent PPDU, or the display device may provide a BA indicating which MPDUs (including one or more MSDUs) were successfully obtained.
  • the video slices are associated with a latency requirement for displaying video frames at the video device. If one or more MSDUs are not successfully delivered to the display device in an amount of time to allow the display device to display the video frame (such as not receiving a BA indicating one or more MPDUs including the MSDUs within a set amount of time), the render device may move on to providing MSDUs for a different video slice without completing delivery of the previous MSDUs. For example, referring back to Figure 18 and assuming that video frames Q and Q+l are p-frames, the render device generates queues 1 through N and attempts to deliver the MSDUs in those queues. As noted above, rendering of the frames is at a defined interval.
  • Render device goes on to generate queues N+l through 2*N at time after generating queues 1 through N.
  • the render device generating a new queue associated with the same video slice may indicate that the render device is not to attempt to deliver any more MSDUs from the previous queue.
  • video slice 1 of video frame Q and video frame Q+l may refer to the same area of a video frame (such as the same lines or columns between the video frames). If queue 1 is not empty (with additional MSDUs from queue 1 to be delivered) by the time queue N+l is generated, the render device may stop attempting to deliver the remaining MSDUs in queue 1 and begin providing the MSDUs in queue N+l.
  • the video slice may be identified by a port number, which may be used to determine that queue N+l is associated with queue 1.
  • the render device may flush queue 1 (with the remaining MSDUs considered stale) so that the MSDUs are no longer scheduled for transmission to the display device.
  • a first MSDU queue (generated for a first p-slice) may be flushed after rendering a second p-slice associated with the first p-slice (such as the same video slice of a successive p-frame) and before providing, to the display device, a PPDU including one or more MSDUs associated with the first p-slice (such as a PPDU including one or more MSDUs that are failed to be delivered).
  • the render device may attempt to continue to deliver MSDUs associated with i-slices after a new corresponding i-slice or a p-slice is generated.
  • an i-slice may be used as a reference frame for the intermediate frames in a video generated by the display device from the obtained PPDUs. Therefore, older i-slices may still be of value to the display device since they may be used as reference for subsequent p-slices (such as to decode successive p-slices).
  • the render device may not flush an MSDU queue associated with a previous i-slice upon rendering a new i-slice or p-slice associated with the previous i-slice. For example, the render device may continue to attempt to deliver MSDUs for the i-slice up until a threshold number of successive i-slices, p-slices, or a combination thereof are rendered (such as until a second successive i-slice is rendered or a number of p-slices up until the next i-slice is rendered).
  • the render device may render a first i-slice of frame Q, generate a first MSDU queue associated with the first i-slice, render a second i-slice of frame Q+l associated with the first i-slice (such as the same video slice of the video frames), generate a second MSDU queue associated with the second i-slice, and still provide a PPDU including one or more MSDUs with the first i-slice after generating the second MSDU queue.
  • a PPDU may be attempted to be transmitted a threshold number of times by the render device (such as five times).
  • the PPDU includes one or more MSDUs of an application file (such as a video slice). If the PPDU fails to be delivered after a threshold number of retries, the PPDU is not again attempted to be delivered to the display device. In this manner, the display device does not obtain one or more MSDUs of the video slice, and the display device may be unable to generate the video slice from the MSDUs that are obtained since some of the MSDUs are missing.
  • the render device (such as the WCD) may flush the MSDU queue associated with the video slice.
  • the render device may generate a replacement video slice after flushing the MSDU queue.
  • the replacement video slice may be a new video slice based on a most recently obtained pose frame or a copy of the original video slice.
  • the render device may generate a replacement MSDU queue associated with the replacement video slice (similar to as depicted in Figure 18 for other video slices).
  • the render device may attempt to transmit the MSDUs from the replacement MSDU queue in one or more PPDUs to the display device. Generating a replacement video slice may be based on whether enough time exists to provide the PPDUs including the replacement video slice to the display device to display the video frame.
  • the render device may determine not to generate a replacement video slice based on the remainder of the TWT window being less than a threshold amount of time. In this manner, the display device may display a video slice from a previous video frame or not display the video slice for the video frame (which may include a black spot at the location of the video slice).
  • the render device may flush an MSDU queue based on an explicit command from the application layer. For example, a user of the render device may indicate that a portion of the XR experience is to be reset or that the XR experience is to be terminated. The scheduled MSDUs are no longer needed to be transmitted.
  • the render device may generate the explicit command to flush one or more MSDU queues at the application layer and provide the command to the WCD.
  • the WCD may flush the one or more MSDU queues based on the command.
  • the render device indicates to the display device that an MSDU queue is flushed.
  • the indication may be included in a MAC header of a packet or a separate control element from the render device to the display device. In this manner, the display device is made aware of one or more missing MSDUs.
  • the indication may indicate which video slice is associated with the MSDU queue (such as providing the IP address, port number, and DSCP value used to identify the MSDU queue in a MAC header). Based on the indication, the display device may terminate storing and processing MSDUs associated with the indicated video slice.
  • the WCD of the display device may include a reorder (REO) queue to obtain and store the MSDUs until all or a sufficient number of MSDUs are obtained to reconstruct the application file (such as a video slice).
  • the REO queue may obtain one or more MSDUs from the render device. If the display device obtains an indication that a transmit queue associated with the one or more obtained MSDUs (such as an MSDU queue that included the one or more MSDUs at the render device) is flushed, the obtained MSDUs may no longer be used in generating the video slice. In this manner, the display device may flush the REO queue after obtaining the indication.
  • the display device may flush the REO queue if all (or a sufficient number of) MSDUs associated with an application file (such as a video slice) are not obtained.
  • the REO queue may obtain a portion of MSDUs associated with a video slice, but the REO queue may not obtain a remainder of the MSDUs associated with the video slice before a REO timeout occurs.
  • the display device may flush the REO queue after not obtaining the remainder of the MSDUs before the REO timeout occurs.
  • an REO timeout amount of time may be a time from when obtaining the first MSDU of the application file to the latest when the last MSDU of the application file is to be obtained.
  • the REO timeout amount of time may be a time indicating when the video slices of an entire video frame are to be obtained.
  • the REO timeout amount of time may be based on the TWT window size.
  • the REO timeout amount of time is based on the AC of the video slice (such as based on the TID for the video slice).
  • An example REO timeout amount of time is 10 ms, but any suitable amount of time may be used.
  • the amount of time may be counted from the beginning of the TWT window or any other suitable starting point for determining if an REO timeout occurs. If the display device counts up to the REO timeout amount of time before obtaining all MSDUs, the display device may flush the REO queue.
  • the display device may flush the REO queue periodically. For example, the display device may flush the REO queue between TWT windows. In some implementations, the display device may flush the REO queue based on an explicit command generated at the application layer. For example, the XR experience may be reset or terminated by a user of the display device or otherwise by the XR application layer. In this manner, the display device may generate a command to flush the REO queue, as the obtained MSDUs are no longer needed.
  • each MSDU may include a number of FEC bits at the end of the payload.
  • the FEC bits from the obtained MSDUs may be sufficient to construct the missing MSDUs. Since a portion of the MSDU payloads are reserved as FEC bits, the amount of payload provided in each MSDU is reduced, and the number of MSDUs for a video slice may increase. However, the display device may not need to obtain all MSDUs for the video slice to generate the video slice.
  • FEC may be used or not used based on a link quality between the render device and the display device (such as a reference signal receive power (RSRP) measurement, a reference signal receive quality (RSRQ) measurement, or a signal to noise ratio (SNR) measurement measured by the display device).
  • RSRP reference signal receive power
  • RSRQ reference signal receive quality
  • SNR signal to noise ratio
  • FEC may be used by the display device and the render device if interference increases above a threshold (such as indicated by SNR dropping below a threshold) or channel conditions worsening below a threshold (such as indicated by RSRP or RSRQ dropping below a threshold).
  • FEC may not be used if the frame rate, resolution, or other parameters of the video frames would not permit use of FEC while still meeting latency requirements for the video frames.
  • FEC may not be used if the frame rate is greater than a threshold frame rate, the video frame resolution is greater than a threshold resolution, and so on.
  • FEC may or may not be used based on one or more parameters of the display device (such as channel size, MCS used for obtaining the PPDUs, and so on that may affect the throughput to the display device).
  • Use of FEC may be indicated by the display device to the render device in the one or more pose frames or by the render device to the display device in a MAC header of one or more PPDUs.
  • the display device may indicate to the render device when enough MSDUs are obtained to reconstruct the video slice. For example, the display device provides a BA to the render device to indicate that a PPDU was obtained from the render device. If the display device determines that the PPDU included MSDUs sufficient to reconstruct the video slice, the display device may indicate in the BA to the render device that a sufficient number of MSDUs are obtained. For example, the aggregated control (A-Control) field of the MAC header of the BA may be configured to indicate that a sufficient number of MSDUs for the video slice are obtained.
  • A-Control aggregated control
  • the render device may process the A-Control field to determine that no further PPDUs including MSDUs for the video slice are to be provided to the display device (with the display device to generate the video slice from the already obtained MSDUs).
  • the render device may flush the associated MSDU queue after obtaining the indication.
  • Changes in the wireless medium and the XR experience may require changes at the render device or the display device in communicating data to the other device or performing one or more XR operations. For example, as the wireless medium becomes more congested, the display device moves away from the render device, or more interference exists on the wireless medium, the render device and the display device may adjust one or more parameters to reduce the amount of information to be communicated between the devices.
  • Example parameters may be associated with the XR experience (such as video frame rate, frame resolution, color palette, and so on that affects the size of the video or the wireless channel, channel size, MCS, use of FEC, TWT window size, and so on to affect the bit rate between devices and the success in delivering data between the devices for the XR experience).
  • the render device and the display device may be configured to provide feedback regarding data provided to the other device, and the device may be configured to adjust the XR experience based on the feedback.
  • the render device may generate feedback based on transmitting PPDUs to the display device, and the display device may generate feedback based on transmitting pose frames to the render device.
  • Figure 19 shows a flowchart illustrating an example process 1900 for generating feedback according to some implementations.
  • the device performing the example process 1900 may be a render device, and the second device may be a display device.
  • the device attempts to provide a plurality of PPDUs associated with one or more video frames of an XR experience to a second device.
  • the device measures one or more of a PPDU transmission latency or a PPDU transmission drop associated with attempting to provide the plurality of PPDUs. For example, the render device may observe when a BA is not obtained for one or more PPDUs transmitted to the display device. In some implementations, the render device may determine that a PPDU transmission drop occurs when a BA is not obtained for a transmitted PPDU. In some implementations, the render device may determine that a PPDU transmission drop occurs when the render device reaches a maximum number of retries for the PPDU and the render device is to no longer attempt to deliver the PPDU. The render device may count the total number of PPDU transmission drops over time.
  • the time may be over a TWT window, a set number of TWT windows, a set amount of time, or another suitable amount of time.
  • the render device may determine a PPDU transmission drop rate, which may be the number of PPDU transmission drops divided by the total number of PPDUs attempted to be transmitted to the display device.
  • the render device may determine a PPDU transmission latency based on the BAs obtained for delivered PPDUs to the display device. For example, the render device tracks a time when a PPDU is transmitted to the display device (such as based on the WCD clock of the render device). The render device also tracks a time when the BA associated with the PPDU is obtained from the display device (such as based on the WCD clock of the render device). The render device may determine the PPDU transmission latency for the PPDU to be the difference between the time the PPDU is transmitted and the time the BA is obtained. In some implementations, the render device may determine an average PPDU transmission latency, median PPDU transmission latency, or a distribution of PPDU transmission latency over a number of PPDU transmissions.
  • the one or more measurements are associated with one or more parameters of the XR experience, and the one or more parameters of the XR experience may be adjusted after the one or more measurements (1906).
  • the PPDU transmission latency and the PPDU transmission drop may indicate if channel conditions are improving (such as based on less interference, the display device moving closer to the render device, and so on) or channel conditions are degrading (such as based on more interference, the display device moving away from the render device, and so on).
  • the measurements also may indicate if throughput to the display device is increasing or decreasing.
  • the render device may determine if one or more parameters of the XR experience are to be adjusted to ensure the video (or other data of the XR experience) still meets the latency and packet loss requirements. For example, the render device may reduce the channel size, increase the MCS, or use FEC if the PPDU transmission drop increases. As a result of reducing the channel size, increasing the MCS, or using FEC, the throughput to the display device decreases. If the decreased throughput is less than what is required for the current video, the render device may adjust one or more video parameters (such as decreasing the resolution, the frame rate, and so on). The measurements or the adjustments may be communicated to the display device, and the display device may implement the one or more adjustments.
  • the render device may reduce the channel size, increase the MCS, or use FEC if the PPDU transmission drop increases. As a result of reducing the channel size, increasing the MCS, or using FEC, the throughput to the display device decreases. If the decreased throughput is less than what is required for the current video, the render
  • Process 1900 depicted in Figure 19 is from the render device perspective of generating feedback.
  • Process 2000 depicted in Figure 20 may be from the display device perspective of generating feedback.
  • Figure 20 shows a flowchart illustrating an example process 2000 for generating feedback according to some implementations.
  • the device performing the example process 2000 may be a display device, and the second device may be a render device.
  • the device attempts to provide a plurality of pose data frames associated with one or more video frames of an XR experience to a second device.
  • the device also attempts to provide tracking frames to the second device.
  • the device measures one or more of a pose data frame transmission latency or a pose data frame transmission drop associated with attempting to provide the plurality of pose data frames. For example, the display device may observe when a BA is not obtained for one or more pose data frames transmitted to the render device. In some implementations, the display device may determine that a pose data frame transmission drop occurs when a BA is not obtained for a transmitted pose data frame. In some implementations, the display device may determine that a pose data frame transmission drop occurs when the display device reaches a maximum number of retries for the pose data frame. The render device may count the total number of pose data frame transmission drops over time.
  • the time may be over a TWT window, a set number of TWT windows, a set amount of time, or another suitable amount of time.
  • the display device may determine a pose data frame transmission drop rate, which may be the number of pose data frame transmission drops divided by the total number of pose data frames attempted to be transmitted to the render device.
  • the display device may determine a pose data frame transmission latency based on the BAs obtained for delivered pose data frames to the render device. For example, the display device tracks a time when a pose data frame is transmitted to the render device (such as based on the WCD clock of the display device). The display device also tracks a time when the BA associated with the pose data frame is obtained from the render device (such as based on the WCD clock of the display device). The display device may determine the pose data frame transmission latency for the pose data frame to be the difference between the time the pose data frame is transmitted and the time the BA is obtained. In some implementations, the display device may determine an average pose data frame transmission latency, median pose data frame transmission latency, or a distribution of pose data frame transmission latency over a number of pose data frame transmissions.
  • the one or more measurements are associated with one or more parameters of the XR experience, and the one or more parameters of the XR experience may be adjusted after the one or more measurements (2006).
  • the pose data frame transmission latency and the pose data frame transmission drop may indicate if channel conditions are improving (such as based on less interference, the display device moving closer to the render device, and so on) or channel conditions are degrading (such as based on more interference, the display device moving away from the render device, and so on).
  • the display device (and the render device) may determine if one or more parameters of the XR experience are to be adjusted to ensure the video (or other data of the XR experience) still meets the latency and packet loss requirements (such as described above).
  • the measurements or the adjustments may be communicated to the render device, and the render device may implement the one or more adjustments.
  • the render device may obtain one or more pose data frames from the display device (with each of the one or more video frames associated with a pose data frame).
  • the one or more measurements may include measurements based on the obtained pose data frames.
  • the render device may measure a pose data frame delivery latency associated with obtaining the one or more pose data frames. For example, with the start time of TWT windows known and a pose data frame to be delivered at the beginning of a TWT window, the render frame may determine the difference between the start time of the TWT window and the time when the pose data frame is obtained (with the times based on the WCD clock of the render device).
  • the time of the IMU measurement is included in the pose data frame (such as a time indicated by the application layer clock of the display device). Since the WCD clock of the render device and the application layer clock of the display device may be synchronized, the render device may determine a difference between the time of the IMU measurement and the time of obtaining the pose data frame. The render device may determine an average pose data frame delivery latency, median pose data frame delivery latency, or a distribution of the pose data frame delivery latency over a number of pose data frame deliveries (such as over a defined number of TWT windows or defined amount of time).
  • the one or more measurements by the render device may include a frequency of missing or delayed pose data frames.
  • the render device determines whether one or more pose data frames are missing or delayed. For example, the display device may attempt to provide a pose data frame up to a timeout at the beginning of each TWT window. If the render device does not obtain the pose data frame before the timeout, the render device may determine that the pose data frame is missing or delayed. The render device may count the number of missing or delayed pose data frames over a determined number of TWT windows. If the number increases as successive TWT windows occur, the render device determines that the frequency of missing or delayed pose data frames increases.
  • a REO queue of the display device may obtain one or more MSDUs from the render device. As noted above, each MSDU may be associated with an application file (such as a video slice). The display device may flush the REO queue one or more times to remove one or more MSDUs from the REO queue.
  • the display device may flush the REO queue based on obtaining an indication that a transmit queue at the render device was flushed (such as the MSDU queue of the render device associated with the MSDUs in the REO queue of the display device).
  • the display device may flush the REO queue periodically (such as between TWT windows).
  • the display device may flush the REO queue if a timeout occurs (such as all MSDUs or a sufficient number of MSDUs not being received in a defined amount of time to allow construction of the video slice for display).
  • the display device may flush the REO queue if MSDUs for a successive video slice are obtained before obtaining the remainder of MSDUs for the previous video slice.
  • the display device may flush the REO queue based on a command from the application layer of the display device.
  • the one or more measurements at the display device may include a REO flush time associated with flushing the REO queue.
  • the display device counts the number of times the REO queue is flushed over a defined amount of time (such as a TWT interval or a number of TWT intervals).
  • the REO flush time may be the average number of flushes per TWT interval, the total number of flushes counted by the display device, or another suitable indication of the number of times the REO queue is flushed by the display device.
  • the one or more measurements at the display device also may include a video frame delivery latency.
  • the display device (such as at the REO queue) obtains a first MSDU associated with a video frame.
  • the first MSDU may include metadata to indicate that the MSDU is the first MSDU for the video frame.
  • the display device also obtains a last MSDU associated with the video frame, which may include meta data to indicate that the MSDU is the last MSDU for the video frame.
  • the first MSDU and the last MSDU may be identified by a DSCP value included in the MSDU.
  • the display device may measure a video frame delivery latency associated with obtaining the first MSDU and the last MSDU for the video frame.
  • the display device may determine a time when the first MSDU is obtained and a time when the last MSDU is obtained from the WCD clock of the display device.
  • the display device may determine a difference in time between when the first MSDU is obtained and when the last MSDU is obtained as the video frame delivery latency for the video frame.
  • the display device may determine an average delivery latency, a median delivery latency, or a distribution of delivery latencies for a number of video frame delivery latencies.
  • pose data frame or video frame delivery latencies, the frequency of dropped or missing pose data frames, or the flush times increasing may indicate that channel conditions are worsening or that throughput between the devices is somehow being restricted.
  • pose data frame or video frame delivery latencies, the frequency of dropped or missing pose data frames, or the flush times decreasing may indicate that channel conditions are improving or that throughput between the devices is increasing.
  • the one or more measurements at the display device also may include one or more of a jitter buffer underflow or overflow or a missing packet associated with a video frame to be displayed.
  • the display device may perform dejittering in processing the video frames before display. Dejittering uses a jitter buffer of video frame information to allow smoothing of the display of video frames. If data in the buffer drops below a lower threshold, the display device may determine a jitter buffer underflow, and if the data in the buffer rises above an upper threshold, the display device may determine a jitter buffer overflow.
  • Underflow (which results from a lack of data) and overflow (which results in too much data such that some data may be leaked from the buffer) may negatively impact dejittering by the display device.
  • the display device may count the total number of overflows or underflows, and average buffer usage, or other metric over a defined amount of time associated with the jitter buffer.
  • a missing packet associated with a video frame to be displayed may include a missing video slice.
  • the display device may display one or more video frames missing one or more video slices.
  • the one or more measurements includes a count of the number of video frames displayed missing one or more video slices, the number of video slices missing, or another suitable metric of missing packets over a defined amount of time.
  • the one or more measurements may include a link quality measurement of a link quality between the render device and the display device.
  • the display device or the render device may measure an RSRP, RSRQ, SNR, or other metric indicating a link quality between the devices. Changes in a link quality metric may indicate worsening channel conditions or improving channel conditions.
  • the render device or the display device may adjust one or more parameters of the XR experience to ensure latency requirements are met for the XR experience.
  • the one or more parameters of the XR experience may include one or more wireless communication parameters between the render device and the display device. Adjusting one or more wireless communication parameters may include one or more of: adjusting a duty cycle of TWT windows for a TWT power save mode.
  • the duty cycle may be the length of the TWT window compared to the amount of time outside of the TWT window during a TWT window interval; enabling or disabling the TWT power save mode; changing a wireless operating channel over which the render device and the display device communicate; adjusting the wireless operating channel size (such as increasing or decreasing the channel size between 20 MHz, 40 MHz, 80 MHz, 80+80 MHz, and 160 MHz); adjusting a MCS; enabling or disabling FEC for providing the PPDUs from the render device to the display device; or adjusting the FEC (such as the code rate of the FEC, which may be associated with the number of bits in the payload to be used for FEC).
  • the one or more parameters of the XR experience may include one or more video parameters. Adjusting one or more video parameters may include adjusting one or more of a video frame rate, a video resolution (such as a frame resolution), a target encode data rate (also referred to as an encoding bit rate), or a video codec.
  • the render device may adjust a render rate of the video frames based on the video frame rate, the video resolution, the encoding bit rate, or the video codec.
  • the display device may adjust a display rate of the video frames based on the video frame rate.
  • a video resolution, an encode data rate, and a video codec are a tradeoff between video quality and file size.
  • selection of a video codec is based on balancing compression of the video with video quality.
  • the render device may adjust the video resolution, the target encode data rate, or the video codec (such as switching between available codecs for encoding the video) to balance video quality with file size and latency requirements.
  • adjusting the one or more parameters may include adjusting a functional split of computing tasks between the display device and the render device or a relay ST A to the display device and another device communicably coupled to the relay STA. For example, based on the one or more measurements, the display device may determine that some rendering should be performed at the display device (such as when the link to the render device is determined to be bad (such as a link quality being less than a threshold or being indicated as bad). For example, one or more p-frames may be rendered at the display device. In this manner, fewer PPDUs are to be transmitted from the render device to the display device.
  • the relay STA may determine that rendering one or more video frames is to be performed at the relay STA instead of the other STA or AP based on the one or more measurements.
  • the display device or the render device may indicate the one or more measurements or the adjustments made to the one or more parameters of the XR experience to the other device.
  • a device may provide an indication of the one or more measurements to the other device in an A-Control field of a header of one or more packets provided to the other device.
  • the one or more packets may be included in an RTS frame or a CTS frame (if RTS/CTS is enabled, such as for asynchronous channel access), a data frame (such as the one or more PPDUs from the render device to the display device or the one or more pose data frames from the display device to the render device), or a BA frame (such as a BA to the render device and associated with obtaining a PPDU or a BA to the display device and associated with obtaining a pose data frame).
  • An example A-Control field may be a variant of the HT control field (HE A-Control field) as defined in the IEEE 802.1 lax standard.
  • the A-Control field may be included in a frame control field of a MAC header of an RTS frame or a CTS frame.
  • the A-Control field may be included in the non-legacy fields 212 of the example PDU in Figure 2A.
  • the A-Control field may be included in the frame control field of the MAC header 412 in the MPDU subframe 406 in Figure 4.
  • the A-Control field may be included in the frame control field of the MAC header of the BA frame.
  • FIG. 21 shows a block diagram of an example control field 2100.
  • the control field 2100 may be a simplified version of the A-Control field included in one or more frames.
  • the actual A-Control field may include additional subfields not shown (such as an end of header (EOH) and so on).
  • the control field includes a control identifier (ID) subfield 2102 and a control information subfield 2104.
  • the control ID in the subfield 2102 indicates what type of control information is included in subfield 2104. For example, based on HE control fields, the subfield 2102 may include different numbers as control IDs for different control information.
  • a control ID of 0 may indicate that subfield 2104 includes an ACK
  • a control ID of 2 may indicate that subfield 2104 includes a BA, and so on.
  • the IEEE 802.1 lax standard reserves some control IDs from use.
  • the A-Control field (such as subfield 2102) includes a reserved control ID to indicate the one or more measurements from the display device or the render device are included in the A-Control field (such as subfield 2104 being used to include the one or more measurements).
  • the reserved control ID may be defined at both the render device and the display device to allow encoding and decoding the A-Control field.
  • different control IDs may be used for different types of measurements.
  • the A-Control field (such as subfield 2102) includes a reserved control ID to indicate the one or more adjusted parameters are included in the A-Control field (such as subfield 2104 being used to include an indication of the one or more adjusted parameters).
  • the reserved control ID may be defined at both the render device and the display device to allow encoding and decoding the A-Control field.
  • different control IDs may be used for different types of adjustments.
  • the A-Control field may include an indication of the link quality (which may be measured by the render device or the display device, as noted above).
  • the link quality may be indicated as good or bad in a binary manner. For example, if a device measures an SNR, the device may compare the SNR to a threshold. If the SNR is less than the threshold, the link quality may be considered bad. If the SNR is greater than the threshold, the link quality may be considered good.
  • the A-Control field (such as the subfield 2104) may include a bit reserved to indicate the link quality (such as 0 for bad and 1 for good). The remaining bits of the subfield may be used to indicate the one or more measurements or the adjustments to the one or more parameters based on the one or more measurements.
  • the display device and the render device may provide feedback to each other and adjust the XR experience as necessary.
  • Providing feedback may be performed for both asynchronous channel access and synchronous channel access. While the feedback and adjustments are described with reference to an XR experience. The operations for measuring and providing feedback described above may be applied to other types of data and other types of applications.
  • a device 154 may support concurrent links with a first device 152 (such as a display device) and a second device 158 (such as an AP of the BSS or another STA in a mesh network).
  • link 156 may be in the 5GHz or 6GHz frequency spectrum
  • link 160 may be in the 5GHz or 6GHz frequency spectrum.
  • the device 154 is configured to support concurrent links while still supporting the XR experience (such as prioritizing XR traffic and otherwise managing XR operations to meet latency and other requirements for the XR experience).
  • the concurrent links may be managed using TWT sessions (such as described above, with the AP or other STA communicating outside of the TWT window for the display device and the render device).
  • TWT sessions such as described above, with the AP or other STA communicating outside of the TWT window for the display device and the render device.
  • concurrent links may be supported using multi-link operation (MLO) techniques (such as defined in the IEEE 802.11 standards).
  • MLO multi-link operation
  • FIG 22 shows a flowchart illustrating an example process 2200 for supporting concurrent wireless links with multiple devices.
  • the example process 2200 may be performed by a WCD (such as WCD included in a render device or another suitable device supporting concurrent wireless links with a first device and a second device).
  • the second device is a display device.
  • the first device may be an AP (in a BSS) or another STA (in a mesh network).
  • the example process 2200 is described below as being performed by a render device or a WCD of a render device for clarity purposes, but the example process 2200 may be performed by any suitable device.
  • the render device (such as the WCD of the render device) communicates with a first device over a first wireless link.
  • the render device communicates with an AP or another STA.
  • the render device communicates with a second device over a second wireless link.
  • the render device may communicate with a display device over a first wireless link configured between the render device and the display device for an XR experience.
  • the WCD communicates concurrently with the first device and the second device using one of MLO techniques or a TWT mode (2206).
  • TWT mode also referred to as TWT
  • MLO techniques may be similar to some TWT mode techniques, but may be supported by legacy devices that do not support TWT.
  • the WCD is configured to give preference to communications on the second wireless link versus communications on the first wireless link (2208).
  • the second wireless link may be between the render device and a display device for an XR experience.
  • XR data to be transmitted between the render device and the display device (such as PPDUs, pose data frames, and tracking frames) are associated with a latency requirement or packet loss requirement associated with the XR experience.
  • communications between the render device and the display device may be of more importance than communications between the render device and the other device (such as an AP or render device attempting to transmit best effort classified traffic to the other device).
  • Example operations for the WCD to give preference to communications on the second wireless link versus communication on the first wireless link are described in more detail below.
  • Concurrently communicating with the first device (such as an AP) and the second device (such as a display device) may include:
  • the WCD is to transmit to the second device during reception of one or more packets from the first device (such as to transmit to the display device when receiving from the AP); - the WCD is to obtain one or more packets from the second device during reception of one or more packets from the first device (such as to receive from the display device when receiving from the AP);
  • the WCD is to transmit to the second device during transmission to the first device (such as to transmit to the display device when transmitting to the AP);
  • the WCD is to obtain one or more packets from the second device during transmission to the first device (such as to receive from the display device when transmitting to the AP).
  • Typical MLO techniques are defined to prevent transmission on a link when reception is on-going on another link. For example, allowing transmission on a wireless link may be based on a clear channel assessment (CCA) of the wireless medium (which includes both wireless links). If the CCA indicates that the wireless medium is occupied (such as when the WCD is receiving from the first device), the WCD prevents transmissions. In this manner, if a WCD is configured for typical MLO, the WCD prevents transmission to the second device over the second wireless link when receiving from the first device over the first wireless link, and the WCD prevents transmission to the first device over the first wireless link when receiving from the second device over the second wireless link.
  • CCA clear channel assessment
  • Typical MLO techniques may prevent communications between the render device and the display device that would negatively impact the XR experience (such as not meeting certain latency requirements).
  • the WCD of the render device may be configured to enhance the MLO techniques (and other MAC operations) to provide preferential treatment to communications with the display device.
  • the WCD may allow transmission to the display device while receiving from another device (such as an AP or STA).
  • the WCD may be configured to ignore a CCA when the WCD is to transmit to the display device. Example adjustments to the MLO techniques for the different example concurrent communications are described below.
  • One example concurrent communication includes the WCD receiving one or more packets from the first device and the WCD transmitting to the second device.
  • the render device may be in the process of obtaining one or more packets from an AP or STA. While obtaining the one or more packets, the render device may determine that the render device is to transmit to the display device (such as transmitting one or more PPDUs associated with one or more video frames).
  • the WCD may transmit to the second device before completion of reception of the one or more packets from the first device.
  • the examples described herein refer to the WCD or device managing the concurrent links as a render device, the second device as a display device, and a first device as an AP exclusively for simplicity and clarity in describing aspects of the present disclosure. To note, the examples are not limited to a render device, display device, AP, or any other specific device.
  • a render device typically would provide a BA to the AP after obtaining the one or more packets over the first wireless link. If transmission over the second wireless link to the display device is still ongoing when the render device is to provide the BA to the AP, a CCA would indicate that the wireless medium is busy, and the render device would prevent providing the BA to the AP.
  • the render device obtains a BA from the display device after transmitting to the display device (such as obtaining a BA from a display device after delivering a PPDU to the display device). Based on typical MLO, if the end of reception from the AP coincides with the end of transmission to the display device, the CCA at the render device would not cause the render device to prevent providing the BA to the AP.
  • the BA may be provided to the AP near the same time a BA from the display device is to be obtained. If the BA to the AP and the BA from the display device overlap on the wireless medium, the render device may not successfully obtain the BA from the display device. Not obtaining the BA causes the render device to retry transmitting to the display device (which increases the latency in delivering data to the display device).
  • the render device prevents providing a BA to the AP to acknowledge reception of the one or more packets.
  • the BA obtained from the display device is not interfered with by a BA to the AP.
  • the render device may determine if the WCD is in a transmit mode for the second wireless link. If the WCD is in a transmit mode for the second wireless link, the render device may prevent providing a BA over the first wireless link. With the render device to prevent transmitting a BA to the AP for the one or more packets, communication of the one or more packets over the first wireless link may be considered to have failed.
  • the AP later may retry to transmit the one or more packets to the render device (increasing the latency of delivering the one or more packets from the AP).
  • Increasing the latency on the first wireless link over increasing the latency on the second wireless link may be acceptable based on the latency requirements between the render device and the display device (such as for an XR experience).
  • Another example concurrent communication includes the render device receiving one or more packets from the AP and the render device receiving one or more packets from the display device.
  • the render device may be receiving from the AP over the first wireless link when pose data frames, tracking frames, or other packets are provided by the display device over the second wireless link.
  • the render device obtains one or more packets from the display device during reception of one or more packets from the AP.
  • the render device provides a BA to the display device after obtaining the one or more packets from the display device.
  • the render device may prevent providing a BA to the AP to acknowledge reception of the one or more packets from the AP.
  • transmitting on a first one or more antennas over the first wireless link may cause a local interference on a second one or more antennas receiving over the second wireless link.
  • Preventing transmission of the BA to the AP prevents generating the local interference in receiving from the display device.
  • Another example concurrent communication includes the render device transmitting to the AP and the render device transmitting to the display device.
  • the render device may be transmitting to the AP over the first wireless link when the render device determines to transmit one or more PPDUs to the display device over the second wireless link. If data arrives at the WCD for transmission (such as one or more MSDUs to be included in one or more PPDUs transmitted to the display device) at the same time as an ongoing transmission to the AP, typical MLO techniques would require the render device to determine a CCA (which would indicate that the wireless medium is busy). In this manner, the render device would delay transmitting to the display device (increasing the latency over the second wireless link).
  • One or more enhancements to the MLO techniques may be applied to attempt to avoid or reduce such delay in transmission to the display device.
  • the render device synchronizes transmissions to the display device with transmissions to the AP.
  • the start of transmissions to the AP and to the display device may be synchronized. Synchronizing the start of transmissions may be based on a transmission schedule or known transmission period from the render device to the display device.
  • rendering video frames may be at regular intervals, with the MSDUs available for transmission at the regular intervals (which may vary within a tolerance amount of time associated with any latencies, such as UL latencies or rendering latencies).
  • the render device may determine if the current time is within a threshold amount of time that MSDUs are to be transmitted to the display device.
  • the render device may delay transmission to the AP to synchronize the transmissions to the AP and to the display device.
  • the render device may reduce the BO associated with the second wireless link to synchronize transmissions over the first wireless link and over the second wireless link.
  • Transmissions to the display device may be shorter than transmissions to the AP. If the transmission to the display device ends first, a BA may be transmitted by the display device while the render device still is transmitting to the AP (which may affect the render device receiving the BA). In some implementations of synchronizing transmissions, the render device may pad the transmission to the display device to cause an end of transmission to the AP to be synchronized with an end of transmission to the display device.
  • Example padding may include zero padding or adding any suitable tail to the transmission to the display device to synchronize the ends of transmission.
  • the render device may determine the amount of padding to be applied based on the amount of data scheduled to be transmitted to the AP, the amount of data scheduled to be transmitted to the display device, and the measured throughputs over the first wireless link and over the second wireless link.
  • a reduced BO associated with the second wireless link may compensate for the padding so that the latency over the second wireless link is not impacted.
  • the render device may prevent transmission to the AP outside of one or more time division multiplexing (TDM) windows.
  • TDM time division multiplexing
  • a TWT session may be conceptualized as a form of time division multiplexing of the wireless medium.
  • communications between a render device and a display device may be during the TWT windows, and communications between the render device and another device (such as an AP or a STA) may be outside of the TWT windows.
  • some legacy devices do not support TWT.
  • a wireless system including one or more legacy devices may be configured to share the wireless medium using TDM windows.
  • an AP of a BSS including the render device and the display device may configure one or more TDM windows to communicate with the render device.
  • Configuring the TDM windows may include configuring a window length, periodicity, and other parameters for the TDM window.
  • transmissions from the render device to the AP are during the one or more TDM windows, and transmissions from the render device to the display device (such as PPDUs transmitted to the display device) are outside of the one or more TDM windows.
  • a maximum delay to transmit to the display device is based on a remaining length of a current TDM window.
  • the render device may reduce a maximum PPDU duration or a burst duration.
  • the PPDU duration may be based on the PPDU size and throughput. If the maximum PPDU duration is reduced, the maximum amount of time to transmit a PPDU may be reduced.
  • the burst duration may be the amount of time the render device is to transmit PPDUs to the display device. Shortening the burst duration shortens the amount of time that the wireless medium is to be reserved exclusively for transmissions to the display device.
  • Shortening the amount of time that is to be reserved for transmitting a PPDU or a transmission duration to the display device allows the render device to more quickly schedule a new PPDU transmission or a new TXOP associated with the burst duration. In this manner, the potential delay for transmission to the display device may be reduced by reducing the maximum PPDU duration or the burst duration.
  • the render device may interrupt transmission to the AP so that the render device no longer is transmitting to the AP.
  • one or more MSDUs are ready to be transmitted by the render device to the display device while the render device is transmitting to the AP.
  • the packets to be transmitted to the AP are included in a transmit buffer of the WCD of the render device.
  • the render device may flush the packets from the transmit buffer associated with the transmission to the AP. With the transmit buffer empty, the transmission to the AP ends. In this manner, when the render device performs CCA for transmitting over the second wireless link, the CCA may indicate that the wireless medium is free for the render device to transmit to the display device over the second wireless link.
  • the render device may use CTS to self (CTS-to-Self) frames to extend reservation of a wireless medium for transmitting to the display device. For example, when the render device transmits to the AP, the wireless medium is reserved for the render device (such as described above regarding the CTS/RTS mechanism between the render device and the display device). When the reservation ends, other devices may contend for the wireless medium or be scheduled to transmit over the wireless medium, and the render device may wait for the other devices to finish occupying the wireless medium.
  • CTS to self CTS-to-Self
  • the render device determines an amount of time to transmit the MSDUs (such as based on the amount of data in one or more MSDU queues associated with the second wireless link).
  • the render device also determines the remaining amount of time to transmit to the AP (such as based on the amount of data remaining in the transmit buffer associated with the first wireless link).
  • the render device may determine if any time from the reservation will be remaining based on the remaining amount of time to transmit to the AP, and the render device may determine an additional amount of time to the reservation to allow transmitting the MSDUs to the display device during the same reservation.
  • the render device may broadcast a CTS-to-Self frame.
  • the CTS- to-Self frame may indicate to pad the time period that the wireless medium is reserved (such as a value to be added to the current NAV of each device obtaining the CTS-to- Self frame).
  • the padded time period is of sufficient length to include the transmission to the AP and the transmission to the display device.
  • Another example concurrent communication includes the render device transmitting to the AP and the render device obtaining one or more packets from the display device.
  • the render device may be transmitting to the AP over the first wireless link when the display device transmits a pose data frame or tracking frame over the second wireless link. If the render device is transmitting over the second wireless link when packets are being transmitted by the display device to the render device, the render device may be unable to obtain the packets from the display device. With the render device not obtaining the packets from the display device, the render device does not provide a BA to the display device, and the display device retransmits the packets to the render device.
  • the display device may continue attempting to transmit the packets until the packets are successfully delivered or a maximum number of retries occurs.
  • the transmission parameters may be adjusted to attempt to increase the likelihood that the packets are delivered. For example, the display device increases the MCS, uses FEC, and lowers the transmission rate. The transmission rate may continue to be lowered, and the lowered transmission rate would cause issues in meeting latency requirements for the XR experience.
  • the MLO techniques may be enhanced to avoid or reduce the transmission delay or prevent adjusting the transmission parameters at the display device that may negatively impact a communication latency between the display device and the render device.
  • one or more TDM windows may be used to prevent packets being transmitted by the display device during the TDM windows.
  • the TDM windows reserve the wireless medium for communications between the render device and the AP (such as described above). For example, transmission to the AP is during the one or more TDM windows, and reception from the display device is outside of the one or more TDM windows. In this manner, the render device is prevented from obtaining the one or more packets from the display device during the one or more TDM windows.
  • the render device may reduce the maximum PPDU duration or the burst duration. As noted above, reducing the maximum PPDU duration or the burst duration reduces the amount of time the render device transmits to the AP. In this manner, the amount of time that the display device is to retry transmitting to the render device is reduced (which reduces the delay associated with the display device transmitting to the render device).
  • the render device may indicate to the display device that the wireless medium is to be busy for an amount of time associated with transmitting to the AP. For example, the render device may provide a frame to the display device indicating a NAV value to reserve the wireless medium for the duration of transmitting to the AP. In some implementations, the NAV value may be included in a PPDU transmitted from the render device to the display device. In this manner, the display device sets its NAV to the indicated NAV value to prevent transmitting to the render device during the indicated duration. [0278] While use of TDM windows and other MLO techniques are described with reference to legacy devices that do not support TWT, the described techniques may be implemented in any device (including devices supporting TWT). In addition or to the alternative, the techniques may be applied outside of MLO, and the techniques are not limited to a specific concurrent link support implementation.
  • a render device may use TWT to support concurrent links.
  • the render device (or the display device) may configure a first TWT session with the display device for communications with the display device over the second wireless link, and the render device may configure a second TWT session with the AP for communications with the AP over the first wireless link.
  • the first TWT session includes a first plurality of TWT windows, and the first plurality of TWT windows is associated with XR activity at the render device (such as obtaining one or more pose data frames or tracking frames from the display device, rendering one or more video frames, and providing the one or more video frames via one or more PPDUs to the display device).
  • the second TWT session includes a second plurality of TWT windows interspersed between the first plurality of TWT windows and not overlapping with the first plurality of TWT windows.
  • the TWT windows may alternate (such as a TWT window from the first plurality, a TWT window from the second plurality, a TWT window from first plurality, and so on).
  • the render device communicates with the display device during the first plurality of TWT windows, and the render device communicates with the AP during the second plurality of TWT windows.
  • the first TWT session is configured by the render device or the display device to meet any latency or packet loss requirements associated with an XR experience.
  • the render device may initiate configuration of the second TWT session with the AP, with the second TWT session to include TWT windows outside of the TWT windows of the first TWT session.
  • the first TWT session may be considered the primary TWT session, and the second TWT session may be considered a secondary TWT session.
  • the AP may be an IEEE 802.1 lax enabled AP to support TWT (including the second TWT session with the render device).
  • the first TWT session may be adjusted (such as adjusting TWT window lengths, interval, and so on).
  • the render device or the display device may adjust the first TWT session without reference to the second TWT session.
  • the first TWT session may be adjusted that would cause conflict with the second TWT session (such as overlapping TWT windows from the two TWT sessions).
  • the render device may initiate an adjustment of the second TWT session with the AP to avoid overlapping TWT windows. In this manner, the second TWT session is adjusted based on adjustments to the first TWT session.
  • the render device may act as a STA to the AP and may act as an SAP to the display device.
  • the render device may be conceptualized as having a STA-associated MAC and an SAP-associated MAC supported by the WCD of the render device.
  • the SAP-associated MAC is used in configuring and adjusting the first TWT session with the display device
  • the STA-associated MAC is used in configuring and adjusting the second TWT session with the AP.
  • the render device may negotiate with the AP to configure or adjust the second TWT session.
  • the AP may have its own TWT requirements (such as based on supporting other devices in the BSS).
  • the render device may configure the second TWT session with the AP to ensure the AP’s requirements are met.
  • the render device also may configure the first TWT session before configuring the second TWT session to support the AP’s requirements while also meeting any latency and packet loss requirements associated with the XR experience.
  • the TWT windows of the second TWT session are scheduled to take into account a S2W duration of the WCD for the STA-associated MAC.
  • the TWT windows of the second TWT session are scheduled so that the S2W duration does not overlap with XR activity (such as receiving from and transmitting to the display device as described above).
  • Figure 23 shows a sequence diagram 2300 illustrating example timings of XR activity and wireless activity with an AP (referred to as wireless activity).
  • a first XR activity 2302 (such as from receiving a pose data frame to providing a current video frame over the second wireless link) has a start time 2304
  • a second XR activity 2306 (such as from receiving a next pose data frame to providing a next video frame over the second wireless link) has a start time 2308.
  • the XR activity 2302 corresponds to a first TWT window of a first TWT session
  • the XR activity 2306 corresponds to a second TWT window of the first TWT session.
  • the XR activity is periodic (such as based on the display rate or render rate of the video), and a period duration 2310 of the periodic activity is a difference between start times 2308 and 2304.
  • Wireless activity 2312 (associated with communications between the render device and the AP over the first wireless link) is to be scheduled outside of the XR activities 2302 and 2306 (such as between TWT windows corresponding to the XR activities 2302 and 2306).
  • the wireless activity 2312 has a start time 2314 and an end time 2316 to be scheduled.
  • the wireless activity 2312 may correspond to a first TWT window of a second TWT session.
  • the start time 2314 may be more than a buffer 2318 after the XR activity 2302.
  • one or more PPDUs may be transmitted to the display device after a TWT window or when PPDUs are delivered to the display device may vary depending on channel conditions or other latencies.
  • the duration of the XR activity 2302 may vary.
  • the buffer 2318 may be for a defined amount of time to account for variations in the duration of the XR activities (including XR activity 2302). As depicted, start time 2314 is after buffer 2318.
  • Start time 2314 also may be scheduled to account for a S2W duration at the render device to ensure reception and transmission with the display device (or the AP) is not affected by one or more WCD components coming out of a low power mode.
  • a generic XR activity start time is x
  • the period duration is PR
  • the XR activity duration (such as the length of the XR activity 2302) is AR
  • the buffer duration is z
  • a generic wireless activity start time is y
  • a TWT wake interval such as the interval of the TWT windows during which time the WCD is to remain in an active mode
  • the duration of the wireless activity (the difference between end time 2316 and start time 2314) is Aw
  • the wireless activity is scheduled not to overlap with the XR activity for any cycle of the wireless activity (nw, assuming the wireless activity is cyclic) and any cycle of the XR activity (mr), which is mathematically depicted in equations (1) - (3) below: C x+z-y)
  • start time 2304 may be x+mtPR for any cycle of XR activity
  • start time 2308 may be x+(nR+l)PR
  • the end of XR activity 2302 may be x+nRPR+AR
  • the end of XR activity 2306 may be x+(nR+l)PR+AR.
  • Start time 2314 may be y+nwPw
  • end time 2316 may be y+nwPw+Aw.
  • Pw may be configured to be one or more multiples of PR. If TWT is used for communications between the render device and the display device, PR is a constant value across all cycles of XR activity (assuming no adjustments are made to the XR activity, such as based on feedback).
  • the render device may use TDM windows that are not defined in a TWT session.
  • one or more MLO techniques may include the use of TDM windows.
  • PR may vary between different cycles of XR activity (such as resulting from delays in transmission or reception resulting from concurrent communications).
  • the render device may schedule wireless activity for such TDM windows similar to as above with reference to equations (1) - (3), but regarding a varying PR.
  • the render device may attempt to determine a common multiple of possible PR that may occur based on possible delays or variations in the XR activity. In this manner, the wireless activity 2312 is ensured not to overlap with any of the XR activities (including XR activities 2302 and 2306).
  • the application layer clock and the WCD clock of a device may have different fidelities.
  • the application layer clock may be able to indicate a value with more fidelity (also referred to as granularity) than the WCD clock.
  • the difference in fidelity also may be caused by PR being indicated in a different manner than Pw.
  • PR may be indicated as a time every number of cycles (such as 50 ms per 3 cycles) of XR activity).
  • Pw (which is to equal PR) may be determined per cycle of wireless activity. For example, if PR is 50 ms per 3 cycles (which would be 50/3 ms per cycle) and Pw is to equal PR, the render device may approximate Pw based on a fidelity of the WCD clock (such as 1 ps being the minimum unit of time).
  • Pw would be restricted to being indicated as 16.666 ms or 16.667 ms, which is an approximation of 50/3 ms.
  • the period duration of wireless activity 2312 differs from the period duration 2310 of XR activity (such as up to 1 ps per cycle in the example).
  • the wireless activity shifts 3.88 ms every 194 seconds with reference to the XR activity.
  • the buffer is configured to account for an amount of drift in the wireless activity over time resulting from the approximation error in measuring Pw. If the buffer 2318 is 3.88 ms, the wireless activity may begin to overlap the previous XR activity after 194 seconds.
  • drift between XR activity and wireless activity may occur based on a difference in resonance frequencies associated with an application layer clock and a WCD clock of a device.
  • different crystals may be used for the application layer clock and the WCD clock.
  • the crystals may have different resonance frequencies upon which the clocks are based.
  • the crystals may be affected differently by environmental conditions (such as temperature or moisture) that varies the resonance frequencies.
  • the device may assume that the resonance frequencies are the same, but the wireless activity may drift over time with reference to the XR activity based on a difference in resonance frequencies.
  • the wireless activity may drift 3.88 ms every 194 seconds. If the buffer 2318 is 3.88 ms, the wireless activity may begin to overlap the previous XR activity after 194 seconds.
  • One or both of the TWT sessions may be adjusted based on the drift.
  • the second TWT session periodically may be adjusted to prevent TWT windows from the second TWT session overlapping with TWT windows from the first TWT session.
  • the render device may measure a drift between the first plurality of TWT windows from the first TWT session and the second plurality of TWT windows from the second TWT session.
  • determining the first TWT session may be based on the application layer clock of the render device.
  • the second TWT session may be based on a WCD clock of the render device.
  • the render device may adjust a timing of one or more of the first plurality of TWT windows or the second plurality of TWT windows.
  • the render device may initiate adjusting the second TWT session to prevent overlap of the second plurality of TWT windows with the first plurality of TWT windows.
  • the adjustment of the second TWT session may be at a defined interval of time based on a measured drift. For example, if the drift is based on an approximation of Pw, the render device may determine a defined interval to adjust the second TWT session based on the approximation error and the size of the buffer. In the above example of Pw being approximated to be 16.666 ms, PR being 50/3 ms, and the buffer being 3.88 ms, the render device may determine the defined interval to be less than 194 seconds.
  • the render device may determine the defined interval to be less than 194 ms.
  • the defined interval may be based on both types of drift (such as being based on the feature causing the most drift between the two).
  • the render device may attempt to determine if XR activity and wireless activity begin to overlap. For example, the render device may provide a schedule or the WCD otherwise may estimate when XR activity is to occur (such as the interval at when transmitting or receiving over the second wireless link), and the WCD periodically may determine if the wireless activity is to overlap with estimated XR activity.
  • the TWT windows of the second TWT session with the AP may be moved forward or delayed based on the determination to prevent overlap with TWT windows of the first TWT session.
  • aspects of the present disclosure include a wireless system for giving preference to communications between two devices over other devices while still sharing the wireless medium with the other devices. Preference may be given for communications between a render device and a display device of an XR experience to meet latency and packet loss requirements. In some implementations, preference may be given via asynchronous channel control operations. In some implementations, preference may be given via synchronous channel control operations. Sharing the wireless medium may be through the use of TWT sessions or MLO techniques (such as for concurrent link support). Aspects of the present disclosure also include generating and providing feedback associated with an XR experience between the render device and the display device.
  • Communications between the devices may be adjusted based on the feedback to ensure meeting latency and packet loss requirements for the XR experience.
  • an overall system (such as a BSS or mesh network) including one or more render devices and one or more display devices) may be configured to support an XR experience.
  • a method performed by a wireless communication device for an extended reality (XR) experience including: obtaining control of a wireless medium, where: control of the wireless medium is associated with a first priority of transmitting, over the wireless medium, a first physical layer protocol data unit (PPDU) of an application file from the wireless communication device to a second device; and the first priority is different than a second priority of transmitting data from the second device to the wireless communication device over the wireless medium; providing the first PPDU to the second device; and providing one or more subsequent PPDUs of the application file to the second device, where providing the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • XR extended reality
  • the first priority is associated with a first set of enhanced distributed channel access (EDCA) parameters
  • the second priority is associated with a second set of EDCA parameters
  • the third priority is associated with a third set of EDCA parameters.
  • EDCA enhanced distributed channel access
  • EDCA parameters include one or more of: an arbitration interframe spacing number (AIFSN); a minimum contention window (CWmin); or a maximum contention window (CWmax).
  • AIFSN arbitration interframe spacing number
  • CWmin minimum contention window
  • CWmax maximum contention window
  • the first priority is associated with a first backoff counter value, where the first backoff counter value is used by a backoff counter of the wireless communication device to contend for the wireless medium for the first PPDU;
  • the second priority is associated with a second backoff counter value greater than the first backoff counter value;
  • the third priority is associated with a third backoff counter value greater than the second backoff counter value, where the third backoff counter value is used by the backoff counter of the wireless communication device to contend for the wireless medium for the one or more subsequent PPDUs.
  • the wireless communication device is included in a software enabled access point (SAP); the second device includes a head mounted display (HMD); and the application file is associated with a video frame to be displayed by the HMD.
  • SAP software enabled access point
  • HMD head mounted display
  • a wireless communication device configured for an extended reality (XR) experience, including: a processing system; and an interface configured to: obtain control of a wireless medium, where: control of the wireless medium is associated with a first priority of transmitting, over the wireless medium, a first physical layer protocol data unit (PPDU) of an application file from the wireless communication device to a second device; and the first priority is different than a second priority of transmitting data from the second device to the wireless communication device over the wireless medium; provide the first PPDU to the second device; and provide one or more subsequent PPDUs of the application file to the second device, where providing the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • XR extended reality
  • the first priority is associated with a first set of enhanced distributed channel access (EDCA) parameters
  • the second priority is associated with a second set of EDCA parameters
  • the third priority is associated with a third set of EDCA parameters.
  • EDCA enhanced distributed channel access
  • EDCA parameters include one or more of: an arbitration interframe spacing number (AIFSN); a minimum contention window (CWmin); or a maximum contention window (CWmax).
  • AIFSN arbitration interframe spacing number
  • CWmin minimum contention window
  • CWmax maximum contention window
  • the first priority is associated with a first backoff counter value, where the first backoff counter value is used by a backoff counter of the wireless communication device to contend for the wireless medium for the first PPDU;
  • the second priority is associated with a second backoff counter value greater than the first backoff counter value;
  • the third priority is associated with a third backoff counter value greater than the second backoff counter value, where the third backoff counter value is used by the backoff counter of the wireless communication device to contend for the wireless medium for the one or more subsequent PPDUs.
  • BSS basic service set
  • OBSS BSS
  • the wireless communication device or the second device obtains control of the wireless medium based on the first priority or the second priority.
  • RTS request to send
  • NAV network allocation vector
  • MAC media access control layer
  • SAP software enabled access point
  • HMD head mounted display
  • a method performed by a wireless communication device including: obtaining a first physical layer protocol data unit (PPDU) of an application file from a second device over a wireless medium, where: the second device obtains control of the wireless medium; control of the wireless medium is associated with a first priority of transmitting the first PPDU over the wireless medium; and the first priority is different than a second priority of transmitting data from the wireless communication device to the second device over the wireless medium; and obtaining one or more subsequent PPDUs of the application file from the AP, where obtaining the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • PPDU physical layer protocol data unit
  • EDCA parameters include one or more of: an arbitration interframe spacing number (AIFSN); a minimum contention window (CWmin); or a maximum contention window (CWmax).
  • AIFSN arbitration interframe spacing number
  • CWmin minimum contention window
  • CWmax maximum contention window
  • RTS frame includes an indication of a network allocation vector (NAV) to maintain control of the wireless medium for a first time period, where the first time period is greater than a first transmit opportunity (TXOP) during which the first PPDU is obtained from the AP; providing a first clear to send (CTS) frame to the second device after obtaining the first RTS frame; and obtaining, from the AP, a second RTS frame after the first TXOP and before an end of the first time period, where: the second RTS frame includes an indication of the NAV to extend the first time period to cover a plurality of TXOPs; and obtaining the first PPDU is after providing the first CTS frame and during the first TXOP.
  • NAV network allocation vector
  • TXOP transmit opportunity
  • CTS clear to send
  • the second device includes a software enabled AP (SAP); the wireless communication device is included in a head mounted display (HMD); and the application file is associated with a video frame to be displayed by the HMD.
  • SAP software enabled AP
  • HMD head mounted display
  • a wireless communication device configured for an extended reality (XR) experience, including: a processing system; and an interface configured to: obtain a first physical layer protocol data unit (PPDU) of an application file from a second device over a wireless medium, where: the second device obtains control of the wireless medium; control of the wireless medium is associated with a first priority of transmitting the first PPDU over the wireless medium; and the first priority is different than a second priority of transmitting data from the wireless communication device to the second device over the wireless medium; and obtain one or more subsequent PPDUs of the application file from the AP, where obtaining the one or more subsequent PPDUs is associated with a third priority of transmitting the one or more subsequent PPDUs over the wireless medium.
  • XR extended reality
  • EDCA parameters include one or more of: an arbitration interframe spacing number (AIFSN); a minimum contention window (CWmin); or a maximum contention window (CWmax).
  • AIFSN arbitration interframe spacing number
  • CWmin minimum contention window
  • CWmax maximum contention window
  • the wireless communication device of one or more of clauses 45 - 46 where: the wireless communication device and the second device are included in a first basic service set (BSS); the wireless communication device and the second device are within range of a device in an other BSS (OBSS); the OBSS device contends for the wireless medium to transmit high importance classified data associated with a fourth priority; and the wireless communication device or the second device obtains control of the wireless medium based on the first priority or the second priority.
  • BSS basic service set
  • OBSS BSS
  • the wireless communication device or the second device obtains control of the wireless medium based on the first priority or the second priority.
  • the wireless communication device of one or more of clauses 45 - 46 or 50 where: the second device contends for the wireless medium to transmit one or more subsequent PPDUs associated with the third priority; the wireless communication device prevents contending for the wireless medium; and the OBSS device obtains control of the wireless medium based on the fourth priority.
  • RTS request to send
  • NAV network allocation vector
  • MAC media access control layer
  • CF contention free
  • the second device obtains control of the wireless medium to provide one or more PPDUs of a second application file to the wireless communication device; a first MSDU of the second application file includes metadata identifying the first MSDU of the second application file; the first MSDU is associated with the first set of EDCA parameters; and obtaining control of the wireless medium by the second device is associated with the first set of EDCA parameters.
  • the wireless communication device of one or more of clauses 45 - 57 , where: the second device includes a software enabled AP (SAP); the wireless communication device is included in a head mounted display (HMD); and the application file is associated with a video frame to be displayed by the HMD.
  • SAP software enabled AP
  • HMD head mounted display
  • a method performed by a device for an extended reality (XR) experience including: obtaining, from a second device, uplink (UL) data over a wireless medium; and providing, to the second device, downlink (DL) data including physical layer protocol data units (PPDUs) over the wireless medium, where: one or more PPDUs are provided to the second device during a current target wake time (TWT) window; and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided to the second device; or when the first PPDU is provided from an application layer to a media access control layer (MAC) of the device.
  • XR extended reality
  • synchronizing the WCD clock to the application layer clock includes: aligning the beginning of the current TWT window to a first time before a render time to begin rendering the first video frame, where the first time precedes the render time by a first offset and where the first offset is associated with the M2R latency.
  • synchronizing the application layer clock and the WCD clock of the device includes synchronizing the application layer clock to the WCD clock, where synchronizing the application layer clock to the WCD clock is associated with a timing synchronization function (TSF) at the MAC of the device.
  • TSF timing synchronization function
  • synchronizing the application layer clock to the WCD clock includes: aligning a render time to begin rendering the first video frame to a first time after the beginning of the current TWT window, where the first time succeeds the beginning of the current TWT window by a first offset and where the first offset is associated with the M2R latency.
  • the device includes a software enabled AP (SAP); and the second device includes a head mounted display (HMD) to display the video frames.
  • SAP software enabled AP
  • HMD head mounted display
  • a device configured for an extended reality (XR) experience including: a processing system; and an interface configured to: obtain, from a second device, uplink (UL) data over a wireless medium; and provide, to the second device, downlink (DL) data including physical layer protocol data units (PPDUs) over the wireless medium, where: one or more PPDUs are provided to the second device during a current target wake time (TWT) window; and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided to the second device; or when the first PPDU is provided from an application layer to a media access control layer (MAC) of the device.
  • XR extended reality
  • the processing system is configured to render video frames to be displayed by the second device, where: the device includes a render device; the second device includes a display device; the UL data includes pose data frames; and the rendering of the video frames is associated with the obtained pose data frames, where: a first video frame of the video frames includes a plurality of slices; rendering of the first video frame is associated with a first pose data frame most recently obtained from the second device; and one or more PPDUs are associated with one or more of the plurality of slices.
  • synchronizing the WCD clock to the application layer clock includes: aligning the beginning of the current TWT window to a first time before a render time to begin rendering the first video frame, where the first time precedes the render time by a first offset and where the first offset is associated with the M2R latency.
  • synchronizing the application layer clock and the WCD clock of the device includes synchronizing the application layer clock to the WCD clock, where synchronizing the application layer clock to the WCD clock is associated with a timing synchronization function (TSF) at the MAC of the device.
  • TSF timing synchronization function
  • synchronizing the application layer clock to the WCD clock includes: aligning a render time to begin rendering the first video frame to a first time after the beginning of the current TWT window, where the first time succeeds the beginning of the current TWT window by a first offset and where the first offset is associated with the M2R latency.
  • a device for an extended reality (XR) experience including: providing uplink (UL) data to a second device over a wireless medium; and obtaining, from the second device, downlink (DL) data including physical layer protocol data units (PPDUs) over the wireless medium, where: one or more PPDUs are obtained from the second device during a current target wake time (TWT) window; and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided by the second device; or when the first PPDU is provided from an application layer to a media access control layer (MAC) of the second device.
  • XR extended reality
  • the second device is to render video frames to be displayed by the device, where: the second device includes a render device; the device includes a display device; the UL data includes pose data frames; and the rendering of the video frames is associated with the provided pose data frames, where: a first video frame of the video frames includes a plurality of slices; rendering of the first video frame is associated with a first pose data frame most recently obtained by the second device; and one or more PPDUs are associated with one or more of the plurality of slices.
  • synchronizing the WCD clock to the application layer clock includes: aligning the beginning of the current TWT window to a first time the first pose data is provided to the second device, where: the first time precedes a render time that the second device is to begin rendering the first video frame; the first time precedes the render time by a first offset; and the first offset is associated with the M2R latency. 95.
  • the second device includes a software enabled AP (SAP); and the device includes a head mounted display (HMD) to display the video frames.
  • SAP software enabled AP
  • HMD head mounted display
  • a device configured for an extended reality (XR) experience including: a processing system; and an interface configured to: provide uplink (UL) data to a second device over a wireless medium; and obtain, from the second device, downlink (DL) data including physical layer protocol data units (PPDUs) over the wireless medium, where: one or more PPDUs are obtained from the second device during a current target wake time (TWT) window; and a beginning of the current TWT window is associated with one of: when a first PPDU of the one or more PPDUs is provided by the second device; or when the first PPDU is provided from an application layer to a media access control layer (MAC) of the second device.
  • XR extended reality
  • the second device is to render video frames to be displayed by the device, where: the second device includes a render device; the device includes a display device; the UL data includes pose data frames; and the rendering of the video frames is associated with the provided pose data frames, where: a first video frame of the video frames includes a plurality of slices; rendering of the first video frame is associated with a first pose data frame most recently obtained by the second device; and one or more PPDUs are associated with one or more of the plurality of slices.
  • the interface is further configured to: obtain, from the second device, a packet including a MAC header including a power management (PM) field indicating that the second device is not to enter into a low power mode after the current TWT window; and obtain, from the second device, a PPDU associated with a video slice of the first video frame after the current TWT window and before a next TWT window.
  • PM power management
  • synchronizing the WCD clock to the application layer clock includes: aligning the beginning of the current TWT window to a first time the first pose data is provided to the second device, where: the first time precedes a render time that the second device is to begin rendering the first video frame; the first time precedes the render time by a first offset; and the first offset is associated with the M2R latency.
  • the interface is further configured to indicate, to the second device, a time of the beginning of the current TWT window, where: the indication of the time is associated with a timing synchronization function (TSF) at the MAC of the device after the WCD clock is synchronized to the application layer clock; a WCD clock of the second device is synchronized to the WCD clock of the device, where synchronization of the WCD clocks are associated with the TSF of the device and a TSF at a MAC of the second device; and an application layer clock of the second device is synchronized to the WCD clock of the second device, where synchronization of the application layer clock of the second device to the WCD clock of the second device is associated with the TSF of the second device.
  • TSF timing synchronization function
  • synchronizing the application layer clock and the WCD clock of the device includes synchronizing the application layer clock to the WCD clock, where synchronizing the application layer clock to the WCD clock is associated with a timing synchronization function (TSF) at the MAC of the device.
  • TSF timing synchronization function
  • synchronizing the application layer clock to the WCD clock includes: aligning a display time to a time associated with the current TWT window.
  • the second device includes a software enabled AP (SAP); and the device includes a head mounted display (HMD) to display the video frames.
  • SAP software enabled AP
  • HMD head mounted display
  • a method performed by a device including: rendering a plurality of video frames to be provided to a second device; splitting each video frame of the plurality of video frames into a plurality of video slices; and for each video slice of the plurality of video slices: generating a plurality of PPDUs to include the video slice, where: each PPDU includes one or more media access control layer (MAC) service data units (MSDUs) associated with the video slice; and the video slice is identified by a port number and a differentiated services field codepoint (DSCP) value included in each MSDU of the plurality of PPDU; and queuing the MSDUs for transmission to the second device.
  • MAC media access control layer
  • MSDUs media access control layer
  • DSCP differentiated services field codepoint
  • queuing the MSDUs includes generating a MSDU queue in software for each video slice, where each MSDU queue is identified by an internet protocol (IP) address, port number, and DSCP value.
  • IP internet protocol
  • each video slice is associated with a traffic identifier (TID) that is associated with an access category (AC) of the video slice.
  • TID traffic identifier
  • AC access category
  • the device includes a software enabled access point (SAP); and the second device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a device configured for an extended reality (XR) experience including: an interface; and a processing system configured to: render a plurality of video frames to be provided to a second device; split each video frame of the plurality of video frames into a plurality of video slices; and for each video slice of the plurality of video slices: generate a plurality of PPDUs to include the video slice, where: each PPDU includes one or more media access control layer (MAC) service data units (MSDUs) associated with the video slice; and the video slice is identified by a port number and a differentiated services field codepoint (DSCP) value included in each MSDU of the PPDU; and queue the MSDUs for transmission to the second device.
  • MAC media access control layer
  • MSDUs media access control layer
  • DSCP differentiated services field codepoint
  • queuing the MSDUs includes generating a MSDU queue in software for each video slice, where each MSDU queue is identified by an internet protocol (IP) address, port number, and DSCP value.
  • IP internet protocol
  • each video slice is associated with a traffic identifier (TID) that is associated with an access category (AC) of the video slice.
  • TID traffic identifier
  • AC access category
  • the processing system is further configured to: render a first p-slice; generate a first MSDU queue associated with the first p-slice; render a second p-slice after rendering the first p-slice; and flush the first MSDU queue after rendering the second p-slice and before providing, to the second device, a PPDU including one or more MSDUs associated with the first p-slice.
  • the processing system is further configured to: render a first i-slice; generate a first MSDU queue associated with the first i-slice; render a second i-slice or p-slice after rendering the first i-slice; and generate a second MSDU queue associated with the second i- slice; and the interface is configured to provide, to the second device, a PPDU including one or more MSDUs associated with the first i-slice after generating the second MSDU queue.
  • the device includes a software enabled access point (SAP); and the second device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a method performed by a device including: obtaining, from a second device, one or more physical layer (PHY) protocol data units (PPDUs) associated with a video frame, where: the second device renders a plurality of video frames to be provided to the device; the second device splits each video frame of the plurality of video frames into a plurality of video slices; and for each video slice of the plurality of video slices: the second device generates a plurality of PPDUs to include the video slice, where: each PPDU includes one or more media access control layer (MAC) service data units (MSDUs) associated with the video slice; and the video slice is identified by a port number and a differentiated services field codepoint (DSCP) value included in each MSDU of the plurality of PPDUs; and the second device queues the MSDUs for transmission to the device.
  • PHY physical layer
  • PPDUs physical layer protocol data units
  • queuing the MSDUs includes generating a MSDU queue in software for each video slice, where each MSDU queue is identified by an internet protocol (IP) address, port number, and DSCP value.
  • IP internet protocol
  • each video slice is associated with a traffic identifier (TID) that is associated with an access category (AC) of the video slice.
  • TID traffic identifier
  • AC access category
  • the second device includes a software enabled access point (SAP); and the device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a device configured for an extended reality (XR) experience including: a processing system; and an interface configured to: obtain, from a second device, one or more physical layer (PHY) protocol data units (PPDUs) associated with a video frame, where: the second device renders a plurality of video frames to be provided to the device; the second device splits each video frame of the plurality of video frames into a plurality of video slices; and for each video slice of the plurality of video slices: the second device generates a plurality of PPDUs to include the video slice, where: each PPDU includes one or more media access control layer (MAC) service data units (MSDUs) associated with the video slice; and the video slice is identified by a port number and a differentiated services field codepoint (DSCP) value included in each MSDU of the plurality of PPDUs; and the second device queues the MSDUs for transmission to the device.
  • PHY physical layer
  • PPDUs physical layer protocol data units
  • queuing the MSDUs includes generating a MSDU queue in software for each video slice, where each MSDU queue is identified by an internet protocol (IP) address, port number, and DSCP value.
  • IP internet protocol
  • each video slice is associated with a traffic identifier (TID) that is associated with an access category (AC) of the video slice.
  • TID traffic identifier
  • AC access category
  • the interface is further configured to obtain, at a reorder (REO) queue of the device, a portion of MSDUs associated with a video slice; and the processing system is configured to flush the REO queue after not obtaining a remainder of the MSDUs associated with the video slice before a REO timeout occurs.
  • REO reorder
  • the interface is further configured to: obtain, at a reorder (REO) queue of the device, one or more MSDUs from the second device; and obtain an indication that a transmit queue associated with the one or more MSDUs is flushed by the second device; and the processing system is configured to flush the REO queue after obtaining the indication.
  • REO reorder
  • the second device includes a software enabled access point (SAP); and the device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a method performed by a device including: attempting to provide a plurality of PPDUs associated with one or more video frames of an extended reality (XR) experience to a second device; and measuring one or more of: a PPDU transmission latency associated with attempting to provide the plurality of PPDUs; or a PPDU transmission drop associated with attempting to provide the plurality of PPDUs, where one or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • XR extended reality
  • adjusting the one or more parameters of the XR experience includes adjusting one or more wireless communication parameters between the device and the second device, where adjusting the one or more wireless communication parameters includes one or more of: adjusting a duty cycle of target wake time (TWT) windows of a TWT power save mode associated with the device and the second device; enabling or disabling the TWT power save mode; changing a wireless operating channel over which the device and the second device communicate; adjusting the wireless operating channel size; adjusting a modulation and coding scheme (MCS); enabling or disabling forward error correction (FEC) for providing at least a portion of the one or more PPDUs to the second device; or adjusting the FEC for providing at least the portion of the one or more PPDUs to the second device.
  • TWT target wake time
  • MCS modulation and coding scheme
  • FEC forward error correction
  • adjusting the one or more parameters of the XR experience includes adjusting one or more video parameters, where the video parameters include one or more of: a video frame rate; a video resolution; a target encode data rate; or a video codec.
  • the device includes a software enabled access point (SAP); and the second device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a device configured for an extended reality (XR) experience including: an interface configured to attempt to provide a plurality of PPDUs associated with one or more video frames of the XR experience to a second device; and a processing system configured to measure one or more of: a PPDU transmission latency associated with attempting to provide the plurality of PPDUs; or a PPDU transmission drop associated with attempting to provide the plurality of PPDUs, where one or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • XR extended reality
  • the interface is further configured to obtain one or more pose data frames from the second device, where each of the one or more video frames is associated with a pose data frame of the one or more pose data frames; and the processing system is further configured to measure a pose data frame delivery latency associated with obtaining the one or more pose data frames.
  • the one or more measurements include a link quality measurement of a link quality between the device and the second device; and the A-Control field includes an indication of the link quality.
  • adjusting the one or more parameters of the XR experience includes adjusting one or more wireless communication parameters between the device and the second device, where adjusting the one or more wireless communication parameters includes one or more of: adjusting a duty cycle of target wake time (TWT) windows of a TWT power save mode associated with the device and the second device; enabling or disabling the TWT power save mode; changing a wireless operating channel over which the device and the second device communicate; adjusting the wireless operating channel size; adjusting a modulation and coding scheme (MCS); enabling or disabling forward error correction (FEC) for providing at least a portion of the one or more PPDUs to the second device; or adjusting the FEC for providing at least the portion of the one or more PPDUs to the second device.
  • TWT target wake time
  • MCS modulation and coding scheme
  • FEC forward error correction
  • adjusting the one or more parameters of the XR experience includes adjusting one or more video parameters, where the video parameters include one or more of: a video frame rate; a video resolution; a target encode data rate; or a video codec.
  • the device includes a software enabled access point (SAP); and the second device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a method performed by a device including: attempting to provide a plurality of pose data frames associated with one or more video frames of an extended reality (XR) experience to a second device; and measuring one or more of: a pose data frame transmission latency associated with attempting to provide the plurality of pose data frames; or a pose data frame transmission drop associated with attempting to provide the plurality of pose data frames, where one or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • XR extended reality
  • adjusting the one or more parameters of the XR experience includes adjusting one or more wireless communication parameters between the device and the second device, where adjusting the one or more wireless communication parameters includes one or more of: adjusting a duty cycle of target wake time (TWT) windows of a TWT power save mode associated with the device and the second device; enabling or disabling the TWT power save mode; changing a wireless operating channel over which the device and the second device communicate; adjusting the wireless operating channel size; adjusting a modulation and coding scheme (MCS); enabling or disabling forward error correction (FEC) for providing at least a portion of the one or more PPDUs to the second device; or adjusting the FEC for providing at least the portion of the one or more PPDUs to the second device.
  • TWT target wake time
  • MCS modulation and coding scheme
  • FEC forward error correction
  • the second device includes a software enabled access point (SAP); and the device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a device configured for an extended reality (XR) experience including: an interface configured to attempt to provide a plurality of pose data frames associated with one or more video frames of the XR experience to a second device; and a processing system configured to measure one or more of: a pose data frame transmission latency associated with attempting to provide the plurality of pose data frames; or a pose data frame transmission drop associated with attempting to provide the plurality of pose data frames, where one or more parameters of the XR experience are adjusted and associated with one or more of the measurements.
  • XR extended reality
  • the interface is further configured to obtain, at a reorder (REO) queue of the device, one or more media access control layer (MAC) service data units (MSDUs) from the second device, where each MSDU is associated with a video slice of the one or more video frames; and the processing system is further configured to: flush the REO queue one or more times to remove one or more MSDEis from the REO queue; and measure a REO flush time associated with flushing the REO queue.
  • REO reorder
  • MSDUs media access control layer service data units
  • the interface is further configured to, for one or more video frames: obtain a first MSDU associated with a video frame; and obtain a last MSDU associated with the video frame; and the processing system is further configured to measure a video frame delivery latency associated with obtaining the first MSDU and the last MSDU for one or more video frames.
  • the one or more measurements include a link quality measurement of a link quality between the device and the second device; and the A-Control field includes an indication of the link quality.
  • adjusting the one or more parameters of the XR experience includes adjusting one or more wireless communication parameters between the device and the second device, where adjusting the one or more wireless communication parameters includes one or more of: adjusting a duty cycle of target wake time (TWT) windows of a TWT power save mode associated with the device and the second device; enabling or disabling the TWT power save mode; changing a wireless operating channel over which the device and the second device communicate; adjusting the wireless operating channel size; adjusting a modulation and coding scheme (MCS); enabling or disabling forward error correction (FEC) for providing at least a portion of the one or more PPDUs to the second device; or adjusting the FEC for providing at least the portion of the one or more PPDEis to the second device.
  • TWT target wake time
  • MCS modulation and coding scheme
  • FEC forward error correction
  • the second device includes a software enabled access point (SAP); and the device includes a head mounted display (HMD).
  • SAP software enabled access point
  • HMD head mounted display
  • a method performed by a wireless communication device including: communicating with a first device over a first wireless link; and communicating with a second device over a second wireless link, where: the wireless communication device communicates concurrently with the first device and the second device using one of multi-link operation (MLO) techniques or a target wake time (TWT) mode; and the wireless communication device is configured to give preference to communications on the second wireless link versus communications on the first wireless link.
  • MLO multi-link operation
  • TWT target wake time
  • concurrently communicating with the first device and the second device includes: for when the wireless communication device is to obtain one or more packets from the second device during reception of one or more packets from the first device: obtaining the one or more packets from the second device during reception of the one or more packets from the first device; providing a block acknowledgement (BA) to the second device after obtaining the one or more packets from the second device; and preventing the wireless communication device from providing a BA to the first device to acknowledge the reception of the one or more packets from the first device.
  • BA block acknowledgement
  • synchronizing the transmission to the second device with the transmission to the first device includes: synchronizing a start of the transmission to the first device with a start of the transmission to the second device; and padding the transmission to the second device to cause an end of the transmission to the first device to be synchronized with an end of the transmission to the second device, where a backoff associated with the second wireless link is reduced when the transmission to the first device and the transmission to the second device are synchronized.
  • the method of one or more of clauses 217, 224, or 225 further including: measuring a drift between the first plurality of TWT windows and the second plurality of TWT windows, where: the first TWT session is associated with an application layer clock associated with the wireless communication device; and the second TWT session is associated with a wireless communication device (WCD) clock at a media access control layer (MAC); and adjusting a timing of one or more of the first plurality of TWT windows or the second plurality of TWT windows to reduce the drift.
  • WCD wireless communication device
  • MAC media access control layer
  • a wireless communication device including: a processing system; and an interface configured to: communicate with a first device over a first wireless link; and communicate with a second device over a second wireless link, where: the wireless communication device communicates concurrently with the first device and the second device using one of multi-link operation (MLO) techniques or a target wake time (TWT) mode; and the wireless communication device is configured to give preference to communications on the second wireless link versus communications on the first wireless link.
  • MLO multi-link operation
  • TWT target wake time
  • the wireless communication device of clause 231, where concurrently communicating with the first device and the second device includes: for when the wireless communication device is to transmit to the second device during reception of one or more packets from the first device: transmitting to the second device before the completion of reception of the one or more packets from the first device; obtaining a block acknowledgement (BA) from the second device after transmitting to the second device; and preventing the wireless communication device from providing a BA to the first device to acknowledge the reception of the one or more packets.
  • BA block acknowledgement
  • the wireless communication device of clause 231, where concurrently communicating with the first device and the second device includes: for when the wireless communication device is to obtain one or more packets from the second device during reception of one or more packets from the first device: obtaining the one or more packets from the second device during reception of the one or more packets from the first device; providing a block acknowledgement (BA) to the second device after obtaining the one or more packets from the second device; and preventing the wireless communication device from providing a BA to the first device to acknowledge the reception of the one or more packets from the first device.
  • BA block acknowledgement
  • the wireless communication device of clause 231, where concurrently communicating with the first device and the second device includes: for when the wireless communication device is to transmit to the second device during transmission to the first device, performing one or more of: synchronizing the transmission to the second device with the transmission to the first device; preventing transmission to the first device outside of one or more time division multiplexing (TDM) windows, where: transmission to the first device is during the one or more TDM windows; and transmission to the second device is outside of the one or more TDM windows; reducing one or more of a maximum physical layer (PHY) protocol data unit (PPDU) duration or a burst duration; flushing packets from a transmit buffer associated with the transmission to the first device; or broadcasting a clear to send (CTS) to self (CTS-to-Self) frame to pad a time period for reserving a wireless medium, where: the wireless medium includes the first wireless link and the second wireless link; and the time period is of sufficient length for the transmission to the first device and the transmission to the second device.
  • TDM
  • synchronizing the transmission to the second device with the transmission to the first device includes: synchronizing a start of the transmission to the first device with a start of the transmission to the second device; and padding the transmission to the second device to cause an end of the transmission to the first device to be synchronized with an end of the transmission to the second device, where a backoff associated with the second wireless link is reduced when the transmission to the first device and the transmission to the second device are synchronized.
  • 236 The wireless communication device of one or more of clauses 231 or 234, where for when preventing transmission to the first device during a time division multiplexing window associated with transmission to the second device, the first device is prevented from supporting a target wake time (TWT) mode.
  • TWT target wake time
  • the wireless communication device of clause 231, where concurrently communicating with the first device and the second device includes: for when the wireless communication device is to obtain one or more packets from the second device during transmission to the first device, performing one or more of: preventing obtaining the one or more packets from the second device during one or more time division multiplexing (TDM) windows, where: transmission to the first device is during the one or more TDM windows; and reception from the second device is outside of the one or more TDM windows; reducing one or more of a maximum physical layer (PHY) protocol data unit (PPDU) duration or a burst duration; or before transmitting to the first device, providing a frame to the second device indicating a network allocation vector (NAV) value to reserve a wireless medium for the duration of transmitting to the first device, where the wireless medium includes the first wireless link and the second wireless link.
  • TDM time division multiplexing
  • the wireless communication device of one or more of clauses 231 or 238, where concurrently communicating with the first device and the second device includes: configuring a first target wake time (TWT) session for communications on the second wireless link, where: the first TWT session is associated with a first plurality of TWT windows on the second wireless link; the first plurality of TWT windows is associated with XR activity at the wireless communication device; and the XR activity includes obtaining one or more pose data frames from the second device, rendering one or more video frames, and providing the one or more video frames via one or more physical layer (PHY) protocol data units (PPDUs) to the second device; configuring a second TWT session for communications on the first wireless link, where: the second TWT session is associated with a second plurality of TWT windows interspersed between the first plurality of TWT windows; the second plurality of TWT windows is associated with wireless activity between the first device and the wireless communication device; and the first plurality of TWT windows does not overlap with the second plurality
  • the wireless communication device of one or more of clauses 231, 238, or 239 where the processing system is configured to: measure a drift between the first plurality of TWT windows and the second plurality of TWT windows, where: the first TWT session is associated with an application layer clock associated with the wireless communication device; and the second TWT session is associated with a wireless communication device (WCD) clock at a media access control layer (MAC); and adjust a timing of one or more of the first plurality of TWT windows or the second plurality of TWT windows to reduce the drift.
  • WCD wireless communication device
  • MAC media access control layer
  • drift is associated with a difference between a fidelity of the application layer clock and a fidelity of the WCD clock.
  • drift is associated with a difference between a resonance frequency of the application layer clock and a resonance frequency of the WCD clock.
  • AP access point
  • STA relay station
  • HMD head mounted display
  • a phrase referring to “at least one of’ or “one or more of’ a list of items refers to any combination of those items, including single members.
  • “at least one of: a, b, or c” is intended to cover the possibilities of: a only, b only, c only, a combination of a and b, a combination of a and c, a combination of b and c, and a combination of a and b and c.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

La présente divulgation concerne des systèmes, des procédés et un appareil de commande d'accès à un canal synchrone d'un système sans fil. Selon certains aspects, un dispositif peut utiliser une session TWT pour communiquer avec un second dispositif pendant une ou plusieurs périodes de service TWT. Des communications de liaison montante et de liaison descendante peuvent être coordonnées pour se trouver toutes deux dans une période de service TWT afin de permettre à un dispositif d'entrer dans un mode de faible puissance hors de la période de service TWT. La session TWT, comprenant les périodes de service, peut être configurée et gérée par le dispositif ou le second dispositif afin d'assurer que les communications associées à une expérience XR entre les dispositifs (tels que des trames de données de pose ou des trames de suivi fournies en tant que données de liaison montante et des trames de données vidéo fournies en tant que données de liaison descendante) satisfont des exigences de latence ou d'autres exigences pour l'expérience XR. L'utilisation de périodes de service TWT permet à d'autres dispositifs d'utiliser le support sans fil hors des périodes de service TWT.
EP22705249.5A 2021-03-01 2022-01-31 Commande d'accès à un canal synchrone d'un système sans fil Pending EP4302557A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US17/188,165 US11696345B2 (en) 2021-03-01 2021-03-01 Asynchronous channel access control of a wireless system
US17/188,275 US11800572B2 (en) 2021-03-01 2021-03-01 Synchronous channel access control of a wireless system
PCT/US2022/014565 WO2022186930A1 (fr) 2021-03-01 2022-01-31 Commande d'accès à un canal synchrone d'un système sans fil

Publications (1)

Publication Number Publication Date
EP4302557A1 true EP4302557A1 (fr) 2024-01-10

Family

ID=80786544

Family Applications (2)

Application Number Title Priority Date Filing Date
EP22705245.3A Pending EP4302556A1 (fr) 2021-03-01 2022-01-31 Commande d'accès à un canal asynchrone d'un système sans fil
EP22705249.5A Pending EP4302557A1 (fr) 2021-03-01 2022-01-31 Commande d'accès à un canal synchrone d'un système sans fil

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP22705245.3A Pending EP4302556A1 (fr) 2021-03-01 2022-01-31 Commande d'accès à un canal asynchrone d'un système sans fil

Country Status (5)

Country Link
EP (2) EP4302556A1 (fr)
JP (2) JP2024509378A (fr)
KR (2) KR20230152683A (fr)
TW (2) TW202236896A (fr)
WO (2) WO2022186929A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11792616B2 (en) * 2019-09-27 2023-10-17 Intel Corporation Distributed network allocation vector management for enabling vehicle-to-everything radio access technology coexistence
WO2024076816A1 (fr) * 2022-10-04 2024-04-11 Qualcomm Incorporated Déclencheurs de commutateur de configuration pour communications de dispositif audio

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020033395A1 (fr) * 2018-08-06 2020-02-13 Babaei Alireza Opérations de partie de cellule et de bande passante dans des bandes sans licence
US20210243749A1 (en) * 2018-08-07 2021-08-05 Idac Holdings, Inc. Nr v2x - methods for data transmission in wireless systems

Also Published As

Publication number Publication date
WO2022186929A1 (fr) 2022-09-09
WO2022186930A1 (fr) 2022-09-09
KR20230152683A (ko) 2023-11-03
TW202236896A (zh) 2022-09-16
KR20230152681A (ko) 2023-11-03
JP2024508429A (ja) 2024-02-27
JP2024509378A (ja) 2024-03-01
TW202236889A (zh) 2022-09-16
EP4302556A1 (fr) 2024-01-10

Similar Documents

Publication Publication Date Title
US11696345B2 (en) Asynchronous channel access control of a wireless system
US11800572B2 (en) Synchronous channel access control of a wireless system
TWI770060B (zh) 可撓無線電服務5g nr資料傳輸
US10574402B2 (en) Station (STA), access point (AP) and method for aggregation of data packets for uplink transmission
US20220078844A1 (en) Scheduling wireless stations within a target wake time service period
US10057747B2 (en) 5G MB connectivity acknowledgement aggregation
KR101764955B1 (ko) 채널 통합 및 매체 접근 제어 재전송을 수행하는 방법 및 장치
KR101653346B1 (ko) 무선 근거리 통신망에서의 절전을 위한 방법 및 장치
TWI384793B (zh) 具有與舊有系統交互操作性之高速媒體存取控制
CN115989698A (zh) 用于无线网络的低时延增强
KR20190040227A (ko) 향상된 분산 채널 액세스를 사용하는 무선 통신 방법 및 이를 사용하는 무선 통신 단말
TWI549555B (zh) 用於低延遲802.11媒體存取之存取終端、方法、設備及電腦程式產品
US20230379999A1 (en) Wireless communication method using multi-link, and wireless communication terminal using same
WO2022186929A1 (fr) Commande d'accès à un canal asynchrone d'un système sans fil
TW202318829A (zh) 用於同級間(p2p)通訊的低時延方案
KR20230150960A (ko) 낮은 레이턴시 애플리케이션들을 위한 무선 네트워크 구성
US20220416964A1 (en) A-mpdu preemption for time-critical ultra-low latency (ull) communications
US20230337217A1 (en) Network time sectoring
CN118104378A (zh) 用于对等(p2p)通信的低时延方案
MXPA06004230A (en) High speed media access control and direct link protocol

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230621

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR