WO2023096639A1 - Wearable apparatuses with dual physical layer interfaces - Google Patents

Wearable apparatuses with dual physical layer interfaces Download PDF

Info

Publication number
WO2023096639A1
WO2023096639A1 PCT/US2021/060673 US2021060673W WO2023096639A1 WO 2023096639 A1 WO2023096639 A1 WO 2023096639A1 US 2021060673 W US2021060673 W US 2021060673W WO 2023096639 A1 WO2023096639 A1 WO 2023096639A1
Authority
WO
WIPO (PCT)
Prior art keywords
physical layer
data
layer interface
host device
data stream
Prior art date
Application number
PCT/US2021/060673
Other languages
French (fr)
Inventor
Chih-Hsin Lee
Cheng-Chih Chen
Haohsuan TING
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2021/060673 priority Critical patent/WO2023096639A1/en
Publication of WO2023096639A1 publication Critical patent/WO2023096639A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1698Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • VR devices are used for different applications.
  • a user wears and operates a VR headset (e.g., a head-mounted display (HMD) device) to view content, play a video game, conduct a virtual meeting, or perform other online-related activities.
  • the HMD device may connect to a host device capable of executing (e.g., running) programs to generate a reality in the case of a VR, or to enhance the reality in the case of an AR.
  • the HMD device may use an input (e.g., audio data, video data, haptic data, and the like) from a computer program (e.g., a VR program and/or an AR program running in the host device) to allow a wearer to experience the VR and/or the AR. Further, the VR devices may transmit user input data and/or sensor data to the host device to control input interfaces for objects in virtual environments and augmented reality environments.
  • an input e.g., audio data, video data, haptic data, and the like
  • a computer program e.g., a VR program and/or an AR program running in the host device
  • the VR devices may transmit user input data and/or sensor data to the host device to control input interfaces for objects in virtual environments and augmented reality environments.
  • FIG. 1B is a block diagram of the example wearable apparatus of FIG. 1A, depicting additional features
  • FIG. 2A is a block diagram of the example wearable apparatus of FIG. 1A, depicting a duplexer to combine a first physical layer interface and a second physical layer interface of the wireless device;
  • FIG. 58 is a block diagram of a computing environment, depicting data communication between the host device and the HMD device of FIG. 5A via the first channel and the second channel in different frequency bands.
  • VR Virtual reality
  • AR augmented reality
  • MR mixed reality
  • a VR scenario may involve presentation of digital or virtual image Information without transparency to other actual real-world visual input.
  • An AR scenario may involve presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user.
  • a MR scenario may be related to merging real and virtual worlds to produce new environments where physical and virtual objects co-exist and interact in real-time.
  • the HMD device may be communicatively connected to a host device (e.g., a personal computer, a cloud-based computing device, or the like), which may be capable of executing programs to generate a reality in the case of the VR, to enhance the reality in the case of the AR, or the like.
  • a host device e.g., a personal computer, a cloud-based computing device, or the like
  • the HMD device may receive first data stream from the host device, which may be displayed on a display panel of the HMD device.
  • the HMD device may transmit second data stream to the host device.
  • the second data stream is used to control the first data stream that is transmitted from the host device.
  • data transmission may be identified into two categories.
  • One being the first data stream such as the video, audio, haptic, and the like from the host device to the HMD device.
  • the other being the second data stream such as gesture data, position data, control data, and the like from the HMD device to the host device.
  • the HMD device may utilize a single wireless channel corresponding to a duplex data communication (e.g., time division duplexing (TDD)) for receiving the first data stream from the host device and transmitting the second data stream to the host device.
  • TDD time division duplexing
  • HMD devices that utilize a single wireless channel for data communication
  • data transmission to the host device and data reception from the host device cannot be done at the same time because the wireless connection’s hardware follows time division in data transmission and reception.
  • the host device may have to wait for the HMD device to finish the second data stream transmission before the host device transmits the first data stream to the HMD device, and vice versa. Therefore, the HMD device that utilizes a single wireless channel for data communication may cause a latency in the data transmission/reception and may bring a non-real-time experience and motion sickness sensation to the user.
  • the latency time can be increased for the HMD device to receive the data from the host device, particularly when significantly large data has to be transmitted from the host device to the HMD device for the enhanced video quality with a high refresh rate.
  • the increased latency time may cause a lag in the HMD device’s video/audlo and body interaction system (e.g., body-mounted sensors) controlled by the HMD device.
  • the user experience may be affected with the delayed visual/hearing/haptlc responses.
  • the HMD device may receive delayed data from the host device, which may affect the user experience.
  • Examples described herein provides a wearable apparatus including an input device and an HMD device.
  • the input device may receive sensor data (e.g., position data, environment data, or the like), user input data (e g. , gesture input, control input, or the like), or both.
  • the HMD device may include an antenna and a wireless device connected to the antenna.
  • the wireless device may include a first physical layer interface to communicate via a first communication frequency in accordance with a wireless communication protocol (e.g., a wireless communication protocol that uses a time division duplexing (TDD) communication) and a second physical layer interface to communicate via a second communication frequency in accordance with the wireless communication protocol.
  • the wearable apparatus may include a processor.
  • the processor may generate a first data stream based on the sensor data, user input data, or both and transmit the first data stream to a host device via the first physical layer interface. Further, the processor may receive a second data stream from the host device via the second physical layer interface. The second data stream may be generated at the host device according to a data source and the first data stream.
  • examples described herein provides a wireless device with dual physical layer interfaces, where one physical layer interface may be utilized to transmit the data from the HMD device to the host device via a first channel, and the other physical layer interface may be used for the data transmission from the host device to the HMD device via a second channel
  • examples described herein provides a full duplex data transmission between the HMD device and the host device to reduce the latency time (or approach "zero-wait”) and makes the host device transmits the video data (e.g., with maximum video quality and minimum video compression) without sharing data bandwidth in the same channel for receiving data from the HMD device.
  • FIG. 1A is a block diagram of an example wearable apparatus 100, including a wireless device 104 having a dual physical layer interface to transmit/receive data stream to/from a host device 112.
  • Wearable apparatus 100 may present two-dimensional (2D) or three-dimensional (3D) virtual images (i.e., virtual content) to a user.
  • the images may be still images., frames of a video, or a video, or in any combinations thereof.
  • wearable apparatus 100 can present the virtual content in a Virtual Reality (VR) environment, an Augmented Reality (AR) environment, or a Mixed Reality (MR) environment for user interaction,
  • VR Virtual Reality
  • AR Augmented Reality
  • MR Mixed Reality
  • wearable apparatus 100 includes a head-mounted display (HMD) device 102 to present the virtual content.
  • HMD device 102 is a VR headset.
  • HMD device 102 may connect to host device 112 (e.g., a computing device) capable of executing (e.g., running) programs to generate a reality in the case of a VR, or to enhance the reality in the case of an AR.
  • host device 112 e.g., a computing device
  • Host device 112 may execute the program, receive images of a physical environment from HMD device 102, analyze the images, and send data to HMD device 102.
  • HMD device 102 may use an input from a computer program (e.g., a VR program and/or an AR program running in host device 112) to allow a wearer to experience the VR and/or the AR.
  • a computer program e.g., a VR program and/or an AR program running in host device 112
  • HMD device 102 may be connected to host device 112 executing the VR and/or AR program via a wireless connection,
  • HMD device 102 may refer to a device worn on a head which may display visual data to the user.
  • a user operates the VR headset to play a role in the video game.
  • the VR headsets may display a high- resolution graphical user interface, plays 3D audio, and provides a voice service for communication with other users.
  • the VR headset can be used for other applications such as to view content, conduct a virtual meeting, or perform other online-related activities.
  • wearable apparatus 100 includes wireless device 104.
  • wireless device 104 includes a first physical layer interface 106 to communicate via a first communication frequency in accordance with a wireless communication protocol
  • the wireless communication protocol may be associated with any wireless connection that uses a time division duplexing (TDD) communication.
  • TDD communication may refer to duplex communication links where uplink is separated from downlink by an allocation of different time slots in a same frequency band.
  • the wireless communication protocol is a wireless fidelity (Wi-Fi) protocol, a Bluetooth protocol, 3G/4G/5G wireless standards, or the like.
  • the wireless communication protocol can be other short-range or long-range communication protocols such as a ZigBee® protocol, a Z-Wave® protocol, an EEE 802.15.4 protocol, a Long-Term Evolution Direct (LTE-D) protocol, or the like.
  • wireless device 104 includes a second physical layer interface 108 to communicate via a second communication frequency in accordance with the wireless communication protocol.
  • the second communication frequency may be different from the first communication frequency
  • wearable apparatus 100 includes a processor 110.
  • processor may refer to, for example, a central processing unit (CPU), a semiconductor-based microprocessor, a digital signal processor (DSP) such as a digital image processing unit, or other hardware devices or processing elements suitable to retrieve and execute instructions stored in a storage medium, or suitable combinations thereof.
  • DSP digital signal processor
  • a processor may, for example, include single or multiple cores on a chip, multiple cores across multiple chips, multiple cores across multiple devices, or suitable combinations thereof.
  • a processor may be functional to fetch, decode, and execute instructions as described herein.
  • processor 110 may receive a first data stream from host device 112 via first physical layer interface 106 (e.g., using the first communication frequency) to render the first data stream on HMD device 102.
  • the first data stream includes audio data, video data, text data, haptic data, or any combination thereof.
  • processor 110 may transmit a second data stream to host device 112 via second physical layer interface 108 (e.g. , using the second communication frequency).
  • the second data stream is used to control the first data stream that is transmited from host device 112.
  • FIG. 1B is a block diagram of example wearable apparatus 100 of FIG. 1A, depicting additional features.
  • HMD device 102 includes a display panel 152 and an audio device 154.
  • Display panel 152 may output image data, text data, video data, or any combination thereof of the first data stream.
  • audio device 154 may output audio data of the first data stream, capture voice input data, or a combination thereof.
  • processor 110 may provide the first data stream from host device 112 to display panel 152 and audio device 154 via first physical layer interface 106 of wireless device 104. Further, processor 110 may provide a voice input from audio device 154 to host device 112 via second physical layer interface 108 of wireless device 104.
  • host device 112 includes a wireless device 162.
  • wireless device 162 includes a dual physical layer interface, i.e., a third physical layer interface 164 to communicate with first physical layer interface 106 of wearable apparatus 100 and a fourth physical layer interface 166 to communicate with second physical layer interface 108 of wearable apparatus 100, During operation, host device 112 may transmit the first data stream to wearable apparatus 100 via third physical layer interface 164 of wireless device 162. Further, host device 112 (e.g., a processor of host device 112) may receive the second data stream via fourth physical layer interface 166 of wireless device 162. Furthermore, host device 112 may process the second data stream to modify the first data stream based on the second data stream and the data source.
  • a dual physical layer interface i.e., a third physical layer interface 164 to communicate with first physical layer interface 106 of wearable apparatus 100 and a fourth physical layer interface 166 to communicate with second physical layer interface 108 of wearable apparatus 100.
  • host device 112 may transmit the first data stream to wearable apparatus 100 via
  • wearable apparatus 100 includes a position sensing device 156, a user input device 158,. an environment sensing device 160, or any combination thereof.
  • position sensing device 156 obtains position data associated with HMD device 102.
  • the position data includes movement information, orientation information, tilt angle information, location information, or any combination thereof associated with HMD device 102.
  • user input device 158 may obtain user input data.
  • the user input data includes a button input, a voice input, a gesture input, or any combination thereof.
  • environment sensing device 160 obtains environment data of a vicinity of HMD device 102.
  • the environment data includes image data, video data, depth information, or any combination thereof associated with a physical environment in the vicinity of HMD device 102.
  • processor 110 may generate the second data stream based on the position data, the user input data, the environment data, or any combination thereof.
  • FIG. 2A is a block diagram of example wearable apparatus 100 of FIG. 1A, depicting a duplexer 206 to combine first physical layer interface 106 and second physical layer interface 108 of wireless device 104,
  • wireless device 104 includes a first media access controller (MAC) 202 and a second MAC 204.
  • first MAC 202 is communicatively connected to first physical layer interface 106 to receive the first data stream from host device 112 via first physical layer interface 106.
  • second MAC 204 is communicatively connected to second physical layer interface 108 to transmit the second data stream to host device 112 via second physical layer interface 108.
  • First MAC 202 may have a first MAC address to wirelessly communicate with host device 112 via first physical layer interface 106. Further, second MAC 204 may have a second MAC address to wirelessly communicate with host device 112 via second physical layer interface 108. Further, physical layer interfaces 106 or 108 may send and receive data packets of an application wirelessly. For example, first physical layer interface 106 is connected to or provided with an antenna 208 to facilitate communication with host device 112 via a first channel Further, second physical layer interface 108 may be connected to or provided with antenna 208 to facilitate communication with host device 112 via a second channel.
  • first MAC 202 operates between an upper network layer (e.g., a logical link control layer) and first physical layer interface 106.
  • second MAC 204 may operate between the upper network layer and second physical layer interface 108.
  • a MAC e.g. MAC 202 or 204 may process network data received from the upper network layer and send to host device 112 via a corresponding physical layer interface (e.g. , physical layer interface 106 or 108).
  • the MAC may also process network data received from the corresponding physical layer interface and send to the upper network layer for further processing.
  • physical layer interface 106 or 108 may transform data received from corresponding MAC 202 or 204 into signals for transmission and transmit the signals to host device 112. Furthermore, physical layer interface 106 or 108 may also receive signals from host device 112, convert the received signals into data, and provide the data to corresponding MAC 202 or 204, In some examples, physical layer interfaces 106 and 108 may be compliant with specifications, such as 802.11 standards. Processor 110 may coordinate between first MAC 202 and second MAC 204 to perform the simultaneous operations (i.e.. receiving first data stream and transmitting second data stream at a same time as described with respect to FIG. 1A).
  • wireless device 104 includes duplexer 206 to enable first physical layer interface 106 and second physical layer interface 108 to share a common antenna 208.
  • duplexer 206 is a three-port filtering device which allows transmitters and receivers operating at different communication frequencies to share same antenna 208.
  • An advantage of using duplexer 206 is that duplexer 206 may transmit and receive data via a single antenna 208.
  • host device 112 may include an antenna 212 to transmit (e.g., the first data stream) and receive (e.g., the second data stream) data to/from wireless device 104, respectively, via respective wireless channels 210.
  • FIG. 2B is a block diagram of exampie wearable apparatus 100 of FIG.
  • wireless device 104 includes unified MAC 252 connected to first physical layer interface 106 and second physical layer interface 108.
  • unified MAC 252 receives multimedia content from host device 112 via first physical layer interface 106 and transmits sensor data to host device 112 via second physical layer interface 108.
  • unified MAC 252 is provided for controlling multiple physical layer interfaces (e.g., first physical layer interface 106 and second physical layer interface 108), such as for controlling concurrent data communication over first physical layer interface 106 and second physical layer interface 108.
  • the term “concurrent” may refer to two events occurring at a same time or a portion of the two events occurring at the same time (i.e., the two events occurring at an overlapping period of time).
  • multiple physical layer interfaces e.g., first physical layer interface 106 and second physical layer interface 108) may be configured to communicate with host device 112 over wireless channel 210.
  • input device 302 can include a trackpad, a touchscreen, a joystick, a multiple degree-of-freedom (DOF) controller, a game controller, a keyboard, a mouse, a directional pad (D ⁇ pad), a wand, a haptic device, a totem (e.g., functioning as a virtual input device), a wearable sensor, an image sensor (e.g., a camera), or the like.
  • a multi-DOF controller can sense user input in some or all possible translations (e.g., left/right, forward/backward, or up/down) or rotations (e.g., yaw, pitch, or roll) of the controller.
  • HMD device 304 includes an antenna 306, wireless device 308 connected to antenna 306, and a processor 314 connected to wireless device 308.
  • wireless device 308 includes first physical layer interface 310 and second physical layer interface 312.
  • First physical layer interface 310 may communicate via a first communication frequency in accordance with a wireless communication protocol.
  • Second physical layer interface 312 may communicate via a second communication frequency in accordance with the wireless communication protocol.
  • processor 314 may receive the second data stream from host device 316 via second physical layer interface 312.
  • the second data stream is generated at host device 316 according to a data source and the first data stream.
  • wireless device 308 includes a first media access controller (MAC) and a second MAC.
  • the first MAC may be communicatively connected to first physical layer interface 310 to transmit the first data stream to host device 316 via the first communication frequency.
  • the second MAC may be communicatively connected to second physical layer interface 312 to receive the second data stream from host device 316 via the second communication frequency.
  • FIG, 4 is a block diagram of an example wearable apparatus 400 including a non -transitory computer-readable storage medium 404, storing instructions to communicate with a host device via a dual physical layer interface.
  • wearable apparatus 400 is a head-mounted display (HMD) device.
  • Wearable apparatus 400 includes a processor 402 and computer-readable storage medium 404 communicatively coupled through a system bus.
  • Processor 402 may be any type of CPU, microprocessor, or processing logic that interprets and executes computer-readable instructions stored in computer-readable storage medium 404.
  • Computer-readable storage medium 404 may be a random-access memory (RAM) or another type of dynamic storage device that may store information and computer-readable instructions that may be executed by processor 402.
  • computer-readable storage medium 404 may be synchronous DRAM (SDRAM), double data rate (DDR), Rambus® DRAM (RDRAM), Rambus® RAM , and the like, or storage memory media such as a floppy disk, a hard disk, a CD-ROM, a DVD, a pen drive, and the like.
  • computer-readable storage medium 404 may be a non-transitory computer- readable medium, where the term “non-transitory” does not encompass transitory propagating signais.
  • computer-readable storage medium 404 may be remote but accessible to wearable apparatus 400,
  • Computer-readable storage medium 404 stores instructions 406, 408, 410, and 412.
  • wearable apparatus 400 may include an HMD device and an input device to input data or command to the HMD device.
  • processor 402 is implemented as part of the HMD device,
  • Instructions 406 may be executed by processor 402 to establish a first connection with a host device via a first physical layer interface of a wireless device according to a wireless communication protocol.
  • the wireless communication protocol is associated with a wireless connection that uses a TDD communication.
  • Instructions 408 may be executed by processor 402 to establish a second connection with the host device via a second physical layer interface of the wireless device according to the wireless communication protocol.
  • the first connection and the second connection may be established via a first channel and a second channel, respectively, having non-overlapping frequencies.
  • the first channel and the second channel are associated with a common frequency band.
  • the first channel is associated with a first frequency band and the second channel is associated with a second frequency band that is different from the first frequency band,
  • Instructions 410 may be executed by processor 402 to utilize the first physical layer interface to transmit user input data and sensor data to the host device via the first connection.
  • instructions to utilize the first physical layer interface to transmit the user input data, the sensor data, or both to the host device include instructions to utilize a first media access controller (MAC) connected to the first physical layer interface to transmit the user input data, the sensor data, or both.
  • the first physical layer interface may be an interface between the fist MAC and the first channel.
  • Instructions 412 may be executed by processor 402 to utilize the second physical layer interface to receive multimedia content and interactive data from the host device via the second connection
  • the multimedia content and the interactive data are generated based on the user input data and the sensor data.
  • instructions to utilize the second physical layer interface to receive the multimedia content and the interactive data from the host device include instructions to utilize a second MAC connected to the second physical layer interface to receive the multimedia content and the interactive data.
  • the second physical layer interface may be an interface between the second MAC and the second channel.
  • FIG. 5A is a block diagram of a computing environment 500A, depicting data communication between a host device 502 and an HMD device 504 via a first channel and a second channel in a common frequency band.
  • host device 502 includes a wireless device 506A having a dual physical layer interface (e.g., a first physical layer interface 508A and a second physical layer interface 510A) and an antenna 512A connected to the dual physical layer interface.
  • HMD device 504 includes a wireless device 506B having a dual physical layer interface (e.g., a third physical layer interface 508B and a fourth physical layer interface 510B) and an antenna 512B connected to the dual physical layer interface.
  • a first connection (e.g., a first channel) is established between host device 502 and HMD device 504 via first physical layer interface 508A and third physical layer interface 508B according to a wireless communication protocol.
  • a second connection (e.g., a second channel) may be established between host device 502 and HMD device 504 via second physical layer interface 510A and fourth physical layer interface 510B according to the wireless communication protocol.
  • the wireless communication protocol is associated with a wireless connection that uses a TDD communication.
  • the wireless communication protocol is a Bluetooth protocol, an Ultra-wideband (UWB) communication protocol, a Near-Field Communication (NFC) protocol, a Zigbee communication protocol, an Infrared communication protocol, or the like.
  • the first channel may be used to transmit multimedia content and interactive data from host device 502 to HMD device 504 using antennas 512A and 512B.
  • the second channel may be used to transmit user input data and sensor data from HMD device 504 to host device 502 using antennas 512A and 5128.
  • the first channel and the second channel are associated with a common frequency band.
  • the frequency band can be 2.4GHz band, 5GHz band, or 6GHz band.
  • the first connection may use channel 3 and communication frequency of 40MHz bandwidth in 2.4GHz band and the second connection may use channel 11 and communication frequency of 20MHz bandwidth in 2.4GHz band.
  • two nonfrequency-overlapped channels are selected in 2.4GHz, 5GHz, or 6GHz band.
  • FIG. 5B is a block diagram of a computing environment 5008, depicting data communication between host device 502 and HMD device 504 of FIG. 5A via the first channel and the second channel in different frequency bands.
  • similarly named elements of FIG. 58 may be similar in structure and/or function to elements described with respect to FIG. 5A.
  • the first channel and the second channel are associated with different frequency bands.
  • the first channel is associated with a first frequency band (e.g., 2.4GHz band) and the second channel is associated with a second frequency band (e.g., 5GHz band) that is different from the first frequency band.
  • the first frequency is 2.4GHz band and the second frequency is 6GHz band.
  • the first frequency is 5GHz band and the second frequency is 6GHz band.
  • the first connection may use channel 50 and communication frequency of 160MHz bandwidth in 5GHz band and the second connection may use channel 3 and communication frequency of 40MHz bandwidth in 2.4GHz band.

Abstract

In an example, a wearable apparatus includes a head-mounted display (HMD) device, a wireless device, and a processor. The wireless device may include a first physical layer interface to communicate via a first communication frequency in accordance with a wireless communication protocol and a second physical layer interface to communicate via a second communication frequency in accordance with the wireless communication protocol. The second communication frequency may be different from the first communication frequency. During operation, the processor may receive a first data stream from a host device via the first physical layer interface to render the first data stream on the HMD device and transmit a second data stream to the host device via the second physical layer interface. The second data stream may be used to control the first data stream that is transmitted from the host device.

Description

WEARABLE APPARATUSES WITH DUAL PHYSICAL LAYER INTERFACES
BACKGROUND
[0001] Virtual reality [VR) devices are used for different applications. For example, a user wears and operates a VR headset (e.g., a head-mounted display (HMD) device) to view content, play a video game, conduct a virtual meeting, or perform other online-related activities. The HMD device may connect to a host device capable of executing (e.g., running) programs to generate a reality in the case of a VR, or to enhance the reality in the case of an AR. The HMD device may use an input (e.g., audio data, video data, haptic data, and the like) from a computer program (e.g., a VR program and/or an AR program running in the host device) to allow a wearer to experience the VR and/or the AR. Further, the VR devices may transmit user input data and/or sensor data to the host device to control input interfaces for objects in virtual environments and augmented reality environments.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Examples are described in the following detailed description and in reference to the drawings, in which:
[0003] FIG. 1A is a block diagram of an example wearable apparatus, including a wireless device having a dual physical layer interface to transmit/receive data stream to/from a host device;
[0004] FIG. 1B is a block diagram of the example wearable apparatus of FIG. 1A, depicting additional features;
[0005] FIG. 2A is a block diagram of the example wearable apparatus of FIG. 1A, depicting a duplexer to combine a first physical layer interface and a second physical layer interface of the wireless device;
[0006] FIG. 2B is a block diagram of the example wearable apparatus of FIG. 1A, depicting a unified media access control (MAC) connected to the first physical layer interface and the second physical layer interface; [0007] FIG. 3 is a block diagram of an example wearable apparatus, including a wireless device having a dual physical layer interface to communicate with a host device;
[0008] FIG. 4 is a block diagram of an example wearable apparatus including a non-transitory computer-readable storage medium, storing instructions to communicate with a host device via a dual physical layer interface;
[0009] FIG. 5A is a block diagram of a computing environment, depicting data communication between a host device and a head-mounted display (HMD) device via a first channel and a second channel in a common frequency band; and
[0010] FIG. 58 is a block diagram of a computing environment, depicting data communication between the host device and the HMD device of FIG. 5A via the first channel and the second channel in different frequency bands.
DETAILED DESCRIPTION
[0011] Modern computing and display technologies have facilitated development of systems for "Virtual reality" (VR), “augmented reality” (AR), or "mixed reality" (MR) experiences, where digitally reproduced images or portions thereof are presented to a user in a manner where they seem to be, or may be perceived as, real. A VR scenario may involve presentation of digital or virtual image Information without transparency to other actual real-world visual input. An AR scenario may involve presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. Further, a MR scenario may be related to merging real and virtual worlds to produce new environments where physical and virtual objects co-exist and interact in real-time.
[0012] In such scenarios, users can explore the virtual environment using a head-mounted display (HMD) device, often in conjunction with an input device. The HMD devices can take a variety of forms such as glasses, goggles, helmets, and the like. The HMD devices may display a virtual environment in front of user’s eyes. The input device may exchange commands and data with the HMD device through either a wired or wireless connection. An example input device may be a gaming controller, an electronic pen, a joystick, or any other device that allows the user to respond to information displayed on the HMD device through a touch, gesture, proximity, hovering input, or the like. For example, the user may wear and operate the HMD device to view content, play a video game, conduct a virtual meeting, or perform other online-related activities, using the input device. Further, the users may explore the virtual world by moving through the physical environment, where such movements correspond to and control movements in the virtual world.
[0013] Further, the HMD device may be communicatively connected to a host device (e.g., a personal computer, a cloud-based computing device, or the like), which may be capable of executing programs to generate a reality in the case of the VR, to enhance the reality in the case of the AR, or the like. During operation, the HMD device may receive first data stream from the host device, which may be displayed on a display panel of the HMD device. Further, the HMD device may transmit second data stream to the host device. In this example, the second data stream is used to control the first data stream that is transmitted from the host device.
[0014] Thus, data transmission may be identified into two categories. One being the first data stream such as the video, audio, haptic, and the like from the host device to the HMD device. The other being the second data stream such as gesture data, position data, control data, and the like from the HMD device to the host device. In an example VR, AR, or MR scenario, the HMD device may utilize a single wireless channel corresponding to a duplex data communication (e.g., time division duplexing (TDD)) for receiving the first data stream from the host device and transmitting the second data stream to the host device.
[0015] In such HMD devices that utilize a single wireless channel for data communication, data transmission to the host device and data reception from the host device cannot be done at the same time because the wireless connection’s hardware follows time division in data transmission and reception. In this example, the host device may have to wait for the HMD device to finish the second data stream transmission before the host device transmits the first data stream to the HMD device, and vice versa. Therefore, the HMD device that utilizes a single wireless channel for data communication may cause a latency in the data transmission/reception and may bring a non-real-time experience and motion sickness sensation to the user. Also, the time division of the single wireless channel design in the data transmission and reception can result in a reduced capacity for realizing real-time high video quality/resolution with high refresh rates when the HMD device has to transmit significantly large tracking/sensing/other data back to the host device.
[0016] For example, a motion-to-photon (MTP) latency is measured by a time from the user moving his head to display changing in the HMD device. In this example, a transmission latency from the HMD device to the host device and then the back from the host device to the HMD device may cause the MTP latency in a wireless connection (e.g., a Wi-Fi connection). The latency between the head movement and the display changing in the HMD device can lead to a detached VR experience and contribute to the motion sickness sensation.
[0017] Consider an example where the HMD device has to transmit data that can consume a significant amount of time for transmission. In this example, the latency time can be increased for the HMD device to receive the data from the host device, particularly when significantly large data has to be transmitted from the host device to the HMD device for the enhanced video quality with a high refresh rate. Further, the increased latency time may cause a lag in the HMD device’s video/audlo and body interaction system (e.g., body-mounted sensors) controlled by the HMD device. In this example, the user experience may be affected with the delayed visual/hearing/haptlc responses.
[0018] On the other hand, if the HMD device cannot transmit user's movement or sensing data back to the host device in time for processing due to the wireless channel being occupied by host device’s transmission at a current instance, then the HMD device may receive delayed data from the host device, which may affect the user experience.
[0019] In other examples, when the HMD device has to transmit a significant amount of data back to the host device, then a data bandwidth may be shared by the HMD device for transmission of the data, in real-time, to the host device. Such sharing of the data bandwidth for data transmission from the HMD device to the host device may reduce the data bandwidth for the data transmission from the host device to the HMD device, thereby affecting the video/audio output of the HMD device.
[0020] Examples described herein provides a wearable apparatus including an input device and an HMD device. The input device may receive sensor data (e.g., position data, environment data, or the like), user input data (e g. , gesture input, control input, or the like), or both. The HMD device may include an antenna and a wireless device connected to the antenna. The wireless device may include a first physical layer interface to communicate via a first communication frequency in accordance with a wireless communication protocol (e.g., a wireless communication protocol that uses a time division duplexing (TDD) communication) and a second physical layer interface to communicate via a second communication frequency in accordance with the wireless communication protocol. Further, the wearable apparatus may include a processor. During operation, the processor may generate a first data stream based on the sensor data, user input data, or both and transmit the first data stream to a host device via the first physical layer interface. Further, the processor may receive a second data stream from the host device via the second physical layer interface. The second data stream may be generated at the host device according to a data source and the first data stream.
[0021] Thus, examples described herein provides a wireless device with dual physical layer interfaces, where one physical layer interface may be utilized to transmit the data from the HMD device to the host device via a first channel, and the other physical layer interface may be used for the data transmission from the host device to the HMD device via a second channel Hence, examples described herein provides a full duplex data transmission between the HMD device and the host device to reduce the latency time (or approach "zero-wait”) and makes the host device transmits the video data (e.g., with maximum video quality and minimum video compression) without sharing data bandwidth in the same channel for receiving data from the HMD device. [0022] In the foltowing description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present techniques. However the example apparatuses, devices, and systems, may be practiced without these specific details. Reference in the specification to “an example" or similar language means that a particular feature, structure, or characteristic described may be included in at least that one example but may not be in other examples.
[0023] Turning now to the figures, FIG. 1A is a block diagram of an example wearable apparatus 100, including a wireless device 104 having a dual physical layer interface to transmit/receive data stream to/from a host device 112. Wearable apparatus 100 may present two-dimensional (2D) or three-dimensional (3D) virtual images (i.e., virtual content) to a user. The images may be still images., frames of a video, or a video, or in any combinations thereof. Further, wearable apparatus 100 can present the virtual content in a Virtual Reality (VR) environment, an Augmented Reality (AR) environment, or a Mixed Reality (MR) environment for user interaction,
[0024] As shown in FIG. 1A, wearable apparatus 100 includes a head-mounted display (HMD) device 102 to present the virtual content. For example, HMD device 102 is a VR headset. HMD device 102 may connect to host device 112 (e.g., a computing device) capable of executing (e.g., running) programs to generate a reality in the case of a VR, or to enhance the reality in the case of an AR. Host device 112 may execute the program, receive images of a physical environment from HMD device 102, analyze the images, and send data to HMD device 102. Further, HMD device 102 may use an input from a computer program (e.g., a VR program and/or an AR program running in host device 112) to allow a wearer to experience the VR and/or the AR. HMD device 102 may be connected to host device 112 executing the VR and/or AR program via a wireless connection,
[0025] As used herein, HMD device 102 may refer to a device worn on a head which may display visual data to the user. In an example, a user operates the VR headset to play a role in the video game. The VR headsets may display a high- resolution graphical user interface, plays 3D audio, and provides a voice service for communication with other users. Similarly, the VR headset can be used for other applications such as to view content, conduct a virtual meeting, or perform other online-related activities.
[0026] Further, wearable apparatus 100 includes wireless device 104. In an example, wireless device 104 includes a first physical layer interface 106 to communicate via a first communication frequency in accordance with a wireless communication protocol The wireless communication protocol may be associated with any wireless connection that uses a time division duplexing (TDD) communication. The TDD communication may refer to duplex communication links where uplink is separated from downlink by an allocation of different time slots in a same frequency band. For example, the wireless communication protocol is a wireless fidelity (Wi-Fi) protocol, a Bluetooth protocol, 3G/4G/5G wireless standards, or the like. In other examples, the wireless communication protocol can be other short-range or long-range communication protocols such as a ZigBee® protocol, a Z-Wave® protocol, an EEE 802.15.4 protocol, a Long-Term Evolution Direct (LTE-D) protocol, or the like. Further, wireless device 104 includes a second physical layer interface 108 to communicate via a second communication frequency in accordance with the wireless communication protocol. The second communication frequency may be different from the first communication frequency,
[0027] Furthermore, wearable apparatus 100 includes a processor 110. As used herein, the term “processor" may refer to, for example, a central processing unit (CPU), a semiconductor-based microprocessor, a digital signal processor (DSP) such as a digital image processing unit, or other hardware devices or processing elements suitable to retrieve and execute instructions stored in a storage medium, or suitable combinations thereof. A processor may, for example, include single or multiple cores on a chip, multiple cores across multiple chips, multiple cores across multiple devices, or suitable combinations thereof. A processor may be functional to fetch, decode, and execute instructions as described herein.
[0028] During operation, processor 110 may receive a first data stream from host device 112 via first physical layer interface 106 (e.g., using the first communication frequency) to render the first data stream on HMD device 102. For example, the first data stream includes audio data, video data, text data, haptic data, or any combination thereof. Further, processor 110 may transmit a second data stream to host device 112 via second physical layer interface 108 (e.g. , using the second communication frequency). In an example, the second data stream is used to control the first data stream that is transmited from host device 112.
[0029] FIG. 1B is a block diagram of example wearable apparatus 100 of FIG. 1A, depicting additional features. For example, similarly named elements of FIG. 1B may be similar in structure and/or function to elements described with respect to FIG. 1A. As shown in FIG. 1B, HMD device 102 includes a display panel 152 and an audio device 154. Display panel 152 may output image data, text data, video data, or any combination thereof of the first data stream. Further, audio device 154 may output audio data of the first data stream, capture voice input data, or a combination thereof.
[0030] During operation, processor 110 may provide the first data stream from host device 112 to display panel 152 and audio device 154 via first physical layer interface 106 of wireless device 104. Further, processor 110 may provide a voice input from audio device 154 to host device 112 via second physical layer interface 108 of wireless device 104.
[0031] Further as shown in FIG.1B , host device 112 includes a wireless device 162. In an example, wireless device 162 includes a dual physical layer interface, i.e., a third physical layer interface 164 to communicate with first physical layer interface 106 of wearable apparatus 100 and a fourth physical layer interface 166 to communicate with second physical layer interface 108 of wearable apparatus 100, During operation, host device 112 may transmit the first data stream to wearable apparatus 100 via third physical layer interface 164 of wireless device 162. Further, host device 112 (e.g., a processor of host device 112) may receive the second data stream via fourth physical layer interface 166 of wireless device 162. Furthermore, host device 112 may process the second data stream to modify the first data stream based on the second data stream and the data source. [0032] Furthermore, as shown in FIG. 1B, wearable apparatus 100 includes a position sensing device 156, a user input device 158,. an environment sensing device 160, or any combination thereof. In an exampie, position sensing device 156 obtains position data associated with HMD device 102. For example, the position data includes movement information, orientation information, tilt angle information, location information, or any combination thereof associated with HMD device 102. Further, user input device 158 may obtain user input data. For example, the user input data includes a button input, a voice input, a gesture input, or any combination thereof. Furthermore, environment sensing device 160 obtains environment data of a vicinity of HMD device 102. For example, the environment data includes image data, video data, depth information, or any combination thereof associated with a physical environment in the vicinity of HMD device 102. During operation, processor 110 may generate the second data stream based on the position data, the user input data, the environment data, or any combination thereof.
[0033] FIG. 2A is a block diagram of example wearable apparatus 100 of FIG. 1A, depicting a duplexer 206 to combine first physical layer interface 106 and second physical layer interface 108 of wireless device 104, For example, similarly named elements of FIG. 2A may be similar in structure and/or function to elements described with respect to FIG. 1A. As shown in FIG. 2A, wireless device 104 includes a first media access controller (MAC) 202 and a second MAC 204. In an example, first MAC 202 is communicatively connected to first physical layer interface 106 to receive the first data stream from host device 112 via first physical layer interface 106. Further, second MAC 204 is communicatively connected to second physical layer interface 108 to transmit the second data stream to host device 112 via second physical layer interface 108.
[0034] First MAC 202 may have a first MAC address to wirelessly communicate with host device 112 via first physical layer interface 106. Further, second MAC 204 may have a second MAC address to wirelessly communicate with host device 112 via second physical layer interface 108. Further, physical layer interfaces 106 or 108 may send and receive data packets of an application wirelessly. For example, first physical layer interface 106 is connected to or provided with an antenna 208 to facilitate communication with host device 112 via a first channel Further, second physical layer interface 108 may be connected to or provided with antenna 208 to facilitate communication with host device 112 via a second channel.
[0035] in an example, first MAC 202 operates between an upper network layer (e.g., a logical link control layer) and first physical layer interface 106. Similarly, second MAC 204 may operate between the upper network layer and second physical layer interface 108. During operation, a MAC (e,g„ MAC 202 or 204) may process network data received from the upper network layer and send to host device 112 via a corresponding physical layer interface (e.g. , physical layer interface 106 or 108). Similarly, the MAC may also process network data received from the corresponding physical layer interface and send to the upper network layer for further processing.
[0036] Further, physical layer interface 106 or 108 may transform data received from corresponding MAC 202 or 204 into signals for transmission and transmit the signals to host device 112. Furthermore, physical layer interface 106 or 108 may also receive signals from host device 112, convert the received signals into data, and provide the data to corresponding MAC 202 or 204, In some examples, physical layer interfaces 106 and 108 may be compliant with specifications, such as 802.11 standards. Processor 110 may coordinate between first MAC 202 and second MAC 204 to perform the simultaneous operations (i.e.. receiving first data stream and transmitting second data stream at a same time as described with respect to FIG. 1A).
[0037] Further as shown in FIG. 2A, wireless device 104 includes duplexer 206 to enable first physical layer interface 106 and second physical layer interface 108 to share a common antenna 208. For example, duplexer 206 is a three-port filtering device which allows transmitters and receivers operating at different communication frequencies to share same antenna 208, An advantage of using duplexer 206 is that duplexer 206 may transmit and receive data via a single antenna 208. Furthermore, host device 112 may include an antenna 212 to transmit (e.g., the first data stream) and receive (e.g., the second data stream) data to/from wireless device 104, respectively, via respective wireless channels 210. [0038] FIG. 2B is a block diagram of exampie wearable apparatus 100 of FIG. 1 A, depicting a unified MAC 252 connected to first physical layer interface 106 and second physical layer interface 108. As shown in FIG. 2B, wireless device 104 includes unified MAC 252 connected to first physical layer interface 106 and second physical layer interface 108. in an example, unified MAC 252 receives multimedia content from host device 112 via first physical layer interface 106 and transmits sensor data to host device 112 via second physical layer interface 108.
[0039] In an example, unified MAC 252 is provided for controlling multiple physical layer interfaces (e.g., first physical layer interface 106 and second physical layer interface 108), such as for controlling concurrent data communication over first physical layer interface 106 and second physical layer interface 108. The term “concurrent” may refer to two events occurring at a same time or a portion of the two events occurring at the same time (i.e., the two events occurring at an overlapping period of time). Further, multiple physical layer interfaces (e.g., first physical layer interface 106 and second physical layer interface 108) may be configured to communicate with host device 112 over wireless channel 210. For example, unified MAC 252 supports transmitting/receiving packets (e.g., data, management, extension, acknowledgements, and the like) by any physical layer interface or by multiple physical layer interfaces in an order. Further, the data may be transmitted in chunks or in packets.
[0040] FIG. 3 is a block diagram of an example wearable apparatus 300, including a wireless device 308 having a dual physical layer interface to communicate with a host device 316. As shown in FIG. 3, wearable apparatus 300 includes an input device 302 to receive sensor data, user input data, or both. Further, wearable apparatus 300 may include a HMD device 304 communicatively connected to input device 302. For example, the sensor data includes position data of a user wearing HMD device 304, environment data surrounding the user, or a combination thereof. The user input data may include a button input, a voice input, a gesture input, or any combination thereof. For example, HMD device 304 is a VR HMD device, an AR HMD device, or a MR HMD device. [0041] For example, input device 302 can include a trackpad, a touchscreen, a joystick, a multiple degree-of-freedom (DOF) controller, a game controller, a keyboard, a mouse, a directional pad (D~pad), a wand, a haptic device, a totem (e.g., functioning as a virtual input device), a wearable sensor, an image sensor (e.g., a camera), or the like. A multi-DOF controller can sense user input in some or all possible translations (e.g., left/right, forward/backward, or up/down) or rotations (e.g., yaw, pitch, or roll) of the controller. A multi-DOF controller which supports the translation movements may be referred to as a 3DOF while a multi- DOF controller which supports the translations and rotations may be referred to as 6DOF. In some cases, the user may use a finger (e.g., a thumb) to press or swipe on a touch-sensitive input device to provide input to HMD device 304. Input device 302 may be held by the user’s hand during the use of wearable apparatus 300. Input device 302 can be In a wired communication or a wireless communication with HMD device 304.
[0042] Further, HMD device 304 includes an antenna 306, wireless device 308 connected to antenna 306, and a processor 314 connected to wireless device 308. Furthermore, wireless device 308 includes first physical layer interface 310 and second physical layer interface 312. First physical layer interface 310 may communicate via a first communication frequency in accordance with a wireless communication protocol. Second physical layer interface 312 may communicate via a second communication frequency in accordance with the wireless communication protocol.
[0043] During operation, processor 314 may generate a first data stream based on the sensor data, user input data, or both. Further, processor 314 may transmit the first data stream to host device 316 via first physical layer interface 310. In an example, host device 316 includes another antenna 318 to receive the first data stream. Further, host device 316 may include a processor to process the first data stream to generate a second data stream and transmit the second data stream via antenna 318.
[0044] Further during operation, processor 314 may receive the second data stream from host device 316 via second physical layer interface 312. In an example, the second data stream is generated at host device 316 according to a data source and the first data stream.
[0045] In an example, wireless device 308 includes a first media access controller (MAC) and a second MAC. The first MAC may be communicatively connected to first physical layer interface 310 to transmit the first data stream to host device 316 via the first communication frequency. Further, the second MAC may be communicatively connected to second physical layer interface 312 to receive the second data stream from host device 316 via the second communication frequency.
[0046] In another example, wireless device 308 includes a unified MAC connected to first physical layer interface 310 and second physical layer interface 312. The unified MAC may transmit the first data stream to host device 316 via first physical layer interface 310 and receive the second data stream from host device 316 via second physical layer interface 312, In yet another example, wireless device 308 includes a duplexer to combine first physical layer interface 310 and second physical layer interface 312 operating at different communication frequencies to utilize antenna 306.
[0047] FIG, 4 is a block diagram of an example wearable apparatus 400 including a non -transitory computer-readable storage medium 404, storing instructions to communicate with a host device via a dual physical layer interface. In an example, wearable apparatus 400 is a head-mounted display (HMD) device. Wearable apparatus 400 includes a processor 402 and computer-readable storage medium 404 communicatively coupled through a system bus. Processor 402 may be any type of CPU, microprocessor, or processing logic that interprets and executes computer-readable instructions stored in computer-readable storage medium 404.
[0048] Computer-readable storage medium 404 may be a random-access memory (RAM) or another type of dynamic storage device that may store information and computer-readable instructions that may be executed by processor 402. For example, computer-readable storage medium 404 may be synchronous DRAM (SDRAM), double data rate (DDR), Rambus® DRAM (RDRAM), Rambus® RAM , and the like, or storage memory media such as a floppy disk, a hard disk, a CD-ROM, a DVD, a pen drive, and the like. In an example, computer-readable storage medium 404 may be a non-transitory computer- readable medium, where the term “non-transitory” does not encompass transitory propagating signais. In an example, computer-readable storage medium 404 may be remote but accessible to wearable apparatus 400,
[0049] Computer-readable storage medium 404 stores instructions 406, 408, 410, and 412. In an example, wearable apparatus 400 may include an HMD device and an input device to input data or command to the HMD device. In an example, processor 402 is implemented as part of the HMD device,
[0050] Instructions 406 may be executed by processor 402 to establish a first connection with a host device via a first physical layer interface of a wireless device according to a wireless communication protocol. For example, the wireless communication protocol is associated with a wireless connection that uses a TDD communication.
[0051] Instructions 408 may be executed by processor 402 to establish a second connection with the host device via a second physical layer interface of the wireless device according to the wireless communication protocol. The first connection and the second connection may be established via a first channel and a second channel, respectively, having non-overlapping frequencies. In an example, the first channel and the second channel are associated with a common frequency band. In another example, the first channel is associated with a first frequency band and the second channel is associated with a second frequency band that is different from the first frequency band,
[0052] Instructions 410 may be executed by processor 402 to utilize the first physical layer interface to transmit user input data and sensor data to the host device via the first connection. In an example, instructions to utilize the first physical layer interface to transmit the user input data, the sensor data, or both to the host device include instructions to utilize a first media access controller (MAC) connected to the first physical layer interface to transmit the user input data, the sensor data, or both. The first physical layer interface may be an interface between the fist MAC and the first channel.
[0053] Instructions 412 may be executed by processor 402 to utilize the second physical layer interface to receive multimedia content and interactive data from the host device via the second connection For example, the multimedia content and the interactive data are generated based on the user input data and the sensor data. In an example, instructions to utilize the second physical layer interface to receive the multimedia content and the interactive data from the host device include instructions to utilize a second MAC connected to the second physical layer interface to receive the multimedia content and the interactive data. The second physical layer interface may be an interface between the second MAC and the second channel.
[0054] FIG. 5A is a block diagram of a computing environment 500A, depicting data communication between a host device 502 and an HMD device 504 via a first channel and a second channel in a common frequency band. As shown in FIG. 5A, host device 502 includes a wireless device 506A having a dual physical layer interface (e.g., a first physical layer interface 508A and a second physical layer interface 510A) and an antenna 512A connected to the dual physical layer interface. Further, HMD device 504 includes a wireless device 506B having a dual physical layer interface (e.g., a third physical layer interface 508B and a fourth physical layer interface 510B) and an antenna 512B connected to the dual physical layer interface.
[0055] In an example, a first connection (e.g., a first channel) is established between host device 502 and HMD device 504 via first physical layer interface 508A and third physical layer interface 508B according to a wireless communication protocol. Further, a second connection (e.g., a second channel) may be established between host device 502 and HMD device 504 via second physical layer interface 510A and fourth physical layer interface 510B according to the wireless communication protocol. In an example, the wireless communication protocol is associated with a wireless connection that uses a TDD communication. In other examples, the wireless communication protocol is a Bluetooth protocol, an Ultra-wideband (UWB) communication protocol, a Near-Field Communication (NFC) protocol, a Zigbee communication protocol, an Infrared communication protocol, or the like.
[0056] During operation, the first channel may be used to transmit multimedia content and interactive data from host device 502 to HMD device 504 using antennas 512A and 512B. Further, the second channel may be used to transmit user input data and sensor data from HMD device 504 to host device 502 using antennas 512A and 5128. In an example, the first channel and the second channel are associated with a common frequency band. For example, the frequency band can be 2.4GHz band, 5GHz band, or 6GHz band. For example, the first connection may use channel 3 and communication frequency of 40MHz bandwidth in 2.4GHz band and the second connection may use channel 11 and communication frequency of 20MHz bandwidth in 2.4GHz band. In this example, two nonfrequency-overlapped channels are selected in 2.4GHz, 5GHz, or 6GHz band.
[0057] FIG. 5B is a block diagram of a computing environment 5008, depicting data communication between host device 502 and HMD device 504 of FIG. 5A via the first channel and the second channel in different frequency bands. For example, similarly named elements of FIG. 58 may be similar in structure and/or function to elements described with respect to FIG. 5A. In the example shown in FIG. 58, the first channel and the second channel are associated with different frequency bands. For example, the first channel is associated with a first frequency band (e.g., 2.4GHz band) and the second channel is associated with a second frequency band (e.g., 5GHz band) that is different from the first frequency band. In another example, the first frequency is 2.4GHz band and the second frequency is 6GHz band. In yet another example, the first frequency is 5GHz band and the second frequency is 6GHz band. For example, the first connection may use channel 50 and communication frequency of 160MHz bandwidth in 5GHz band and the second connection may use channel 3 and communication frequency of 40MHz bandwidth in 2.4GHz band. [0058] The above-described examples are for the purpose of illustration. Although the above examples have been described in conjunction with example implementations thereof numerous modifications may be possible without materially departing from the teachings of the subject matter described herein. Other substitutions, modifications, and changes may be made without departing from the spirit of the subject matter. Also, the features disclosed in this specification (including any accompanying claims, abstract, and drawings), and/or any method or process so disclosed, may be combined in any combination, except combinations where some of such features are mutually exclusive.
[0059] The terms “include," “have,” and variations thereof, as used herein, have the same meaning as the term "comprise" or appropriate variation thereof. Furthermore, the term “based on," as used herein, means “based at least in part on." Thus, a feature that is described as based on some stimulus can be based on the stimulus or a combination of stimuli including the stimulus. In addition, the terms “first” and “second” are used to identify individual elements and may not meant to designate an order or number of those elements,
[0060] The present description has been shown and described with reference to the foregoing examples . It is understood, however, that other forms, details, and examples can be made without departing from the spirit and scope of the present subject matter that is defined in the following claims.

Claims

WHAT IS CLAIMED IS:
1. A wearable apparatus comprising: a head-mounted display (HMD) device; a wireless device comprising: a first physical layer Interface to communicate via a first communication frequency in accordance with a wireless communication protocol; and a second physical layer interface to communicate via a second communication frequency in accordance with the wireless communication protocol, the second communication frequency is different from the first communication frequency; and a processor to: receive a first data stream from a host device via the first physical layer interface to render the first data stream on the HMD device; and transmit a second data stream to the host device via the second physical layer interface, wherein the second data stream is used to control the first data stream that is transmitted from the host device,
2. The wearable apparatus of claim 1 , wherein the first data stream comprises audio data, video data, text data, haptic data, or any combination thereof,
3. The wearable apparatus of claim 1 , further comprising: a position sensing device to obtain position data associated with the HMD device; a user input device to obtain user input data; an environment sensing device to obtain environment data of a vicinity of the HMD device; or any combination thereof, wherein the processor is to generate the second data stream based on the position data, the user input data, the environment data, or any combination thereof.
4. The wearable apparatus of claim 3, wherein the position data comprises movement information, orientation information, tilt angle information, location information, or any combination thereof associated with the HMD device, wherein the user input data comprises a button input, a voice input, a gesture input, or any combination thereof, and wherein the environment data comprises image data, video data, depth information, or any combination thereof associated with a physicai environment in the vicinity of the HMD device.
5. The wearable apparatus of claim 1 , wherein the wireless device comprises: a first media access controller (MAC) communicatively connected to the first physical layer interface to receive the first data stream from the host device via the first physical layer interface; and a second MAC communicatively connected to the second physical layer interface to transmit the second data stream to the host device via the second physicai layer interface.
6. The wearable apparatus of claim 1 , wherein the wireless device comprises a unified media access controller (MAC) connected to the first physical layer interface and the second physical layer interface, the unified MAC is to receive multimedia content from the host device via the first physical layer interface and transmit sensor data to the host device via the second physical layer interface.
7. The wearable apparatus of claim 1 , the wireless device comprises: a duplexer to enable the first physical layer interface and the second physical layer interface to share a common antenna.
8. The wearable apparatus of claim 1, wherein the wireless communication protocol is associated with a wireless connection that uses a time division duplexing (TDD) communication.
9. The wearable apparatus of claim 1 , wherein the HMD device comprises: a display panel to output image data, text data, video data, or any combination thereof of the first data stream; and an audio device to output audio data of the first data stream, wherein the processor is to: provide th® first data stream from the host device to the display panel and the audio device via the first physical layer interface of the wireless device; and provide a voice input from the audio device to the host device via the second physical layer interface of the wireless device.
10. A wearable apparatus comprising: an input device to receive sensor data, user input data, or both; and a head-mounted display (HMD) device comprising: an antenna: a wireless device connected to the antenna, the wireless device comprising: a first physical layer interface to communicate via a first communication frequency in accordance with a wireless communication protocol; and a second physical layer interface to communicate via a second communication frequency in accordance with the wireless communication protocol; and a processor to: generate a first data stream based on the sensor data, user input data, or both, transmit the first data stream to a host device via the first physical layer interface; and receive a second data stream from the host device via the second physical layer interface, the second data stream is generated at the host device according to a data source and the first data stream.
11. The wearable apparatus of claim 10, wherein the sensor data comprises position data of a user wearing the HMD device, environment data surrounding the user, or a combination thereof, and wherein the user input data comprises a button input, a voice input, a gesture input, or any combination thereof.
12. The wearable apparatus of claim 10. wherein the wireless device comprises: a first media access controller (MAC) communicatively connected to the first physical layer interface to transmit the first data stream to the host device via the first communication frequency; and a second MAC communicatively connected to the second physical layer interface to receive the second data stream from the host device via the second communication frequency.
13. The wearable apparatus of claim 10, wherein the wireless device comprises a unified media access controller (MAC) connected to the first physical layer interface and the second physical layer interface, the unified MAC is to transmit the first data stream to the host device via the first physical layer interface and receive the second data stream from the host device via the second physical layer interface.
14. The wearable a pparatus of claim 10, wherein the wireless device comprises: a duplexer to combine the first physical layer interface and the second physical layer interface operating at different communication frequencies to utilize the antenna.
15. A non-transitory computer-readable storage medium storing instructions executable by a processor of a wearable apparatus to: establish a first connection with a host device via a first physical layer interface of a wireless device according to a wireless communication protocol; establish a second connection with the host device via a second physical layer interface of the wireless device according to the wireless communication protocol, the first connection and the second connection are established via a first channel and a second channel, respectively, having non-overlapping frequencies; utilize the first physical layer interface to transmit user input data and sensor data to the host device via the first connection; and utilize the second physical layer interface to receive multimedia content and interactive data from the host device via the second connection, the multimedia content and the interactive data are generated based on the user input data and the sensor data.
16. The non-transstory computer-readable storage medium of claim 15, wherein the first channel and the second channel are associated with a common frequency band.
17. The non-transitory computer-readable storage medium of claim 15, wherein the first channel is associated with a first frequency band and the second channel is associated with a second frequency band that is different from the first frequency band.
18. The non-transitory computer-readable storage medium of claim 15, wherein instructions to utilize the first physical layer interface to transmit the user input data, the sensor data, or both to the host device comprise instructions to: utilize a first media access controller (MAC) connected to the first physical layer interface to transmit the user input data, the sensor data, or both, wherein the first physical layer interface is an interface between the fist MAC and the first channel.
19. The non-transitory computer-readable storage medium of claim 15, wherein instructions to utilize the second physical layer interface to receive the multimedia content and the interactive data from the host device comprise instructions to: utilize a second media access controller (MAC) connected to the second physical layer interface to receive the multimedia content and the interactive data, wherein the second physical layer interface is an interface between the second MAC and the second channel.
20. The non-transitory computer-readable storage medium of claim 15, wherein the wearable apparatus is a head-mounted display (HMD) device, and wherein the wireless communication protocol is associated with a wireless connection that uses a time division duplexing (TDD) communication.
PCT/US2021/060673 2021-11-24 2021-11-24 Wearable apparatuses with dual physical layer interfaces WO2023096639A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2021/060673 WO2023096639A1 (en) 2021-11-24 2021-11-24 Wearable apparatuses with dual physical layer interfaces

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/060673 WO2023096639A1 (en) 2021-11-24 2021-11-24 Wearable apparatuses with dual physical layer interfaces

Publications (1)

Publication Number Publication Date
WO2023096639A1 true WO2023096639A1 (en) 2023-06-01

Family

ID=86540168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/060673 WO2023096639A1 (en) 2021-11-24 2021-11-24 Wearable apparatuses with dual physical layer interfaces

Country Status (1)

Country Link
WO (1) WO2023096639A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160217103A1 (en) * 2015-01-27 2016-07-28 Samsung Electronics Co., Ltd. Method and apparatus for full duplex transmission between electronic devices
US20170308258A1 (en) * 2014-09-26 2017-10-26 Lg Electronics Inc. Mobile device, hmd and system
US20180033402A1 (en) * 2016-07-29 2018-02-01 Beijing Pico Technology Co., Ltd. Head-wearable display device
US20180095529A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Predictive RF Beamforming for Head Mounted Display
RU2677566C1 (en) * 2016-08-30 2019-01-17 Бейдзин Сяоми Мобайл Софтвэр Ко., Лтд. Method, device and electronic equipment for virtual reality managing
US10356401B1 (en) * 2017-08-16 2019-07-16 Facebook, Inc. Noise cancellation in a wireless head mounted display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170308258A1 (en) * 2014-09-26 2017-10-26 Lg Electronics Inc. Mobile device, hmd and system
US20160217103A1 (en) * 2015-01-27 2016-07-28 Samsung Electronics Co., Ltd. Method and apparatus for full duplex transmission between electronic devices
US20180033402A1 (en) * 2016-07-29 2018-02-01 Beijing Pico Technology Co., Ltd. Head-wearable display device
RU2677566C1 (en) * 2016-08-30 2019-01-17 Бейдзин Сяоми Мобайл Софтвэр Ко., Лтд. Method, device and electronic equipment for virtual reality managing
US20180095529A1 (en) * 2016-09-30 2018-04-05 Sony Interactive Entertainment Inc. Predictive RF Beamforming for Head Mounted Display
US10356401B1 (en) * 2017-08-16 2019-07-16 Facebook, Inc. Noise cancellation in a wireless head mounted display

Similar Documents

Publication Publication Date Title
US11682106B2 (en) Foveated image rendering for head-mounted display devices
JP6391077B2 (en) Computing device, computing system, method, data processing system, non-transitory machine-readable medium, and program
US9310613B2 (en) Mobile wireless display for accessing data from a host and method for controlling
US10397627B2 (en) Desktop-cloud-based media control method and device
US20190124464A1 (en) Audio spatialization
US11164546B2 (en) HMD device and method for controlling same
US20200167999A1 (en) Image generation based on brain activity monitoring
CN109845275A (en) The method and apparatus that defeated session control is supported are spread for visual field virtual reality
US10146501B1 (en) Sound control by various hand gestures
Joachimczak et al. Real-time mixed-reality telepresence via 3D reconstruction with HoloLens and commodity depth sensors
KR102572528B1 (en) HMD delivery system and method
KR20140141419A (en) Display apparatus and control method thereof
WO2023096639A1 (en) Wearable apparatuses with dual physical layer interfaces
US11500476B1 (en) Dual-transceiver based input devices
US10096149B2 (en) Direct motion sensor input to rendering pipeline
US20210042990A1 (en) Rendering a virtual scene
Waytowich et al. Development of an extensible SSVEP-BCI software platform and application to wheelchair control
US11132054B2 (en) Electronic apparatus, control method thereof and electronic system
CN209803661U (en) Wearable computing equipment
US10916117B2 (en) Collison avoidance for wearable apparatuses
CN115131547A (en) Method, device and system for image interception by VR/AR equipment
JP2022522580A (en) Systems and methods for beamforming
US11843411B2 (en) Systems and methods of coexistence of Wi-Fi communication and narrowband communication
US20230214168A1 (en) Source device, sink device, and operating methods thereof
KR20220056645A (en) Method for controlling virtual multi screen, method for sending and receiving virtual environment, and apparatus implementing the same method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21965833

Country of ref document: EP

Kind code of ref document: A1