WO2020079320A1 - Delivering and handling media content in a wireless communication network - Google Patents

Delivering and handling media content in a wireless communication network Download PDF

Info

Publication number
WO2020079320A1
WO2020079320A1 PCT/FI2018/050752 FI2018050752W WO2020079320A1 WO 2020079320 A1 WO2020079320 A1 WO 2020079320A1 FI 2018050752 W FI2018050752 W FI 2018050752W WO 2020079320 A1 WO2020079320 A1 WO 2020079320A1
Authority
WO
WIPO (PCT)
Prior art keywords
media content
beams
tile
directional
mobile device
Prior art date
Application number
PCT/FI2018/050752
Other languages
French (fr)
Inventor
Athul Prasad
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Priority to EP18937200.6A priority Critical patent/EP3868030A4/en
Priority to CN201880100230.1A priority patent/CN113228526A/en
Priority to US17/284,165 priority patent/US20210336684A1/en
Priority to PCT/FI2018/050752 priority patent/WO2020079320A1/en
Publication of WO2020079320A1 publication Critical patent/WO2020079320A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0613Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission
    • H04B7/0615Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal
    • H04B7/0619Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal using feedback from receiving side
    • H04B7/0621Feedback content
    • H04B7/0632Channel quality parameters, e.g. channel quality indicator [CQI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B17/00Monitoring; Testing
    • H04B17/30Monitoring; Testing of propagation channels
    • H04B17/309Measuring or estimating channel quality parameters
    • H04B17/318Received signal strength
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0686Hybrid systems, i.e. switching and simultaneous transmission
    • H04B7/0695Hybrid systems, i.e. switching and simultaneous transmission using beam selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/08Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the receiving station
    • H04B7/0868Hybrid systems, i.e. switching and combining
    • H04B7/088Hybrid systems, i.e. switching and combining using beam selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234345Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/23439Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25841Management of client data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6405Multicasting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/06Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/18Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
    • H04W4/185Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals by embedding added-value information into content, e.g. geo-tagging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • FIELD [0001] Various example embodiments relate in general to wireless communication networks, and delivering and handling of media content in such networks.
  • Certain applications may require high data rates for delivering media content, such as, for example, video content.
  • media content such as, for example, video content.
  • higher frequency bands have more bandwidth available for wireless transmissions, which enables higher data rates.
  • current standardization efforts in the field of radio communications comprise the exploitation of higher frequency bands for wireless transmissions.
  • 3rd Generation Partnership Project, 3GPP develops 5G technology and considers the use of millimetre-wave frequency bands for it. There is therefore a need to provide improved methods, apparatuses and computer programs for transmitting and handling of media content, especially on high frequency bands.
  • an apparatus comprising means for receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.
  • the means may be further configured to perform transmission as a broadcast transmission or a multicast transmission.
  • the means may be further configured to perform selecting at least one strongest beam from the set of beams and selecting the tile of the directional media content based on the selected at least strongest one beam.
  • the means may be further configured to perform selecting a set of strongest beams from the set of beams and selecting the tile of the directional media content based on the selected set of strongest beams.
  • the means may be further configured to perform receiving information about the at least one beam, wherein said information comprises an identity of the at least one beam and selecting the tile of the directional media content based on the identity of the at least one beam.
  • the means may be further configured to perform receiving mapping instructions from the network node, wherein the mapping instructions comprise mappings between beam combinations and tiles of the directional media content, selecting a part of the mapping instructions based on a combination of received beams and selecting the tile of the directional media content based on the selected part of the mapping instructions.
  • the means may be further configured to perform determining an angle of arrival of the at least one beam and selecting the tile of the directional media content based on the angle of arrival of the at least one beam.
  • the means may be further configured to perform determining the angle of arrival of the at least one beam based on a direction tag associated with the at least one beam and a location tag of the network node.
  • the means may be further configured to perform receiving information about a location of the network node, determining a direction of the at least one beam based on the location of the network node and selecting the tile of the directional media content based on the direction of the at least one beam.
  • the means may be further configured to perform rendering the tile of the directional media content for the user.
  • the means may be further configured to perform signalling information about the at least one beam and/or the estimated viewing direction of the user from a physical layer of the mobile device to an application layer and selecting, at the application layer, the tile of the directional media content based on the signalled information.
  • the set of beams may comprise beams in horizontal and vertical directions.
  • the means may be further configured to perform receiving information related to multiple streams or programs via the at least one beam, selecting one of said multiple streams or programs and displaying the selected stream or program on the display of the mobile device.
  • an apparatus comprising means for transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content and transmitting information for displaying a tile of the directional media content on a mobile device.
  • the means may be further configured to perform transmission as a broadcast transmission or a multicast transmission.
  • said information may comprise information related to the set of beams.
  • said information may comprise mapping instructions, and the mapping instructions may comprise mappings between beam combinations and tiles of the directional media content.
  • said information may comprise an identity of a beam.
  • said information may comprise a location of the network node.
  • said information may comprise the set of beams may comprise beams in horizontal and vertical directions.
  • the means may be further configured to perform transmitting information related to multiple streams or programs via each beam of the set of beams.
  • an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to perform, receive from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimate a viewing direction based at least partially on directional properties of at least one beam of the set of beams, select a tile of the directional media content based on the estimated viewing direction and display the tile of the directional media content on a mobile device.
  • an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to perform, transmit a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmit information for displaying a tile of the directional media content on a mobile device.
  • a method comprising receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.
  • a method comprising transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmitting information for displaying a tile of the directional media content on a mobile device.
  • a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least perform, receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.
  • a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least perform, transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmitting information for displaying a tile of the directional media content on a mobile device.
  • a computer program configured to perform, receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.
  • a computer program configured to perform, transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmitting information for displaying a tile of the directional media content on a mobile device.
  • FIGURE 1 illustrates an exemplary network scenario in accordance with at least some embodiments
  • FIGURE 2 illustrates an exemplary network scenario for transmitting directional content in accordance with at least some embodiments
  • FIGURE 3 illustrates an exemplary scenario in accordance with at least some embodiments
  • FIGURE 4 illustrates an example related to estimation of a viewing direction based on received beams in accordance with at least some embodiments
  • FIGURE 5 illustrates an exemplary end-to-end implementation in accordance with at least some embodiments
  • FIGURE 6 illustrates a mapping in accordance with at least some embodiments
  • FIGURE 7 illustrates an example apparatus capable of supporting at least some embodiments
  • FIGURE 8 illustrates a flow graph of a first method in accordance with at least some embodiments
  • FIGURE 9 illustrates a flow graph of a second method in accordance with at least some embodiments.
  • a network node may transmit a transmission using a set of beams, wherein the transmission comprises directional media content.
  • beams may be used for highly directional radio transmissions, which may be used to transmit data from a transmitter, e.g., a base station, to a receiver, e.g., a mobile device.
  • the mobile device may select a strongest beam, or beams, and estimate a viewing direction of a user of the mobile device based on a direction of the selected beam(s).
  • the viewing direction of the user may correspond to a direction to which the mobile device is directed to.
  • the mobile device may also select a tile of the media content based on a direction of the selected beam(s) and display the selected tile on a display of the mobile device.
  • Media content may be, for example, directional media content.
  • some embodiments relate to delivering and handling of high-quality directional media content for applications, such as, Virtual Reality, VR, and/or Augmented Reality, AR.
  • applications typically deliver 3D video content and require high data rates. For example, data rates of at least 1 Gbps may be required, depending on the quality of the media content.
  • data rates of at least 1 Gbps may be required, depending on the quality of the media content.
  • eMBB enhanced Mobile BroadBand
  • such applications also require low latencies and small jitter.
  • Multicast transmissions may be used if the same directional media content is addressed to a selected group of users, whereas broadcast transmissions may be used if the content needs to be sent to all the users.
  • said media content may refer to directional media content, such as, for example, 3D, VR, AR, 360-degree videos or any form of media content with directional properties. That is to say, media shown to the user may depend or vary based on the viewing direction of the user.
  • High data rates may be achieved using high frequency bands, because typically there is more bandwidth available on higher frequency bands.
  • frequency bands between 30 and 300 GHz may be used. Such frequency bands may be referred to as millimetre-wave bands. Future radio communication systems may use even higher frequency bands.
  • Beamforming is often, if not always, used on millimetre-wave bands to improve the performance of wireless communication systems.
  • Embodiments may exploit beamforming for delivering high-quality media content, but are not restricted to any particular radio access technology, or frequency band, and may be exploited in any wireless communication system, wherein beamforming is used.
  • a 3D grid of beams may be used for delivering directional media content.
  • a 3D grid of beams may comprise directional beams which are spatially separated in a 3D space.
  • 3D properties may comprise a beamwidth along with a vertical and horizontal direction of a beam. By vertical it is meant in a direction perpendicular to the plane of the horizon.
  • the grid, or a set, of beams would be thus three- dimensional.
  • a 3D grid of beams may be used to deliver any form of data, including 3D media content.
  • FIGURE 1 illustrates an exemplary network scenario in accordance with at least some embodiments.
  • a beam-based wireless communication system which comprises server 110, base station 120, beams 130 and mobile devices 140.
  • Server 110 may be a low-latency edge cloud/content server.
  • FIGURE 1 shows beams 130 horizontally for clarity. Nevertheless, a set of beams 130 may also comprise beams in a vertical direction, thereby forming a 3D grid, or set, of beams 130.
  • Mobile devices 140 may comprise, for example, an User Equipment, UE, a smartphone, a cellular phone, a Machine-to-Machine, M2M, node, machine-type communications node, an Internet of Things, IoT, node, a car telemetry unit, a laptop computer, a tablet computer, wireless head mounted device, or, indeed, another kind of suitable wireless device or mobile device.
  • mobile devices 140 communicate wirelessly with a cell of Base Station, BS, 120 via beams 130.
  • BS 120 may be considered as a serving BS for mobile devices 140.
  • Air interface between mobile device 140 and BS 120 may be configured in accordance with a Radio Access Technology, RAT, which both mobile device 140 and base station 120 are configured to support.
  • RAT Radio Access Technology
  • Examples of cellular RATs include Long Term Evolution, LTE, New Radio, NR, which is also known as fifth generation, 5G, radio access technology and MulteFire.
  • examples of non-cellular RATs include Wireless Local Area Network, WLAN, and Worldwide Interoperability for Microwave Access, WiMAX.
  • BS 140 may be referred to as eNB while in the context of NR, BS 120 may be referred to as gNB.
  • BS 120 may be referred to as an access point.
  • BS 120 may be referred to as a network node.
  • Mobile devices 140 may be similarly referred to as UEs. In any case, embodiments are not restricted to any particular wireless technology, and may be exploited in any system which uses beamforming for wireless transmissions.
  • BS 120 may be connected, directly or via at least one intermediate node, with core network 110.
  • Core network 110 may be, in turn, coupled with another network (not shown in FIGURE 1), via which connectivity to further networks may be obtained, for example via a worldwide interconnection network.
  • BS 120 may be connected with at least one other BS as well via an inter-base station interface (not shown in FIGURE 1), even though in some embodiments the inter-base station interface may be absent.
  • BS 120 may be connected, directly or via at least one intermediate node, with core network 110 or with another core network.
  • FIGURE 1 shows mobile devices 140 configured to be in a wireless connection on one or more communication channels in a cell with a BS 120 (such as (e/g)NodeB providing the cell).
  • the physical link from mobile device 140 to a BS 120 is called uplink or reverse link and the physical link from BS 120 to mobile device 140 is called downlink or forward link.
  • BS 120 or its functionalities may be implemented by using any node, host, server or access point, etc., entity suitable for such a usage.
  • the current architecture in LTE networks is fully distributed in the radio and fully centralized in the core network.
  • the low latency applications and services in 5G require bringing the content close to the radio which leads to local break out and Multi access Edge Computing, MEC.
  • 5G enables analytics and knowledge generation to occur at the source of the data. This approach requires leveraging resources that may not be continuously connected to a network such as laptops, smartphones, tablets and sensors.
  • MEC provides a distributed computing environment for application and service hosting. It also has the ability to store and process content, host application instances, etc., in close proximity to cellular subscribers for faster response time.
  • 5G (or NR) networks are being designed to support multiple hierarchies, where MEC servers can be placed between the core and BS 120. It should be appreciated that MEC can be applied in LTE networks as well.
  • Beam-based wireless communication systems enable efficient transmission of media content to multiple mobile devices 140 using multicast and/or broadcast transmissions.
  • the scenario of FIGURE 1 may be considered as an indoor viewing arena, wherein BS 140 may transmit media content and mobile devices 140 may receive the media content via beams 130.
  • Tailored sub-content may be transmitted to different mobile devices 140 as well, e.g., for censoring a part of the content for adult and under-age viewers, for delivering tailored advertisements and subtitles in different languages, etc.
  • Tailored sub-content may be transmitted using unicast along with multicast and/or broadcast transmission.
  • wireless communications may be exploited instead of wired communications due to the inherent robustness and flexibility that wireless solutions provide.
  • deployment of VR devices using wired connections in movie theatres or stadiums would make the system susceptible to wear and tear, and possible loss of connectivity due to wire damages.
  • mobile devices may be built in a robust manner, with automated tests conducted to ensure that the mobile devices are performing well.
  • Development of a new ecosystem through industry verticals may be enabled as well, using generic deployments.
  • Concerning mobility for example in indoor deployments limited mobility is typically allowed and may be tolerated without noticeable loss of quality, such as, standing up and slight adjustments for comfort, etc.
  • service interruptions or loss of quality may occur if mobile device 140 moves significantly from the assigned location.
  • directional content may refer to media content, such as video or audio, which has location or directional relevance.
  • virtual and augmented reality content may be considered as directional content, where the viewed content needs to be adjusted based on the location and viewing direction of a user.
  • a viewing direction of the user may refer to a direction the user is looking at or to a direction the mobile device is directed to.
  • the viewing direction of the user may refer to the direction the user is looking at in 3D space, i.e., a tile of the 3D viewing space.
  • the mobile device would be directed to the tile of the 3D viewing space as well.
  • the media content may be provided as a plurality of tiles, wherein each tile may correspond to a sub-stream of a stream, and the stream comprises the overall media content.
  • the appropriate directional media content may be shown to the user depending on the viewing direction, i.e., the direction the mobile device is directed to.
  • users may simply turn their heads or viewing direction to change the direction the mobile device is directed towards.
  • the direction may be estimated and the mobile device may fetch and show a tile to the user based on the estimated new viewing direction.
  • Total number of tiles may depend on an encoding technique used.
  • it may be assumed that entire directional media content may be quantized into an appropriate number of tiles.
  • the quantization may also be based, e.g., on accuracy of direction estimation of the network.
  • Accuracy of direction estimation may depend on a number of base station and available beams.
  • a mobile device may have additional antenna configurations to enable a beam-based system design, such as loops around the mobile device, both in uplink and downlink. The mobile device may therefore be reused as a VR headset without any additional equipment.
  • the antenna configuration of the mobile device may be exploited to enable estimation of the viewing direction of the user.
  • multiple antennas may be positioned in the mobile device depending on operational frequency bands, for accurate detection of beam and transmission parameters.
  • a mobile device may comprise multiple antennas for reception of a beamformed transmission. Said multiple antennas may be at different locations within the mobile device or associated with the mobile device. The mobile device may receive, for example, a downlink transmission using said multiple antennas. As the antennas may be at different locations, different antennas may receive a beam at different arrival times. Thus, for example an arrival time of the beam may be determined separately for each antenna. An angle-of-arrival may then be determined based on the arrival times. Hence, the mobile device receiving the directional media content would be able to accurately determine the directional properties of the received beam.
  • Said multiple antennas associated with the mobile device may form an antenna array, which may be used for directional signal reception.
  • a conventional beamformer known as delay-and-sum beamformer may be formed.
  • the conventional beamformer may give equal magnitudes for weights of the antennas and it may be steered to a certain direction by assigning proper phases for each antenna.
  • a phase shifter may be used in association with each antenna for directional signal reception.
  • a method for transmitting high-definition, high-data rate, directional media content using wireless communications is provided.
  • One of the aims is to enable viewing of directional media content while minimizing complexity of mobile device implementation and avoiding additional, dedicated components.
  • Some embodiments therefore provide a cost-efficient method for delivering media content.
  • the efficient delivery of directional media content is enabled.
  • Transmissions of a network node may have highly directional characteristics, which may be exploited by mobile devices, such as mobile devices 140, for estimating the viewing direction of the transmitted directional media content, i.e., the direction the mobile device is directed to.
  • Source network node may transmit transmissions wirelessly to mobile devices, wherein the transmissions may comprise media content, e.g., VR content.
  • the mobile device may be an end-user device, for example, a low-cost UE without directional measurement or estimation capabilities, and capable of displaying the content to the user with high-quality.
  • transmissions may be performed by exploiting beamforming.
  • the network node may transmit media content over all available beams together with a location tag.
  • the location tag may be used for determining a location of the network node.
  • the location tag may comprise a relative location of the network node, compared to other network nodes.
  • a direction of at least one beam may be estimated using the relative location of the network node and a location of the mobile device.
  • the mobile device may use the location of the network node and information related to beams, such as received powers, identities of the beams or beam reference information, to determine the direction of the at least one beam.
  • the mobile device may then provide directional content, i.e., a tile, based on the determined direction of the at least one beam. That is to say, the mobile device may then provide directional content based on a determined direction to which the mobile device is directed to.
  • the network node may transmit, i.e., broadcast or multicast, content using all available beams together with direction tags for each beam.
  • One direction tag may comprise a direction of one beam and be used for providing directional content based on the direction of the beam, wherein the direction of the beam may correspond to a direction to which the mobile device is directed to.
  • Direction tags may be used together with location tags.
  • identities of the beams may be transmitted.
  • received identity may be used for determining the direction of a beam and for providing directional content based on the determined direction of the beam.
  • the target node may use an angle of arrival of at least one beam to generate directional information, which may be further used to provide the directional content.
  • the angle of arrival may be used together with location tags, direction tags and/or beam identities as well.
  • the mobile device may also select the directional content based on tile-based encoding, for example, by correlating the sub-streams within the transmitted 360-degree content with the viewing direction information.
  • Sub- streams may indicate independent flows of traffic for each tile of the directional media content.
  • sub-streams may be isolated by the network, for example, using a dynamic adaptive streaming over Hypertext Transfer Protocol, HTTP, DASH-aware network element, DANE.
  • each sub-stream may represent a tile which in turn represents a corresponding spatial display position.
  • FIGURE 2 illustrates an exemplary network scenario for transmitting directional media content in accordance with at least some embodiments.
  • server 210 may correspond to server 110
  • base station 220 may correspond to base station 120
  • beams 230 may correspond to beams 130
  • mobile devices 240 may correspond to mobile devices 140.
  • TRPs 225 are shown in FIGURE 2.
  • TRPs 225 may be connected to BS 220 via wired connections while TRPs 225 may use wireless communications for communicating with mobile devices 240 via beams 230.
  • multiple BSs 220 may be used instead of, or in addition to, TRPs 225.
  • TRPs 225 may transmit same media content over multiple beams 230.
  • the transmissions over each beam may also comprise a location tag of TRP 225 in question.
  • TRPs 225 may be referred to as source transmitters and the location tag of each TRP 225 may indicate to mobile devices 240 the relative position of each TRP 225.
  • the location tag may be associated with an identity of TRP 225 for providing appropriate directional media content. In order to provide best viewing experience and coverage within the area, multiple TRPs 225 may be deployed within the coverage area.
  • transmissions over each beam may comprise beam information, which is specific for a beam.
  • the beam information may comprise, for example, a direction tag, i.e., direction of the beam.
  • each beam may be mapped to a certain tile of the content.
  • Direction tag may be seen as an reference to tiled content information, i.e., a tile of a 3D viewing space, within the 3D viewing space of the directional media content.
  • the transmission of the beam may comprise information identifying substantially one tile, or sub-stream, of the content to be shown to a user.
  • the beam information may comprise an identity of the beam.
  • Direction of the beam may be pre-determined and thus the identity of the beam may be used as a reference to a tile of a 3D viewing space.
  • an application instance may be located at edge cloud/content server 210 and location or direction tagging may be performed by server 210. Concerning the application instance, an application server may be located at a remote server but in case of low-latency applications, such as VR, the application server associated with the application instance may be hosted/located at the edge cloud. An application client may then be in mobile devices 240, which may be used for viewing the directional media content.
  • mobile devices 240 may receive directional media content from a source network node and show the appropriate directional content to a user depending on the actual viewing direction.
  • the actual viewing direction may correspond to a direction to which mobile device 240 is directed to.
  • Mobile devices 240 may also determine a location of the source network node during the process. Due to highly directional properties of the transmissions on millimeter-wave bands, using beams 230, a viewing direction of a user associated with mobile device 240 may impact a candidate set of strongest beams received by mobile device 240.
  • the viewing direction of the user may indicate the direction the user is looking at in the 3D space, because mobile device 240 may be directed to that direction.
  • appropriate directional media content may be shown to the user based on estimated viewing direction of the user.
  • the candidate set of strongest beams may be similar to a neighbor cell list in legacy systems, e.g., LTE, wherein mobile devices 240 may maintain a list of strongest cells and, possibly, relatively weaker neighbor cells.
  • mobile devices 240 may also maintain information about a strongest set of beams that it receives at any point of time.
  • mobile devices 240 may provide directional media content to a user, i.e., a tile, based on the strongest set of beams, because the strongest set of beams may indicate a direction to which mobile device 240 is directed to. So if, for example, a combined direction tag of all the strongest beams would be used, it would enable more accurate estimation of the viewing direction.
  • mobile device 240 may receive the data from multiple beams.
  • the transmitters e.g., BSs or TRPs, may coordinate and synchronize data transmissions using same physical resource blocks, thereby improving the received signal quality at the mobile device.
  • the mobile device may be able to identify the beams using metadata, as will be described later on.
  • FIGURE 3 illustrates an exemplary scenario in accordance with at least some embodiments.
  • server 310 may correspond to server 110
  • BS 320 may correspond to BS 120
  • mobile device 340 may correspond to mobile device 140.
  • TRP 325 may correspond to TRP 225.
  • 3D viewing space 350 may comprise a tile 350a or part of tile 350a that is shown to a user.
  • tile 350a may be presented on a display of a 3D VR headset.
  • tile 350a may be referred to as a part of 3D viewing space 350, which is related to directional media content and shown to the user.
  • 3D viewing space 350 may be referred to as media content.
  • a user may view a part of the tile 350a as well, and for that level of accuracy a mobile device may either estimate a change in received beam angle-of-arrival accurately or use some other sensors within the mobile device (for e.g., gyroscope and accelerometer).
  • Server 310 or BS 320 may transmit location information related to TRP 325 and/or direction or identity information related to beams along with media content.
  • UE 340 may receive a transmission using multiple beams from TRP 325.
  • the transmission may comprise location information related to TRP 325, possibly along with direction and/or identity information related to beams and the media content.
  • Mobile device 340 may, upon receiving said information, determine a direction of a beam. Consequently, mobile device 340 may determine tile 350a based on the direction of the beam. Alternatively, tile 350a may be determined based on an angle of arrival of said beam.
  • Tile 350a may be related to a direction the user is looking at, i.e., tile 350a of 3D viewing space 350, and it may correspond to the direction of said one beam. Tile 350a may hence correspond to a direction to which mobile device 340 is directed to.
  • Tile 350a may be regenerated from the data sent over the beams. Consequently, mobile device 340 may provide tile 350a of 3D viewing space 350 to a user via a display.
  • mobile device 340 which may be capable of receiving beam-formed transmissions and attached to a 3D display, to provide and show directional media content to the user without any additional equipment within the mobile device or externally.
  • wireless head mounted devices used to view directional media content currently require external sensors and initial manual calibration to estimate the viewing direction of the user relative to the directional content being viewed.
  • directional properties of transmissions e.g., 5G transmissions, may be combined to estimate the viewing direction of the user without any additional equipment or sensors.
  • Calibration may be done in the application layer of the mobile device with active intervention by a user.
  • the calibration may be performed by an application client installed in the mobile device for viewing the directional content.
  • the application client may request the user to indicate manually when a particular direction is seen within the directional media content, which may then be used along with the beams as a reference point to correlate the directional media content and the received beams.
  • the calibration may also be done by a multi-access edge cloud or a network, based on an indication of a user indicating that an incorrect viewing direction is shown by the mobile device or the application of mobile device detects incorrect viewing direction to activate update.
  • the indication of the incorrect viewing direction may act as a trigger for updating mapping between the location and direction tags. Transmission of information related to these interactions may take place on the user plane.
  • Mobile device 340 may recreate the tile 350a or part of the 3D viewing space 350 using the received information, such as, relative location of TRP 325 which may be received together with the transmitted media content. Relative location of TRP 325 may be within the 3D space.
  • mobile device 340 may also use an angle of arrival of the received beams to recreate tile 350a of 3D viewing space 350.
  • Mobile device 340 may have multiple antennas and hence the angle of arrival may be estimated accurately by using said multiple antennas. For example, if a user changes the viewing direction by moving the mobile device, e.g., by rotating his/her head in case of VR headsets, the angle of arrival of the received strongest set of beams would change, which may be used to provide appropriate directional media content to the user or VR headsets. Hence, location of provided tile 350a of 3D viewing space 350 would change on 3D viewing space 350.
  • the main difference with the scenario of FIGURE 2 compared to FIGURE 1 or FIGURE 3 is that in case of multiple TRPs of FIGURE 2 the set of strongest beams (possibly identified by beam IDs) may change as well.
  • the mobile device may use a TRP or a cell identity together with a beam identity to estimate the viewing direction, e.g., in combination with a direction tag.
  • an increased amount of information would make it possible for the mobile device to estimate a change in viewing direction based on minor changes in received power levels of various beams and corresponding angle-of-arrival.
  • Increased amount of information may also imply added information related to the different sets of received beams from the TRPs, depending on the change in viewing direction of the user.
  • Mobile device 340 may then select the appropriate directional content from the received data and provide the appropriate directional content, i.e., tile 350a of 3D viewing space 350, to the user.
  • all 3D media content 350 may be cached in a mobile device and then the content for tile 350a of 3D viewing space 350 may be selected based on received beams and/or estimated directionality, which may be calculated based on the strongest received beams and location of TRPs 325.
  • the lower protocol levels may comprise, e.g., physical layer, and information about the strongest beams and/or estimated directionality may be transmitted from the physical layer to the application layer.
  • Location tagging may be done either statically or dynamically, depending on the scenario. For example, in an indoor viewing arena or a movie theatre location and direction tagging may be done statically because the directional media content may be stored for long. For example, the directional media content may be stored in an edge-cloud server close to the base station or TRP, or locally within a private network, e.g., 5G private network depending on the deployment scenario. Alternatively, location and direction tagging may be done dynamically. For example, in scenarios where live or non-live media content is transmitted over wider areas with the content cached at mobile devices and shown locally.
  • location tagging may be done statically, i.e., there would be always the same tags for the same location, irrespective of the media content that is transmitted. For example, in a movie theatre locations of seats and locations of the BSs or TRPs may be the same irrespective of a movie that is played. On the other hand, tagging may be done dynamically, i.e., in a museum or outdoor locations, such as stadiums, concert grounds, etc. Beams, transmitted by BSs or TRPs, may come from different directions for different mobile devices 340 and hence, sets of strongest beams received from different TRPs 325 may be different as well.
  • a set of strongest beams may be referred to as a candidate set of beams. That is to say, the set of strongest beams may depend at least partially on the viewing direction and location of each individual user.
  • a mobile device may hence require either some form of calibration where a correlated direction may be periodically, or any other time, fedback to the network.
  • the correlated direction may refer to a direction identified by the mobile device, i.e., the mobile device may identify that a certain beam, or a set of beams, corresponds to a certain tile of the directional media content and feedback the correlated direction to the network.
  • Calibration may also be done based on receiving a confirmation from a mobile device related to the estimated viewing direction. That is to say, the mobile device may confirm that a certain correlated direction is correct, i.e., a certain beam, or a set of beams, corresponds to a certain tile of the directional media content.
  • the confirmation may be received while setting up the directional media content.
  • the network may transmit content for calibration while setting up the directional media content as well.
  • the calibration content may be correlated with the direction tags based on user feedback to derive a correlated direction.
  • the correlated direction may be estimated depending at least partially on the candidate beams and application level intelligence, for estimating the viewing direction based at least partially on the aggregated location tag information.
  • Application layer intelligence may be used to fetch a tile of the directional media content and display the tile on the mobile device, based on the estimated viewing direction.
  • connection setup procedure There may be no specific impacts to the connection setup procedure, since the data reception at the mobile device, or end-to-end application flow in the mobile device between the application client in the mobile device and the application server, may occur after a successful connection setup procedure and setup of bearers. A specific user feedback may be required to derive a correlated direction, similarly as in case of the calibration.
  • content may be divided into sub-streams, which may be transmitted separately.
  • a tile or sub-stream and tile 350a of 3D viewing space 350 may be assumed to be the same.
  • server 310 or BS 320 may split 3D viewing space 350 into a quantized set of tiles.
  • server 310 or BS 320 may use tile-based Dynamic Adaptive Streaming over Hypertext Transfer Protocol, DASH, sub-streams to deliver the content to mobile device 340.
  • Wireless communications may be used to transmit 3D viewing space 350, comprising all the tiles, over air interface. Then, mobile device 340 may select and provide appropriate content from the received sub-streams based on the estimated viewing direction of a user.
  • FIGURE 4 illustrates an example related to estimation of a viewing direction based on received beams in accordance with at least some embodiments.
  • the viewing direction may correspond to a direction to which the mobile device is directed to.
  • FIGURE 4 comprises first mobile device 440a and second mobile device 440b, which may be located within coverage areas of first TRP 425a and second TRP 425b.
  • FIGURE 4 also shows two beams 430al and 430a2 associated with TRP 425a, and two beams 430b 1 and 430b2 associated with TRP 425b. Naturally, there may be more than two beams per TRP.
  • mobile devices 440a and 440b may correspond to mobile devices 140 and beams 430al, 430a2, 430b2 and 430b2 may correspond to beams 130.
  • 3D viewing spaces 450a and 450b which demonstrate the direction to which the user is looking at, are shown in FIGURE 4.
  • Viewing directions 450a and 450b may correspond to tile 350a of 3D viewing space 350 of FIGURE 3.
  • Viewing directions 450a and 450b may also correspond to directions to which mobile devices 440a and 440b are directed to, respectively.
  • tile 450a and 450b may be determined based on the received beams, for example, by comparing identities, IDs, of the beams of the received strongest beams, their TRP or cell IDs, and their angle-of-arrival with the corresponding mapping to the tiled content.
  • the viewing directions may be determined based on locations of TRPs 425a and 425b.
  • Mobile devices 440a and 440b may first estimate their locations and then select the appropriate sub-streams, i.e., tiles 450a and 450b. In general, use of a large set of candidate beams for estimating the directionality improves the accuracy of the estimation of the direction in which mobile device 440a is directed to.
  • the downlink transmission comprising data such as media content
  • mobile device 440b may receive media content using strongest beam 430al, or set of beams, of TRP 425a, if the user of mobile device 440b is looking at the direction of TRP 425a.
  • Beams may be 3 -dimensional, i.e., multiple beams may be sent in a direction within the horizontal axis while multiple beams may be sent in a direction within the vertical axis, wherein each beam has a length, and possibly a beam width.
  • FIGURE 5 illustrates an exemplary end-to-end implementation in accordance with at least some embodiments.
  • a live 360-degree VR video content may be distributed using the exemplary end-to-end implementation of FIGURE 5.
  • Cameras 500 may be VR cameras, which may record live content from different viewing directions.
  • Each of cameras 500 may generate tile 550a of 360-degree media content individually.
  • Tile 550a of the 360-degree directional media content may be sent to a computer 505, which may stitch the content received from multiple cameras 500 to generate overall 360-degree content 550.
  • overall 360-degree content 550 may correspond to 3D viewing space 350.
  • computer 505 may encode overall 360-degree content 550, for example, using tile-based DASH encoding, wherein the 360-degree content 550 may be quantized into a set of tiles. Each tile 550a may represent a different viewing direction. In general, each tile 550a is associated with a sub-stream of transmission of overall 360-degree content 550. Overall 360-degree content 550 may be transferred to various mobile devices 540.
  • Computer 505 may transmit overall 360-degree content 550 to MEC or cloud server 510.
  • MEC or cloud server may 510 may process overall 360-degree content 550 by adding location tags of TRPs 525 and/or direction tags of beams into the overall directional media content, which may be transmitted, i.e., broadcasted and/or multicasted.
  • Direction tags may indicate the viewing direction associated with each tile 550a within overall 360- degree content 550 and each direction tag may be associated with one beam, to denote the direction of the beam in question.
  • direction tags may be combined with, e.g., beam information.
  • MEC or cloud server may 510 may transmit overall 360-degree content 550 together with the location tags to a BS 520.
  • BS 500 may transmit the received information directly to mobile device 540 or via TRPs 525.
  • Mobile device 540 may, based on the received transmission, determine information related to a received strongest beam, e.g., a direction of the strongest beam. Consequently, mobile device 540 may provide appropriate content to the mobile device using the direction of the strongest beam, which may correspond to the real-time viewing direction of the user. Said content may be displayed on the mobile device to the user via a user interface.
  • live video content may be shown from the appropriate VR camera 500, which may be changed if the viewing direction of the user is changed, i.e., tile 550a may be changed. Changing the viewing direction of the user may change the strongest received beam as well.
  • MEC or cloud server may 510 may transmit locations of TRPs 525 and directional properties, e.g., direction tags, of beams to BS 520.
  • BS 520 may then transmit overall 360-degree content 550 over all the available beams to mobile device 540 directly, or via TRPs 525, along with the locations of TRPs 525 and the directional properties of beams to mobile device 540.
  • Mobile device 540 may thus receive the transmitted overall 360-degree content and estimate the real-time viewing direction of the user based on the directional properties of the strongest beams (candidate set) and the location of TRPs 525.
  • estimation of the viewing direction may be dependent on mobile device 540.
  • Mobile device 540 may determine the real-time viewing direction of the user and show the appropriate tile 550a to the user.
  • Said appropriate tile 550a may be associated with appropriate VR camera 500, which may reflect the real-time viewing direction of the user if they were at the location where the content is being generated, e.g., at a football stadium.
  • directional content is generated, encoded and transported to a user, for example, in a VR viewing arena or movie theater.
  • the relationship between the received beams with the viewing direction may be estimated as shown in association with FIGURE 4.
  • An application in mobile device 540 may receive information related to the strongest beams received by mobile device 540, from the physical layer of mobile device 540, and based on this information estimate the viewing direction of the user. Consequently, mobile device 540 may show the appropriate tile 550a to the user.
  • at least one strongest beam may be identified at a lower layer, e.g., a physical layer, of a mobile device using a Beam Reference Signal, BRS.
  • BRS Beam Reference Signal
  • Identifying the at least one strongest beam may comprise comparing the received powers of all the beams.
  • BRS may occupy 8 subcarriers (5th ⁇ l2th subcarrier) in every Resource Block, RB, except the 18 RBs at the center of the frequency band. The 18 RBs at the center may be reserved for Primary Synchronization Signals, PSS, Secondary Synchronization Signals, SSS and Extended Synchronization Signals, ESS.
  • BRS may transmitted at every symbols (i.e., symbol 0 ⁇ 13) in subframes 0 and 25.
  • the data may be based on pseudo random data, e.g., Gold Sequence.
  • the detected at least one strongest beam may be signaled from the physical layer to an application layer of mobile device 540.
  • a VR or directional media content application that is running on mobile device 540 and showing the directional content, or selecting the content to be shown, to the user may frequently fetch this information in real-time from the physical layer, to keep track of the viewing direction of the user of mobile device 540.
  • An overview of possible metadata may be signaled within a beam. Metadata may need to be extracted for detecting the candidate set of beams in the physical layer of the mobile device.
  • metadata may refer to a set of data that describes and gives information about other data.
  • PSS/SSS, ESS, BRS, etc. may be described as possible metadata, which makes it possible for the mobile device to detect the beam information.
  • the candidate set of beams or their identities may be signaled to the Internet Protocol, IP, layer where the application may be located.
  • the mobile device may use the metadata to identify beams. Based on the calculation of SCB the mobile device may determine the strongest set of received beams, i.e., the candidate set of beams, which may be then used to estimate the viewing direction.
  • a mapping of beams to tiles of 3D viewing space may be used. Such a mapping may be seen as a mapping of beams to all the possible viewing directions of the user, so that a combination of beams received from different transmitters corresponds to one possible viewing direction.
  • the viewing directions may be quantized. As an example, if there are 100 possible combinations of beams, the number of possible viewing directions may be 100 as well.
  • a first beam combination may correspond to a first possible viewing direction and a second beam combination may correspond to a second possible viewing direction, etc. That is to say, the first beam combination may correspond to a first tile and the second beam combination may correspond to a second tile.
  • Such a mapping may be signaled by the application server to the application client in the mobile device as mapping instructions.
  • the mapping instructions may be referred to as a mapping table in some embodiments.
  • the mapping instructions may include the relationship between the beam combinations signaled by the physical layer of the mobile device and the tiles.
  • the mobile device may receive mapping instructions from the network node, wherein the mapping instructions comprise mappings between beam combinations and tiles of the directional media content, select a part of the mapping instructions based on a combination of received beams and select the tile of the directional media content based on the selected part of the mapping instructions.
  • FIGURE 6 illustrates a mapping table in accordance with at least some embodiments.
  • the mapping table i.e., the mapping instructions, may be transmitted by a network, e.g., a BS, to a mobile device.
  • the mapping table may be used for estimating a viewing direction of a user, i.e., a direction to which the mobile device is directed to.
  • the mapping table may be handled at the physical layer of the mobile device.
  • the mobile device may receive the mapping table and use the mapping table to estimate the viewing direction of the user at the physical layer.
  • the estimated viewing direction of the user may be provided to the application layer of the mobile device, from the physical layer of the mobile device, to be displayed on a display of the mobile device.
  • 3D viewing space 650i corresponds to a viewing space for a first location of a first network node, e.g., a cell, TRP or a BS
  • 3D viewing space 650 2 corresponds to a viewing direction for a second location of a second network node.
  • the mobile device may receive a set of candidate beams, e.g., a set of strongest beams (ax, by ⁇ which may comprise a first beam from the first network node ⁇ ax ⁇ or ⁇ ak ⁇ and a second beam ⁇ by ⁇ or ⁇ bl ⁇ from the second network node.
  • the first beam and the second beam together may indicate a row to be selected from a mapping table, and each row may correspond to a certain viewing direction.
  • a part of the mapping instructions may refer to a row in the mapping table.
  • Tile 650ai corresponds to a first, quantized viewing direction ⁇ XI, Yl, Zl ⁇ , which may be mapped to candidate beams ⁇ ax, by ⁇ .
  • tile 650a N corresponds to an Nth viewing direction ⁇ Xn, Yn, Zn ⁇ , which may be mapped to candidate beams ⁇ ak, bl ⁇ .
  • the mobile device may hence estimate that a viewing direction of a user, i.e.
  • a direction the mobile device is directed to in a 3D space may be for example (XI, Yl, Zl ⁇ if beams (ax, by ⁇ are the strongest beams received from the first network and the second network node, respectively.
  • the estimated viewing direction would enable the mobile device to show the appropriate media content to the user, e.g., tile 650ai, within the viewing space of the directional media content.
  • the mobile device may estimate that the viewing direction of the user corresponds to tile 650aN, i.e., (Xn, Yn, Zn ⁇ .
  • the mobile device may thus receive, from a network node, a mapping table.
  • each row may comprise a mapping between one beam combination and one tile of the directional media content.
  • the mobile device may select a row in the mapping table based on the estimated viewing direction of the user and also select the tile of the directional media content based on the selected row.
  • the mapping instructions may be also provided based on identities of the received strongest beams and their corresponding angles-of-arrival estimated by the mobile device. With reference to FIGURE 6, this would imply mapping between beams (ax, by ⁇ with respective angles-of-arrival (Ak, Al ⁇ , which would be mapped to the viewing direction (XI, Yl, Zl ⁇ , i.e., tile 650ai. Instead of the viewing direction (XI, Yl, Zl ⁇ the mapping may be made directly to a tile within the directional content, e.g., tile 650ai may be mapped directly to the received radio environment information.
  • the received radio environment information may be in terms of identities of beams, angles-of-arrival, etc., and it may be mapped to a tile within the directional content.
  • the mobile device may thus select an appropriate tile and display the appropriate sub-stream on a display of the mobile device. It is noted that in some embodiments there is no information exchange between the mobile device and the network node does not have any information about the real-time viewing direction of the user, i.e., the direction to which the mobile device is directed to, and the network node does not adapt the beam directionality based on the viewing direction of the user. Based on the available received information, the mobile device may estimate the viewing direction and display the appropriate directional content. Similar mapping could be applied for the viewing space 650 2 , when the user is viewing directional content from a different location and source node, with a location-specific context.
  • a 3D viewing space may be formed of a finite set of quantized viewing directions.
  • one quantized viewing direction e.g., 650ai
  • the application within the mobile device may show directional content to the user, depending on the real-time viewing direction of the user which may be identified based on the received beams.
  • the beams may be highly directional within the 3D space and BRS may be utilized to identify and segregate the received beams.
  • the mapping instructions may be transmitted locally.
  • locations of the TRPs and directions of the beams may be signaled to enable the mobile device to estimate the mapping as well.
  • Mobility may depend on the type of directional content viewed by the user and the application within the mobile device. For example, if the user is watching a movie, the real-time viewing direction within the viewing space may be relevant for a certain location of the mobile device. Such content may be referred to as static or location-independent directional media content, for which a simple mapping between the detected beams and the quantized viewing directions, i.e, tiles of the directional media content, would be sufficient. For example, a candidate set of beams may be changing depending on the movement of the mobile device, which needs to be covered using a larger number of TRPs in order to provide coverage within the entire region where the content is transmitted.
  • Embodiments may be used for dynamic or location-dependent directional content.
  • the viewed content may not depend on the real-time viewing direction but also on the location of the mobile device, e.g., within a museum, exhibition centers or other scenarios. Different content may be shown to users to give them the full- virtual reality experience of being in the virtual world as the users move around within the physical space.
  • different TRPs may transmit different directional content at different times.
  • One transmission of the directional content may be relevant for a current location of the mobile device.
  • the content could keep changing as the user moves around, while still maintaining similar mapping, i.e., relationship between the beams and the viewing directions.
  • the same beam identities may be reused while transmitting different content from different TRPs as the user is moving around, with each set of TRPs covering a finite region.
  • the mapping instructions may contain additional cell identity information indicating which location-specific content should be selected to show a specific tile to the user.
  • Locations may be associated with cell identities, which may be determined by the physical layer of the mobile device and signaled to the application layer.
  • a 3D viewing space for a first location may be associated with an identity of a first cell and a second location may be associated with an identity of a second cell of the transmitting first and second BSs, respectively.
  • the mobile device may obtain time and frequency synchronization with the cell and the cell identities from PSSs and SSS.
  • the mapping instructions may therefore be cell-specific.
  • Cell-specific mapping instructions enable location specificity, since a cell would have limited coverage area.
  • cell- specific mapping instructions may be advantageous especially in dynamic scenarios, wherein users may be moving around.
  • an application for showing the directional content to a user may be installed by the user on a mobile device.
  • the application may be pre-installed on the mobile device.
  • the application of the mobile device may be provided by the network infrastructure owner provisioning the content.
  • the selection of the content may be based on the scenario as well. For example, if the user is watching the directional content at home, principles similar to video-on-demand may apply. On the other hand, if the user is watching the content at a movie theater, the application may be provided by the theater entity for starting to show content similar to linear content, wherein the content may be shown to users when the movie starts playing on mobile device. In this scenario, the content may be shown once the user enables the application, and the application may fetch the content from the stream transmitted over-the-air. In outdoor scenarios such as stadiums, the users may also select from a wide range of available directional content based on a live event ongoing on a stadium, with the directional view provided according to some embodiments.
  • the application may be installed by the user or pre-installed by an owner of an infrastructure, depending on the ownership of the mobile device. Also, content selection may be done by the user by fetching the content, similar to video-on-demand. Content selection may be limited to pre-decided ones, similar to linear content. In outdoor scenarios with mobility, content may be selected using a combination of linear and video-on-demand.
  • Some embodiments may provide simple and efficient implementation of directional / immersive content, by using the unique characteristics of beam-based transmissions. Significant cost reductions in the mobile device may be achieved due to the lack of need for special equipment. Significantly improved technology adoption is possible as well, since essentially any mobile device, e.g., 5G UE, may support reception and efficient display of directional / immersive content.
  • FIGURE 7 illustrates an example apparatus capable of supporting at least some embodiments. Illustrated is device 700, which may comprise, for example, mobile device 140, e.g. an UE , or BS 120, such as, a network node of FIGURE 1.
  • processor 710 which may comprise, for example, a single- or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core.
  • Processor 710 may comprise, in general, a control device.
  • Processor 710 may comprise more than one processor.
  • Processor 710 may be a control device.
  • a processing core may comprise, for example, a Cortex- A8 processing core manufactured by ARM Holdings or a Steamroller processing core produced by Advanced Micro Devices Corporation.
  • Processor 710 may comprise at least one Qualcomm Snapdragon and/or Intel Atom processor.
  • Processor 710 may comprise at least one application-specific integrated circuit, ASIC.
  • Processor 710 may comprise at least one field- programmable gate array, FPGA.
  • Processor 710 may be means for performing method steps in device 700.
  • Processor 710 may be configured, at least in part by computer instructions, to perform actions.
  • a processor may comprise circuitry, or be constituted as circuitry or circuitries, the circuitry or circuitries being configured to perform phases of methods in accordance with embodiments described herein.
  • the term“circuitry” may refer to one or more or all of the following: (a) hardware-only circuit implementations, such as implementations in only analog and/or digital circuitry, and (b) combinations of hardware circuits and software, such as, as applicable: (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
  • firmware firmware
  • circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware.
  • circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
  • Device 700 may comprise memory 720.
  • Memory 720 may comprise random- access memory and/or permanent memory.
  • Memory 720 may comprise at least one RAM chip.
  • Memory 720 may comprise solid-state, magnetic, optical and/or holographic memory, for example.
  • Memory 720 may be at least in part accessible to processor 710.
  • Memory 720 may be at least in part comprised in processor 710.
  • Memory 720 may be means for storing information.
  • Memory 720 may comprise computer instructions that processor 710 is configured to execute. When computer instructions configured to cause processor 710 to perform certain actions are stored in memory 720, and device 700 overall is configured to run under the direction of processor 710 using computer instructions from memory 720, processor 710 and/or its at least one processing core may be considered to be configured to perform said certain actions.
  • Memory 720 may be at least in part comprised in processor 710.
  • Memory 720 may be at least in part external to device 700 but accessible to device 700.
  • Device 700 may comprise a transmitter 730.
  • Device 700 may comprise a receiver 740.
  • Transmitter 730 and receiver 740 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard.
  • Transmitter 730 may comprise more than one transmitter.
  • Receiver 740 may comprise more than one receiver.
  • Transmitter 730 and/or receiver 740 may be configured to operate in accordance with Global System for Mobile communication, GSM, Wideband Code Division Multiple Access, WCDMA, 5G, Long Term Evolution, LTE, IS-95, Wireless Local Area Network, WLAN, Ethernet and/or Worldwide Interoperability for Microwave Access, WiMAX, standards, for example.
  • Device 700 may comprise a Near-Field Communication, NFC, transceiver 750.
  • NFC transceiver 750 may support at least one NFC technology, such as Bluetooth, or similar technologies.
  • Device 700 may comprise User Interface, UI, 760.
  • UI 760 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing device 700 to vibrate, a speaker and a microphone.
  • a user may be able to operate device 700 via UI 760, for example to accept incoming telephone calls, to originate telephone calls or video calls, to browse the Internet, to manage digital files stored in memory 720 or on a cloud accessible via transmitter 730 and receiver 740, or via NFC transceiver 750, and/or to play games.
  • Device 700 may comprise or be arranged to accept a user identity module 770.
  • User identity module 770 may comprise, for example, a Subscriber Identity Module, SIM, card installable in device 700.
  • a user identity module 770 may comprise information identifying a subscription of a user of device 700.
  • a user identity module 770 may comprise cryptographic information usable to verify the identity of a user of device 700 and/or to facilitate encryption of communicated information and billing of the user of device 700 for communication effected via device 700.
  • Processor 710 may be furnished with a transmitter arranged to output information from processor 710, via electrical leads internal to device 700, to other devices comprised in device 700.
  • a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 720 for storage therein.
  • the transmitter may comprise a parallel bus transmitter.
  • Fikewise processor 710 may comprise a receiver arranged to receive information in processor 710, via electrical leads internal to device 700, from other devices comprised in device 700.
  • Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 740 for processing in processor 710.
  • the receiver may comprise a parallel bus receiver.
  • Device 700 may comprise further devices not illustrated in FIGURE 4.
  • device 700 may comprise at least one digital camera.
  • Some devices 700 may comprise a back-facing camera and a front-facing camera, wherein the back-facing camera may be intended for digital photography and the front facing camera for video telephony.
  • Device 700 may comprise a fingerprint or face sensor arranged to authenticate, at least in part, a user of device 700.
  • device 700 lacks at least one device described above.
  • some devices 700 may lack a NFC transceiver 750 and/or user identity module 770.
  • Processor 710, memory 720, transmitter 730, receiver 740, NFC transceiver 750, UI 760 and/or user identity module 770 may be interconnected by electrical leads internal to device 700 in a multitude of different ways.
  • each of the aforementioned devices may be separately connected to a master bus internal to device 700, to allow for the devices to exchange information.
  • this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the embodiments.
  • FIGURE 8 is a flow graph of a first method in accordance with at least some embodiments.
  • the phases of the illustrated first method may be performed by mobile device 140, such as, an UE, or by a control device configured to control the functioning thereof, possibly when installed therein.
  • the first method may comprise, at step 810, receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content.
  • the first method may also comprise, at step 820, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams.
  • the first method may comprise, at step 830, selecting a tile of the directional media content based on the estimated viewing direction.
  • the first method may comprise displaying the tile of the directional media content on a mobile device.
  • FIGURE 9 is a flow graph of a second method in accordance with at least some embodiments.
  • the phases of the illustrated second method may be performed by BS 120 or a network node in general, or by a control device configured to control the functioning thereof, possibly when installed therein.
  • the second method may comprise, at step 910, transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content.
  • the second method may also comprise, at step 920, transmitting information for displaying a tile of the directional media content e.g. on a display of a mobile device.
  • an apparatus such as, for example, a terminal or a network node, may comprise means for carrying out the embodiments described above and any combination thereof.
  • a computer program may be configured to cause a method in accordance with the embodiments described above and any combination thereof.
  • a computer program product embodied on a non-transitory computer readable medium, may be configured to control a processor to perform a process comprising the embodiments described above and any combination thereof.
  • an apparatus such as, for example, a terminal or a network node, may comprise at least one processor, and at least one memory including computer program code, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform the embodiments described above and any combination thereof
  • At least some embodiments find industrial application in wireless communication networks, wherein video or directional media content is transmitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Quality & Reliability (AREA)
  • Computer Graphics (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

According to an example aspect of the present invention, there is provided an apparatus comprising means for receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.

Description

DELIVERING AND HANDLING MEDIA CONTENT IN A WIRELESS
COMMUNICATION NETWORK
FIELD [0001] Various example embodiments relate in general to wireless communication networks, and delivering and handling of media content in such networks.
BACKGROUND
[0002] Certain applications, such as, Virtual Reality, VR, may require high data rates for delivering media content, such as, for example, video content. In general, higher frequency bands have more bandwidth available for wireless transmissions, which enables higher data rates. Consequently, current standardization efforts in the field of radio communications comprise the exploitation of higher frequency bands for wireless transmissions. For example, 3rd Generation Partnership Project, 3GPP, develops 5G technology and considers the use of millimetre-wave frequency bands for it. There is therefore a need to provide improved methods, apparatuses and computer programs for transmitting and handling of media content, especially on high frequency bands.
SUMMARY [0003] According to some aspects, there is provided the subject-matter of the independent claims. Some embodiments are defined in the dependent claims.
[0004] According to a first aspect, there is provided an apparatus comprising means for receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device. [0005] According to the first aspect, the means may be further configured to perform transmission as a broadcast transmission or a multicast transmission.
[0006] According to the first aspect, the means may be further configured to perform selecting at least one strongest beam from the set of beams and selecting the tile of the directional media content based on the selected at least strongest one beam.
[0007] According to the first aspect, the means may be further configured to perform selecting a set of strongest beams from the set of beams and selecting the tile of the directional media content based on the selected set of strongest beams.
[0008] According to the first aspect, the means may be further configured to perform receiving information about the at least one beam, wherein said information comprises an identity of the at least one beam and selecting the tile of the directional media content based on the identity of the at least one beam.
[0009] According to the first aspect, the means may be further configured to perform receiving mapping instructions from the network node, wherein the mapping instructions comprise mappings between beam combinations and tiles of the directional media content, selecting a part of the mapping instructions based on a combination of received beams and selecting the tile of the directional media content based on the selected part of the mapping instructions.
[0010] According to the first aspect, the means may be further configured to perform determining an angle of arrival of the at least one beam and selecting the tile of the directional media content based on the angle of arrival of the at least one beam.
[0011] According to the first aspect, the means may be further configured to perform determining the angle of arrival of the at least one beam based on a direction tag associated with the at least one beam and a location tag of the network node.
[0012] According to the first aspect, the means may be further configured to perform receiving information about a location of the network node, determining a direction of the at least one beam based on the location of the network node and selecting the tile of the directional media content based on the direction of the at least one beam.
[0013] According to the first aspect, the means may be further configured to perform rendering the tile of the directional media content for the user. [0014] According to the first aspect, the means may be further configured to perform signalling information about the at least one beam and/or the estimated viewing direction of the user from a physical layer of the mobile device to an application layer and selecting, at the application layer, the tile of the directional media content based on the signalled information.
[0015] According to the first aspect, the set of beams may comprise beams in horizontal and vertical directions.
[0016] According to the first aspect, the means may be further configured to perform receiving information related to multiple streams or programs via the at least one beam, selecting one of said multiple streams or programs and displaying the selected stream or program on the display of the mobile device.
[0017] According to a second aspect, there is provided an apparatus comprising means for transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content and transmitting information for displaying a tile of the directional media content on a mobile device.
[0018] According to the second aspect, the means may be further configured to perform transmission as a broadcast transmission or a multicast transmission.
[0019] According to the second aspect, said information may comprise information related to the set of beams. [0020] According to the second aspect, said information may comprise mapping instructions, and the mapping instructions may comprise mappings between beam combinations and tiles of the directional media content.
[0021] According to the second aspect, said information may comprise an identity of a beam. [0022] According to the second aspect, said information may comprise a location of the network node.
[0023] According to the second aspect, said information may comprise the set of beams may comprise beams in horizontal and vertical directions. [0024] According to the second aspect, the means may be further configured to perform transmitting information related to multiple streams or programs via each beam of the set of beams.
[0025] According to a third aspect, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to perform, receive from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimate a viewing direction based at least partially on directional properties of at least one beam of the set of beams, select a tile of the directional media content based on the estimated viewing direction and display the tile of the directional media content on a mobile device.
[0026] According to a fourth aspect, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to perform, transmit a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmit information for displaying a tile of the directional media content on a mobile device.
[0027] According to a fifth aspect, there is provided a method comprising receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.
[0028] According to a sixth aspect, there is provided a method comprising transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmitting information for displaying a tile of the directional media content on a mobile device.
[0029] According to a seventh aspect, there is provided a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least perform, receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device. [0030] According to an eighth aspect, there is provided a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least perform, transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmitting information for displaying a tile of the directional media content on a mobile device.
[0031] According a ninth aspect, there is provided a computer program configured to perform, receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams, selecting a tile of the directional media content based on the estimated viewing direction and displaying the tile of the directional media content on a mobile device.
[0032] According to a tenth aspect, there is provided a computer program configured to perform, transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content, and transmitting information for displaying a tile of the directional media content on a mobile device.
BRIEF DESCRIPTION OF THE DRAWINGS
[0033] FIGURE 1 illustrates an exemplary network scenario in accordance with at least some embodiments; [0034] FIGURE 2 illustrates an exemplary network scenario for transmitting directional content in accordance with at least some embodiments;
[0035] FIGURE 3 illustrates an exemplary scenario in accordance with at least some embodiments; [0036] FIGURE 4 illustrates an example related to estimation of a viewing direction based on received beams in accordance with at least some embodiments;
[0037] FIGURE 5 illustrates an exemplary end-to-end implementation in accordance with at least some embodiments; [0038] FIGURE 6 illustrates a mapping in accordance with at least some embodiments;
[0039] FIGURE 7 illustrates an example apparatus capable of supporting at least some embodiments;
[0040] FIGURE 8 illustrates a flow graph of a first method in accordance with at least some embodiments;
[0041] FIGURE 9 illustrates a flow graph of a second method in accordance with at least some embodiments.
EMBODIMENTS
[0042] Transmission and handling of media content, such as, for example, video content, may be improved by the procedures described herein. In more detail, a network node may transmit a transmission using a set of beams, wherein the transmission comprises directional media content. In general, beams may be used for highly directional radio transmissions, which may be used to transmit data from a transmitter, e.g., a base station, to a receiver, e.g., a mobile device. Upon reception of the transmission, the mobile device may select a strongest beam, or beams, and estimate a viewing direction of a user of the mobile device based on a direction of the selected beam(s). The viewing direction of the user may correspond to a direction to which the mobile device is directed to. The mobile device may also select a tile of the media content based on a direction of the selected beam(s) and display the selected tile on a display of the mobile device. Media content may be, for example, directional media content.
[0043] More specifically, some embodiments relate to delivering and handling of high-quality directional media content for applications, such as, Virtual Reality, VR, and/or Augmented Reality, AR. Such applications typically deliver 3D video content and require high data rates. For example, data rates of at least 1 Gbps may be required, depending on the quality of the media content. In the context of 5G the requirement may be fulfilled by exploiting enhanced Mobile BroadBand, eMBB type of communication. Moreover, such applications also require low latencies and small jitter. In general, it is desirable to utilize wireless transmission, e.g., multicast or broadcast transmissions, for the delivery of such media content, because delivering the content using unicast transmissions may be impractical.
[0044] Multicast transmissions may be used if the same directional media content is addressed to a selected group of users, whereas broadcast transmissions may be used if the content needs to be sent to all the users. In some embodiments, said media content may refer to directional media content, such as, for example, 3D, VR, AR, 360-degree videos or any form of media content with directional properties. That is to say, media shown to the user may depend or vary based on the viewing direction of the user.
[0045] High data rates may be achieved using high frequency bands, because typically there is more bandwidth available on higher frequency bands. For example, frequency bands between 30 and 300 GHz may be used. Such frequency bands may be referred to as millimetre-wave bands. Future radio communication systems may use even higher frequency bands. Beamforming is often, if not always, used on millimetre-wave bands to improve the performance of wireless communication systems. Embodiments may exploit beamforming for delivering high-quality media content, but are not restricted to any particular radio access technology, or frequency band, and may be exploited in any wireless communication system, wherein beamforming is used.
[0046] In some embodiments, a 3D grid of beams may be used for delivering directional media content. In general, a 3D grid of beams may comprise directional beams which are spatially separated in a 3D space. 3D properties may comprise a beamwidth along with a vertical and horizontal direction of a beam. By vertical it is meant in a direction perpendicular to the plane of the horizon. The grid, or a set, of beams would be thus three- dimensional. Moreover, a 3D grid of beams may be used to deliver any form of data, including 3D media content.
[0047] FIGURE 1 illustrates an exemplary network scenario in accordance with at least some embodiments. According to the example scenario of FIGURE 1, there may be a beam-based wireless communication system, which comprises server 110, base station 120, beams 130 and mobile devices 140. Server 110 may be a low-latency edge cloud/content server. FIGURE 1 shows beams 130 horizontally for clarity. Nevertheless, a set of beams 130 may also comprise beams in a vertical direction, thereby forming a 3D grid, or set, of beams 130.
[0048] Mobile devices 140 may comprise, for example, an User Equipment, UE, a smartphone, a cellular phone, a Machine-to-Machine, M2M, node, machine-type communications node, an Internet of Things, IoT, node, a car telemetry unit, a laptop computer, a tablet computer, wireless head mounted device, or, indeed, another kind of suitable wireless device or mobile device. In the example system of FIGURE 1, mobile devices 140 communicate wirelessly with a cell of Base Station, BS, 120 via beams 130. BS 120 may be considered as a serving BS for mobile devices 140. Air interface between mobile device 140 and BS 120 may be configured in accordance with a Radio Access Technology, RAT, which both mobile device 140 and base station 120 are configured to support.
[0049] Examples of cellular RATs include Long Term Evolution, LTE, New Radio, NR, which is also known as fifth generation, 5G, radio access technology and MulteFire. On the other hand, examples of non-cellular RATs include Wireless Local Area Network, WLAN, and Worldwide Interoperability for Microwave Access, WiMAX. For example, in the context of LTE, BS 140 may be referred to as eNB while in the context of NR, BS 120 may be referred to as gNB. Also, for example in the context of WLAN, BS 120 may be referred to as an access point. In general, BS 120 may be referred to as a network node. Mobile devices 140 may be similarly referred to as UEs. In any case, embodiments are not restricted to any particular wireless technology, and may be exploited in any system which uses beamforming for wireless transmissions.
[0050] BS 120 may be connected, directly or via at least one intermediate node, with core network 110. Core network 110 may be, in turn, coupled with another network (not shown in FIGURE 1), via which connectivity to further networks may be obtained, for example via a worldwide interconnection network. BS 120 may be connected with at least one other BS as well via an inter-base station interface (not shown in FIGURE 1), even though in some embodiments the inter-base station interface may be absent. BS 120 may be connected, directly or via at least one intermediate node, with core network 110 or with another core network. [0051] FIGURE 1 shows mobile devices 140 configured to be in a wireless connection on one or more communication channels in a cell with a BS 120 (such as (e/g)NodeB providing the cell). The physical link from mobile device 140 to a BS 120 is called uplink or reverse link and the physical link from BS 120 to mobile device 140 is called downlink or forward link. It should be appreciated that BS 120 or its functionalities may be implemented by using any node, host, server or access point, etc., entity suitable for such a usage. The current architecture in LTE networks is fully distributed in the radio and fully centralized in the core network. However, the low latency applications and services in 5G require bringing the content close to the radio which leads to local break out and Multi access Edge Computing, MEC. 5G enables analytics and knowledge generation to occur at the source of the data. This approach requires leveraging resources that may not be continuously connected to a network such as laptops, smartphones, tablets and sensors. MEC provides a distributed computing environment for application and service hosting. It also has the ability to store and process content, host application instances, etc., in close proximity to cellular subscribers for faster response time. 5G (or NR) networks are being designed to support multiple hierarchies, where MEC servers can be placed between the core and BS 120. It should be appreciated that MEC can be applied in LTE networks as well.
[0052] Beam-based wireless communication systems enable efficient transmission of media content to multiple mobile devices 140 using multicast and/or broadcast transmissions. As an example, the scenario of FIGURE 1 may be considered as an indoor viewing arena, wherein BS 140 may transmit media content and mobile devices 140 may receive the media content via beams 130. Tailored sub-content may be transmitted to different mobile devices 140 as well, e.g., for censoring a part of the content for adult and under-age viewers, for delivering tailored advertisements and subtitles in different languages, etc. Tailored sub-content may be transmitted using unicast along with multicast and/or broadcast transmission.
[0053] In some embodiments, wireless communications may be exploited instead of wired communications due to the inherent robustness and flexibility that wireless solutions provide. For example, deployment of VR devices using wired connections in movie theatres or stadiums would make the system susceptible to wear and tear, and possible loss of connectivity due to wire damages. Furthermore, mobile devices may be built in a robust manner, with automated tests conducted to ensure that the mobile devices are performing well. Development of a new ecosystem through industry verticals may be enabled as well, using generic deployments. Concerning mobility, for example in indoor deployments limited mobility is typically allowed and may be tolerated without noticeable loss of quality, such as, standing up and slight adjustments for comfort, etc. However, service interruptions or loss of quality may occur if mobile device 140 moves significantly from the assigned location.
[0054] One challenge related to ubiquitous availability and technology adoption of media content, such as VR, is the availability of low-cost mobile devices which could be exploited to deliver such content to the users. Generally speaking, consumers tend to prefer cheaper or low-cost equipment for consuming immersive and directional audio-visual, i.e., media, content. However, currently available VR headsets are expensive compared to regular mobile devices, because additional components are needed for enabling seamless visibility of directional content. In general, directional content may refer to media content, such as video or audio, which has location or directional relevance. For example, virtual and augmented reality content may be considered as directional content, where the viewed content needs to be adjusted based on the location and viewing direction of a user.
[0055] Moreover, a viewing direction of the user may refer to a direction the user is looking at or to a direction the mobile device is directed to. For instance, in case of 3D media content the viewing direction of the user may refer to the direction the user is looking at in 3D space, i.e., a tile of the 3D viewing space. Thus, the mobile device would be directed to the tile of the 3D viewing space as well. The media content may be provided as a plurality of tiles, wherein each tile may correspond to a sub-stream of a stream, and the stream comprises the overall media content. The appropriate directional media content may be shown to the user depending on the viewing direction, i.e., the direction the mobile device is directed to. Thus, to change to another tile, users may simply turn their heads or viewing direction to change the direction the mobile device is directed towards. The direction may be estimated and the mobile device may fetch and show a tile to the user based on the estimated new viewing direction.
[0056] Total number of tiles may depend on an encoding technique used. In some embodiments, it may be assumed that entire directional media content may be quantized into an appropriate number of tiles. The quantization may also be based, e.g., on accuracy of direction estimation of the network. Accuracy of direction estimation may depend on a number of base station and available beams. In some embodiments, a mobile device may have additional antenna configurations to enable a beam-based system design, such as loops around the mobile device, both in uplink and downlink. The mobile device may therefore be reused as a VR headset without any additional equipment. Thus, the antenna configuration of the mobile device may be exploited to enable estimation of the viewing direction of the user.
[0057] In some embodiments, multiple antennas may be positioned in the mobile device depending on operational frequency bands, for accurate detection of beam and transmission parameters. As an example, a mobile device may comprise multiple antennas for reception of a beamformed transmission. Said multiple antennas may be at different locations within the mobile device or associated with the mobile device. The mobile device may receive, for example, a downlink transmission using said multiple antennas. As the antennas may be at different locations, different antennas may receive a beam at different arrival times. Thus, for example an arrival time of the beam may be determined separately for each antenna. An angle-of-arrival may then be determined based on the arrival times. Hence, the mobile device receiving the directional media content would be able to accurately determine the directional properties of the received beam.
[0058] Said multiple antennas associated with the mobile device may form an antenna array, which may be used for directional signal reception. For example, a conventional beamformer known as delay-and-sum beamformer may be formed. The conventional beamformer may give equal magnitudes for weights of the antennas and it may be steered to a certain direction by assigning proper phases for each antenna. A phase shifter may be used in association with each antenna for directional signal reception.
[0059] In some embodiments, a method for transmitting high-definition, high-data rate, directional media content using wireless communications is provided. One of the aims is to enable viewing of directional media content while minimizing complexity of mobile device implementation and avoiding additional, dedicated components. Some embodiments therefore provide a cost-efficient method for delivering media content. Moreover, the efficient delivery of directional media content is enabled.
[0060] Transmissions of a network node, such as BS 120, Transmit-Receive Point, TRP, distributed unit or remote radio head may have highly directional characteristics, which may be exploited by mobile devices, such as mobile devices 140, for estimating the viewing direction of the transmitted directional media content, i.e., the direction the mobile device is directed to. Source network node may transmit transmissions wirelessly to mobile devices, wherein the transmissions may comprise media content, e.g., VR content. The mobile device may be an end-user device, for example, a low-cost UE without directional measurement or estimation capabilities, and capable of displaying the content to the user with high-quality.
[0061] In some embodiments, transmissions may be performed by exploiting beamforming. The network node may transmit media content over all available beams together with a location tag. The location tag may be used for determining a location of the network node. For example, the location tag may comprise a relative location of the network node, compared to other network nodes. Thus, a direction of at least one beam may be estimated using the relative location of the network node and a location of the mobile device. Alternatively, or in addition, the mobile device may use the location of the network node and information related to beams, such as received powers, identities of the beams or beam reference information, to determine the direction of the at least one beam. The mobile device may then provide directional content, i.e., a tile, based on the determined direction of the at least one beam. That is to say, the mobile device may then provide directional content based on a determined direction to which the mobile device is directed to.
[0062] Also, in some embodiments the network node may transmit, i.e., broadcast or multicast, content using all available beams together with direction tags for each beam. One direction tag may comprise a direction of one beam and be used for providing directional content based on the direction of the beam, wherein the direction of the beam may correspond to a direction to which the mobile device is directed to. Direction tags may be used together with location tags.
[0063] Alternatively, or in addition, in some embodiments identities of the beams may be transmitted. Thus, if the direction of each beam has been mapped to the identity of each beam beforehand, received identity may be used for determining the direction of a beam and for providing directional content based on the determined direction of the beam.
[0064] Moreover, in some embodiments the target node may use an angle of arrival of at least one beam to generate directional information, which may be further used to provide the directional content. The angle of arrival may be used together with location tags, direction tags and/or beam identities as well. The mobile device may also select the directional content based on tile-based encoding, for example, by correlating the sub-streams within the transmitted 360-degree content with the viewing direction information. Sub- streams may indicate independent flows of traffic for each tile of the directional media content. Moreover, sub-streams may be isolated by the network, for example, using a dynamic adaptive streaming over Hypertext Transfer Protocol, HTTP, DASH-aware network element, DANE. Thus, each sub-stream may represent a tile which in turn represents a corresponding spatial display position.
[0065] FIGURE 2 illustrates an exemplary network scenario for transmitting directional media content in accordance with at least some embodiments. With reference to FIGURE 1, server 210 may correspond to server 110, base station 220 may correspond to base station 120, beams 230 may correspond to beams 130 and mobile devices 240 may correspond to mobile devices 140. In addition, TRPs 225 are shown in FIGURE 2. TRPs 225 may be connected to BS 220 via wired connections while TRPs 225 may use wireless communications for communicating with mobile devices 240 via beams 230. In some embodiments, multiple BSs 220 may be used instead of, or in addition to, TRPs 225.
[0066] TRPs 225 may transmit same media content over multiple beams 230. The transmissions over each beam may also comprise a location tag of TRP 225 in question. TRPs 225 may be referred to as source transmitters and the location tag of each TRP 225 may indicate to mobile devices 240 the relative position of each TRP 225. The location tag may be associated with an identity of TRP 225 for providing appropriate directional media content. In order to provide best viewing experience and coverage within the area, multiple TRPs 225 may be deployed within the coverage area.
[0067] Alternatively, or in addition, transmissions over each beam may comprise beam information, which is specific for a beam. The beam information may comprise, for example, a direction tag, i.e., direction of the beam. In some embodiments, each beam may be mapped to a certain tile of the content. Direction tag may be seen as an reference to tiled content information, i.e., a tile of a 3D viewing space, within the 3D viewing space of the directional media content. Hence, the transmission of the beam may comprise information identifying substantially one tile, or sub-stream, of the content to be shown to a user.
[0068] In some embodiments, the beam information may comprise an identity of the beam. Direction of the beam may be pre-determined and thus the identity of the beam may be used as a reference to a tile of a 3D viewing space. [0069] In some embodiments, an application instance may be located at edge cloud/content server 210 and location or direction tagging may be performed by server 210. Concerning the application instance, an application server may be located at a remote server but in case of low-latency applications, such as VR, the application server associated with the application instance may be hosted/located at the edge cloud. An application client may then be in mobile devices 240, which may be used for viewing the directional media content.
[0070] Due to the directional characteristics of beams 230, mobile devices 240 may receive directional media content from a source network node and show the appropriate directional content to a user depending on the actual viewing direction. The actual viewing direction may correspond to a direction to which mobile device 240 is directed to. Mobile devices 240 may also determine a location of the source network node during the process. Due to highly directional properties of the transmissions on millimeter-wave bands, using beams 230, a viewing direction of a user associated with mobile device 240 may impact a candidate set of strongest beams received by mobile device 240. The viewing direction of the user may indicate the direction the user is looking at in the 3D space, because mobile device 240 may be directed to that direction. Hence, appropriate directional media content may be shown to the user based on estimated viewing direction of the user.
[0071] The candidate set of strongest beams may be similar to a neighbor cell list in legacy systems, e.g., LTE, wherein mobile devices 240 may maintain a list of strongest cells and, possibly, relatively weaker neighbor cells. In a beam-based system mobile devices 240 may also maintain information about a strongest set of beams that it receives at any point of time. Thus, mobile devices 240 may provide directional media content to a user, i.e., a tile, based on the strongest set of beams, because the strongest set of beams may indicate a direction to which mobile device 240 is directed to. So if, for example, a combined direction tag of all the strongest beams would be used, it would enable more accurate estimation of the viewing direction. At least in case of a multicast or broadcast with Single-Frequency Network, SFN, type of transmissions, mobile device 240 may receive the data from multiple beams. The transmitters, e.g., BSs or TRPs, may coordinate and synchronize data transmissions using same physical resource blocks, thereby improving the received signal quality at the mobile device. The mobile device may be able to identify the beams using metadata, as will be described later on. [0072] FIGURE 3 illustrates an exemplary scenario in accordance with at least some embodiments. With reference to FIGURE 1, server 310 may correspond to server 110, BS 320 may correspond to BS 120, and mobile device 340 may correspond to mobile device 140. Also, with reference to FIGURE 2, TRP 325 may correspond to TRP 225. In the example of FIGURE 3, 3D viewing space 350 may comprise a tile 350a or part of tile 350a that is shown to a user. As an example, tile 350a may be presented on a display of a 3D VR headset. In some embodiments, tile 350a may be referred to as a part of 3D viewing space 350, which is related to directional media content and shown to the user. In general, 3D viewing space 350 may be referred to as media content. A user may view a part of the tile 350a as well, and for that level of accuracy a mobile device may either estimate a change in received beam angle-of-arrival accurately or use some other sensors within the mobile device (for e.g., gyroscope and accelerometer).
[0073] Server 310 or BS 320 may transmit location information related to TRP 325 and/or direction or identity information related to beams along with media content. Hence, UE 340 may receive a transmission using multiple beams from TRP 325. The transmission may comprise location information related to TRP 325, possibly along with direction and/or identity information related to beams and the media content. Mobile device 340 may, upon receiving said information, determine a direction of a beam. Consequently, mobile device 340 may determine tile 350a based on the direction of the beam. Alternatively, tile 350a may be determined based on an angle of arrival of said beam. Tile 350a may be related to a direction the user is looking at, i.e., tile 350a of 3D viewing space 350, and it may correspond to the direction of said one beam. Tile 350a may hence correspond to a direction to which mobile device 340 is directed to.
[0074] Tile 350a may be regenerated from the data sent over the beams. Consequently, mobile device 340 may provide tile 350a of 3D viewing space 350 to a user via a display. Hence, some embodiments enable mobile device 340, which may be capable of receiving beam-formed transmissions and attached to a 3D display, to provide and show directional media content to the user without any additional equipment within the mobile device or externally. For example, wireless head mounted devices used to view directional media content currently require external sensors and initial manual calibration to estimate the viewing direction of the user relative to the directional content being viewed. In some embodiments, directional properties of transmissions, e.g., 5G transmissions, may be combined to estimate the viewing direction of the user without any additional equipment or sensors.
[0075] Calibration may be done in the application layer of the mobile device with active intervention by a user. For example, the calibration may be performed by an application client installed in the mobile device for viewing the directional content. The application client may request the user to indicate manually when a particular direction is seen within the directional media content, which may then be used along with the beams as a reference point to correlate the directional media content and the received beams. Alternatively, the calibration may also be done by a multi-access edge cloud or a network, based on an indication of a user indicating that an incorrect viewing direction is shown by the mobile device or the application of mobile device detects incorrect viewing direction to activate update. The indication of the incorrect viewing direction may act as a trigger for updating mapping between the location and direction tags. Transmission of information related to these interactions may take place on the user plane.
[0076] Some embodiments may be applicable in a scenario, wherein there is only one BS 320 or TRP 325. Mobile device 340 may recreate the tile 350a or part of the 3D viewing space 350 using the received information, such as, relative location of TRP 325 which may be received together with the transmitted media content. Relative location of TRP 325 may be within the 3D space.
[0077] In addition, or alternatively, mobile device 340 may also use an angle of arrival of the received beams to recreate tile 350a of 3D viewing space 350. Mobile device 340 may have multiple antennas and hence the angle of arrival may be estimated accurately by using said multiple antennas. For example, if a user changes the viewing direction by moving the mobile device, e.g., by rotating his/her head in case of VR headsets, the angle of arrival of the received strongest set of beams would change, which may be used to provide appropriate directional media content to the user or VR headsets. Hence, location of provided tile 350a of 3D viewing space 350 would change on 3D viewing space 350.
[0078] The main difference with the scenario of FIGURE 2 compared to FIGURE 1 or FIGURE 3 is that in case of multiple TRPs of FIGURE 2 the set of strongest beams (possibly identified by beam IDs) may change as well. With multiple TRPs, the mobile device may use a TRP or a cell identity together with a beam identity to estimate the viewing direction, e.g., in combination with a direction tag. However, with a single TRP, there would be different sets of beams (e.g., with direction tags) from the same TRP to compute. Thus, the more information related to the direction tags the mobile device has, the higher would be the accuracy of the viewing direction.
[0079] For example, an increased amount of information would make it possible for the mobile device to estimate a change in viewing direction based on minor changes in received power levels of various beams and corresponding angle-of-arrival. Increased amount of information may also imply added information related to the different sets of received beams from the TRPs, depending on the change in viewing direction of the user. Hence, increased amount of information would enable mobile device 340 to make the directional estimation faster and more accurately. Mobile device 340 may then select the appropriate directional content from the received data and provide the appropriate directional content, i.e., tile 350a of 3D viewing space 350, to the user.
[00S0] In some embodiments, all 3D media content 350 may be cached in a mobile device and then the content for tile 350a of 3D viewing space 350 may be selected based on received beams and/or estimated directionality, which may be calculated based on the strongest received beams and location of TRPs 325. Thus, there would be no need to transmit media content continuously. Therefore, complexity may be minimized at lower protocol levels of a mobile device while enabling selection of the appropriate content, i.e., tile 350a of 3D viewing space 350, at the application layer. The lower protocol levels may comprise, e.g., physical layer, and information about the strongest beams and/or estimated directionality may be transmitted from the physical layer to the application layer.
[0081] Location tagging may be done either statically or dynamically, depending on the scenario. For example, in an indoor viewing arena or a movie theatre location and direction tagging may be done statically because the directional media content may be stored for long. For example, the directional media content may be stored in an edge-cloud server close to the base station or TRP, or locally within a private network, e.g., 5G private network depending on the deployment scenario. Alternatively, location and direction tagging may be done dynamically. For example, in scenarios where live or non-live media content is transmitted over wider areas with the content cached at mobile devices and shown locally. So if a location of a mobile device does not change relative to the base station transmitters, location tagging may be done statically, i.e., there would be always the same tags for the same location, irrespective of the media content that is transmitted. For example, in a movie theatre locations of seats and locations of the BSs or TRPs may be the same irrespective of a movie that is played. On the other hand, tagging may be done dynamically, i.e., in a museum or outdoor locations, such as stadiums, concert grounds, etc. Beams, transmitted by BSs or TRPs, may come from different directions for different mobile devices 340 and hence, sets of strongest beams received from different TRPs 325 may be different as well. A set of strongest beams may be referred to as a candidate set of beams. That is to say, the set of strongest beams may depend at least partially on the viewing direction and location of each individual user. A mobile device may hence require either some form of calibration where a correlated direction may be periodically, or any other time, fedback to the network. The correlated direction may refer to a direction identified by the mobile device, i.e., the mobile device may identify that a certain beam, or a set of beams, corresponds to a certain tile of the directional media content and feedback the correlated direction to the network.
[0082] Calibration may also be done based on receiving a confirmation from a mobile device related to the estimated viewing direction. That is to say, the mobile device may confirm that a certain correlated direction is correct, i.e., a certain beam, or a set of beams, corresponds to a certain tile of the directional media content.
[0083] The confirmation may be received while setting up the directional media content. The network may transmit content for calibration while setting up the directional media content as well. The calibration content may be correlated with the direction tags based on user feedback to derive a correlated direction. The correlated direction may be estimated depending at least partially on the candidate beams and application level intelligence, for estimating the viewing direction based at least partially on the aggregated location tag information. Application layer intelligence may be used to fetch a tile of the directional media content and display the tile on the mobile device, based on the estimated viewing direction.
[0084] There may be no specific impacts to the connection setup procedure, since the data reception at the mobile device, or end-to-end application flow in the mobile device between the application client in the mobile device and the application server, may occur after a successful connection setup procedure and setup of bearers. A specific user feedback may be required to derive a correlated direction, similarly as in case of the calibration.
[0085] Furthermore, content may be divided into sub-streams, which may be transmitted separately. Hence, in some embodiments a tile or sub-stream and tile 350a of 3D viewing space 350 may be assumed to be the same. As an example, server 310 or BS 320 may split 3D viewing space 350 into a quantized set of tiles. In some embodiments, server 310 or BS 320 may use tile-based Dynamic Adaptive Streaming over Hypertext Transfer Protocol, DASH, sub-streams to deliver the content to mobile device 340. Wireless communications may be used to transmit 3D viewing space 350, comprising all the tiles, over air interface. Then, mobile device 340 may select and provide appropriate content from the received sub-streams based on the estimated viewing direction of a user.
[0086] FIGURE 4 illustrates an example related to estimation of a viewing direction based on received beams in accordance with at least some embodiments. The viewing direction may correspond to a direction to which the mobile device is directed to. FIGURE 4 comprises first mobile device 440a and second mobile device 440b, which may be located within coverage areas of first TRP 425a and second TRP 425b. FIGURE 4 also shows two beams 430al and 430a2 associated with TRP 425a, and two beams 430b 1 and 430b2 associated with TRP 425b. Naturally, there may be more than two beams per TRP.
[0087] With reference to FIGURE 1 , mobile devices 440a and 440b may correspond to mobile devices 140 and beams 430al, 430a2, 430b2 and 430b2 may correspond to beams 130. Also, 3D viewing spaces 450a and 450b, which demonstrate the direction to which the user is looking at, are shown in FIGURE 4. Viewing directions 450a and 450b may correspond to tile 350a of 3D viewing space 350 of FIGURE 3. Viewing directions 450a and 450b may also correspond to directions to which mobile devices 440a and 440b are directed to, respectively.
[0088] In the example of FIGURE 4, tile 450a and 450b may be determined based on the received beams, for example, by comparing identities, IDs, of the beams of the received strongest beams, their TRP or cell IDs, and their angle-of-arrival with the corresponding mapping to the tiled content. In addition, the viewing directions may be determined based on locations of TRPs 425a and 425b. Mobile devices 440a and 440b may first estimate their locations and then select the appropriate sub-streams, i.e., tiles 450a and 450b. In general, use of a large set of candidate beams for estimating the directionality improves the accuracy of the estimation of the direction in which mobile device 440a is directed to.
[0089] The downlink transmission, comprising data such as media content, may be received by mobile device 440a using strongest beam 430bl, or set of beams, of TRP 425b, if the user of mobile device 440a is looking at the direction of TRP 425b, i.e., mobile device 440a is directed to TRP 425b. Similarly, mobile device 440b may receive media content using strongest beam 430al, or set of beams, of TRP 425a, if the user of mobile device 440b is looking at the direction of TRP 425a.
[0090] If there are multiple TRPs, those may be transmitting the same data. Beams may be 3 -dimensional, i.e., multiple beams may be sent in a direction within the horizontal axis while multiple beams may be sent in a direction within the vertical axis, wherein each beam has a length, and possibly a beam width.
[0091] FIGURE 5 illustrates an exemplary end-to-end implementation in accordance with at least some embodiments. For example, a live 360-degree VR video content may be distributed using the exemplary end-to-end implementation of FIGURE 5. Cameras 500 may be VR cameras, which may record live content from different viewing directions. Each of cameras 500 may generate tile 550a of 360-degree media content individually. Tile 550a of the 360-degree directional media content may be sent to a computer 505, which may stitch the content received from multiple cameras 500 to generate overall 360-degree content 550. With reference to FIGURE 3, overall 360-degree content 550 may correspond to 3D viewing space 350.
[0092] Moreover, computer 505 may encode overall 360-degree content 550, for example, using tile-based DASH encoding, wherein the 360-degree content 550 may be quantized into a set of tiles. Each tile 550a may represent a different viewing direction. In general, each tile 550a is associated with a sub-stream of transmission of overall 360-degree content 550. Overall 360-degree content 550 may be transferred to various mobile devices 540.
[0093] Computer 505 may transmit overall 360-degree content 550 to MEC or cloud server 510. MEC or cloud server may 510 may process overall 360-degree content 550 by adding location tags of TRPs 525 and/or direction tags of beams into the overall directional media content, which may be transmitted, i.e., broadcasted and/or multicasted. Direction tags may indicate the viewing direction associated with each tile 550a within overall 360- degree content 550 and each direction tag may be associated with one beam, to denote the direction of the beam in question. Moreover, direction tags may be combined with, e.g., beam information. MEC or cloud server may 510 may transmit overall 360-degree content 550 together with the location tags to a BS 520. [0094] BS 500 may transmit the received information directly to mobile device 540 or via TRPs 525. Mobile device 540 may, based on the received transmission, determine information related to a received strongest beam, e.g., a direction of the strongest beam. Consequently, mobile device 540 may provide appropriate content to the mobile device using the direction of the strongest beam, which may correspond to the real-time viewing direction of the user. Said content may be displayed on the mobile device to the user via a user interface. As an example, live video content may be shown from the appropriate VR camera 500, which may be changed if the viewing direction of the user is changed, i.e., tile 550a may be changed. Changing the viewing direction of the user may change the strongest received beam as well.
[0095] Alternatively, or in addition, MEC or cloud server may 510 may transmit locations of TRPs 525 and directional properties, e.g., direction tags, of beams to BS 520. BS 520 may then transmit overall 360-degree content 550 over all the available beams to mobile device 540 directly, or via TRPs 525, along with the locations of TRPs 525 and the directional properties of beams to mobile device 540. Mobile device 540 may thus receive the transmitted overall 360-degree content and estimate the real-time viewing direction of the user based on the directional properties of the strongest beams (candidate set) and the location of TRPs 525.
[0096] Thus, estimation of the viewing direction may be dependent on mobile device 540. Mobile device 540 may determine the real-time viewing direction of the user and show the appropriate tile 550a to the user. Said appropriate tile 550a may be associated with appropriate VR camera 500, which may reflect the real-time viewing direction of the user if they were at the location where the content is being generated, e.g., at a football stadium.
[0097] Same principles may be applied to any scenario where directional content is generated, encoded and transported to a user, for example, in a VR viewing arena or movie theater. The relationship between the received beams with the viewing direction may be estimated as shown in association with FIGURE 4. An application in mobile device 540 may receive information related to the strongest beams received by mobile device 540, from the physical layer of mobile device 540, and based on this information estimate the viewing direction of the user. Consequently, mobile device 540 may show the appropriate tile 550a to the user. [0098] In general, in some embodiments at least one strongest beam may be identified at a lower layer, e.g., a physical layer, of a mobile device using a Beam Reference Signal, BRS. Identifying the at least one strongest beam may comprise comparing the received powers of all the beams. BRS may occupy 8 subcarriers (5th~l2th subcarrier) in every Resource Block, RB, except the 18 RBs at the center of the frequency band. The 18 RBs at the center may be reserved for Primary Synchronization Signals, PSS, Secondary Synchronization Signals, SSS and Extended Synchronization Signals, ESS. BRS may transmitted at every symbols (i.e., symbol 0 ~ 13) in subframes 0 and 25. The data may be based on pseudo random data, e.g., Gold Sequence.
[0099] The detected at least one strongest beam may be signaled from the physical layer to an application layer of mobile device 540. Alternatively, a VR or directional media content application that is running on mobile device 540 and showing the directional content, or selecting the content to be shown, to the user may frequently fetch this information in real-time from the physical layer, to keep track of the viewing direction of the user of mobile device 540. An overview of possible metadata may be signaled within a beam. Metadata may need to be extracted for detecting the candidate set of beams in the physical layer of the mobile device. In some embodiments, metadata may refer to a set of data that describes and gives information about other data. For example, PSS/SSS, ESS, BRS, etc., may be described as possible metadata, which makes it possible for the mobile device to detect the beam information. The candidate set of beams or their identities may be signaled to the Internet Protocol, IP, layer where the application may be located. The candidate set of beams, SCB, may be calculated as: SCB = (Beam ID of f(NRB)}, where f(NRB) is a function that returns the N strongest received beam signals, in terms of reference signal received power, quality, etc. Thus, the mobile device may use the metadata to identify beams. Based on the calculation of SCB the mobile device may determine the strongest set of received beams, i.e., the candidate set of beams, which may be then used to estimate the viewing direction.
[00100] In some embodiments, a mapping of beams to tiles of 3D viewing space may be used. Such a mapping may be seen as a mapping of beams to all the possible viewing directions of the user, so that a combination of beams received from different transmitters corresponds to one possible viewing direction. The viewing directions may be quantized. As an example, if there are 100 possible combinations of beams, the number of possible viewing directions may be 100 as well. A first beam combination may correspond to a first possible viewing direction and a second beam combination may correspond to a second possible viewing direction, etc. That is to say, the first beam combination may correspond to a first tile and the second beam combination may correspond to a second tile. Such a mapping may be signaled by the application server to the application client in the mobile device as mapping instructions. The mapping instructions may be referred to as a mapping table in some embodiments. The mapping instructions may include the relationship between the beam combinations signaled by the physical layer of the mobile device and the tiles.
[00101] Hence, the mobile device may receive mapping instructions from the network node, wherein the mapping instructions comprise mappings between beam combinations and tiles of the directional media content, select a part of the mapping instructions based on a combination of received beams and select the tile of the directional media content based on the selected part of the mapping instructions. FIGURE 6 illustrates a mapping table in accordance with at least some embodiments. The mapping table, i.e., the mapping instructions, may be transmitted by a network, e.g., a BS, to a mobile device. The mapping table may be used for estimating a viewing direction of a user, i.e., a direction to which the mobile device is directed to. The mapping table may be handled at the physical layer of the mobile device. That is to say, the mobile device may receive the mapping table and use the mapping table to estimate the viewing direction of the user at the physical layer. The estimated viewing direction of the user may be provided to the application layer of the mobile device, from the physical layer of the mobile device, to be displayed on a display of the mobile device.
[00102] In FIGURE 6, 3D viewing space 650i corresponds to a viewing space for a first location of a first network node, e.g., a cell, TRP or a BS, and 3D viewing space 6502 corresponds to a viewing direction for a second location of a second network node. Hence, the mobile device may receive a set of candidate beams, e.g., a set of strongest beams (ax, by} which may comprise a first beam from the first network node {ax} or {ak} and a second beam {by} or {bl} from the second network node. The first beam and the second beam together may indicate a row to be selected from a mapping table, and each row may correspond to a certain viewing direction. Hence, a part of the mapping instructions may refer to a row in the mapping table.
[00103] Tile 650ai corresponds to a first, quantized viewing direction {XI, Yl, Zl }, which may be mapped to candidate beams {ax, by} . Similarly, tile 650aN corresponds to an Nth viewing direction {Xn, Yn, Zn} , which may be mapped to candidate beams {ak, bl} . [00104] Based on the mapping table, the mobile device may hence estimate that a viewing direction of a user, i.e. a direction the mobile device is directed to in a 3D space, may be for example (XI, Yl, Zl } if beams (ax, by} are the strongest beams received from the first network and the second network node, respectively. The estimated viewing direction would enable the mobile device to show the appropriate media content to the user, e.g., tile 650ai, within the viewing space of the directional media content. Similarly, if beams (ak, bl} are the strongest beams received from the first network and the second network node, respectively, the mobile device may estimate that the viewing direction of the user corresponds to tile 650aN, i.e., (Xn, Yn, Zn} .
[00105] The mobile device may thus receive, from a network node, a mapping table. In the mapping table each row may comprise a mapping between one beam combination and one tile of the directional media content. The mobile device may select a row in the mapping table based on the estimated viewing direction of the user and also select the tile of the directional media content based on the selected row.
[00106] The mapping instructions may be also provided based on identities of the received strongest beams and their corresponding angles-of-arrival estimated by the mobile device. With reference to FIGURE 6, this would imply mapping between beams (ax, by} with respective angles-of-arrival (Ak, Al}, which would be mapped to the viewing direction (XI, Yl, Zl }, i.e., tile 650ai. Instead of the viewing direction (XI, Yl, Zl } the mapping may be made directly to a tile within the directional content, e.g., tile 650ai may be mapped directly to the received radio environment information. The received radio environment information may be in terms of identities of beams, angles-of-arrival, etc., and it may be mapped to a tile within the directional content.
[00107] The mobile device may thus select an appropriate tile and display the appropriate sub-stream on a display of the mobile device. It is noted that in some embodiments there is no information exchange between the mobile device and the network node does not have any information about the real-time viewing direction of the user, i.e., the direction to which the mobile device is directed to, and the network node does not adapt the beam directionality based on the viewing direction of the user. Based on the available received information, the mobile device may estimate the viewing direction and display the appropriate directional content. Similar mapping could be applied for the viewing space 6502, when the user is viewing directional content from a different location and source node, with a location-specific context.
[00108] Thus, a 3D viewing space may be formed of a finite set of quantized viewing directions. With reference to FIGURE 3, one quantized viewing direction, e.g., 650ai, may correspond to one tile 350a of 3D viewing space 350. Based on the mapping instructions, the application within the mobile device may show directional content to the user, depending on the real-time viewing direction of the user which may be identified based on the received beams. The beams may be highly directional within the 3D space and BRS may be utilized to identify and segregate the received beams.
[00109] In some embodiments, the mapping instructions, defining the relationship between the beams and tiles of 3D viewing space, may be transmitted locally. Alternatively, locations of the TRPs and directions of the beams may be signaled to enable the mobile device to estimate the mapping as well.
[00110] Mobility may depend on the type of directional content viewed by the user and the application within the mobile device. For example, if the user is watching a movie, the real-time viewing direction within the viewing space may be relevant for a certain location of the mobile device. Such content may be referred to as static or location-independent directional media content, for which a simple mapping between the detected beams and the quantized viewing directions, i.e, tiles of the directional media content, would be sufficient. For example, a candidate set of beams may be changing depending on the movement of the mobile device, which needs to be covered using a larger number of TRPs in order to provide coverage within the entire region where the content is transmitted.
[00111] Embodiments may be used for dynamic or location-dependent directional content. The viewed content may not depend on the real-time viewing direction but also on the location of the mobile device, e.g., within a museum, exhibition centers or other scenarios. Different content may be shown to users to give them the full- virtual reality experience of being in the virtual world as the users move around within the physical space.
[00112] In some embodiments, different TRPs may transmit different directional content at different times. One transmission of the directional content may be relevant for a current location of the mobile device. Hence the content could keep changing as the user moves around, while still maintaining similar mapping, i.e., relationship between the beams and the viewing directions. Thus, the same beam identities may be reused while transmitting different content from different TRPs as the user is moving around, with each set of TRPs covering a finite region.
[00113] Alternatively, the mapping instructions may contain additional cell identity information indicating which location-specific content should be selected to show a specific tile to the user. Locations may be associated with cell identities, which may be determined by the physical layer of the mobile device and signaled to the application layer. A 3D viewing space for a first location may be associated with an identity of a first cell and a second location may be associated with an identity of a second cell of the transmitting first and second BSs, respectively. In the cell search procedure, the mobile device may obtain time and frequency synchronization with the cell and the cell identities from PSSs and SSS. The mapping instructions may therefore be cell-specific. Cell-specific mapping instructions enable location specificity, since a cell would have limited coverage area. Moreover, cell- specific mapping instructions may be advantageous especially in dynamic scenarios, wherein users may be moving around.
[00114] According to some embodiments, an application for showing the directional content to a user, e.g., for virtual or augmented reality, may be installed by the user on a mobile device. Alternatively, the application may be pre-installed on the mobile device. In some embodiments, the application of the mobile device may be provided by the network infrastructure owner provisioning the content.
[00115] The selection of the content may be based on the scenario as well. For example, if the user is watching the directional content at home, principles similar to video-on-demand may apply. On the other hand, if the user is watching the content at a movie theater, the application may be provided by the theater entity for starting to show content similar to linear content, wherein the content may be shown to users when the movie starts playing on mobile device. In this scenario, the content may be shown once the user enables the application, and the application may fetch the content from the stream transmitted over-the-air. In outdoor scenarios such as stadiums, the users may also select from a wide range of available directional content based on a live event ongoing on a stadium, with the directional view provided according to some embodiments.
[00116] The application may be installed by the user or pre-installed by an owner of an infrastructure, depending on the ownership of the mobile device. Also, content selection may be done by the user by fetching the content, similar to video-on-demand. Content selection may be limited to pre-decided ones, similar to linear content. In outdoor scenarios with mobility, content may be selected using a combination of linear and video-on-demand.
[00117] Some embodiments may provide simple and efficient implementation of directional / immersive content, by using the unique characteristics of beam-based transmissions. Significant cost reductions in the mobile device may be achieved due to the lack of need for special equipment. Significantly improved technology adoption is possible as well, since essentially any mobile device, e.g., 5G UE, may support reception and efficient display of directional / immersive content.
[00118] FIGURE 7 illustrates an example apparatus capable of supporting at least some embodiments. Illustrated is device 700, which may comprise, for example, mobile device 140, e.g. an UE , or BS 120, such as, a network node of FIGURE 1. Comprised in device 700 is processor 710, which may comprise, for example, a single- or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core. Processor 710 may comprise, in general, a control device. Processor 710 may comprise more than one processor. Processor 710 may be a control device. A processing core may comprise, for example, a Cortex- A8 processing core manufactured by ARM Holdings or a Steamroller processing core produced by Advanced Micro Devices Corporation. Processor 710 may comprise at least one Qualcomm Snapdragon and/or Intel Atom processor. Processor 710 may comprise at least one application-specific integrated circuit, ASIC. Processor 710 may comprise at least one field- programmable gate array, FPGA. Processor 710 may be means for performing method steps in device 700. Processor 710 may be configured, at least in part by computer instructions, to perform actions.
[00119] A processor may comprise circuitry, or be constituted as circuitry or circuitries, the circuitry or circuitries being configured to perform phases of methods in accordance with embodiments described herein. As used in this application, the term“circuitry” may refer to one or more or all of the following: (a) hardware-only circuit implementations, such as implementations in only analog and/or digital circuitry, and (b) combinations of hardware circuits and software, such as, as applicable: (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
[00120] This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
[00121] Device 700 may comprise memory 720. Memory 720 may comprise random- access memory and/or permanent memory. Memory 720 may comprise at least one RAM chip. Memory 720 may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory 720 may be at least in part accessible to processor 710. Memory 720 may be at least in part comprised in processor 710. Memory 720 may be means for storing information. Memory 720 may comprise computer instructions that processor 710 is configured to execute. When computer instructions configured to cause processor 710 to perform certain actions are stored in memory 720, and device 700 overall is configured to run under the direction of processor 710 using computer instructions from memory 720, processor 710 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 720 may be at least in part comprised in processor 710. Memory 720 may be at least in part external to device 700 but accessible to device 700.
[00122] Device 700 may comprise a transmitter 730. Device 700 may comprise a receiver 740. Transmitter 730 and receiver 740 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard. Transmitter 730 may comprise more than one transmitter. Receiver 740 may comprise more than one receiver. Transmitter 730 and/or receiver 740 may be configured to operate in accordance with Global System for Mobile communication, GSM, Wideband Code Division Multiple Access, WCDMA, 5G, Long Term Evolution, LTE, IS-95, Wireless Local Area Network, WLAN, Ethernet and/or Worldwide Interoperability for Microwave Access, WiMAX, standards, for example.
[00123] Device 700 may comprise a Near-Field Communication, NFC, transceiver 750. NFC transceiver 750 may support at least one NFC technology, such as Bluetooth, or similar technologies.
[00124] Device 700 may comprise User Interface, UI, 760. UI 760 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing device 700 to vibrate, a speaker and a microphone. A user may be able to operate device 700 via UI 760, for example to accept incoming telephone calls, to originate telephone calls or video calls, to browse the Internet, to manage digital files stored in memory 720 or on a cloud accessible via transmitter 730 and receiver 740, or via NFC transceiver 750, and/or to play games.
[00125] Device 700 may comprise or be arranged to accept a user identity module 770. User identity module 770 may comprise, for example, a Subscriber Identity Module, SIM, card installable in device 700. A user identity module 770 may comprise information identifying a subscription of a user of device 700. A user identity module 770 may comprise cryptographic information usable to verify the identity of a user of device 700 and/or to facilitate encryption of communicated information and billing of the user of device 700 for communication effected via device 700.
[00126] Processor 710 may be furnished with a transmitter arranged to output information from processor 710, via electrical leads internal to device 700, to other devices comprised in device 700. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 720 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Fikewise processor 710 may comprise a receiver arranged to receive information in processor 710, via electrical leads internal to device 700, from other devices comprised in device 700. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 740 for processing in processor 710. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver. [00127] Device 700 may comprise further devices not illustrated in FIGURE 4. For example, where device 700 comprises a smartphone, it may comprise at least one digital camera. Some devices 700 may comprise a back-facing camera and a front-facing camera, wherein the back-facing camera may be intended for digital photography and the front facing camera for video telephony. Device 700 may comprise a fingerprint or face sensor arranged to authenticate, at least in part, a user of device 700. In some embodiments, device 700 lacks at least one device described above. For example, some devices 700 may lack a NFC transceiver 750 and/or user identity module 770.
[00128] Processor 710, memory 720, transmitter 730, receiver 740, NFC transceiver 750, UI 760 and/or user identity module 770 may be interconnected by electrical leads internal to device 700 in a multitude of different ways. For example, each of the aforementioned devices may be separately connected to a master bus internal to device 700, to allow for the devices to exchange information. However, as the skilled person will appreciate, this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the embodiments.
[00129] FIGURE 8 is a flow graph of a first method in accordance with at least some embodiments. The phases of the illustrated first method may be performed by mobile device 140, such as, an UE, or by a control device configured to control the functioning thereof, possibly when installed therein.
[00130] The first method may comprise, at step 810, receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content. The first method may also comprise, at step 820, estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams. In addition, the first method may comprise, at step 830, selecting a tile of the directional media content based on the estimated viewing direction. Finally, the first method may comprise displaying the tile of the directional media content on a mobile device.
[00131] FIGURE 9 is a flow graph of a second method in accordance with at least some embodiments. The phases of the illustrated second method may be performed by BS 120 or a network node in general, or by a control device configured to control the functioning thereof, possibly when installed therein. [00132] The second method may comprise, at step 910, transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content. The second method may also comprise, at step 920, transmitting information for displaying a tile of the directional media content e.g. on a display of a mobile device.
[00133] It is to be understood that the embodiments disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments and is not intended to be limiting.
[00134] Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases“in one embodiment” or“in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed.
[00135] As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and examples may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations.
[00136] In an exemplary embodiment, an apparatus, such as, for example, a terminal or a network node, may comprise means for carrying out the embodiments described above and any combination thereof.
[00137] In an exemplary embodiment, a computer program may be configured to cause a method in accordance with the embodiments described above and any combination thereof. In an exemplary embodiment, a computer program product, embodied on a non-transitory computer readable medium, may be configured to control a processor to perform a process comprising the embodiments described above and any combination thereof.
[00138] In an exemplary embodiment, an apparatus, such as, for example, a terminal or a network node, may comprise at least one processor, and at least one memory including computer program code, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform the embodiments described above and any combination thereof
[00139] Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the preceding description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
[00140] While the forgoing examples are illustrative of the principles of the embodiments in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
[00141] The verbs“to comprise” and“to include” are used in this document as open limitations that neither exclude nor require the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of "a" or "an", that is, a singular form, throughout this document does not exclude a plurality.
INDUSTRIAL APPLICABILITY [00142] At least some embodiments find industrial application in wireless communication networks, wherein video or directional media content is transmitted.
ACRONYMS LIST
3 GPP 3rd Generation Partnership Project
AR Augmented Reality
BRS Beam Reference Signal
DANE DASH-aware network element
DASH Dynamic Adaptive Streaming over Hypertext Transfer Protocol
eMBB enhanced Mobile BroadBand
ESS Extended Synchronization Signal
GSM Global System for Mobile communication
HTTP Hypertext Transfer Protocol
IoT Internet of Things
IP Internet Protocol
LTE Long-Term Evolution
M2M Machine-to -Machine
MEC Multi-access Edge Computing
NFC Near-Field Communication
NR New Radio
PSS Primary Synchronization Signal
RAT Radio Access Technology
RB Resource Block SFN Single-Frequency Network
SIM Subscriber Identity Module
sss Secondary Synchronization Signal
TRP Transmit-Receive Point
UE User Equipment
UI User Interface
VR Virtual Reality
WCDMA Wideband Code Division Multiple Access
WiMAX Worldwide Interoperability for Microwave Access
WLAN Wireless Local Area Network
REFERENCE SIGNS LIST
Figure imgf000036_0001
Figure imgf000037_0001

Claims

CLAIMS:
1. Apparatus, comprising means for:
- receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content;
- estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams;
- selecting a tile of the directional media content based on the estimated viewing direction; and
- displaying the tile of the directional media content on a mobile device.
2. The apparatus of any preceding claim wherein the means are further configured to perform transmission as a broadcast transmission or a multicast transmission.
3. The apparatus of any preceding claim wherein the means are further configured to perform:
- selecting at least one strongest beam from the set of beams; and
- selecting the tile of the directional media content based on the selected at least strongest one beam.
4. The apparatus of any preceding claim wherein the means are further configured to perform:
- selecting a set of strongest beams from the set of beams; and
- selecting the tile of the directional media content based on the selected set of strongest beams.
5. The apparatus of any preceding claim wherein the means are further configured to perform:
- receiving information about the at least one beam, wherein said information comprises an identity of the at least one beam;
- selecting the tile of the directional media content based on the identity of the at least one beam.
6. The apparatus of any preceding claim wherein the means are further configured to perform:
- receiving mapping instructions from the network node, wherein the mapping instructions comprise mappings between beam combinations and tiles of the directional media content;
- selecting a part of the mapping instructions based on a combination of received beams;
- selecting the tile of the directional media content based on the selected part of the mapping instructions.
7. The apparatus of any preceding claim wherein the means are further configured to perform:
- determining an angle of arrival of the at least one beam;
- selecting the tile of the directional media content based on the angle of arrival of the at least one beam.
8. The apparatus of any preceding claim wherein the means are further configured to perform:
- determining the angle of arrival of the at least one beam based on a direction tag associated with the at least one beam and a location tag of the network node.
9. The apparatus of any preceding claim wherein the means are further configured to perform:
- receiving information about a location of the network node;
- determining a direction of the at least one beam based on the location of the network node;
- selecting the tile of the directional media content based on the direction of the at least one beam.
10. The apparatus of any preceding claim wherein the means are further configured to perform:
- rendering the tile of the directional media content for the user.
11. The apparatus of any preceding claim wherein the means are further configured to perform:
- signalling information about the at least one beam and/or the estimated viewing direction of the user from a physical layer of the mobile device to an application layer; and
- selecting, at the application layer, the tile of the directional media content based on the signalled information.
12. The apparatus of any preceding claim, wherein the set of beams comprises beams in horizontal and vertical directions.
13. The apparatus of any preceding claim wherein the means are further configured to perform:
- receiving information related to multiple streams or programs via the at least one beam;
- selecting one of said multiple streams or programs; and
- displaying the selected stream or program on the display of the mobile device.
14. Apparatus comprising means for performing:
- transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content; and
- transmitting information for displaying a tile of the directional media content on a mobile device.
15. The apparatus of claim 14 wherein the means are further configured to perform transmission as a broadcast transmission or a multicast transmission.
16. The apparatus of claim 14 or 15, wherein said information comprises information related to the set of beams.
17. The apparatus of any preceding claim 14-16, wherein said information comprises mapping instructions, and the mapping instructions comprise mappings between beam combinations and tiles of the directional media content.
18. The apparatus of any preceding claim 14-17, wherein said information comprises an identity of a beam.
19. The apparatus of any preceding claim 14-18, wherein said information comprises a location of the network node.
20. The apparatus of any preceding claim 14-19, wherein the set of beams comprises beams in horizontal and vertical directions.
21. The apparatus of any preceding claim wherein the means are further configured to perform:
- transmitting information related to multiple streams or programs via each beam of the set of beams.
22. An apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to perform:
- receive from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content;
- estimate a viewing direction based at least partially on directional properties of at least one beam of the set of beams;
- select a tile of the directional media content based on the estimated viewing direction; and
- display the tile of the directional media content on a mobile device.
23. An apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to perform:
- transmit a transmission using a set of beams, wherein the transmission comprises at least directional media content; and
- transmit information for displaying a tile of the directional media content on a mobile device.
24. A method comprising:
- receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content;
- estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams;
- selecting a tile of the directional media content based on the estimated viewing direction; and
- displaying the tile of the directional media content on a mobile device.
25. A method comprising:
- transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content; and
- transmitting information for displaying a tile of the directional media content on a mobile device.
26. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least perform:
- receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content;
- estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams;
- selecting a tile of the directional media content based on the estimated viewing direction; and
- displaying the tile of the directional media content on a mobile device.
27. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least perform:
- transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content; and - transmitting information for displaying a tile of the directional media content on a mobile device.
28. A computer program configured to perform:
- receiving from a network node a transmission using a set of beams, wherein the transmission comprises at least directional media content;
- estimating a viewing direction based at least partially on directional properties of at least one beam of the set of beams;
- selecting a tile of the directional media content based on the estimated viewing direction; and
- displaying the tile of the directional media content on a mobile device.
29. A computer program configured to perform:
- transmitting a transmission using a set of beams, wherein the transmission comprises at least directional media content; and
- transmitting information for displaying a tile of the directional media content on a mobile device.
PCT/FI2018/050752 2018-10-16 2018-10-16 Delivering and handling media content in a wireless communication network WO2020079320A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP18937200.6A EP3868030A4 (en) 2018-10-16 2018-10-16 Delivering and handling media content in a wireless communication network
CN201880100230.1A CN113228526A (en) 2018-10-16 2018-10-16 Delivering and processing media content in a wireless communication network
US17/284,165 US20210336684A1 (en) 2018-10-16 2018-10-16 Delivering and handling media content in a wireless communication network
PCT/FI2018/050752 WO2020079320A1 (en) 2018-10-16 2018-10-16 Delivering and handling media content in a wireless communication network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2018/050752 WO2020079320A1 (en) 2018-10-16 2018-10-16 Delivering and handling media content in a wireless communication network

Publications (1)

Publication Number Publication Date
WO2020079320A1 true WO2020079320A1 (en) 2020-04-23

Family

ID=70284543

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2018/050752 WO2020079320A1 (en) 2018-10-16 2018-10-16 Delivering and handling media content in a wireless communication network

Country Status (4)

Country Link
US (1) US20210336684A1 (en)
EP (1) EP3868030A4 (en)
CN (1) CN113228526A (en)
WO (1) WO2020079320A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021235829A1 (en) * 2020-05-18 2021-11-25 삼성전자 주식회사 Image content transmitting method and device using edge computing service

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150349863A1 (en) * 2014-05-28 2015-12-03 Qualcomm Incorporated Method and apparatus for leveraging spatial/location/user interaction sensors to aid in transmit and receive-side beamforing in a directional wireless network
US20160366454A1 (en) 2015-06-15 2016-12-15 Intel Corporation Adaptive data streaming based on virtual screen size
US20170195044A1 (en) 2015-12-30 2017-07-06 Surefire Llc Receivers for optical narrowcasting
US20170303263A1 (en) * 2016-04-19 2017-10-19 Qualcomm Incorporated Beam reference signal based narrowband channel measurement and cqi reporting
WO2018082904A1 (en) * 2016-11-04 2018-05-11 Sony Corporation Communications device and method
US20180239419A1 (en) 2017-02-21 2018-08-23 WiseJet, Inc. Wireless transceiver system using beam tracking

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9253592B1 (en) * 2014-06-19 2016-02-02 Amazon Technologies, Inc. Inter-device bearing estimation based on beamforming and motion data
KR102344045B1 (en) * 2015-04-21 2021-12-28 삼성전자주식회사 Electronic apparatus for displaying screen and method for controlling thereof
JP2018107603A (en) * 2016-12-26 2018-07-05 オリンパス株式会社 Sensor information acquisition device, sensor information acquisition method, sensor information acquisition program and medical instrument
US11170409B2 (en) * 2017-05-19 2021-11-09 Abl Ip Holding, Llc Wireless beacon based systems utilizing printable circuits
KR20210066797A (en) * 2018-08-29 2021-06-07 피씨엠에스 홀딩스, 인크. Optical method and system for light field display based on mosaic periodic layer

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150349863A1 (en) * 2014-05-28 2015-12-03 Qualcomm Incorporated Method and apparatus for leveraging spatial/location/user interaction sensors to aid in transmit and receive-side beamforing in a directional wireless network
US20160366454A1 (en) 2015-06-15 2016-12-15 Intel Corporation Adaptive data streaming based on virtual screen size
US20170195044A1 (en) 2015-12-30 2017-07-06 Surefire Llc Receivers for optical narrowcasting
US20170303263A1 (en) * 2016-04-19 2017-10-19 Qualcomm Incorporated Beam reference signal based narrowband channel measurement and cqi reporting
WO2018082904A1 (en) * 2016-11-04 2018-05-11 Sony Corporation Communications device and method
US20180239419A1 (en) 2017-02-21 2018-08-23 WiseJet, Inc. Wireless transceiver system using beam tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3868030A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021235829A1 (en) * 2020-05-18 2021-11-25 삼성전자 주식회사 Image content transmitting method and device using edge computing service

Also Published As

Publication number Publication date
EP3868030A1 (en) 2021-08-25
CN113228526A (en) 2021-08-06
EP3868030A4 (en) 2022-07-20
US20210336684A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
JP7333802B2 (en) Metrics and messages to improve your 360-degree adaptive streaming experience
US20230305099A1 (en) Sidelink angular-based and sl rrm-based positioning
CN110915217B (en) 360-degree video decoding method, device and readable medium
US10194431B2 (en) Wireless device location services
CN105829909B (en) Wireless indoor positions air interface protocol
US10148995B2 (en) Method and apparatus for transmitting and receiving data in communication system
US20230316684A1 (en) Terminal device, application server, receiving method, and transmitting method
WO2019006336A1 (en) Weighted to spherically uniform psnr for 360-degree video quality evaluation using cubemap-based projections
KR20230049086A (en) Indicating a Subset of Positioning Reference Signals for User Equipment Power Savings
WO2017071136A1 (en) Method and apparatus for assisted positioning
US11308703B2 (en) Augmented reality channel sounding system
US20210336684A1 (en) Delivering and handling media content in a wireless communication network
CN115053584A (en) Relative transmission spot beam configuration information
CN112788532A (en) Electronic device, user equipment, wireless communication method, and storage medium
CN113273184A (en) Method of mirroring 3D objects to a light field display
US20230397063A1 (en) Spatially aware cells
WO2023081197A1 (en) Methods and apparatus for supporting collaborative extended reality (xr)
WO2023076894A1 (en) Sidelink positioning
CN113906705A (en) Information transmission method, device, communication equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18937200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018937200

Country of ref document: EP

Effective date: 20210517