WO2014041353A2 - Media content distribution - Google Patents

Media content distribution Download PDF

Info

Publication number
WO2014041353A2
WO2014041353A2 PCT/GB2013/052384 GB2013052384W WO2014041353A2 WO 2014041353 A2 WO2014041353 A2 WO 2014041353A2 GB 2013052384 W GB2013052384 W GB 2013052384W WO 2014041353 A2 WO2014041353 A2 WO 2014041353A2
Authority
WO
WIPO (PCT)
Prior art keywords
media
user
data
event
media data
Prior art date
Application number
PCT/GB2013/052384
Other languages
French (fr)
Other versions
WO2014041353A3 (en
Inventor
Tupac Martir
Original Assignee
Tupac Martir
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tupac Martir filed Critical Tupac Martir
Priority to EP13771566.0A priority Critical patent/EP2896210A2/en
Publication of WO2014041353A2 publication Critical patent/WO2014041353A2/en
Publication of WO2014041353A3 publication Critical patent/WO2014041353A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1845Arrangements for providing special services to substations for broadcast or conference, e.g. multicast broadcast or multicast in a specific location, e.g. geocast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/222Monitoring or handling of messages using geographical location information, e.g. messages transmitted or received in proximity of a certain spot or area
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present invention relates to media content distribution.
  • Embodiments of the present invention relate to a system and method for improving a supporter's experience of an event at an event venue, and in particular to a system and method that links a supporter's device to the devices of other users, enabling the supporter to receive audio/video data captured by the other users.
  • supporters have a limited viewpoint for viewing the event, usually being their own direct view of the event supplemented with such additional views on large screens at the event venue as are selected by organisers of the event. After an event has finished, a supporter's experience of the event is often even more limited - to their own captured video footage (if they were present themselves) or commercial footage if available. It would be desirable to provide supporters with access to additional viewpoints, and to provide supporters with facilities to increase their involvement in the event, both before, during and after the event actually takes place.
  • a media content distribution system comprising:
  • a media handling device for receiving and distributing media data; and a plurality of user devices, each user device having a camera function and being operable in a media transmit mode to transmit media data of an event occurring at an event venue to the media handling device, each user device being operable in a media receive mode to receive media data from the media handling device;
  • a first user device from amongst the plurality of user devices is operable to select a media source from a second user device among the user devices which are currently operating in the media transmit mode and the media handling device is operable to stream the media data of the event being received from the second user device to the first user device.
  • the media content is preferably a video stream, optionally with audio, but may instead be still images. It will be appreciated that media content transmitted from one device may at any given time be streamed to one device, no devices or many devices.
  • the media handling device may only receive media content from devices actually present at (for example within or in some cases in the immediate vicinity outside) the event venue. This could be achieved using location information indicating the location of the image capture devices, or by limiting to devices registered to the event as part of the ticket buying process for example.
  • the selection of the media source by the user may be achieved by selecting a specific device which is currently transmitting, while in other cases the selection of the media source by the user may be achieved by selecting a geographical area or zone of the event venue, with the specific source then being allocated by the media handling device - for example by allocating the highest quality device in that area or zone.
  • a user may be able to access footage from a media source in a number of ways, for example by selecting from thumbnail versions of each media source from a drop down list, for example ordered by media source location.
  • the first user device is operable to select the media source using an event venue representation indicating the location within the event venue of the user devices operating in the media transmit mode.
  • the event venue representation may for example be a plan or isometric view of the event venue. This enables the user to visually identify media sources in an area from which the user would like to view the event, and to select appropriate media sources for view based on location.
  • the user may select either a specific transmitting device, or an area or zone from which a specific transmitting device is to be allocated by the media handling device.
  • the location within the event venue of each user device operating in the media transmit mode may be determined from one of a GPS position or other positioning facility determined by the user device and a seat or area identifier entered by the user or otherwise allocated to the user.
  • the event venue representation may comprise a first representation indicating the location of plural selectable areas within the event venue (top level representation, or overview). Selection of an area within the event venue representation may cause the display of a second representation (close up of selected area) indicating the location within the selected area of user devices operating in the media transmit mode.
  • top level representation permits the user to navigate to a location of interest within the event venue and the second representation permits the user to actually select a media source from which to receive footage. This is useful because it may be difficult to distinguish between and selected individual media content sources on a top level representation.
  • the event venue representation may have a deeper hierarchy than this and comprise a multi-tiered hierarchy of representations.
  • This may include a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area of user devices operating in the media transmit mode or a further intermediate level representation.
  • This arrangement is particularly suitable for very large event venues, such as a racing circuit, where two levels of resolution may be insufficient to properly navigate the geography of the event venue and select individual media sources.
  • a user database of device users may be provided, each user having a user profile associated with one or more user devices, the user profile comprising one or more of a current location of a user device associated with the user, a specification for the one or more user devices and quality information indicating the image capture capability of the one or more user devices.
  • the user database may comprise device information regarding at least some of the user devices, the device information indicating the media capture capabilities of the user device, wherein an indicator of media quality is displayed on the event venue representation in relation to at least some of the media sources, the indicator of media quality being based on the device information stored in the user database.
  • the event venue representation may be automatically presented on the first (viewing) user device to enable the selection of an alternative media source.
  • an alternative media source is automatically selected by the media handling device (or the viewing user's device) based on its position relative to that of the originally selected media source.
  • a media content distribution method comprising the steps of:
  • a media handling device for receiving and distributing media data, the media handling device being arranged
  • a user device having a camera function, the user device being operable
  • the media receive mode to receive the media data of the event being received from the selected media source and streamed via the media handling device to the user device.
  • a media content distribution system comprising:
  • a media handler for receiving, storing and distributing media data; and a plurality of camera devices, each camera device having a camera function and being operable to upload media data of an event occurring at an event venue to the media handler, the media handler being operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data; and
  • the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
  • This arrangement permits footage capturing users to upload footage they captured at an event, and users other than the uploading user to add supplemental content, for example text describing their own memories of the event or the scene captured by the footage, to be associated with the uploaded content, and to be viewable along with the uploaded content by subsequent viewers.
  • supplemental content for example text describing their own memories of the event or the scene captured by the footage
  • the playback device may be operable to display a user interface comprising an event venue representation indicating the locations within the event venue at which uploaded media data was captured;
  • the playback device may be operable to receive a user selection of uploaded media data via the user interface, and to display the selected media data along with any additional data stored in association with the selected media data.
  • the playback device may be operable to select stored media data to which description data is to be associated using the event venue representation. In this way, a user may be able to find a media source close to the area from which they watched the event, which should provide the closest resemblance to the experience they themselves had at the event.
  • the playback device may be a camera device which itself captured footage at the event. In other words, users present at the event can access footage of other users present at the event and swap memories and supplemental content. However, in some embodiments supplemental content may be added by users who were not at the event, but nonetheless have an interest in the event and the footage.
  • the camera devices may be operable to store media data locally during the event at the time of image capture, and to upload the media data to the media handler after the event. This prevents the media handler and related storage device from being overburdened with media content during the match and encourages users to be more selective about the material they upload. However, in the alternative users can selectively stream with or without storing.
  • the event venue representation may comprise a first representation indicating the location of plural selectable areas within the event venue, selection of an area within the event venue representation on the playback device causing the display of a second representation indicating the location within the selected area at which uploaded media data was captured.
  • the event venue representation may comprise a hierarchy of representations, a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area at which uploaded media data was captured or a further intermediate level representation.
  • the location within the event venue of each media source may be determined from one of a GPS position or other positioning service determined by the user device which generated the media source, and a seat or area identifier entered by the user of the device which generated the media source, or otherwise allocated to the user of the device.
  • An indicator of the media quality of at least some of the uploaded media data may be displayed on the event venue representation.
  • a user database is provided which stores device information regarding at least some of the camera devices used to capture the media data stored at the media handler, the device information indicating the media capture capabilities of the user device, wherein the indicator of media quality displayed on the event venue representation is based on the device information stored in the user database.
  • quality related information may be stored at the image capture device and transmitted to the media handling device along with the media content.
  • the event venue representation may indicate the availability of different uploaded media data for different time periods or occurrences during the event, the event venue representation being navigable at the playback device with respect to time or occurrences to present the user with access to the different uploaded media data.
  • the playback devices may be operable to select a particular time period of key occurrence during the event, and the indication of media data shown on the event venue representation is updated to reflect uploaded media data corresponding to the selected time or key occurrence.
  • an event play mode in which the indications of media data are continuously updated with respect to a progression in time through the event.
  • a media content distribution method comprising the steps of:
  • the playback device submitting from the playback device to the media handler additional data for association with an item of the stored media data, the playback device being a different device than the device used to capture and/or upload the media data; storing the additional data in association with the media data;
  • a media handler for receiving, storing and distributing media data, the media handler being operable to receive, from a plurality of camera devices each having a camera function, uploaded media data of an event occurring at an event venue;
  • a playback device for accessing media data stored by a media handler; wherein the playback device is operable
  • the media handler being operable to store the additional data in association with the media data
  • the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
  • a media content distribution system comprising:
  • a media handler for receiving and distributing media data
  • each camera device having a camera function and being operable to capture and transmit media data of an event occurring at an event venue to the media handler, the media handler being operable to stream the captured media data to device users or store the captured media data for later access; and a playback device operable to access streamed or stored media data of an event via the media handler and display the accessed media data to a user;
  • the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • the technique of providing restricted access to content may be applied equally to real-time streamed content, or the subsequent access to stored content.
  • the media handler may be operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device at the time of capture of the uploaded media data.
  • the number of media data sources made available to the playback device may be dependent on how close the playback device is to the event venue.
  • the media handler may be operable to provide access to more media data sources for playback devices relatively closer to the event venue than for playback devices relatively further from the event venue.
  • the subset of media data made available to the playback device may be dependent on a current orientation of the playback device with respect to the event venue.
  • the playback device may be operable: to display a user interface comprising an event venue representation indicating the locations within the event venue at which media data was captured;
  • a playback device further than a predetermined distance from the event venue may be provided with access only to one or more exterior views of the event venue.
  • an exterior view of particular event venue may be displayed at the display device when the display device is aimed towards the geographical location of that event venue.
  • the exterior views presented may correspond to the side of the event venue closest to the playback device, resulting in a telescope effect.
  • a playback device within or relatively nearby the event venue may be provided with access to media data captured at locations within the event venue closest to the current location of the playback device.
  • a media content distribution method comprising:
  • a playback device access streamed or stored media data of an event via the media handler and displaying the accessed media data to a user; wherein the playback device is restricted to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • a media handler for receiving and distributing media data, operable
  • the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • a playback device for accessing media data from a plurality of camera devices via a media handler, the camera device capturing media data of an event occurring at an event venue, the playback device being operable to restrict access to only a subset of the media data being generated of the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • aspects of the present invention are also envisaged, and include a computer program, a media handler, a content distribution device and a user device.
  • Figure 1 schematically illustrates a media content distribution system
  • Figure 2 schematically illustrates an event venue representation of a sports stadium
  • Figure 3 schematically illustrates a seating area display within the event venue representation of Figure 2;
  • Figure 4 schematically illustrates a selectable timeline for navigating footage available at different times within the event venue representation
  • Figure 5 schematically illustrates a set of selectable buttons for accessing footage at different times or in relation to different occurrences at the event
  • Figure 6 schematically illustrates an event venue representation of a theatre hall
  • Figure 7 schematically illustrates a simplified event venue representation of the theatre hall
  • Figure 8 schematically illustrates a seating area within the simplified event venue representation of Figure 7;
  • Figure 9 schematically illustrates an event venue representation of a racing circuit
  • Figure 10 schematically illustrates a trackside area within the event venue representation of Figure 9;
  • Figure 11 schematically illustrates a process by which viewers at an event are able to access live footage of the event captured by other viewers
  • Figure 12 schematically illustrates a process by which viewers at an event are able to upload footage of the event along with related information, and other users are able to access the uploaded footage and add their own related information;
  • Figure 13 schematically illustrates an augmented reality system in which a user is able to see a representation of an event venue by directing a portable device in the direction of the geographical location of the event venue;
  • Figure 14 schematically illustrates a view of the event venue as provided for by the system of Figure 13;
  • Figure 15 schematically illustrates a process by which a user of the augmented reality system of Figure 13 is able to access live or stored footage of an event taking place at the event venue;
  • Figures 16A and 16B schematically illustrate how a different level of access to footage is available when the user is at different distances from the event venue.
  • Figures 17A and 17B schematically illustrate how a different subset of cameras/views are available when the user is at different positions around the event venue.
  • Connect a fan which in its core is about connecting supporters/fans in the entertainment industry, by sharing the view of the show, match, concert, etc. as how they experience these from their point of view and share it with supporters/fans that are not in the location - fan-zone spectators.
  • the app will be compatible with a number of electronic devices and platforms for example an iPad, an iPod, an iPhone, an Android phone, an Android Tablet, a Windows phone, a Blackberry device and computers.
  • the app will run in real time, so that someone in the venue, using the camera in their device, becomes a window for other fan-zone spectators around the world to connect and see, listen and experience the event in their own device.
  • the fan that is not in the venue gets to experience from a specific seat or area the way that the crowd experiences the show.
  • the fan-zone spectator can choose from any of the cameras available in the venue, each fan inside the venue will agree to the terms and conditions to become a relay of this image.
  • the venue/client Because of streaming speed, the venue/client will have to make available a fast broadband, so that the fans inside the venue are able to stream the images and audio to the fan-zone spectators. It will be the fan-zone spectators' responsibility to have a fast broadband in order to receive the feed. This will be part of the terms and conditions.
  • a "live switching" module may be available, in which the fan-zone spectator can decide on more than one camera at the time. There will be a protection, so that this is not able to record, but only switch in between cameras.
  • screens may be installed so that the faces of these fans that are watching through the fans in the venue may be seen, thus making them part of the global audience, so that organisers can know how fans are reacting and adding to the experience of the show.
  • the first part of it means that by understanding the GPS and location in the world of the fan, it is possible to establish were the fan is relative to the stadium or venue.
  • the GPS will tell the fan were the stadium/venue is in relation to him/her, and instead of seeing the environment around him/her, the stadium/venue will appear in the screen, as if being transported to the outside of the stadium/venue.
  • the fan will see the facade of the stadium/venue and some historic moments of the team/artist.
  • By clicking into stadium/venue it will have a clean view from that side as if it was a window into the stadium/venue.
  • the fan will have the opportunity to select the camera and see what is currently happening inside the stadium/venue. This can connect to any of the cameras that have been placed inside the stadium, as long as it has the same angle of view. For instance, if coming from the southwest (SW), only the SW cameras will show.
  • SW southwest
  • fans using the system are recording a football match
  • the goals can then be seen from all the different angles of the cameras, so a fan-zone spectator can see the goal from the SW or east (E) and see and hear how the fans experienced the moment.
  • the system may have its own social media account, which will allow fans around the world to be closer with other fans and their clubs, bands, etc. It will be compatible to the already existing social medias (Twitter (RTM), Facebook (RTM), tumblr (RTM), etc.) with the difference that only members of the website will be allowed to trade points, images, videos and related uploads.
  • the facade is designed to be a reflection of the passion and history of the club and its fans. The way it communicates and evolves enables the stadium, the heart of Real Madrid (RM) to become a living fan that transcends time and distance.
  • the facade is 43 million pixels, making it the largest screen in the world, by a lot. The idea of the lighting is to produce an ever evolving sculpture. Where the east and the west facade will be running content that shows the history and passion of the club; the north and the south will work more as atmospheric pieces.
  • This merging of all of the screens merges all of the elements of Real Madrid, its fans, history and future; creating one screen, one community and one Real Madrid at which the stadium is the heart.
  • the east and west screens will show content of former players, goals and historic moments. They will also be able to show sponsorships and brands that want to be associated will RM. All the content will be displayed in white so as to create a cohesive enthusiasm for team colours (Los Blancos).
  • the facade will be interactive with fans that are both close and far away from the stadium thus giving the possibility of giving people the feeling they are close to what's happening in the stadium. This allows the fans and the stadium to have a mutually influential bond, regardless of how close they are to one another fans can still connect, send messages and the stadium can send exclusive content and generate excitement and atmosphere.
  • the opacity of the facade will vary depending on whether it's a day after a match or 3 days away etc., this will be part of the impact the stadium will have.
  • louvers containing 24mm LED screens will have 3 positions; close, mid and open. This will provide different angles to the light. In an open position the entire screen becomes available; in the closed position only the middle 2 sections will be available. In the mid position they will act as awash. Behind each of these louvers there will be 2 lamps that will help to light the internal space. These louvers will have independent motion/movement which will be affected by the sun, temperature and various other sensors or factors that will create. The old stadium will be lit with LED fixtures that will help accentuate the space and while adding depth to the building.
  • the facade could potentially be utilised to run advertisements, games, movies, and make it the building that people go visit when special moment happens in the country, city, and club.
  • App_ The facade in combination with the app will allow for an ever-evolving building that connects to the fans and becomes the heart of the team and the city, making it a living human being.
  • the app is the tool that allows fans of all different locations to become part of the action. Instead of viewing distance as potentially negative, this app embraces the distance, interacting with RM fans all over the world in a way that evolves to show different parts of the history and passion that the fans have for club depending on location and proximity to the stadium.
  • the app gives fans from all over the world the opportunity to interact with the stadium and the content of the facade, other fans and connect to the match in an inspiring new way; through the eyes of the fans at the stadium.
  • the app delivers an augmented reality (AR) 'fan finder' that allows fan to see the global audience simply by holding up their device. They can also hold up their device to view an exclusive AR player intro with videos, statistics, goals scored etc. the app can be used as a guide to find their seat, the bar, a restaurant or friend.
  • AR augmented reality
  • fans can create a fan pixel, much like the devices in the London 2012 Olympics. This is where fan devices are all held in the air to create a giant screen where messages and effects can be broadcast.
  • a media content distribution system 1 is schematically illustrated.
  • the top portion of Figure 1 shows elements of the system 1 which are disposed insider an event venue 5, while the bottom portion of Figure 1 shows elements of the system 1 which are disposed outside the event venue 5.
  • the event venue in the present case is a football stadium, but it will be appreciated that it could instead be a cricket ground, a race track, a theatre or any other event venue, sporting or otherwise.
  • visitors are using personal electronic devices 62a, 62b, which may be mobile telephones, ipods, ipads or camera devices, to capture footage of the event, in this case the football match.
  • the footage captured by the devices 62a, 62b is transmitted via a wireless hub 20 to a media content distribution device 10.
  • the footage is provided by the transmitting personal electronic device in association with the user ID of the user, a device ID indicating the type or identity of the transmitting device, and in some embodiments the location of the user (e.g. seat identifier or GPS location). Some or all of the footage may optionally be provided to an external content store 70, for access by users at a later time (for example subsequently to the event).
  • the media content distribution device 10 is operable to stream the received footage to another device on request.
  • the requesting device may be a device 62c, also present within the event venue, one of devices 64a, 64b immediately outside the event venue, or one of devices 66a, 66b some distance away from the event venue.
  • these are all within range of the wireless hub 20, and therefore the footage may be streamed from the media content distribution device 10 to these devices via the wireless hub 20.
  • the devices 66a and 66b are outside of the range of the wireless hub, and are therefore required to access the footage via a telecommunications network 50 and the Internet 30.
  • Each of the devices 62a, 62b, 62c, 64a, 64b, 66a, 66b has installed thereon the app described above, and is subscribed to the service.
  • the app may cause the personal electronic device to register itself with the media handling device when the personal electronic device enters the event venue, or the vicinity thereof. This registration may take place automatically (based for example on GPS location, or by detection of the presence of the wireless hub 20), or when the app is launched. Alternatively, there may be a registration function selectable by the user, which causes the device to register itself and the user as present at the event venue.
  • the requesting device may actively select a particular media source to watch. While this selection could be made from a list for example, preferably the selection is made using an event venue representation which permits the user of the requesting device to see where within the event venue currently transmitting media sources are present. This may be in the form of a plan or isometric view of the event venue, in which selectable media sources are visibly presented (for example by flashing icons or highlighted portions). The user of the requesting device is able to select one of these media sources using the event venue representation, for example by clicking or tapping on an icon/highlighted portion. As a result, if the user of the requesting device would like to watch a football game from a location near the goal for example, a media source from this area can be selected.
  • a media source may be chosen either by the user specifically selecting a particular transmitting media source, or by the user selecting an area or zone within the event venue, from which a specific media source is allocated by the media handling device. In this way, the user is able to choose from where they would like to be able to watch the event, while the media handling device is able to take care of choosing a specific device - for example taking into account image capture quality, or favoured image capturing users.
  • the event venue representation is populated with media sources based on the currently received footage from transmitting devices, the location (seat identifier or GPS location) of the users/transmitting devices, and optionally a quality indicator providing some indication of the quality of the footage in terms of resolution/frame rate etc.
  • the quality of the footage may be determined from a subscriber database, as will be explained below.
  • a subscriber database 40 which stores details of users of the service, as well as details of any devices which those users have registered to the service.
  • the database may use the following fields:
  • the User ID is a unique identifier for the user.
  • the User name is also stored.
  • Access to the service may be password protected, and accordingly the database stores a password to be checked against a password entered by the user (or automatically entered by an app on the user's device) when the user accesses the service using their device.
  • the current GPS location may optionally be stored in the database, but might instead be provided directly to the media content distribution device 10 along with the footage.
  • the current seat location may optionally be stored in the database, but again might instead be stored on the user's device and provided directly to the media content distribution device along with the footage.
  • Each device which a user registers to the service is also identified (by a Device ID) in the database in association with the user.
  • Each device may have an associated device specification entry indicating relevant video image capture characteristics of the device. For example, the frame rate, resolution or optical characteristics of the device could be stored, or alternatively a "quality rating" could be tagged against the device specification entry. Finally, accrued credit, for example the star rating, for the user is stored. This credit may be spent or used to access certain features of the service.
  • a user ID and device ID may also be provided, permitting the media content distribution device 10 to obtain the quality rating for the device from the subscriber database, or to determine the probable quality of the footage by referring to the subscriber database using the user ID and device ID and accessing the associated device specification.
  • the location of the user providing the footage may be based on a GPS position - either provided directly to the media content distribution device 10, or provided first to the database 40 and then accessed from there. Alternatively, the location of the user providing the footage may be based on a seat position associated with the user.
  • the seat position may be determined by pre-allocating the seat to the user at the time of purchase, or the time of entry into the event venue, and recording the seat allocation in association with the user on the database 40.
  • the user may enter his seat number into the app on his portable electronic device (on purchase, or while at the event venue for example), which will in turn provide this information to the database via the Internet.
  • the seat position may instead be stored on the user's device and provided directly to the media content distribution device along with the footage. In any of these cases, the position information is used to set the position of the media source on the event venue representation.
  • FIG. 2 an example event venue representation for a football stadium is schematically illustrated.
  • the football pitch is provided in the centre of the figure, while the various seating areas are shown around the outside of the pitch, designated by alphanumeric codes.
  • the position of the viewing user could be represented on the event venue representation, for example by an icon (for example the user's avatar) or a flashing dot, permitting the user to identify their own position at the event, and position of other media sources relative to himself.
  • the view of Figure 2 is the top level event venue representation showing the whole event venue.
  • Each seating area shown in the top level representation of Figure 2 corresponds to an area of seating, such as that shown schematically in Figure 3.
  • Any seating area shown in the top level representation within which a currently transmitting media content source (that is, a user/device currently transmitting footage to the media content distribution device 10) is present may be highlighted to indicate the presence of a transmitting source to a viewer of the event venue representation. Any seating area within which a currently transmitting media content source is present can be selected, resulting in the display switching to show a close up (zoomed) view of that seating area.
  • a close up of the seating area Dl of Figure 2 is shown in Figure 3.
  • seat rows A to G form part of the seating area Dl, each row having either 15 or 16 seats.
  • the event venue representation is a real time representation of where footage is available from at any given time. Accordingly, seating areas may become highlighted and switch off throughout the event, as users turn on and off the capture and transmit functions of their devices.
  • streamed footage from a given device may suddenly terminate when the user of that device decides to stop capturing and streaming the footage.
  • the media content distribution device may intelligently and smoothly switch to streaming footage from a different image capture device nearby to the image capture device which has ceased to transmit footage. In this way the viewing user should continue to experience a similar view of the event.
  • termination of a footage stream may cause the event venue representation to be displayed, with the viewing user being prompted to select another source.
  • the media content distribution device may switch streaming from a first device to a second device even if the first device does not discontinue its transmission.
  • the second device is in a similar location to the first device but has higher quality image capture characteristics.
  • a "favourite" source of the viewing user e.g. a friend
  • the viewing user's preferences might dictate that transmissions from certain users always take precedence, or take precedence over other nearby users and devices.
  • the quality of the devices providing the footage may be indicated for example by having different coloured highlighting for seats associated with footage of different quality. It will be appreciated that other visual indicators could instead be provided.
  • “Favourite” sources may be distinguished in the event venue representation from other sources, either by a different colour, or an annotation or any other visual indication. This enables a viewing user to preferentially select footage from their friends, from participating celebrities at the event, or from other users whom they have found to capture event footage in a manner which they like.
  • media sources could be tagged (in the database for example) as being associated with a particular team. An indication of the team could then be visually identified on the event venue representation, permitting the user to select media sources associated with a particular team. Either as an alternative or an adjunct to providing visual representations of e.g. favourite sources, the quality of sources or a team association of sources, filtering could be required, permitting the user to view only favourite sources, high quality sources or sources associated with a particular team, for example.
  • the event venue representation may indicate a direction in which the transmitting image capture devices are facing, giving the user an idea of what the associated footage will contain.
  • This indication could be provided by a directional arrow originating at the location of the media source and pointing in the direction which the image capturing device is pointing. This feature would require the image capture device to transmit an indication of its current facing to the media handling device along with the footage.
  • Hardware facilitating the self- determination of the orientation of personal devices such as smartphones is readily available, and can be used to provide this function.
  • the direction of facing of image capture devices may also be used in combination with techniques such as goal line tracking or ball location tracking to permit the auto-selection or filtering of camera device directed at a specific area of interest at the event.
  • the media handling device may compare a direction of facing of each camera device at a particular time with the location of (for example) the ball at that same time to determine whether the ball is within the field of view of the camera device, or preferably in the centre of the field of view of the camera device.
  • a similar principle could be applied to other objects of interests in other contexts, for example the location of racing cars on a racing circuit.
  • the event venue representation may also indicate the location of fixed-camera installed within the event venue, or mobile cameras utilised by professional cameramen at the event. These media sources may also be selected by users for playback on their personal device vide the event venue representation.
  • footage When footage is captured, it may simply be streamed (only stored temporarily) from the image capture device to the viewing device via the media content distribution centre, or it may be stored at the content store 70 in addition to being streamed to requesting users. If it is stored in the content store 70 then it can also be accessed at a later time. In one example, most footage is streamed, but some footage is marked by the user capturing that footage as being for permanent storage. This footage, when transmitted to the media content distribution device, is both streamed to any requesting users and also stored in the content store 70.
  • At least some of the footage being transmitted to the media content distribution device is stored at the image capture device (but not at or by the media distribution device) at the time of capture, and is then uploaded to the content store 70 at a later time, for example after the event has finished.
  • Each user may only be permitted to store a limited amount of footage (in terms of one or both of data capacity and duration), in order that the storage facility of the content store is not overwhelmed.
  • a user may be encouraged or prompted to store footage, for example if there are too few media sources in their area - with this being rewarded by extra points.
  • Footage stored into the content store 70 is stored in association with the location (e.g. GPS location or seat location, or derived location) from which the footage was captured, the identity of the user who captured the footage, the quality of the footage (e.g. characteristics of the image capture device), and the time of capture of the footage.
  • This information may be provided from the image capture device to the media content distribution device or from the subscriber database, or a combination of the two.
  • the time of capture, location and footage may be provided by the image capture device along with the user ID and device ID, and the user ID and device ID may be used to obtain the quality information from the subscriber database.
  • the quality information may be stored at the image capture device and provided to the media content distribution device along with the footage and other data.
  • the GPS location may be converted by the media content distribution device into a location within the event venue.
  • This derived location might be a seat position, or alternatively a seating area (e.g. Dl) or a zone within a seating area. This will be sufficient for the event venue representation to highlight an appropriate position for the footage.
  • the uploading user may also store additional content, for example text describing their personal memories of the event, or photographs or sound clips captured during the event.
  • FIG. 4 a timeline 100 for a football match is shown. The timeline 100 is broken down into periods, these being pre-match, first half, half time, second half, extra time and post- match. It will be appreciated that this is only one example breakdown of a football match.
  • a slider button 110 is provided which can be dragged along the timeline to access content associated with any given moment in time. When the slider button 110 is dragged to a particular position on the timeline 100, the event venue representation is updated to highlight those media sources for which stored content captured at the time indicated on the slider 100 is available.
  • a series of buttons are provided for selecting different periods during the event. When a button is selected, the event venue representation is updated to highlight those media sources for which stored content captured during the period indicated by the selected button is available.
  • buttons could be selected, resulting in media sources which captured footage during some or all of the selected periods being highlighted. It will further be appreciated that the interactive timeline of Figure 4 and the period selection buttons of Figure 5 could be provided in combination, permitting a user to access content either from a particular moment in time, or during a particular period.
  • buttons are also shown, these being a "Goals! button, a "Saves! button and a "Fouls! button. These buttons can be used to access footage relating to each of these themes.
  • the footage associated with each theme may be identified in the database with tags - manually entered by the uploading user for example, or by associating with a theme footage captured at a time during the event at which an occurrence corresponding to that theme was known to have taken place. This could be determined automatically by the system based on event related information entered by the system operator.
  • the system operator may associate with an event a dataset indicating times during the event of various key occurrences, such as goals, saves or fouls for example, resulting in a list of time instants or periods at which each given event type occurs.
  • An example is shown in the table below. Based on this information, the selection of "goals" by the viewing user will cause the event venue representation to be populated with identifiers of the availability of media sources which were being captured at the times 17m32s, 27m02s and 91mlls.
  • the footage may commence from the start of the clip, or may start from the point in time selected by the user or derived from the user's input.
  • the viewing user is also able to attach text, sound clips or images to the footage, which are the accessible to other users subsequently accessing that item of footage.
  • the attached text, sound or image data is then stored at the content store 70 in association with the footage itself.
  • a viewing user is able to associate their own memories of the event with footage captured by someone else.
  • Other views accessing the same footage at a subsequent time will have access not only to the footage and related content uploaded by the user who captured the footage, but also any additional materials uploaded in association with the footage by previous viewers.
  • Figure 6 shows a seating plan for a theatre hall, and this may constitute an event venue representation provided to a user wishing to access footage of the event (either in real-time or in relation to a past event, in each case as described above). It can be seen from Figure 6 that certain seats are shaded - these seats represent the location of available media sources within the theatre hall. In the context of realtime (during event) viewing, these shaded seats represent the location of currently transmitting image capture devices, while in the context of viewing a historical (ended) event, the shaded seats represent the location of media sources which captured data at a particular selected time, period or occurrence within the event. It will be appreciated that the detail present in Figure 6 may be inappropriate for a small hand held device. In this regard, a simplified top level event venue representation such as that shown in Figure 7 may be used instead.
  • FIG 7 a simplified view of the theatre hall is provided, broken down into multiple selectable seating areas 210, 220, 230, 240, 250, 260 and 270.
  • a user is able to select one of these seating areas which is of interest, resulting in a close up view of that seating area being shown.
  • selecting the seating area 260 in Figure 7 may result in the close up of Figure 8 being presented to the user.
  • three blocks of seating are shown, with some seats within each block being shaded to represent that footage is available in relation to each of those seats. The user is able to select these shaded seats to gain access to the footage associated with that location.
  • the selection of a shaded seat would trigger the streaming of footage from an image capture device in the case of real-time operation (connect-a-fan) or the playback of stored footage in the case of viewing after the event.
  • Figure 9 shows a plan view of a racing track, and this may constitute an event venue representation provided to a user wishing to access footage of the event (either in real-time or in relation to a past event, in each case as described above).
  • the event venue representation is marked with 20 zones, each of which is selectable to gain access to media content associated with that zone. In some cases, selection of a zone may trigger the playback or a random or best available (based on quality information for example) item of footage associated geographically with that zone.
  • Figure 10 is an example of a close up view within the event venue representation of Figure 9.
  • an indication of the location of the track is provided, along with an indication of the direction in which cars are travelling (in this case provided by the car representations and related arrows).
  • Various seating and standing areas are identified in Figure 10, identified by numbering.
  • These areas may be selectable to obtain access to more detailed seating plans similar to that shown in Figure 8, or the location of available media sources may be indicated by icons or highlighting directly on the representation of Figure 10. Again, such media sources are selectable in order to initiate playback of footage relating to these.
  • a users capturing footage of the race may be positioned all round the track at many or all of the 20 identified locations.
  • a particular viewer will be located only at a single location at any given time. However, that viewer may wish to track the action at pole position throughout a lap of the race.
  • the user can achieve this by selecting appropriate media sources around the track as the car in pole position progresses.
  • the system is operable to sequentially select media sources at different positions around the track to follow the progress of a particular car, for example the car in pole position or a car selected by the viewing user.
  • the system in this case is able to monitor the location of the selected car (for example based on GPS trackers affixed to each car), and repeatedly switch the media source presented to the viewing user such that the selected car is always likely to be in view.
  • the media sources selected could be those closest to the current position of the selected car, and/or those oriented towards the current position of the selected car (based on internal sensors of the image capture devices).
  • the event venue representation identifies where key events are happening, for example the current position of the car in pole position, the location of a crash or other incident, or a pit stop.
  • on-car cameras may also be provided, and may again be accessible to a user using the event-venue representation.
  • the event venue representations can be navigated after the event both with respect to location and with respect to time/occurrence in like manner to the football stadium embodiment described above.
  • an image capture device transmits footage to the media content distribution device 10 via the hub 20 (not shown).
  • the footage received also includes metadata such as the user ID and device ID of the user of the image capture device and the device itself respectively, and optionally location information (for example GPS coordinates) and information regarding the current orientation of the image capture device, if this is not to be retrieved from the database 40.
  • the media content distribution device 10 obtains information about the user of the image capture device and the device itself from the database 40.
  • the obtained information may include quality information about the image capture device 62a (based on the device ID) and in some cases the location of the user where this has been stored in advance - as might be the case for allocated seating associated with tickets booked via the app or seat numbers entered into the app and uploaded to the database 40. Based on this information (and based on similar information obtained in relation to other footage received from other image capture devices) a data structure representing the location and optionally quality of available footage streams is formed. Optionally at a step S3, at least some of the received footage is stored in the content store 70 along with the identity of the user which provided the footage, the time of capture, the location from which the footage was taken, and quality information about the device which generated the footage.
  • a viewer device 62c requests access to an event venue representation displaying the location (and optionally quality) of footage streams which it is able to currently access.
  • an event venue representation is generated based on the data structure and passed back to the viewer device 62c at a step S5.
  • the user of the viewer device 62c selects desired footage using the event venue representation, consequently sending a request for this footage to the media content distribution device.
  • the requested footage is provided (streamed) from the media content distribution device 10 to the viewer device 62c.
  • a schematic flow diagram of the collective memory method is shown.
  • captured content is uploaded from an image capture device 62a to the content store 70.
  • the captured content is uploaded in association with the time of capture, the user ID of the user and the device ID of the image capture device.
  • the location of the user may also be uploaded. This step may take place at the time of capture during an event, or subsequently.
  • the content store 70 obtains information about the user of the image capture device and the device itself from the database 40.
  • the obtained information may include quality information about the image capture device 62a (based on the device ID) and in some cases the location of the user where this has been stored in advance - as might be the case for allocated seating associated with tickets booked via the app or seat numbers entered into the app and uploaded to the database 40.
  • the captured content and the other uploaded and acquired information referred to above is stored in association in the content store.
  • the upload step Ul may include the association of additional text, sound or image content with the captured content and the upload of this additional content to the content store for storage.
  • a user requests an event venue representation from the content store 70. This step may be triggered by the user opening the app, tapping an icon or an external view of and representing the event venue and/or event, or similar.
  • the content store 70 generates an event venue representation based on the uploaded content corresponding to a particular event at a particular venue.
  • the event venue representation is populated with visual indications of content which is available at particular times during the event and at particular locations within the venue. The locations and times for each item of content are derivable from the time of capture information and location information stored at the content store 70 in association with the footage.
  • the event venue representation may also indicate whether a particular item of content has additional information ("memories") associated with it, for example in the form of text-based descriptions of recollections of the event, or audio and/or images associated with a the uploading user or another user's memory of the event. This indication may be given visually, for example by colour coding, icon shaping or any other technique.
  • the viewer uses the event venue representation to identify and select a desired item of content, whereupon this item of content is requested from the content store 70.
  • the content store 70 provides the requested item of content to the viewer, along with any additional information ("memories") associated with that item of content.
  • the viewer is then able to watch the content, and examine the additional information.
  • the user is able to upload their own additional information (“memories"), in the form of text, audio or image, to be associated with the item of content. The viewer's additional information will then be accessible to subsequent viewers accessing the item of content.
  • an augmented reality system for accessing media content is schematically illustrated.
  • a user 310 possessing an electronic device 315 is able to view an augmented reality image of various event venues 320, 325, 330, 335 by aiming his electronic device 315 at the geographical location of one of the devices, wherever in the world those event venues are.
  • the event venues 320 and 335 are sports stadiums
  • the event venue 325 is a theatre hall
  • the event venue 330 is a racing circuit.
  • Each of the event venues in the example of Figure 13 is a different distance from the user 310.
  • the event venue 320 is 4700km from the user
  • the event venue 325 is 179km from the user
  • the event venue 330 is 23km from the user
  • the event venue 335 is 1km from the user.
  • Figure 14 schematically illustrates the electronic device 315 of Figure 13 showing an augmented reality image of the event venue 320, at which the electronic device 315 is aimed. For event venues beyond a certain predetermined range of the viewing user, this may be the only content available to the user. For closer event venues, access to additional content may be provided, for example by clicking or tapping on the external view.
  • a schematic flow diagram of the augmented reality method is shown.
  • footage is captured and transmitted from image capture devices to a media distribution device.
  • the image capture devices may be portable electronic device of users at the event venue, or fixed or portable camera devices associated with the event venue itself.
  • the footage received also includes metadata such as the user ID and device ID of the user of the image capture device and the device itself respectively, and optionally location information (for example GPS coordinates).
  • location information for example GPS coordinates
  • a database may store a predetermined location for each of these devices.
  • data associated with the captured footage is obtained from the database.
  • the obtained information may include quality information about the image capture device (based on the device ID) and in some cases the location of the transmitting user where this has been stored in advance. Based on this information (and based on similar information obtained in relation to other footage received from other image capture devices) a data structure representing the location and optionally quality of available footage streams is formed.
  • a data structure representing the location and optionally quality of available footage streams is formed.
  • at least some of the received footage is stored in the content store 70 along with the identity of the user which provided the footage, the time of capture, the location from which the footage was taken, and quality information about the device which generated the footage.
  • a viewer device 62c requests access to an event venue representation displaying the location (and optionally quality) of footage streams which it is able to currently access.
  • an event venue representation is generated based on the data structure and passed back to the viewer device at a step W5. It will be appreciated that the data structure, and thus the event venue representation, includes only the media sources which the viewer device is permitted to access based on its location relative to the event venue.
  • a step W6 the user of the viewer device selects desired footage using the event venue representation, consequently sending a request for this footage to the media content distribution device. If the desired footage is stored (non real-time) footage, then at a step W7 a request for the footage is sent to the content store, and returned to the media distribution device at a step W8. The retrieved footage is then provided to the viewer device at a step W9. If on the other hand the footage is real-time footage then the steps W7 and W8 are not required and instead footage from the selected source is streamed from the capture device to the viewer device at the step S9.
  • FIG. 16A and 16B an access restriction based on distance from the event venue referred to above is schematically illustrated.
  • a viewer device 620 is shown at a relatively large distance from a media venue 610. At this distance, the viewer device 620 is only able to access a single internal camera 630 at the event venue 610. It will be appreciated that at an even greater distance the viewer device 620 may not be able to access any internal cameras at all, limiting the display to an exterior view of the event venue as per Figure 14.
  • Figure 16B the viewer device 620 is much closer to the event venue 610. In this case, the viewer device 620 is not only able to access the internal camera 630, but also additional cameras 640a, 640b, 640c, 640d. In other words, as the viewer device 620 approaches the event venue 610, more internal cameras are available to view.
  • the cameras 630, 640 may be user devices used by fans within the event venue, or fixed or mobile camera devices installed within the event venue and operated by the event organisers.
  • the camera devices available to the user are selected from cameras closest to the viewer device, providing the experience of a "window" into the event venue from the user's real world location.
  • FIG. 17A an access restriction based on position around the event venue referred to above is schematically illustrated.
  • the viewer device 620 is located just outside the event venue 610.
  • the viewer device 620 has access to the internal cameras 650a, 650b, 650c closest to his location outside the event venue.
  • the experience of a "window" into the event venue from the user's real world location is provided.
  • a different set of internal cameras 660a, 660b, 660c are available to the user.
  • the references to internal cameras in relation to Figures 16 and 17 can be replaced with media sources, and the position within the event venue from which they were captured.
  • a user can move around an event venue after the event and "look in” using their viewer device as if the event was currently occurring.
  • the user could turn up at an event venue after an event has finished (potentially weeks, months or years later) and trigger playback of footage associated with the event.
  • the selection of cameras may be based in part on the direction in which the viewer device is facing - determined for example by existing functionality of the device itself (electronic compass technology).
  • An internal camera may be available for access if it is facing in the same or a similar direction as the viewer device. It will be appreciated that the cameras available for access may be determined based on a combination two or more of the distance, position, and orientation of the viewer device with respect to the position and/or orientation of the camera devices.
  • the audio data of a higher quality may be audio data professionally captured at the event venue and provided to the media handling device, or audio data from another image capture device having a better audio capture capability.
  • the user may be able to selectively switch between the actual audio associated with a media source, or higher quality substituted audio.
  • the audio substitution could be achieved at the media handling device by stripping the audio from audio/video data received from an image capture device, and inserting a higher quality audio stream captured in parallel. It will be appreciated that appropriate synchronisation between the audio/video stream and the substitute audio stream would need to be provided.
  • the video data is stored in association with date/time information indicating the time at which the video data was captured.
  • a substitute audio stream could be stored separately from the stored video data, with its own date /time of capture information.
  • the data /time of capture information of each of the video data and substitute audio data can be used to correlate the two streams together, for combination either for playback only, or for storage as a higher audio quality version of the video file.

Abstract

A media handling device for receiving and distributing media data and a plurality of user devices. Each user device has a camera function and is operable in a media transmit mode to transmit media data of an event occurring at an event venue to the media handling device. Each user device is operable in a media receive mode to receive media data from the media handling device. When operating in the media receive mode, a first user device from amongst the plurality of user devices is operable to select a media source from a second user device among the user devices which are currently operating in the media transmit mode and the media handling device is operable to stream the media data of the event being received from the second user device to the first user device. In this way, device users at an event are able to share media content which they have captured, for example video footage, with other users at, outside or away from the event. This enables users to experience the event from different viewpoints around the event venue.

Description

MEDIA CONTENT DISTRIBUTION
Field of the Invention
The present invention relates to media content distribution. Embodiments of the present invention relate to a system and method for improving a supporter's experience of an event at an event venue, and in particular to a system and method that links a supporter's device to the devices of other users, enabling the supporter to receive audio/video data captured by the other users.
Background of the Invention
At sporting events, supporters have a limited viewpoint for viewing the event, usually being their own direct view of the event supplemented with such additional views on large screens at the event venue as are selected by organisers of the event. After an event has finished, a supporter's experience of the event is often even more limited - to their own captured video footage (if they were present themselves) or commercial footage if available. It would be desirable to provide supporters with access to additional viewpoints, and to provide supporters with facilities to increase their involvement in the event, both before, during and after the event actually takes place.
Summary of the Invention
According to an aspect of the invention, there is provided a media content distribution system, comprising:
a media handling device for receiving and distributing media data; and a plurality of user devices, each user device having a camera function and being operable in a media transmit mode to transmit media data of an event occurring at an event venue to the media handling device, each user device being operable in a media receive mode to receive media data from the media handling device; wherein
when operating in the media receive mode, a first user device from amongst the plurality of user devices is operable to select a media source from a second user device among the user devices which are currently operating in the media transmit mode and the media handling device is operable to stream the media data of the event being received from the second user device to the first user device.
In this way, device users at an event are able to share media content which they have captured, for example video footage, with other users at, outside or away from the event. This enables users to experience the event from different viewpoints around the event venue. The media content is preferably a video stream, optionally with audio, but may instead be still images. It will be appreciated that media content transmitted from one device may at any given time be streamed to one device, no devices or many devices. The media handling device may only receive media content from devices actually present at (for example within or in some cases in the immediate vicinity outside) the event venue. This could be achieved using location information indicating the location of the image capture devices, or by limiting to devices registered to the event as part of the ticket buying process for example. In some cases the selection of the media source by the user may be achieved by selecting a specific device which is currently transmitting, while in other cases the selection of the media source by the user may be achieved by selecting a geographical area or zone of the event venue, with the specific source then being allocated by the media handling device - for example by allocating the highest quality device in that area or zone.
A user may be able to access footage from a media source in a number of ways, for example by selecting from thumbnail versions of each media source from a drop down list, for example ordered by media source location. However, preferably the first user device is operable to select the media source using an event venue representation indicating the location within the event venue of the user devices operating in the media transmit mode. The event venue representation may for example be a plan or isometric view of the event venue. This enables the user to visually identify media sources in an area from which the user would like to view the event, and to select appropriate media sources for view based on location. Again, the user may select either a specific transmitting device, or an area or zone from which a specific transmitting device is to be allocated by the media handling device. The location within the event venue of each user device operating in the media transmit mode may be determined from one of a GPS position or other positioning facility determined by the user device and a seat or area identifier entered by the user or otherwise allocated to the user.
The event venue representation may comprise a first representation indicating the location of plural selectable areas within the event venue (top level representation, or overview). Selection of an area within the event venue representation may cause the display of a second representation (close up of selected area) indicating the location within the selected area of user devices operating in the media transmit mode. In this example the top level representation permits the user to navigate to a location of interest within the event venue and the second representation permits the user to actually select a media source from which to receive footage. This is useful because it may be difficult to distinguish between and selected individual media content sources on a top level representation.
The event venue representation may have a deeper hierarchy than this and comprise a multi-tiered hierarchy of representations. This may include a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area of user devices operating in the media transmit mode or a further intermediate level representation. This arrangement is particularly suitable for very large event venues, such as a racing circuit, where two levels of resolution may be insufficient to properly navigate the geography of the event venue and select individual media sources.
A user database of device users may be provided, each user having a user profile associated with one or more user devices, the user profile comprising one or more of a current location of a user device associated with the user, a specification for the one or more user devices and quality information indicating the image capture capability of the one or more user devices. The user database may comprise device information regarding at least some of the user devices, the device information indicating the media capture capabilities of the user device, wherein an indicator of media quality is displayed on the event venue representation in relation to at least some of the media sources, the indicator of media quality being based on the device information stored in the user database.
In one embodiment, if the user device transmitting the selected media source exits the media transmit mode, the event venue representation may be automatically presented on the first (viewing) user device to enable the selection of an alternative media source. In another embodiment, if the user device transmitting the selected media source exits the media transmit mode, an alternative media source is automatically selected by the media handling device (or the viewing user's device) based on its position relative to that of the originally selected media source.
According to another aspect of the present invention, there is provided a media content distribution method, comprising the steps of:
transmitting, from first user devices having a camera function and being in a media transmit mode, media data of an event occurring at an event venue to a media handling device;
selecting, at a second user device, a media source from among the media sources transmitting from the first user devices which are currently operating in the media transmit mode;
to stream the media data of the event being received from the selected one of the first user devices to the second user device.
According to another aspect of the present invention, there is provided a media handling device for receiving and distributing media data, the media handling device being arranged
to receive, from a plurality of user devices each having a camera function, media data transmitted by the user devices when operating in a media transmit mode, the received media data being of an event occurring at an event venue; and
to receive, from a user device operating in a media receive mode, a selection of a media source from among the user devices which are currently operating in the media transmit mode; and
to stream the media data of the event being received from the selected media source to the user device in the media receive mode which made the selection.
According to another aspect of the present invention, there is provided a user device having a camera function, the user device being operable
in a media transmit mode to transmit media data of an event occurring at an event venue to a media handling device; and
in a media receive mode to select a media source from among a plurality of media sources corresponding to other user devices which are currently operating in the media transmit mode and transmitting media data to the media handling device;
in the media receive mode to receive the media data of the event being received from the selected media source and streamed via the media handling device to the user device.
According to another aspect of the present invention, there is provided a media content distribution system, comprising:
a media handler for receiving, storing and distributing media data; and a plurality of camera devices, each camera device having a camera function and being operable to upload media data of an event occurring at an event venue to the media handler, the media handler being operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data; and
a playback device operable
to access and display media data stored by the media handler; to submit additional data for association with media data stored by the media handler, the media handler being operable to store the additional data in association with the media data; wherein
when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
This arrangement permits footage capturing users to upload footage they captured at an event, and users other than the uploading user to add supplemental content, for example text describing their own memories of the event or the scene captured by the footage, to be associated with the uploaded content, and to be viewable along with the uploaded content by subsequent viewers.
The playback device may be operable to display a user interface comprising an event venue representation indicating the locations within the event venue at which uploaded media data was captured; and
the playback device may be operable to receive a user selection of uploaded media data via the user interface, and to display the selected media data along with any additional data stored in association with the selected media data.
The playback device may be operable to select stored media data to which description data is to be associated using the event venue representation. In this way, a user may be able to find a media source close to the area from which they watched the event, which should provide the closest resemblance to the experience they themselves had at the event. The playback device may be a camera device which itself captured footage at the event. In other words, users present at the event can access footage of other users present at the event and swap memories and supplemental content. However, in some embodiments supplemental content may be added by users who were not at the event, but nonetheless have an interest in the event and the footage.
The camera devices may be operable to store media data locally during the event at the time of image capture, and to upload the media data to the media handler after the event. This prevents the media handler and related storage device from being overburdened with media content during the match and encourages users to be more selective about the material they upload. However, in the alternative users can selectively stream with or without storing.
The event venue representation may comprise a first representation indicating the location of plural selectable areas within the event venue, selection of an area within the event venue representation on the playback device causing the display of a second representation indicating the location within the selected area at which uploaded media data was captured.
Alternatively, the event venue representation may comprise a hierarchy of representations, a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area at which uploaded media data was captured or a further intermediate level representation.
The location within the event venue of each media source may be determined from one of a GPS position or other positioning service determined by the user device which generated the media source, and a seat or area identifier entered by the user of the device which generated the media source, or otherwise allocated to the user of the device.
An indicator of the media quality of at least some of the uploaded media data may be displayed on the event venue representation. In one example, a user database is provided which stores device information regarding at least some of the camera devices used to capture the media data stored at the media handler, the device information indicating the media capture capabilities of the user device, wherein the indicator of media quality displayed on the event venue representation is based on the device information stored in the user database. Alternatively, quality related information may be stored at the image capture device and transmitted to the media handling device along with the media content. The event venue representation may indicate the availability of different uploaded media data for different time periods or occurrences during the event, the event venue representation being navigable at the playback device with respect to time or occurrences to present the user with access to the different uploaded media data.
The playback devices may be operable to select a particular time period of key occurrence during the event, and the indication of media data shown on the event venue representation is updated to reflect uploaded media data corresponding to the selected time or key occurrence.
In one embodiment, an event play mode is provided in which the indications of media data are continuously updated with respect to a progression in time through the event.
According to another aspect of the invention, there is provided a media content distribution method, comprising the steps of:
uploading, from a plurality of camera devices to a media handler, media data of an event occurring at an event venue;
storing the uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data;
accessing and displaying media data at a playback device;
submitting from the playback device to the media handler additional data for association with an item of the stored media data, the playback device being a different device than the device used to capture and/or upload the media data; storing the additional data in association with the media data;
wherein when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
According to another aspect of the present invention, there is provided a media handler for receiving, storing and distributing media data, the media handler being operable to receive, from a plurality of camera devices each having a camera function, uploaded media data of an event occurring at an event venue;
to store the uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data;
to provide stored media data to a playback device;
to receive, from the playback device, additional data for association with media data stored by the media handler;
storing the additional data in association with the media data;
wherein when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
According to another aspect of the present invention, there is provided a playback device for accessing media data stored by a media handler; wherein the playback device is operable
to access and display media data stored by the media handler;
to submit additional data for association with media data stored by the media handler, the media handler being operable to store the additional data in association with the media data; wherein
when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
According to another aspect of the present invention, there is provided a media content distribution system, comprising:
a media handler for receiving and distributing media data; and
a plurality of camera devices, each camera device having a camera function and being operable to capture and transmit media data of an event occurring at an event venue to the media handler, the media handler being operable to stream the captured media data to device users or store the captured media data for later access; and a playback device operable to access streamed or stored media data of an event via the media handler and display the accessed media data to a user; wherein
the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
In this way, the user is presented only with access to media sources most relevant to him, improving the experience, simplifying access to relevant media sources (a simpler, less cluttered user interface could for example be provided due to the presence of fewer media sources) and reducing the amount of information required to be transmitted from the media handler to the playback device. These benefits apply equally whether the user is attempting to access live feeds during an event, or stored footage after the event.
It will be appreciated that the technique of providing restricted access to content may be applied equally to real-time streamed content, or the subsequent access to stored content. In the latter case, the media handler may be operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device at the time of capture of the uploaded media data.
The number of media data sources made available to the playback device may be dependent on how close the playback device is to the event venue. For example, the media handler may be operable to provide access to more media data sources for playback devices relatively closer to the event venue than for playback devices relatively further from the event venue.
The subset of media data made available to the playback device may be dependent on a current orientation of the playback device with respect to the event venue.
The playback device may be operable: to display a user interface comprising an event venue representation indicating the locations within the event venue at which media data was captured;
to receive a user selection of uploaded media data via the user interface; to present to the user the selected media data and any associated description data.
A playback device further than a predetermined distance from the event venue may be provided with access only to one or more exterior views of the event venue. For example, an exterior view of particular event venue may be displayed at the display device when the display device is aimed towards the geographical location of that event venue. The exterior views presented may correspond to the side of the event venue closest to the playback device, resulting in a telescope effect.
As an extension of this, by aiming their own device in different directions, external views of different event venues around the globe can be provided, with users being able to access media content captured at those event venues by, for example, tapping or clicking on the external view of the event venue of interest.
A playback device within or relatively nearby the event venue may be provided with access to media data captured at locations within the event venue closest to the current location of the playback device.
According to another aspect of the invention, there is provided a media content distribution method, comprising:
capturing media data of an event occurring at an event venue;
transmitting the captured media data to a media handler;
streaming the captured media data to device users or storing the captured media data for later access;
at a playback device, access streamed or stored media data of an event via the media handler and displaying the accessed media data to a user; wherein the playback device is restricted to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
According to another aspect of the present invention, there is provided a media handler for receiving and distributing media data, operable
to receive, from a plurality of camera devices each having a camera function, media data of an event occurring at an event venue;
to stream the captured media data to device users or store the captured media data for later access;
to provide the captured media data to a playback device for display to a user;
wherein the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
According to another aspect of the present invention, there is provided a playback device for accessing media data from a plurality of camera devices via a media handler, the camera device capturing media data of an event occurring at an event venue, the playback device being operable to restrict access to only a subset of the media data being generated of the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
Other aspects of the present invention are also envisaged, and include a computer program, a media handler, a content distribution device and a user device.
It will be understood that each of the connect-a-fan (real-time content sharing between supporters at an event), collective memory (facility for users to attach their own text or media to content uploaded in relation to an event) and augmented reality (real-time or non-real time access of media content, restricted based on viewer location) concepts may be provided separately, or as different aspects of the same overall system. Certain optional features (for example the event venue representation) may be common to each of these. Brief Description of the Drawings
Embodiments of the present invention will now be described with reference to the following drawings, in which:
Figure 1 schematically illustrates a media content distribution system; Figure 2 schematically illustrates an event venue representation of a sports stadium;
Figure 3 schematically illustrates a seating area display within the event venue representation of Figure 2;
Figure 4 schematically illustrates a selectable timeline for navigating footage available at different times within the event venue representation;
Figure 5 schematically illustrates a set of selectable buttons for accessing footage at different times or in relation to different occurrences at the event;
Figure 6 schematically illustrates an event venue representation of a theatre hall;
Figure 7 schematically illustrates a simplified event venue representation of the theatre hall;
Figure 8 schematically illustrates a seating area within the simplified event venue representation of Figure 7;
Figure 9 schematically illustrates an event venue representation of a racing circuit;
Figure 10 schematically illustrates a trackside area within the event venue representation of Figure 9;
Figure 11 schematically illustrates a process by which viewers at an event are able to access live footage of the event captured by other viewers;
Figure 12 schematically illustrates a process by which viewers at an event are able to upload footage of the event along with related information, and other users are able to access the uploaded footage and add their own related information;
Figure 13 schematically illustrates an augmented reality system in which a user is able to see a representation of an event venue by directing a portable device in the direction of the geographical location of the event venue; Figure 14 schematically illustrates a view of the event venue as provided for by the system of Figure 13;
Figure 15 schematically illustrates a process by which a user of the augmented reality system of Figure 13 is able to access live or stored footage of an event taking place at the event venue;
Figures 16A and 16B schematically illustrate how a different level of access to footage is available when the user is at different distances from the event venue; and
Figures 17A and 17B schematically illustrate how a different subset of cameras/views are available when the user is at different positions around the event venue.
Description of the Example Embodiments
CONNECT A FAN
One concept is called "Connect a fan", which in its core is about connecting supporters/fans in the entertainment industry, by sharing the view of the show, match, concert, etc. as how they experience these from their point of view and share it with supporters/fans that are not in the location - fan-zone spectators.
This experience will be shared through a mobile application (app), as well as via a website that will unite the concept, giving the possibility for fans across the world to experience the different scenarios that the system provides.
The app will be compatible with a number of electronic devices and platforms for example an iPad, an iPod, an iPhone, an Android phone, an Android Tablet, a Windows phone, a Blackberry device and computers.
HOW WILL IT WORK
The app will run in real time, so that someone in the venue, using the camera in their device, becomes a window for other fan-zone spectators around the world to connect and see, listen and experience the event in their own device.
By creating this, the fan that is not in the venue, gets to experience from a specific seat or area the way that the crowd experiences the show. The fan-zone spectator can choose from any of the cameras available in the venue, each fan inside the venue will agree to the terms and conditions to become a relay of this image.
Because of streaming speed, the venue/client will have to make available a fast broadband, so that the fans inside the venue are able to stream the images and audio to the fan-zone spectators. It will be the fan-zone spectators' responsibility to have a fast broadband in order to receive the feed. This will be part of the terms and conditions.
As part of the enhance experience, depending on the device, a "live switching" module may be available, in which the fan-zone spectator can decide on more than one camera at the time. There will be a protection, so that this is not able to record, but only switch in between cameras.
For concerts and other type of shows in which the live sound is needed in a better quality, there will be an option to listen to a better feed of the audio, or in the case of a sporting event, of a live commentary.
Just like fan-zone spectators are able to see the show, game, etc., screens may be installed so that the faces of these fans that are watching through the fans in the venue may be seen, thus making them part of the global audience, so that organisers can know how fans are reacting and adding to the experience of the show.
AUGMENTED REALITY
The next aspect of the experience is Augmented Reality and how it works for the different users of the app.
The first part of it, means that by understanding the GPS and location in the world of the fan, it is possible to establish were the fan is relative to the stadium or venue. When the fan looks into the screen, there will be a telescopic view of the stadium/venue. The GPS will tell the fan were the stadium/venue is in relation to him/her, and instead of seeing the environment around him/her, the stadium/venue will appear in the screen, as if being transported to the outside of the stadium/venue. Here the fan will see the facade of the stadium/venue and some historic moments of the team/artist. By clicking into stadium/venue it will have a clean view from that side as if it was a window into the stadium/venue. Here the fan will have the opportunity to select the camera and see what is currently happening inside the stadium/venue. This can connect to any of the cameras that have been placed inside the stadium, as long as it has the same angle of view. For instance, if coming from the southwest (SW), only the SW cameras will show.
This part of the app works accordingly to the place in the world of the fan, the closer the fan is to the stadium/venue, the more cameras the fan can access. As a result, if the fan is in the city, as he is circling the stadium/venue, he can see the different cameras and get all the points of view.
From the moment the system goes lives in relation to a particular team, artist, venue or stadium, the cameras will become a journal of matches/events and important moments, recording these moments in order to be transmitted while browsing the different cameras.
For example, if fans using the system are recording a football match, the goals can then be seen from all the different angles of the cameras, so a fan-zone spectator can see the goal from the SW or east (E) and see and hear how the fans experienced the moment.
COLLECTIVE MEMORY
Memories are recorded in our minds, although they become collective memory since they are experienced by many fans at the same time. In order to create a bigger experience, as fans "check in" to the venue, the system will know where they are sitting or seeing the match/show from (i.e. the pit, seat, stand, etc.) as their location is known by also using it for "Connect a Fan" and they are using their camera. They will have the option of taking a picture and/or writing a message about their experience, which added to all the other fans, will then create a 360 picture of the match/show. Fans will then be able to go back to the app & website, and see the different pictures as well as reading the different messages that fans have left for that day. For the fans that are watching through "Connect a Fan", it will be possible to tag and write a message, according to who they were viewing, thus making a bigger network of experiences that have now become a collective memory.
SOCIAL MEDIA
As part of the creation of this experience, the system may have its own social media account, which will allow fans around the world to be closer with other fans and their clubs, bands, etc. It will be compatible to the already existing social medias (Twitter (RTM), Facebook (RTM), tumblr (RTM), etc.) with the difference that only members of the website will be allowed to trade points, images, videos and related uploads.
EXAMPLE - REAL MADRID
Facade and Stadium
The facade is designed to be a reflection of the passion and history of the club and its fans. The way it communicates and evolves enables the stadium, the heart of Real Madrid (RM) to become a living fan that transcends time and distance. The facade is 43 million pixels, making it the largest screen in the world, by a lot. The idea of the lighting is to produce an ever evolving sculpture. Where the east and the west facade will be running content that shows the history and passion of the club; the north and the south will work more as atmospheric pieces.
On match day, all the facades will beautifully coagulate to transform into one massive screen, on which the names of fans that have checked in or arrived at the stadium will be projected.
This merging of all of the screens merges all of the elements of Real Madrid, its fans, history and future; creating one screen, one community and one Real Madrid at which the stadium is the heart.
East and West Screens
On non-match nights, the east and west screens will show content of former players, goals and historic moments. They will also be able to show sponsorships and brands that want to be associated will RM. All the content will be displayed in white so as to create a cohesive enthusiasm for team colours (Los Blancos). The facade will be interactive with fans that are both close and far away from the stadium thus giving the possibility of giving people the feeling they are close to what's happening in the stadium. This allows the fans and the stadium to have a mutually influential bond, regardless of how close they are to one another fans can still connect, send messages and the stadium can send exclusive content and generate excitement and atmosphere. The opacity of the facade will vary depending on whether it's a day after a match or 3 days away etc., this will be part of the impact the stadium will have.
North & South Screens
On non-match days these screens will act as a sculpture based on James
Turrell. The facade will not light up altogether, with some spaces lit and others not, thus giving an architectural feel to the stadium. Spaces will be lit on 50-40% and a second layer will be lit at 60% thus giving depth to the space.
On match day the facade will be produced using louvers containing 24mm LED screens. These louvers will have 3 positions; close, mid and open. This will provide different angles to the light. In an open position the entire screen becomes available; in the closed position only the middle 2 sections will be available. In the mid position they will act as awash. Behind each of these louvers there will be 2 lamps that will help to light the internal space. These louvers will have independent motion/movement which will be affected by the sun, temperature and various other sensors or factors that will create. The old stadium will be lit with LED fixtures that will help accentuate the space and while adding depth to the building.
Commercialising the facade
The facade could potentially be utilised to run advertisements, games, movies, and make it the building that people go visit when special moment happens in the country, city, and club.
As the facade/stadium becomes a distinct character in the fans lives, one that delivers messages and connects with them it will be possible to access them as consumers as well as fans.
App_ The facade in combination with the app will allow for an ever-evolving building that connects to the fans and becomes the heart of the team and the city, making it a living human being.
The app is the tool that allows fans of all different locations to become part of the action. Instead of viewing distance as potentially negative, this app embraces the distance, interacting with RM fans all over the world in a way that evolves to show different parts of the history and passion that the fans have for club depending on location and proximity to the stadium.
The app gives fans from all over the world the opportunity to interact with the stadium and the content of the facade, other fans and connect to the match in an inspiring new way; through the eyes of the fans at the stadium.
Features
The app delivers an augmented reality (AR) 'fan finder' that allows fan to see the global audience simply by holding up their device. They can also hold up their device to view an exclusive AR player intro with videos, statistics, goals scored etc. the app can be used as a guide to find their seat, the bar, a restaurant or friend. As well as receiving a slice of the action onto their device, fans can create a fan pixel, much like the devices in the London 2012 Olympics. This is where fan devices are all held in the air to create a giant screen where messages and effects can be broadcast.
Fans will be able to unlock badges and collectables for checking in, participating in activities and using the app. These points will be scored and can be redeemed against digital goodies such as ring tone or skins/themes for their PCs and devices.
This new Real Madrid Experience brings the atmosphere, the joy and the community to any fan that uses the app, making the match and club feel tantalising close even if you are five thousand kilometres away.
User Example
USER 1: CARLOS - SEASON TICKET HOLDER
Day 1 - 2 days day before the match:
• Receives notification about the match - and links to news items; • Opens app to buy the digital program in advance, to read and get in the mood;
• Watches clips of last game;
• Posts event to Facebook to meet for a meal with friends after match.
Day 2 - Match Day:
• AM: notification of the match - team line up and travel news;
• Takes the app quiz on the way to the stadium - earns more star points;
• Messages his mates who he can see the GPS location of so they can meet up;
• Checks the bar queues and arranges to meet in the less busy area;
• He holds his device in the air and can see a giant image of Ronaldo performing tricks over the stadium;
• The stadium is pulsing with light and colour telling him the players are arriving;
• Arrives at the stadium, checks in and sees his avatar on the top 10% of players appearing on the video wall, earning him 10 more star points;
• Watches the on app view of players arriving, shows this to his friends while chatting outside;
• Goes into the stadium and uses the app to guide him to his seat, he watches the match and with a cue from his device holds it the air sees a huge message go out to celebrate a winning goal;
• He then Accesses the live stream of dug-out/dressing room photos that are pushed to his phone and he can see Mourinho chatting to his team;
• He takes photos and videos during the game and are instantly shared to the 'fan cams';
• After the game he goes to the stadium store, buys an exclusive top only available to those with over 100 star points.
Day 3 - the day after:
• Views stats of the players - and the set of info graphics about the match which he can see also displayed on the stadium; • He watches the highlights of the match and sees the comments and twitter feed from his friends around the game;
• Highlights are interspersed with exclusive photos from behind the scenes;
· He rates the players and then submits questions to Ronaldo to be answered in a few days time.
Referring to Figure 1, a media content distribution system 1 is schematically illustrated. The top portion of Figure 1 shows elements of the system 1 which are disposed insider an event venue 5, while the bottom portion of Figure 1 shows elements of the system 1 which are disposed outside the event venue 5. The event venue in the present case is a football stadium, but it will be appreciated that it could instead be a cricket ground, a race track, a theatre or any other event venue, sporting or otherwise. Within the event venue, visitors are using personal electronic devices 62a, 62b, which may be mobile telephones, ipods, ipads or camera devices, to capture footage of the event, in this case the football match. The footage captured by the devices 62a, 62b is transmitted via a wireless hub 20 to a media content distribution device 10. It will therefore be understood that the personal electronic devices will be required to have a wireless capability. The footage is provided by the transmitting personal electronic device in association with the user ID of the user, a device ID indicating the type or identity of the transmitting device, and in some embodiments the location of the user (e.g. seat identifier or GPS location). Some or all of the footage may optionally be provided to an external content store 70, for access by users at a later time (for example subsequently to the event).
The media content distribution device 10 is operable to stream the received footage to another device on request. The requesting device may be a device 62c, also present within the event venue, one of devices 64a, 64b immediately outside the event venue, or one of devices 66a, 66b some distance away from the event venue. In the case of the devices 62c, 64a and 64b, these are all within range of the wireless hub 20, and therefore the footage may be streamed from the media content distribution device 10 to these devices via the wireless hub 20. This permits visitors to the event venue (inside or outside) to be able to receive footage reliably, at high quality, and in real-time. The devices 66a and 66b are outside of the range of the wireless hub, and are therefore required to access the footage via a telecommunications network 50 and the Internet 30.
Each of the devices 62a, 62b, 62c, 64a, 64b, 66a, 66b has installed thereon the app described above, and is subscribed to the service. The app may cause the personal electronic device to register itself with the media handling device when the personal electronic device enters the event venue, or the vicinity thereof. This registration may take place automatically (based for example on GPS location, or by detection of the presence of the wireless hub 20), or when the app is launched. Alternatively, there may be a registration function selectable by the user, which causes the device to register itself and the user as present at the event venue.
The requesting device may actively select a particular media source to watch. While this selection could be made from a list for example, preferably the selection is made using an event venue representation which permits the user of the requesting device to see where within the event venue currently transmitting media sources are present. This may be in the form of a plan or isometric view of the event venue, in which selectable media sources are visibly presented (for example by flashing icons or highlighted portions). The user of the requesting device is able to select one of these media sources using the event venue representation, for example by clicking or tapping on an icon/highlighted portion. As a result, if the user of the requesting device would like to watch a football game from a location near the goal for example, a media source from this area can be selected. Whether or not an event venue representation is used, a media source may be chosen either by the user specifically selecting a particular transmitting media source, or by the user selecting an area or zone within the event venue, from which a specific media source is allocated by the media handling device. In this way, the user is able to choose from where they would like to be able to watch the event, while the media handling device is able to take care of choosing a specific device - for example taking into account image capture quality, or favoured image capturing users.
The event venue representation is populated with media sources based on the currently received footage from transmitting devices, the location (seat identifier or GPS location) of the users/transmitting devices, and optionally a quality indicator providing some indication of the quality of the footage in terms of resolution/frame rate etc. The quality of the footage may be determined from a subscriber database, as will be explained below.
A subscriber database 40 is provided which stores details of users of the service, as well as details of any devices which those users have registered to the service. The database may use the following fields:
• User ID (unique)
• User name
• Password
· Current GPS location (or position determined by an alternative positioning service or feature)
• Current seat position
• Device 1 ID
o Device 1 Specification
· Device 2 ID
o Device 2 Specification
• Accrued credit (e.g. Star points)
The User ID is a unique identifier for the user. The User name is also stored. Access to the service may be password protected, and accordingly the database stores a password to be checked against a password entered by the user (or automatically entered by an app on the user's device) when the user accesses the service using their device. The current GPS location may optionally be stored in the database, but might instead be provided directly to the media content distribution device 10 along with the footage. Similarly, the current seat location may optionally be stored in the database, but again might instead be stored on the user's device and provided directly to the media content distribution device along with the footage. Each device which a user registers to the service is also identified (by a Device ID) in the database in association with the user. Each device may have an associated device specification entry indicating relevant video image capture characteristics of the device. For example, the frame rate, resolution or optical characteristics of the device could be stored, or alternatively a "quality rating" could be tagged against the device specification entry. Finally, accrued credit, for example the star rating, for the user is stored. This credit may be spent or used to access certain features of the service.
When the media content distribution device 10 receives footage from a device/user, a user ID and device ID may also be provided, permitting the media content distribution device 10 to obtain the quality rating for the device from the subscriber database, or to determine the probable quality of the footage by referring to the subscriber database using the user ID and device ID and accessing the associated device specification. The location of the user providing the footage may be based on a GPS position - either provided directly to the media content distribution device 10, or provided first to the database 40 and then accessed from there. Alternatively, the location of the user providing the footage may be based on a seat position associated with the user. The seat position may be determined by pre-allocating the seat to the user at the time of purchase, or the time of entry into the event venue, and recording the seat allocation in association with the user on the database 40. Alternatively, the user may enter his seat number into the app on his portable electronic device (on purchase, or while at the event venue for example), which will in turn provide this information to the database via the Internet. Again, the seat position may instead be stored on the user's device and provided directly to the media content distribution device along with the footage. In any of these cases, the position information is used to set the position of the media source on the event venue representation.
Referring to Figure 2, an example event venue representation for a football stadium is schematically illustrated. The football pitch is provided in the centre of the figure, while the various seating areas are shown around the outside of the pitch, designated by alphanumeric codes. Optionally the position of the viewing user could be represented on the event venue representation, for example by an icon (for example the user's avatar) or a flashing dot, permitting the user to identify their own position at the event, and position of other media sources relative to himself. The view of Figure 2 is the top level event venue representation showing the whole event venue. Each seating area shown in the top level representation of Figure 2 corresponds to an area of seating, such as that shown schematically in Figure 3. Any seating area shown in the top level representation within which a currently transmitting media content source (that is, a user/device currently transmitting footage to the media content distribution device 10) is present may be highlighted to indicate the presence of a transmitting source to a viewer of the event venue representation. Any seating area within which a currently transmitting media content source is present can be selected, resulting in the display switching to show a close up (zoomed) view of that seating area. A close up of the seating area Dl of Figure 2 is shown in Figure 3. As can be seen in Figure 3, seat rows A to G form part of the seating area Dl, each row having either 15 or 16 seats. It can also be seen that certain seats within the zoomed view of the seating area Dl are highlighted - this includes seats A2, A7, B12, C5, CIO, D14, D15, E4, E9, F3, G6 and G13. These highlighted seats can be selected, for example by clicking or tapping on them, to access the footage being transmitted from the users/devices associated with these seats.
It will be appreciated that the event venue representation is a real time representation of where footage is available from at any given time. Accordingly, seating areas may become highlighted and switch off throughout the event, as users turn on and off the capture and transmit functions of their devices.
It will be understood that streamed footage from a given device may suddenly terminate when the user of that device decides to stop capturing and streaming the footage. In this case, one of a number of things may happen. In one example, the media content distribution device may intelligently and smoothly switch to streaming footage from a different image capture device nearby to the image capture device which has ceased to transmit footage. In this way the viewing user should continue to experience a similar view of the event. In another example, termination of a footage stream may cause the event venue representation to be displayed, with the viewing user being prompted to select another source.
In some cases the media content distribution device may switch streaming from a first device to a second device even if the first device does not discontinue its transmission. One example is where the second device is in a similar location to the first device but has higher quality image capture characteristics. Another example might be if a "favourite" source of the viewing user (e.g. a friend) starts to transmit footage - in this case the viewing user's preferences might dictate that transmissions from certain users always take precedence, or take precedence over other nearby users and devices.
The quality of the devices providing the footage may be indicated for example by having different coloured highlighting for seats associated with footage of different quality. It will be appreciated that other visual indicators could instead be provided.
"Favourite" sources may be distinguished in the event venue representation from other sources, either by a different colour, or an annotation or any other visual indication. This enables a viewing user to preferentially select footage from their friends, from participating celebrities at the event, or from other users whom they have found to capture event footage in a manner which they like.
In another example, at a sporting event media sources could be tagged (in the database for example) as being associated with a particular team. An indication of the team could then be visually identified on the event venue representation, permitting the user to select media sources associated with a particular team. Either as an alternative or an adjunct to providing visual representations of e.g. favourite sources, the quality of sources or a team association of sources, filtering could be required, permitting the user to view only favourite sources, high quality sources or sources associated with a particular team, for example.
The event venue representation may indicate a direction in which the transmitting image capture devices are facing, giving the user an idea of what the associated footage will contain. This indication could be provided by a directional arrow originating at the location of the media source and pointing in the direction which the image capturing device is pointing. This feature would require the image capture device to transmit an indication of its current facing to the media handling device along with the footage. Hardware facilitating the self- determination of the orientation of personal devices such as smartphones is readily available, and can be used to provide this function.
In some embodiments relating to sporting events such as football, the direction of facing of image capture devices may also be used in combination with techniques such as goal line tracking or ball location tracking to permit the auto-selection or filtering of camera device directed at a specific area of interest at the event. In other words, the media handling device may compare a direction of facing of each camera device at a particular time with the location of (for example) the ball at that same time to determine whether the ball is within the field of view of the camera device, or preferably in the centre of the field of view of the camera device. A similar principle could be applied to other objects of interests in other contexts, for example the location of racing cars on a racing circuit.
In addition to providing the location of user-originating media sources, the event venue representation may also indicate the location of fixed-camera installed within the event venue, or mobile cameras utilised by professional cameramen at the event. These media sources may also be selected by users for playback on their personal device vide the event venue representation.
When footage is captured, it may simply be streamed (only stored temporarily) from the image capture device to the viewing device via the media content distribution centre, or it may be stored at the content store 70 in addition to being streamed to requesting users. If it is stored in the content store 70 then it can also be accessed at a later time. In one example, most footage is streamed, but some footage is marked by the user capturing that footage as being for permanent storage. This footage, when transmitted to the media content distribution device, is both streamed to any requesting users and also stored in the content store 70. In another example, at least some of the footage being transmitted to the media content distribution device is stored at the image capture device (but not at or by the media distribution device) at the time of capture, and is then uploaded to the content store 70 at a later time, for example after the event has finished. Each user may only be permitted to store a limited amount of footage (in terms of one or both of data capacity and duration), in order that the storage facility of the content store is not overwhelmed. In contrast, in some cases a user may be encouraged or prompted to store footage, for example if there are too few media sources in their area - with this being rewarded by extra points.
Footage stored into the content store 70 is stored in association with the location (e.g. GPS location or seat location, or derived location) from which the footage was captured, the identity of the user who captured the footage, the quality of the footage (e.g. characteristics of the image capture device), and the time of capture of the footage. This information may be provided from the image capture device to the media content distribution device or from the subscriber database, or a combination of the two. For example, the time of capture, location and footage may be provided by the image capture device along with the user ID and device ID, and the user ID and device ID may be used to obtain the quality information from the subscriber database. Alternatively, the quality information may be stored at the image capture device and provided to the media content distribution device along with the footage and other data. Rather than storing a specific GPS location, the GPS location may be converted by the media content distribution device into a location within the event venue. This derived location might be a seat position, or alternatively a seating area (e.g. Dl) or a zone within a seating area. This will be sufficient for the event venue representation to highlight an appropriate position for the footage. The uploading user may also store additional content, for example text describing their personal memories of the event, or photographs or sound clips captured during the event.
Subsequent to the upload of footage and any related content to the content store 70, other users are able to access the footage via the service. Access may be via the app, or via a website. Users are provided with a non-realtime event venue representation. This is similar to the event venue representation described above, but it provides navigation tools for navigating the event venue representation with respect to time, occurrences, or periods within the event. For example, an interactive timeline may be provided on the user interface as schematically illustrated in Figure 4. In Figure 4, a timeline 100 for a football match is shown. The timeline 100 is broken down into periods, these being pre-match, first half, half time, second half, extra time and post- match. It will be appreciated that this is only one example breakdown of a football match. It will be further appreciated that a similar principle applies to other sporting and non-sporting events. For example, motor racing could be broken down into laps, and cricket matches into overs. A slider button 110 is provided which can be dragged along the timeline to access content associated with any given moment in time. When the slider button 110 is dragged to a particular position on the timeline 100, the event venue representation is updated to highlight those media sources for which stored content captured at the time indicated on the slider 100 is available. In an alternative embodiment illustrated schematically in Figure 5, a series of buttons are provided for selecting different periods during the event. When a button is selected, the event venue representation is updated to highlight those media sources for which stored content captured during the period indicated by the selected button is available. It will be appreciated that multiple buttons could be selected, resulting in media sources which captured footage during some or all of the selected periods being highlighted. It will further be appreciated that the interactive timeline of Figure 4 and the period selection buttons of Figure 5 could be provided in combination, permitting a user to access content either from a particular moment in time, or during a particular period.
Also in Figure 5, several thematic buttons are also shown, these being a "Goals!" button, a "Saves!" button and a "Fouls!" button. These buttons can be used to access footage relating to each of these themes. The footage associated with each theme may be identified in the database with tags - manually entered by the uploading user for example, or by associating with a theme footage captured at a time during the event at which an occurrence corresponding to that theme was known to have taken place. This could be determined automatically by the system based on event related information entered by the system operator. For example, the system operator may associate with an event a dataset indicating times during the event of various key occurrences, such as goals, saves or fouls for example, resulting in a list of time instants or periods at which each given event type occurs. An example is shown in the table below. Based on this information, the selection of "goals" by the viewing user will cause the event venue representation to be populated with identifiers of the availability of media sources which were being captured at the times 17m32s, 27m02s and 91mlls.
Figure imgf000031_0001
Once the user has navigated temporally (using the slider or buttons) and spatially (using the event venue representation) through the event venue representation and has identified a content source he is interested in viewing, he is able to select that content source, whereupon playback of the footage will commence. The footage may commence from the start of the clip, or may start from the point in time selected by the user or derived from the user's input.
In addition to watching the stored footage, the viewing user is also able to attach text, sound clips or images to the footage, which are the accessible to other users subsequently accessing that item of footage. In particular, the attached text, sound or image data is then stored at the content store 70 in association with the footage itself. In this way, a viewing user is able to associate their own memories of the event with footage captured by someone else. Other views accessing the same footage at a subsequent time, will have access not only to the footage and related content uploaded by the user who captured the footage, but also any additional materials uploaded in association with the footage by previous viewers.
Referring now to Figure 6, an alternative event venue, in this case a theatre hall, is schematically illustrated. It can be seen that Figure 6 shows a seating plan for a theatre hall, and this may constitute an event venue representation provided to a user wishing to access footage of the event (either in real-time or in relation to a past event, in each case as described above). It can be seen from Figure 6 that certain seats are shaded - these seats represent the location of available media sources within the theatre hall. In the context of realtime (during event) viewing, these shaded seats represent the location of currently transmitting image capture devices, while in the context of viewing a historical (ended) event, the shaded seats represent the location of media sources which captured data at a particular selected time, period or occurrence within the event. It will be appreciated that the detail present in Figure 6 may be inappropriate for a small hand held device. In this regard, a simplified top level event venue representation such as that shown in Figure 7 may be used instead.
In Figure 7, a simplified view of the theatre hall is provided, broken down into multiple selectable seating areas 210, 220, 230, 240, 250, 260 and 270. A user is able to select one of these seating areas which is of interest, resulting in a close up view of that seating area being shown. For example, selecting the seating area 260 in Figure 7 may result in the close up of Figure 8 being presented to the user. In Figure 8, three blocks of seating are shown, with some seats within each block being shaded to represent that footage is available in relation to each of those seats. The user is able to select these shaded seats to gain access to the footage associated with that location. In particular, the selection of a shaded seat would trigger the streaming of footage from an image capture device in the case of real-time operation (connect-a-fan) or the playback of stored footage in the case of viewing after the event.
Referring to Figure 9, another alternative event venue is schematically illustrated, in this case a motor racing circuit. It can be seen that Figure 9 shows a plan view of a racing track, and this may constitute an event venue representation provided to a user wishing to access footage of the event (either in real-time or in relation to a past event, in each case as described above). The event venue representation is marked with 20 zones, each of which is selectable to gain access to media content associated with that zone. In some cases, selection of a zone may trigger the playback or a random or best available (based on quality information for example) item of footage associated geographically with that zone. Alternatively, the selection of a zone may result in a list of available content sources being presented to the user, or a close up view of that zone from which a user can select a desired media source or even a further closer view. It will be understood that where an event venue is particularly large that a hierarchical multi-level event venue representation may be required in order to provide effective access to media sources. Figure 10 is an example of a close up view within the event venue representation of Figure 9. In Figure 10, an indication of the location of the track is provided, along with an indication of the direction in which cars are travelling (in this case provided by the car representations and related arrows). Various seating and standing areas are identified in Figure 10, identified by numbering. These areas may be selectable to obtain access to more detailed seating plans similar to that shown in Figure 8, or the location of available media sources may be indicated by icons or highlighting directly on the representation of Figure 10. Again, such media sources are selectable in order to initiate playback of footage relating to these.
Referring back to Figure 9, it will be understood that a users capturing footage of the race may be positioned all round the track at many or all of the 20 identified locations. A particular viewer will be located only at a single location at any given time. However, that viewer may wish to track the action at pole position throughout a lap of the race. The user can achieve this by selecting appropriate media sources around the track as the car in pole position progresses. In one embodiment, the system is operable to sequentially select media sources at different positions around the track to follow the progress of a particular car, for example the car in pole position or a car selected by the viewing user. The system in this case is able to monitor the location of the selected car (for example based on GPS trackers affixed to each car), and repeatedly switch the media source presented to the viewing user such that the selected car is always likely to be in view. The media sources selected could be those closest to the current position of the selected car, and/or those oriented towards the current position of the selected car (based on internal sensors of the image capture devices). In one embodiment the event venue representation identifies where key events are happening, for example the current position of the car in pole position, the location of a crash or other incident, or a pit stop. In addition, on-car cameras may also be provided, and may again be accessible to a user using the event-venue representation.
In both the theatre hall and racing track embodiments, the event venue representations can be navigated after the event both with respect to location and with respect to time/occurrence in like manner to the football stadium embodiment described above.
Referring to Figure 11, a schematic flow diagram of the connect-a-fan method is shown. At a step SI, an image capture device, in this case device 62a, transmits footage to the media content distribution device 10 via the hub 20 (not shown). The footage received also includes metadata such as the user ID and device ID of the user of the image capture device and the device itself respectively, and optionally location information (for example GPS coordinates) and information regarding the current orientation of the image capture device, if this is not to be retrieved from the database 40. At a step S2, the media content distribution device 10 obtains information about the user of the image capture device and the device itself from the database 40. The obtained information may include quality information about the image capture device 62a (based on the device ID) and in some cases the location of the user where this has been stored in advance - as might be the case for allocated seating associated with tickets booked via the app or seat numbers entered into the app and uploaded to the database 40. Based on this information (and based on similar information obtained in relation to other footage received from other image capture devices) a data structure representing the location and optionally quality of available footage streams is formed. Optionally at a step S3, at least some of the received footage is stored in the content store 70 along with the identity of the user which provided the footage, the time of capture, the location from which the footage was taken, and quality information about the device which generated the footage. At a step S4, a viewer device 62c requests access to an event venue representation displaying the location (and optionally quality) of footage streams which it is able to currently access. In return, an event venue representation is generated based on the data structure and passed back to the viewer device 62c at a step S5. At a step S6, the user of the viewer device 62c selects desired footage using the event venue representation, consequently sending a request for this footage to the media content distribution device. At a step S7, the requested footage is provided (streamed) from the media content distribution device 10 to the viewer device 62c.
Referring to Figure 12, a schematic flow diagram of the collective memory method is shown. At a step Ul, captured content is uploaded from an image capture device 62a to the content store 70. The captured content is uploaded in association with the time of capture, the user ID of the user and the device ID of the image capture device. The location of the user may also be uploaded. This step may take place at the time of capture during an event, or subsequently. At a step U2, the content store 70 obtains information about the user of the image capture device and the device itself from the database 40. The obtained information may include quality information about the image capture device 62a (based on the device ID) and in some cases the location of the user where this has been stored in advance - as might be the case for allocated seating associated with tickets booked via the app or seat numbers entered into the app and uploaded to the database 40. The captured content and the other uploaded and acquired information referred to above is stored in association in the content store. In addition, the upload step Ul may include the association of additional text, sound or image content with the captured content and the upload of this additional content to the content store for storage.
At a step U3, a user requests an event venue representation from the content store 70. This step may be triggered by the user opening the app, tapping an icon or an external view of and representing the event venue and/or event, or similar.
At a step U4, the content store 70 generates an event venue representation based on the uploaded content corresponding to a particular event at a particular venue. The event venue representation is populated with visual indications of content which is available at particular times during the event and at particular locations within the venue. The locations and times for each item of content are derivable from the time of capture information and location information stored at the content store 70 in association with the footage. The event venue representation may also indicate whether a particular item of content has additional information ("memories") associated with it, for example in the form of text-based descriptions of recollections of the event, or audio and/or images associated with a the uploading user or another user's memory of the event. This indication may be given visually, for example by colour coding, icon shaping or any other technique.
At a step U5, the viewer uses the event venue representation to identify and select a desired item of content, whereupon this item of content is requested from the content store 70. Then, at a step U6 the content store 70 provides the requested item of content to the viewer, along with any additional information ("memories") associated with that item of content. The viewer is then able to watch the content, and examine the additional information. Then, at a step U7 the user is able to upload their own additional information ("memories"), in the form of text, audio or image, to be associated with the item of content. The viewer's additional information will then be accessible to subsequent viewers accessing the item of content.
It will be appreciated that, while the content store has been described, for simplicity, in Figure 12 as including the logic for receiving, handling and providing content, and the logic for generating and providing an event venue representation on demand, it will be appreciated that a separate entity may be provided for this purpose in some embodiments.
Referring to Figure 13, an augmented reality system for accessing media content is schematically illustrated. In Figure 13, a user 310 possessing an electronic device 315 is able to view an augmented reality image of various event venues 320, 325, 330, 335 by aiming his electronic device 315 at the geographical location of one of the devices, wherever in the world those event venues are. In the present case the event venues 320 and 335 are sports stadiums, the event venue 325 is a theatre hall and the event venue 330 is a racing circuit. Each of the event venues in the example of Figure 13 is a different distance from the user 310. In particular, the event venue 320 is 4700km from the user, the event venue 325 is 179km from the user, the event venue 330 is 23km from the user, and the event venue 335 is 1km from the user. Figure 14 schematically illustrates the electronic device 315 of Figure 13 showing an augmented reality image of the event venue 320, at which the electronic device 315 is aimed. For event venues beyond a certain predetermined range of the viewing user, this may be the only content available to the user. For closer event venues, access to additional content may be provided, for example by clicking or tapping on the external view.
Referring to Figure 15, a schematic flow diagram of the augmented reality method is shown. At a step Wl, footage is captured and transmitted from image capture devices to a media distribution device. The image capture devices may be portable electronic device of users at the event venue, or fixed or portable camera devices associated with the event venue itself. The footage received also includes metadata such as the user ID and device ID of the user of the image capture device and the device itself respectively, and optionally location information (for example GPS coordinates). Where image capture devices associated with the event venue are used, a database may store a predetermined location for each of these devices. At a step W2, data associated with the captured footage is obtained from the database. The obtained information may include quality information about the image capture device (based on the device ID) and in some cases the location of the transmitting user where this has been stored in advance. Based on this information (and based on similar information obtained in relation to other footage received from other image capture devices) a data structure representing the location and optionally quality of available footage streams is formed. Optionally at a step W3, at least some of the received footage is stored in the content store 70 along with the identity of the user which provided the footage, the time of capture, the location from which the footage was taken, and quality information about the device which generated the footage. At a step W4, a viewer device 62c requests access to an event venue representation displaying the location (and optionally quality) of footage streams which it is able to currently access. In return, an event venue representation is generated based on the data structure and passed back to the viewer device at a step W5. It will be appreciated that the data structure, and thus the event venue representation, includes only the media sources which the viewer device is permitted to access based on its location relative to the event venue.
At a step W6, the user of the viewer device selects desired footage using the event venue representation, consequently sending a request for this footage to the media content distribution device. If the desired footage is stored (non real-time) footage, then at a step W7 a request for the footage is sent to the content store, and returned to the media distribution device at a step W8. The retrieved footage is then provided to the viewer device at a step W9. If on the other hand the footage is real-time footage then the steps W7 and W8 are not required and instead footage from the selected source is streamed from the capture device to the viewer device at the step S9. Referring to Figures 16A and 16B, an access restriction based on distance from the event venue referred to above is schematically illustrated. In Figure 16A, a viewer device 620 is shown at a relatively large distance from a media venue 610. At this distance, the viewer device 620 is only able to access a single internal camera 630 at the event venue 610. It will be appreciated that at an even greater distance the viewer device 620 may not be able to access any internal cameras at all, limiting the display to an exterior view of the event venue as per Figure 14. Figure 16B the viewer device 620 is much closer to the event venue 610. In this case, the viewer device 620 is not only able to access the internal camera 630, but also additional cameras 640a, 640b, 640c, 640d. In other words, as the viewer device 620 approaches the event venue 610, more internal cameras are available to view. The cameras 630, 640 may be user devices used by fans within the event venue, or fixed or mobile camera devices installed within the event venue and operated by the event organisers. The camera devices available to the user are selected from cameras closest to the viewer device, providing the experience of a "window" into the event venue from the user's real world location.
Referring to Figures 17A and 17B, an access restriction based on position around the event venue referred to above is schematically illustrated. In Figure 17A, the viewer device 620 is located just outside the event venue 610. The viewer device 620 has access to the internal cameras 650a, 650b, 650c closest to his location outside the event venue. By accessing footage from cameras closest to the viewer device, the experience of a "window" into the event venue from the user's real world location is provided. When the user walks around the event venue and attempts to access footage from a different position, such as the position shown in Figure 17B, a different set of internal cameras 660a, 660b, 660c are available to the user. It will be appreciated that, since the augmented reality system can operate on stored footage after the event as well as or instead of live footage being streamed from the event, the references to internal cameras in relation to Figures 16 and 17 can be replaced with media sources, and the position within the event venue from which they were captured. In other words, a user can move around an event venue after the event and "look in" using their viewer device as if the event was currently occurring. In one example, the user could turn up at an event venue after an event has finished (potentially weeks, months or years later) and trigger playback of footage associated with the event.
In addition to selecting internal cameras based on the position of the viewer device, the selection of cameras may be based in part on the direction in which the viewer device is facing - determined for example by existing functionality of the device itself (electronic compass technology). An internal camera may be available for access if it is facing in the same or a similar direction as the viewer device. It will be appreciated that the cameras available for access may be determined based on a combination two or more of the distance, position, and orientation of the viewer device with respect to the position and/or orientation of the camera devices.
It will be understood that personal electronic devices such as mobile phones, cameras, ipods and ipads often capture audio data along with video image data. However, the quality of the audio capture achieved by these devices is often poor. In each of the above described examples (connect-a-fan, collective memory and augmented reality) a user may wish to playback captured video with a higher quality than can be provided by the capture devices. This is particularly likely to be the case of collective memory and augmented reality, where the user's entire experience is based on the video footage they are experiencing on their playback device. In order to provide higher quality audio, the audio data captured by the image capture devices may be replaced with audio data of a higher quality. The audio data of a higher quality may be audio data professionally captured at the event venue and provided to the media handling device, or audio data from another image capture device having a better audio capture capability. In some examples the user may be able to selectively switch between the actual audio associated with a media source, or higher quality substituted audio. In relation to connect-a-fan, or real-time augmented reality, the audio substitution could be achieved at the media handling device by stripping the audio from audio/video data received from an image capture device, and inserting a higher quality audio stream captured in parallel. It will be appreciated that appropriate synchronisation between the audio/video stream and the substitute audio stream would need to be provided. In relation to collective memory or non-real-time augmented reality, it will be recalled from the above that the video data is stored in association with date/time information indicating the time at which the video data was captured. A substitute audio stream could be stored separately from the stored video data, with its own date /time of capture information. The data /time of capture information of each of the video data and substitute audio data can be used to correlate the two streams together, for combination either for playback only, or for storage as a higher audio quality version of the video file.

Claims

1. A media content distribution system, comprising:
a media handling device for receiving and distributing media data; and a plurality of user devices, each user device having a camera function and being operable in a media transmit mode to transmit media data of an event occurring at an event venue to the media handling device, each user device being operable in a media receive mode to receive media data from the media handling device; wherein
when operating in the media receive mode, a first user device from amongst the plurality of user devices is operable to select a media source from a second user device among the user devices which are currently operating in the media transmit mode and the media handling device is operable to stream the media data of the event being received from the second user device to the first user device.
2. A media content distribution system according to claim 1, wherein the first user device is operable to select the media source using an event venue representation indicating the location within the event venue of the user devices operating in the media transmit mode.
3. A media content distribution system according to claim 2, wherein the location within the event venue of each user device operating in the media transmit mode is determined from one of a GPS position determined by the user device and a seat or area identifier entered by the user or otherwise allocated to the user.
4. A media content distribution system according to claim 2 or claim 3, wherein the event venue representation comprises a first representation indicating the location of plural selectable areas within the event venue, selection of an area within the event venue representation causing the display of a second representation indicating the location within the selected area of user devices operating in the media transmit mode.
5. A media content distribution system according to claim 2 or claim 3, wherein the event venue representation comprises a hierarchy of representations, a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area of user devices operating in the media transmit mode or a further intermediate level representation.
6. A media content distribution system according to any one of claims 2 to 5, comprising a user database of device users, each user having a user profile associated with one or more user devices, the user profile comprising one or more of a current location of a user device associated with the user, a specification for the one or more user devices and quality information indicating the image capture capability of the one or more user devices.
7. A media content distribution system according to claim 6, wherein the user database comprises device information regarding at least some of the user devices, the device information indicating the media capture capabilities of the user device, wherein an indicator of media quality is displayed on the event venue representation in relation to at least some of the media sources, the indicator of media quality being based on the device information stored in the user database.
8. A media distribution system according to claim 2, wherein if the user device transmitting the selected media source exits the media transmit mode, the event venue representation is automatically presented on the first user device to enable the selection of an alternative media source.
9. A media distribution system according to claim 2, wherein if the user device transmitting the selected media source exits the media transmit mode, an alternative media source is automatically selected by the media handling device based on its position relative to that of the originally selected media source.
10. A media content distribution method, comprising the steps of:
transmitting, from first user devices having a camera function and being in a media transmit mode, media data of an event occurring at an event venue to a media handling device;
selecting, at a second user device, a media source from among the media sources transmitting from the first user devices which are currently operating in the media transmit mode;
to stream the media data of the event being received from the selected one of the first user devices to the second user device.
11. A media handling device for receiving and distributing media data, the media handling device being arranged
to receive, from a plurality of user devices each having a camera function, media data transmitted by the user devices when operating in a media transmit mode, the received media data being of an event occurring at an event venue; and
to receive, from a user device operating in a media receive mode, a selection of a media source from among the user devices which are currently operating in the media transmit mode; and
to stream the media data of the event being received from the selected media source to the user device in the media receive mode which made the selection.
12. A user device having a camera function, the user device being operable in a media transmit mode to transmit media data of an event occurring at an event venue to a media handling device; and
in a media receive mode to select a media source from among a plurality of media sources corresponding to other user devices which are currently operating in the media transmit mode and transmitting media data to the media handling device;
in the media receive mode to receive the media data of the event being received from the selected media source and streamed via the media handling device to the user device.
13. A media content distribution system, comprising:
a media handler for receiving, storing and distributing media data; and a plurality of camera devices, each camera device having a camera function and being operable to upload media data of an event occurring at an event venue to the media handler, the media handler being operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data; and
a playback device operable
to access and display media data stored by the media handler;
to submit additional data for association with media data stored by the media handler, the media handler being operable to store the additional data in association with the media data; wherein
when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
14. A media content distribution system according to claim 13, wherein the playback device is operable to display a user interface comprising an event venue representation indicating the locations within the event venue at which uploaded media data was captured; and
wherein the playback device is operable to receive a user selection of uploaded media data via the user interface, and to display the selected media data along with any additional data stored in association with the selected media data.
15. A media content distribution system according to claim 14, wherein the playback device is operable to select stored media data to which additional data is to be associated using the event venue representation.
16. A media content distribution system according to any one of claims 13 to
15, wherein the playback device is a camera device.
17. A media content distribution system according to any one of claims 13 to
16, wherein the camera devices are operable to store media data locally during the event at the time of image capture, and to upload the media data to the media handler after the event.
18. A media content distribution system according to claim 14, wherein the event venue representation comprises a first representation indicating the location of plural selectable areas within the event venue, selection of an area within the event venue representation on the playback device causing the display of a second representation indicating the location within the selected area at which uploaded media data was captured.
19. A media content distribution system according to claim 14, wherein the event venue representation comprises a hierarchy of representations, a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area at which uploaded media data was captured or a further intermediate level representation.
20. A media content distribution system according to any one of claims 13 to 19, wherein the location within the event venue of each media source is determined from one of a GPS position determined by the user device which generated the media source, and a seat or area identifier entered by the user of the device which generated the media source, or otherwise allocated to the user of the device.
21. A media content distribution system according to claim 14, wherein an indicator of the media quality of at least some of the uploaded media data is displayed on the event venue representation.
22. A media content distribution system according to claim 14, wherein an indicator of whether an item of uploaded media data has additional data associated therewith is displayed on the event venue representation.
23. A media content distribution system according to claim 21, comprising a user database which stores device information regarding at least some of the camera devices used to capture the media data stored at the media handler, the device information indicating the media capture capabilities of the user device, wherein the indicator of media quality displayed on the event venue representation is based on the device information stored in the user database.
24. A media content distribution system according to claim 14, wherein the event venue representation indicates the availability of different uploaded media data for different time periods or occurrences during the event, the event venue representation being navigable at the playback device with respect to time or occurrences to present the user with access to the different uploaded media data.
25. A media content distribution system according to claim 24, wherein the playback devices are operable to select a particular time period of key occurrence during the event, and the indication of media data shown on the event venue representation is updated to reflect uploaded media data corresponding to the selected time or key occurrence.
26. A media content distribution system according to claim 14, wherein in an event play mode the indications of media data are continuously updated with respect to a progression in time through the event.
27. A media content distribution system according to any one of claims 13 to 26, wherein the additional data is one of description data, image data or audio data.
28. A media content distribution method, comprising the steps of:
uploading, from a plurality of camera devices to a media handler, media data of an event occurring at an event venue;
storing the uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data;
accessing and displaying media data at a playback device;
submitting from the playback device to the media handler additional data for association with an item of the stored media data, the playback device being a different device than the device used to capture and/or upload the media data; storing the additional data in association with the media data; wherein when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
29. A media handler for receiving, storing and distributing media data, the media handler being operable
to receive, from a plurality of camera devices each having a camera function, uploaded media data of an event occurring at an event venue;
to store the uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data;
to provide stored media data to a playback device;
to receive, from the playback device, additional data for association with media data stored by the media handler;
storing the additional data in association with the media data;
wherein when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
30. A playback device for accessing media data stored by a media handler according to claim 29; wherein the playback device is operable
to access and display media data stored by the media handler;
to submit additional data for association with media data stored by the media handler, the media handler being operable to store the additional data in association with the media data; wherein
when the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
31. A media content distribution system, comprising: a media handler for receiving and distributing media data; and
a plurality of camera devices, each camera device having a camera function and being operable to capture and transmit media data of an event occurring at an event venue to the media handler, the media handler being operable to stream the captured media data to device users or store the captured media data for later access; and
a playback device operable to access streamed or stored media data of an event via the media handler and display the accessed media data to a user; wherein
the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
32. A media content distribution system according to claim 31, wherein the media handler is operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device at the time of capture of the uploaded media data.
33. A media content distribution system according to claim 31 or claim 32, wherein the number of media data sources made available to the playback device is dependent on how close the playback device is to the event venue.
34. A media content distribution system according to claim 33, wherein the media handler is operable to provide access to more media data sources for playback devices relatively closer to the event venue than for playback devices relatively further from the event venue.
35. A media content distribution system according to any one of claims 31 to
34, wherein the subset of media data made available to the playback device is dependent on a current orientation of the playback device with respect to the event venue.
36. A media content distribution system according to any one of claims 31 to 35, wherein the playback device is operable:
to display a user interface comprising an event venue representation indicating the locations within the event venue at which media data was captured;
to receive a user selection of uploaded media data via the user interface; to present to the user the selected media data and any associated description data.
37. A media content distribution method according to any one of claims 31 to
36, wherein a playback device further than a predetermined distance from the event venue is provided access only to one or more exterior views of the event venue.
38. A media content distribution system according to any one of claims 31 to
37, wherein a playback device within or relatively nearby the event venue is provided with access to media data captured at locations within the event venue closest to the current location of the playback device.
39. A media content distribution system according to any one of claims 31 to
38, wherein an exterior view of particular event venue is displayed at the display device when the display device is aimed towards the geographical location of that event venue.
40. A media content distribution method, comprising:
capturing media data of an event occurring at an event venue;
transmitting the captured media data to a media handler; streaming the captured media data to device users or storing the captured media data for later access;
at a playback device, access streamed or stored media data of an event via the media handler and displaying the accessed media data to a user; wherein the playback device is restricted to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
41. A media handler for receiving and distributing media data, operable
to receive, from a plurality of camera devices each having a camera function, media data of an event occurring at an event venue;
to stream the captured media data to device users or store the captured media data for later access;
to provide the captured media data to a playback device for display to a user;
wherein the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
42. A playback device for accessing media data from a plurality of camera devices via a media handler, the camera device capturing media data of an event occurring at an event venue, the playback device being operable to restrict access to only a subset of the media data being generated of the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
43. A computer program which when executed on a computer is operable to cause the computer to perform a process according to any one of claims 10, 28 or 40.
44. A media content distribution system substantially as hereinbefore described with reference to the accompanying drawings.
45. A media content distribution method substantially as hereinbefore described with reference to the accompanying drawings.
46. A media handler substantially as hereinbefore described with reference to the accompanying drawings.
47. A user device substantially as hereinbefore described with reference to the accompanying drawings.
PCT/GB2013/052384 2012-09-13 2013-09-12 Media content distribution WO2014041353A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13771566.0A EP2896210A2 (en) 2012-09-13 2013-09-12 Media content distribution

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1216360.6 2012-09-13
GBGB1216360.6A GB201216360D0 (en) 2012-09-13 2012-09-13 System and method for improving a supporter's experience
GB1306151.0A GB2505978A (en) 2012-09-13 2013-04-05 Media content distribution system
GB1306151.0 2013-04-05

Publications (2)

Publication Number Publication Date
WO2014041353A2 true WO2014041353A2 (en) 2014-03-20
WO2014041353A3 WO2014041353A3 (en) 2014-05-22

Family

ID=47144227

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2013/052384 WO2014041353A2 (en) 2012-09-13 2013-09-12 Media content distribution

Country Status (3)

Country Link
EP (1) EP2896210A2 (en)
GB (2) GB201216360D0 (en)
WO (1) WO2014041353A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213596A (en) * 2018-03-28 2019-09-06 腾讯科技(深圳)有限公司 Switching method, device, computer equipment and storage medium is broadcast live
CN111512119A (en) * 2018-01-18 2020-08-07 电子湾有限公司 Augmented reality, computer vision and digital ticketing system
WO2020212761A1 (en) * 2019-04-19 2020-10-22 Orange Method for assisting the acquisition of media content at a scene
CN113094011A (en) * 2021-03-26 2021-07-09 联想(北京)有限公司 Screen sharing method, device and equipment and computer readable storage medium
EP4322534A1 (en) * 2022-08-12 2024-02-14 Rami Khatib Methods for viewing recorded events

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9888296B2 (en) 2015-03-27 2018-02-06 Bygge Technologies Inc. Real-time wireless synchronization of live event audio stream with a video recording
US11271648B2 (en) 2017-07-11 2022-03-08 Supreme Architecture Ltd. Spatial optical wireless communication system
US11812084B2 (en) 2020-03-31 2023-11-07 Northwest Instrument Inc. Method and device for content recording and streaming
US11395049B2 (en) * 2020-03-31 2022-07-19 Northwest Instrument Inc. Method and device for content recording and streaming

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050078170A1 (en) * 2003-10-08 2005-04-14 Cisco Technology, Inc. System and method for performing distributed video conferencing
US20050188399A1 (en) * 2004-02-24 2005-08-25 Steven Tischer Methods, systems, and storage mediums for providing multi-viewpoint media sharing of proximity-centric content
US20070127508A1 (en) * 2003-10-24 2007-06-07 Terry Bahr System and method for managing the transmission of video data
US20090148124A1 (en) * 2007-09-28 2009-06-11 Yahoo!, Inc. Distributed Automatic Recording of Live Event
US20100080163A1 (en) * 2008-09-30 2010-04-01 Qualcomm Incorporated Apparatus and methods of providing and receiving venue level transmissions and services
EP2403236A1 (en) * 2010-06-29 2012-01-04 Stockholms Universitet Holding AB Mobile video mixing system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2852769B1 (en) * 2003-03-20 2005-09-16 Eastman Kodak Co METHOD FOR SHARING MULTIMEDIA DATA
FR2959372A1 (en) * 2010-04-23 2011-10-28 Orange Vallee METHOD AND SYSTEM FOR MANAGING A CONTINUOUS BROADCAST SESSION OF A LIVE VIDEO STREAM

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050078170A1 (en) * 2003-10-08 2005-04-14 Cisco Technology, Inc. System and method for performing distributed video conferencing
US20070127508A1 (en) * 2003-10-24 2007-06-07 Terry Bahr System and method for managing the transmission of video data
US20050188399A1 (en) * 2004-02-24 2005-08-25 Steven Tischer Methods, systems, and storage mediums for providing multi-viewpoint media sharing of proximity-centric content
US20090148124A1 (en) * 2007-09-28 2009-06-11 Yahoo!, Inc. Distributed Automatic Recording of Live Event
US20100080163A1 (en) * 2008-09-30 2010-04-01 Qualcomm Incorporated Apparatus and methods of providing and receiving venue level transmissions and services
EP2403236A1 (en) * 2010-06-29 2012-01-04 Stockholms Universitet Holding AB Mobile video mixing system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111512119A (en) * 2018-01-18 2020-08-07 电子湾有限公司 Augmented reality, computer vision and digital ticketing system
US11830249B2 (en) 2018-01-18 2023-11-28 Ebay Inc. Augmented reality, computer vision, and digital ticketing systems
CN110213596A (en) * 2018-03-28 2019-09-06 腾讯科技(深圳)有限公司 Switching method, device, computer equipment and storage medium is broadcast live
WO2020212761A1 (en) * 2019-04-19 2020-10-22 Orange Method for assisting the acquisition of media content at a scene
US11825191B2 (en) 2019-04-19 2023-11-21 Orange Method for assisting the acquisition of media content at a scene
CN113094011A (en) * 2021-03-26 2021-07-09 联想(北京)有限公司 Screen sharing method, device and equipment and computer readable storage medium
CN113094011B (en) * 2021-03-26 2023-12-26 联想(北京)有限公司 Screen sharing method, device, equipment and computer readable storage medium
EP4322534A1 (en) * 2022-08-12 2024-02-14 Rami Khatib Methods for viewing recorded events

Also Published As

Publication number Publication date
EP2896210A2 (en) 2015-07-22
WO2014041353A3 (en) 2014-05-22
GB201216360D0 (en) 2012-10-31
GB201306151D0 (en) 2013-05-22
GB2505978A (en) 2014-03-19

Similar Documents

Publication Publication Date Title
EP2896210A2 (en) Media content distribution
US11546566B2 (en) System and method for presenting and viewing a spherical video segment
US10701448B2 (en) Video delivery method for delivering videos captured from a plurality of viewpoints, video reception method, server, and terminal device
US10020025B2 (en) Methods and systems for customizing immersive media content
CN109416931A (en) Device and method for eye tracking
US9743060B1 (en) System and method for presenting and viewing a spherical video segment
US9026596B2 (en) Sharing of event media streams
KR102194222B1 (en) Event enhancement using augmented reality effects
US11216166B2 (en) Customizing immersive media content with embedded discoverable elements
US20170142486A1 (en) Information processing device, display device, information processing method, program, and information processing system
US10770113B2 (en) Methods and system for customizing immersive media content
KR20150100795A (en) Image capture, processing and delivery at group events
US20180256980A1 (en) Media system and method
US9973746B2 (en) System and method for presenting and viewing a spherical video segment
US11528467B1 (en) System and method for messaging channels, story challenges, and augmented reality
CN104904195A (en) Augmented reality apparatus and method
Mase et al. Socially assisted multi-view video viewer
CN106464773A (en) Augmented reality apparatus and method
US20190012834A1 (en) Augmented Content System and Method
Nelson Enhancing Fan Connection to the Seattle Seahawks Through Digital Media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13771566

Country of ref document: EP

Kind code of ref document: A2

REEP Request for entry into the european phase

Ref document number: 2013771566

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13771566

Country of ref document: EP

Kind code of ref document: A2