WO2015166095A1 - Portable processing apparatus, media distribution system and method - Google Patents

Portable processing apparatus, media distribution system and method Download PDF

Info

Publication number
WO2015166095A1
WO2015166095A1 PCT/EP2015/059613 EP2015059613W WO2015166095A1 WO 2015166095 A1 WO2015166095 A1 WO 2015166095A1 EP 2015059613 W EP2015059613 W EP 2015059613W WO 2015166095 A1 WO2015166095 A1 WO 2015166095A1
Authority
WO
WIPO (PCT)
Prior art keywords
media content
operable
display
marker
media
Prior art date
Application number
PCT/EP2015/059613
Other languages
French (fr)
Inventor
Neil Harrison
Original Assignee
Neil Harrison
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neil Harrison filed Critical Neil Harrison
Publication of WO2015166095A1 publication Critical patent/WO2015166095A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/47805Electronic banking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream

Definitions

  • the present disclosure relates to a portable processing apparatus, media distribution system and method.
  • Live music and other live performances are perennially popular and performing artists are increasingly looking for novel and diverse ways of promoting their music and/or performances to the public.
  • not all venues may have a public performing licence, or be able to provide live music or live performances.
  • performers such as a band or other artist
  • limited funds may find it difficult to promote their performances so as to become known to the public.
  • the present disclosure seeks to address the above problems.
  • the present disclosure seeks to provide a portable processing apparatus, media distribution system and method which may help provide a more interactive user experience with performers and help performers promote their performances to a wider audience.
  • a portable processing apparatus comprising: a video camera operable to capture images of a scene comprising an augmented reality marker; a content reproduction unit operable to reproduce audio and visual content, the reproduction unit comprising a display operable to display media content associated with the augmented reality marker in combination with the images captured by the camera; a processor operable to detect the augmented reality marker within the captured images; a receiver operable to receive the media content from a media content source; in which: in response to detection of an augmented reality marker, the processor is operable to cause the display to present a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced; and in response to user selection of media content from the list, the processor is operable to cause the selected media content to be received by the receiver from the media content source and cause reproduction of the selected media content by the reproduction unit as augmented reality media within the captures images.
  • a method performed by a portable processing apparatus comprising a video camera, and a reproduction unit comprising a display operable to display media content associated with an augmented reality marker in combination with the images captured by the camera, the method comprising: capturing, using the video camera, images of a scene comprising an augmented reality marker; detecting the augmented reality marker within the captured images; and in response to detection of an augmented reality marker, displaying a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced; and in response to user selection of media content from the list, receiving the selected media content from the media content source and reproducing the selected media content by the reproduction unit as augmented reality media within the captures images.
  • the augmented reality marker can thus act as a virtual stage located at a venue such as a coffee shop, metro station, street corner.
  • the user may then view a performance as an augmented reality for example as if a performer or performers were located at the venue.
  • a media distribution system comprising: a first reproduction unit comprising a first display unit operable to display media content comprising three dimensional video data; a media server operable to communicate media content with the first reproduction unit; a portable processing apparatus comprising: a video camera operable to capture images of a scene comprising the first display; a second reproduction unit operable to reproduce audio and visual content, the second reproduction unit comprising a second display operable to display information associated with the media content; a communication unit operable to communicate bidirectionally with the media server; a user input device arranged so that the user can input of one or more commands associated with the media content.
  • a media distribution method for implementation by a media distribution system, the media distribution system comprising a first reproduction unit comprising a first display unit, a media server, and portable processing apparatus comprising a video camera, a second reproduction unit, a communication unit, and a user input device, the method comprising: displaying, on the first display unit, media content comprising three dimensional video data; communicating media content between the media server and the first reproduction unit; capturing, using the video camera, images of a scene comprising the first display; reproducing, using the second reproduction unit, audio and visual content, the second reproduction unit comprising a second display, and the method comprising displaying information on the second display, the information being associated with the media content displayed on the first display; communicating bidirectionally with the media server using the communication unit; and inputting one or more commands associated with the media content using the user input device.
  • Figure 1 is a schematic view of elements of a portable processing apparatus
  • Figure 2 is a schematic view of the portable processing apparatus and an augmented reality marker
  • Figure 3 is a schematic view of a an augmented reality marker
  • Figure 4 is a schematic view of the portable processing apparatus comprising a performance page
  • Figure 5 is a schematic view of the portable processing apparatus comprising a list of virtual stages
  • Figure 6 is a schematic view of the portable processing apparatus comprising a map view of locations of augmented reality markers
  • Figure 7 is a schematic view of the portable processing apparatus comprising a performance page for user donation to a performer
  • Figure 8 is a schematic view of the portable processing apparatus for operation of a performer donation action
  • Figures 9 and 10 are schematic views of operation of a performer donation action and response
  • Figure 1 1 is a schematic view of the portable processing apparatus comprising a configuration page
  • Figure 12 is a schematic view of the portable processing apparatus comprising a performer preferences page
  • Figures 13 and 14 are schematic views of the portable processing apparatus illustrating locations of augmented reality markers with respect to the apparatus;
  • Figure 15 is a schematic view of a media distribution system
  • Figure 16 is a flow chart of method of operation of a portable processing apparatus.
  • Figure 17 is a flow chart of a method of operation of a media distribution system.
  • FIG. 1 schematically shows a portable processing apparatus 1 .
  • the apparatus is a so-called smart phone although it will be appreciated that other portable processing apparatus having the functionality described herein could be used.
  • the apparatus 1 comprises a video camera 2 operable to capture images of a scene comprising an augmented reality marker and a content reproduction unit 4 operable to reproduce audio and visual content.
  • the reproduction unit 4 comprises a display 6 operable to display media content associated with the augmented reality marker in combination with images captured by the camera.
  • the reproduction unit 4 comprises an audio output unit 8 operable to output audio from the apparatus 1 , for example to reproduce audio of the media content.
  • the display 6 comprises a touch sensor for user input to the apparatus.
  • the touch sensor is a capacitive touch sensor such as those known in the art, although it will be appreciated that any other type of sensor or touch screen could be used.
  • the touch sensor can act as a user input device although it will be appreciated that the apparatus 1 may comprise other input elements such as one or more user input buttons.
  • the apparatus 1 comprises a processor 10, which acts to control the apparatus 1 .
  • the processor 10 is arranged to be in communication with the video camera 2 and the content reproduction unit 4.
  • the processor 10 is operable to act as an augmented reality marker detector operable to detect one or more augmented reality markers within the captured images.
  • the processor is operable to apply known techniques to detect the one or more markers.
  • the apparatus 1 comprises a memory 1 1 in communication with the processor 10 for storing program code and other data for the apparatus, such as media content.
  • the memory 1 1 comprises a solid state storage device although it will be appreciated that any other suitable storage device could be used.
  • the apparatus 1 comprises a transmitter/receiver 12 for transmitting data to and/or receiving data from a media content source.
  • the media content source is a media server operable to communicate with the apparatus 1 over a wireless network such as wifi, Bluetooth, or mobile telecommunications such as 3G or 4G.
  • a wireless network such as wifi, Bluetooth, or mobile telecommunications such as 3G or 4G.
  • the apparatus 1 is operable to communicate with the internet and other mobile or portable devices such as other smart phones via the transmitter/receiver 12.
  • the apparatus 1 comprises a motion sensor 14 operable to detect motion of the apparatus 1 (for example by detecting acceleration of the apparatus about 6 axes using an accelerometer - three translational axes and three rotational axes) using known techniques such as those commonly found in smart phones and other portable devices.
  • the apparatus 1 comprises a position detector 16 (location detector) operable to detect the geographical position of the apparatus 1 .
  • the position detector 16 is operable to detect the geographical position of the apparatus 1 using global positioning satellite (GPS) signals although it will be appreciated that any other suitable system could be used such as location detection based on mobile telephone network signals.
  • GPS global positioning satellite
  • the position detector is operable to detect the geographical position of the apparatus with respect to the marker.
  • the processor 10 In response to detection of a predetermined augmented reality marker, the processor 10 (augmented reality marker detector) is operable to cause the display 6 to present a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced. Additionally, in response to user selection of media content from the list, the processor 10 is operable to cause the selected media content to be received by the transmitter/ receiver 12 from the media content source and cause reproduction of the selected media content by the reproduction unit as augmented reality media within the captures images.
  • FIG 2 is a schematic view of the portable processing apparatus 1 and an augmented reality marker 20.
  • the augmented reality marker 20 is located on a table 22, for example in a coffee shop.
  • the augmented reality marker 20 is fixedly attached to the table, for example by lamination onto the table's surface, although it will be appreciated that it could be attached in any other suitable manner.
  • the marker 20 is detachable from the table 22. In other words, for example the marker 20 is separate from the table 22.
  • the marker 20 could be attached to or form all or part of any other suitable object such as a bar, counter top, wall, advertisement and the like.
  • the camera 2 is arranged to capture an image of the marker 20 as indicated by dashed lines 24.
  • the processor 10 is operable to carry out augmented reality processing so as to cause media content such as video images to be combined with the captured images.
  • Figure 3 is a schematic view of the augmented reality marker 20.
  • the augmented reality marker is two-dimensional (2D) and comprises a plurality of symbols or shapes which can be detected by the processor 10 within the captured images using known techniques.
  • the marker 20 could be three dimensional (3D), linear or any other configuration suitable for use as an augmented reality marker.
  • Figure 4 is a schematic view of the portable processing apparatus 1 comprising a performance page.
  • the processor 10 is operable to cause the performance page to be displayed on the display 6, in response to a user command.
  • the performance page comprises a playback control area 28, a performance area 30, and a performance settings area 32.
  • the playback control area 28 comprises a plurality of input icons (such as input icons 34) for controlling reproduction of media items.
  • the playback control area 28 comprises information relating to the media item or items that are to be reproduced in the performance area 30.
  • the performance area 30 comprises a donate icon 36 for donating credit to a performer. This functionality will be described in more detail later below.
  • the performance control area comprises a credit icon 38, an audio stage icon 40, an artist list icon 42, and a stage location icon 44, the functionality of which will be described in more detail later below.
  • the augmented reality marker 20 can act as a virtual stage on which a performer or performers such as a musicians, dancers and/or actors can perform.
  • a user could walk into a coffee shop, capture an image of the augmented reality marker 20 using the camera 2 of the apparatus 1 so that a list of the media content is caused to be displayed on the display 6.
  • the user could then select their desired performed from the list via a suitable user interface such as the playback control area 28.
  • the processor 10 is operable to cause the selected media content to be, for example, streamed or otherwise received from the media content source so that it can be displayed as augmented reality content within the performance area 30, for example together with reproduction of associated audio by the audio output unit 8.
  • the augmented reality marker 20 can be thought of as a virtual stage on which one or more performers can be caused to appear as if within the coffee shop for example.
  • one or more performers could be displayed and reproduced.
  • the processor is operable to cause the display 6 to display a list of artist s (performers) associated with an augmented reality marker in response to user input to the artist list icon 42. The user can thus select a performer or artist that they wish to view within the performance area 30.
  • the reproduction of the media content can be controlled by user input to the playback control area 28, for example by touch input to the input icons 34 so as to perform playback operations such as play, pause, fast forward, rewind, although it will be appreciated that any other suitable playback operations could be implemented.
  • one or more markers may be located at different locations. Therefore, for example, a user may travel around a city for example, and be able to access a performance as if they were attending a gig or the like by the operation of the augmented reality functionality of the apparatus 1 . Additionally, it will be appreciated that the markers could be the same or they could be different from each other for example different markers at different locations, or for association with different respective performers.
  • Figure 5 is a schematic view of the portable processing apparatus comprising a list of virtual stages
  • Figure 6 is a schematic view of the portable processing apparatus comprising a map view of locations of augmented reality markers.
  • the processor 10 is operable to cause a location list area 46 comprising a list of locations of augmented reality markers to be displayed on the display 6 in response to user input to the stage icon 40.
  • the location list comprises a predetermined number of marker locations that are geographically closest to the apparatus 1 .
  • the location of the apparatus 1 is detected by the location detector 16 using known techniques.
  • the apparatus 1 is operable to communicate with a server comprising an augmented reality marker location database so as to determine the location of the marker or markers nearest to the apparatus 1 , although it will be appreciated that the locations of the marker(s) could be stored in the apparatus 1 or determined by any other appropriate technique.
  • the location list comprises eight locations, although it will be appreciated that any suitable number of locations could be displayed and/or that the user could scroll through the list using known user input techniques.
  • the apparatus is operable to display a map of locations of one or more markers (such as a location of the marker 20).
  • the apparatus 1 is operable to cause a map view page to be displayed on the display 6.
  • the map view page comprises a map view area 48 together with the performance control area 32.
  • the processor 10 is operable to cause a map to be displayed within the map view area 48 on the display 6 indicating the position of the apparatus (as shown by star icon 50) together with the positions of a first augmented reality marker 52 and a second augmented reality marker 54.
  • the apparatus 1 is operable to cause the map view and/or location list to be displayed in response to user input at the find stage icon 44.
  • the apparatus is operable to display the map view area 48 so that the user can navigate around it using known user interface techniques such as panning, and zooming.
  • the user can thus use the map view page and/or use the location list area to find their way to an augmented reality marker for example located at a coffee shop, restaurant, or other public or private location, so as to be able to access media content associated with the augmented reality marker.
  • the apparatus is operable to cause a financial transfer to occur between a user's account and a performer's account, for example so that the user can donate money to the performer in virtual similarity to that which often occurs for example when throwing money into a busker's hat or donating money to a street artist. This will now be describe in more detail with reference to Figures 7 to 10.
  • Figure 7 is a schematic view of the portable processing apparatus comprising a performance page for user donation to a performer.
  • the performance page is substantially the same as that described above with reference to Figure 4.
  • the display is operable to display the performance area 30 associated with the media content together with a donation area associated with a donation to a performer displayed in the performance area.
  • an augmented reality performance could be playing in the performance area 30.
  • an augmented reality performance is taken to be reproduction of media content associated with an augmented reality marker detected in the captured images for example using known augmented reality techniques.
  • the donate icon 36 can be considered to be a donation area, although it will be appreciated that the donation area of the display could be larger or smaller than this and dynamically sized and positioned depending on user input and the functionality of the apparatus 1 .
  • the apparatus 1 is operable to cause a financial transfer to occur from a user's account to a performer's account in response to user input at the donation area (e.g. donate icon 36) of the display.
  • a user has a user account associated with the apparatus and each performer has a performer account. It will be appreciated that each performer could have their own account or band or group could have a joint account. In other words, a performer should be taken to mean one or more performers.
  • the accounts are operable to communicate with each other using known online banking and/or mobile telephone banking and financial techniques for example so that money can be transferred from a user's account to a performer's account.
  • the processor in response to user input at the donate icon 36, is operable to cause a plurality of donation amount icons 56a-d to be displayed on the display 6 in the performance area 30.
  • the donation amount icons 56a-d can be thought of as corresponding to the donation area.
  • each donation amount icon is associated with a different monetary amount for donation to the performer(s).
  • the donation amount icons 56a-d are represented as coins or notes although it will be appreciated that any other visual representation could be used for the donation amount icons.
  • a user may choose to donate an amount corresponding to the amount indicated by the donation amount icon by touching the appropriate donation amount icon 56a-d.
  • the apparatus 1 is operable to cause a monetary amount corresponding to the value indicated by the donation amount icon to be transferred from the user's account to the performer's account, for example by communicating with an appropriate banking server.
  • the apparatus is operable to cause the financial transfer to occur in response to a user gesture between the donation area and the performance area.
  • Figure 8 is a schematic view of the portable processing apparatus 1 for operation of a performer donation action.
  • a user may select to donate an amount corresponding to that indicated by donation amount icon 56c by touching the display 6 at the position of the donation amount icon 56c.
  • the user can perform a user gesture, such as a flicking gesture towards the performance area where the performer appears to be within the augmented reality images.
  • the apparatus 1 is operable to detect the user gesture and cause the monetary transfer to occur between the user's account and the performer's account.
  • other user input gestures such as a slide gesture, multitouch input gesture or any other appropriate user input gesture could be used. Accordingly, for example, the user can easily donate money to the performer by a simple touch gesture.
  • the apparatus 1 comprises a motion sensor operable to detect motion of the apparatus.
  • the apparatus 1 is operable to cause a financial transfer to occur from the user's account to a performer's account associated with the media content in response to a detection of a predetermined movement of the apparatus by the motion sensor.
  • the apparatus is operable to detect a user input motion gesture such as shaking of the apparatus 1 by a user and the number of shakes.
  • the apparatus 1 is operable to detect the number of times that the apparatus 1 is shaken, with the number of shakes corresponding to the donation amount to be transferred from the user's account to the the performer's account.
  • any detected shaking of the apparatus 1 will cause a predetermined amount to be transferred from the user's account to the performer's account. Therefore, in examples of the disclosure, the user may donate money to the performer simply by shaking the apparatus 1 . This may provide a simpler and more intuitive interface for the user as well as making donation quicker and so more likely to occur.
  • Figures 9 and 10 are schematic view of operation of a performer donation action and response.
  • the apparatus 1 in response to user input which can cause a donation to occur (for example, the user input touch gesture or user input motion gesture) the apparatus 1 is operable to cause an animation to be displayed on the display 6 corresponding to a coin or note being tossed towards the performer or performers.
  • the apparatus 1 is operable to cause predetermined media content to be reproduced in response to reception of the financial transfer by the performer's account, for example in a performance area as indicated by Figure 10.
  • a performer can pre-record media content such as singing or saying "thank you" or visual acknowledgement of the donation for audio and/or visual reproduction on receipt of a donation into the performer's account.
  • a performer or performers can record a plurality of responses to receipt of donations and store them on the content source (which can be thought of as a content server).
  • the content of the predetermined media content is dependent on the monetary amount of the donation. In other words, for example, a larger donation amount could cause a longer media clip to be reproduced or media content in which the perform is more effusive with their thanks to be played, whereas a smaller donation amount might cause a shorter media clip to be reproduced.
  • any predetermined media content could be reproduced in response to reception of the financial transfer and/or the donation amount. This functionality may improve the interaction between the performer and user and help build more of a fan base for the performer.
  • Figure 1 1 is a schematic view of the portable processing apparatus comprising a configuration page 58.
  • the apparatus 1 is operable to cause the configuration page to be displayed in response to user input at a logo icon 60 displayed on the display 6 although it will be appreciated that the configuration page could be displayed in response to other suitable user input(s).
  • the configuration page 58 comprises a configuration list of attributes of the apparatus that can be configured.
  • the configuration list comprises a plurality of list items relating to an attribute or attributes of the apparatus 1 .
  • the list items comprise one or more of the following items: favourite performers 64; purchased songs 66; stage heroes 68; stage preferences 70; performer preferences 72; stage stroller 74; visual performances 76; audio performances 78; settings 80; account info 82; credit settings 84; and stage notifications 86.
  • the apparatus in response to user input at a position corresponding to favourite performers 64 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure which performers they like so that the media content can be easily found on the apparatus or content server.
  • the apparatus in response to user input at a position corresponding to purchased songs 66 the apparatus is operable to cause a configuration page to be displayed which allows a user to view and manage songs or other media content they have purchased.
  • the apparatus in response to user input at a position corresponding to stage heroes 68, the apparatus is operable to cause a page to be displayed which allows a user to view top performers of the hour, week and month for example those sponsored by brands so as to allow people to quickly see who the best performers are.
  • the apparatus in response to user input at a position corresponding to stage preferences 70 the apparatus is operable to cause a configuration page to be displayed which allows preferences in relation to the augmented reality appearance to be configured.
  • the apparatus in response to user input at a position corresponding to performer preferences 72 the apparatus is operable to cause a configuration page to be displayed which allows a user to set preferences for performance such as preferred music style as will be described later below with respect to Figure 12.
  • the apparatus in response to user input at a position corresponding to stage stroller 74 the apparatus is operable to cause a configuration page to be displayed in relation to a stage stroller setting - the stage stroller setting will be described in more detail below with reference to Figures 12 to 15.
  • the apparatus in response to user input at a position corresponding to visual performances 76 the apparatus is operable to allow a user to switch reproduction of visual performances of the media content on or off.
  • the apparatus in response to user input at a position corresponding to audio performances 78 the apparatus is operable to allows a user to switch audio performances of the media content on or off.
  • the apparatus in response to user input at a position corresponding to settings 80 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure general settings of the apparatus such as in relation to the application running the augmented reality processing.
  • the apparatus in response to user input at a position corresponding to account info 82 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure and/or edit information in relation to the user account, for example in relation to bank details for financial transfers for donation to performer's accounts.
  • the apparatus in response to user input at a position corresponding to credit settings 84 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure credit settings such as the amount or amounts to donate to a performer or to be displayed as donation amount icons 56a-d.
  • the apparatus in response to user input at a position corresponding to stage notifications the apparatus is operable to allow the user to switch stage notifications on or off.
  • the apparatus is operable to display a notification and/or output an audio alarm if the apparatus is within a threshold distance of a augmented reality marker (a "stage") as a stage notification.
  • a stage augmented reality marker
  • Figure 12 is a schematic view of the portable processing apparatus comprising a performer preferences page.
  • the apparatus 1 is operable to display a list of musical genres comprising rock and roll, arabic, singer-songwriter, rap, funk, jazz, classical, and r&b although it will be appreciated that one or more of these genres could be included in the list and that other musical genres could be used.
  • the user is able to select which genres are of interest to them by input to tick boxes in the list using known techniques.
  • the apparatus 1 is operable to allow a user to scroll through the list of genres in the performer preferences page for example using known techniques. However, it will be appreciated that other performer preferences could be included in the performer preferences page.
  • Figures 13 and 14 are schematic views of the portable processing apparatus illustrating locations of augmented reality markers with respect to the apparatus.
  • Figure 13 schematically shows a performance page which is substantially the same as that described above with respect to Figure 4.
  • the performance settings area 32 comprises a audio mode icon 88 which is operable to switch the stage stroller mode on or off depending on user input to the audio mode icon 88.
  • user input to the audio mode icon causes a sub-window 90 to be displayed on the display 6 which allows the user to select an audio mode such as the stage stroll mode.
  • the apparatus 1 In response to selection of the stage stroll mode, the apparatus 1 is operable to cause a stage stroll mode to be implemented.
  • Figure 14 schematically shows a map view in a similar manner to that described above with respect to Figure 6.
  • the apparatus 1 is operable to display a stage stroll control area 92 for controlling operation of the stage stroll mode.
  • the stage stroll mode for example allows reproduction of audio content associated with one or more augmented reality markers in dependence upon the relative position between the one or more markers and the apparatus 1 .
  • the location detector 16 is operable to detect the relative distance between the marker and the apparatus, and, if the relative distance is less than a threshold distance, cause the reproduction unit to reproduce audio content of the media content associated with the marker. For example, a user could be walking around a city and if they are within the threshold distance of an augmented reality marker then the apparatus is operable to cause audio content associated with that marker to be reproduced. This helps facilitate discovering performances in a similar manner to traveling around a city and overhearing for example live or recorded performances from music venues or other performance venues. The user can then walk in the direction of the augmented reality marker for example if they are interested in the media content that is being reproduced.
  • the location detector is operable to detect a relative distance between the apparatus 1 and a plurality of augmented reality markers, each marker having an associated threshold distance.
  • the first threshold distance (indicated by the dashed line 94) is associated with the marker 54 and the second threshold distance (indicated by the dashed line 96) is associated with the marker 52.
  • the first threshold distance (indicated by the dashed line 94) is associated with the marker 54 and the second threshold distance (indicated by the dashed line 96) is associated with the marker 52.
  • the first threshold distance indicated by the dashed line 94
  • the second threshold distance (indicated by the dashed line 96) is associated with the marker 52.
  • any suitable number of markers could be used, each with their own associated threshold distance.
  • the position of the apparatus 1 as indicated by the star icon 50 is within a first threshold distance (indicated by dashed line 94) of the augmented reality marker 54. Therefore, for example the apparatus 1 is operable to reproduce audio content associated with the augmented reality marker 54. However, the position of the apparatus 1 as indicated by the star icon 50 is outside a second threshold distance (indicated by dashed line 96) associated with the augmented reality marker 52.
  • the location detector 16 is operable to cause reproduction of media content associated with those markers whose relative distance between the respective marker and the apparatus is less than the respective threshold distance for that marker.
  • the location detector 16 is operable to cause reproduction of media content associated with the marker detected as having the closest relative distance between the marker and the apparatus.
  • the position of the apparatus 1 (indicated by the star icon 50) is closer to the marker 54 than the marker 52 and so the apparatus is operable to reproduce media content associated with the marker 54.
  • the reproduction unit 4 is operable reproduce audio of the media content associated with a respective marker so that the volume of the audio of the media content is dependent on the relative distance between the apparatus and the respective marker. For example, the closer the apparatus 1 is to a marker the louder the volume of the reproduced audio content. However, it will be appreciated that the audio could become closer, as the apparatus becomes closer to the marker, or any other appropriate dependence of volume. In examples, the volume is linearly dependent on the distance between the marker and the apparatus 1 , although it will be appreciated that any other scaling such as logarithmic scaling could be used.
  • the reproduction unit 4 is operable to reproduce audio of the media content associated with the plurality of the markers (such as markers 52 and 54) so as to mix the audio content associated with the markers in dependence upon the relative distance between the apparatus and the respective marker.
  • audio content associated with both the markers 52 and 54 can be reproduced and mixed together.
  • the relative volumes of the respective audio content associated with the markers 52 and 54 is dependent upon the relative distance between the apparatus 1 and the markers 52 and 54.
  • the apparatus 1 is closer to the marker 54 than the marker 52 and so the audio content associated with the marker 54 will be mixed with the audio content associated with the marker 52 so that the audio content associated with the marker 54 is louder than that associated with the marker 52.
  • a user can stroll around a city or other location in which one or more augmented reality markers are located and experience audio as if they were wandering around an area where live or reproduced music is being played in venues in the city or other location.
  • FIG 15 is a schematic view of a media distribution system 93 according to examples of the disclosure.
  • the media distribution system 93 comprises the portable processing apparatus 1 , a plurality of augmented reality markers 20 and a media server 95.
  • the media server 95 can act as the media source.
  • the system 93 comprises a reproduction unit 97 comprising a display unit operable to display media content comprising three dimensional (3D) video data.
  • the reproduction unit 97 comprises a so-called 3D TV although it will be appreciated that any other suitable apparatus for displaying 3D video content could be used.
  • the media server 95 is operable to communicate media content with the reproduction unit 97 as indicated by the dashed line 100. In other words, for example, the media server 95 is operable to transmit media content to the reproduction unit 97 for reproduction by the display unit of the reproduction unit 97. In examples, the media server 95 is operable to transmit media content comprising a performance by a performer to the reproduction unit 97 although it will be appreciated that the media content could comprise any other suitable media content.
  • the video camera 2 is operable to capture images of a scene comprising the display of the reproduction unit 97 and the display 6 of the content reproduction unit 4 is operable to display information associated with the media content such as artist name, track name and the like.
  • the apparatus 1 can be used to display information related to the 3D media content, for example by detection of images displayed by the reproduction unit 97.
  • the reproduction unit 97 is operable to display an augmented reality so as to provide similar or same functionality to that described above with respect to the augmented reality markers 20, 52, and 54.
  • the apparatus 1 is operable to communicate bidirectionally with the media server 95 as indicated by the dashed line 102, for example, to receive metadata relating to the media content being reproduced by the reproduction unit 96 and/or to receive media content for example as described above with respect to Figures 1 to 14.
  • the apparatus 1 comprises a user input device for example, a touch screen although it will be appreciated that the apparatus 1 could comprise any suitable user input device.
  • the user input device is arranged so that the user can input one or more commands associated with the media content.
  • the user could use the apparatus 1 to control playback functions of the media content reproduced by the reproduction unit 97.
  • the reproduction unit 97 can act as an augmented reality display, for example by displaying video and reproducing audio content recorded by a performer or performers.
  • the user input device is operable to allow other functions such as donation to a performer's account, voting on quality, purchase of media content (such as the currently playing performance) leave messages for the performer and the like to be carried out.
  • the display 6 comprises the user input device and the user input device comprises a touch sensor for user input to the apparatus 1 .
  • the display 6 is operable to display a performance area associated with the media content together with a donation area associated with a donation to a performer displayed in the performance area.
  • the video camera 2 could capture images of the media content reproduced by the reproduction apparatus 97 and display the media content in the performance area 30.
  • the apparatus 1 is operable to cause a financial transfer to occur from a user's account to a performer's account in response to user command comprising user input at the donation area of the display in a similar manner to that described above.
  • the apparatus 1 is operable to cause the financial transfer to occur in response to a user gesture between the donation area (such as donate icon 36) and the performance area 30.
  • the apparatus 1 comprises a motion sensor operable to detect motion of the apparatus 1 and the the apparatus 1 is operable to cause a financial transfer to occur from the user's account to a performer's account associated with the media content in response to a detection of a predetermined movement of the apparatus by the motion sensor.
  • the media distribution system 93 described with respect to Figure 15 can allow donations from a user's account to one or more performer's account in a similar manner to that described above with respect to Figures 7 to 10. Therefore, for example, the media distribution system described herein can help provide a a simplified and intuitive system which can for example allow a user to donate money to a performer or performers.
  • the media distribution system comprises a plurality of augmented reality markers (such as augmented reality markers 20).
  • each marker is associated with respective media content.
  • respective media content should be taken to mean that each marker can be associated with the same media content or different media content.
  • each marker is associated with different media content although in other examples, each marker is associated with one more more media items that are the same for each marker. Alternatively, some markers may be associated with the same media content and some may be associated with different media content.
  • the apparatus is operable to provide a list of media items for user selection.
  • the number of markers associated with media content generated by a particular performer is dependent upon the popularity of the performer. For example, more popular media content can have more markers associated with it and so that media content is more likely to be viewed by a user.
  • the media distribution system comprises a ranking element operable to rank the media content according to user popularity.
  • the media server 95 comprises the ranking element although it will be appreciated that this functionality could be implemented by any suitable device such as a separate server operable to communicate with the media server 95 and the apparatus 1 .
  • the ranking element is operable to rank the media content based on a total accumulative time that each media item has been viewed. For example, if a first media item has been viewed for a total of 500 minutes by all users and a second media item has been viewed for a total of 250 minutes by all user, then the first media item will be ranked higher than the second media item.
  • the media items are ranked according to user popularity as indicated by user votes and/or the number or value of monetary donations associated with that media item.
  • ranking according to cumulative time, user vote, and donations could be combined as appropriate.
  • other appropriate techniques for ranking the media items could be used.
  • Figure 16 is a flow chart of method of operation of the portable processing apparatus 1 according to examples of the disclosure such as those described above with reference to Figure 4.
  • the video camera 2 captures images of a scene comprising an augmented reality marker such as augmented reality marker 20.
  • the augmented reality marker is detected within the captured images for example by the processor 10.
  • a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced is displayed.
  • the selected media content is received from the media content source and, at a step s10, the selected media content is reproduced by the reproduction unit as augmented reality media within the captures images.
  • the flow chart of Figure 16 can be thought of as relating to the operation of the apparatus 1 for example as described above such as with respect to Figure 4.
  • Figure 17 is a flow chart of a method of operation of a media distribution system such as the media distribution system 93.
  • media content comprising three dimensional video data is displayed on a first display unit (such as the reproduction unit 97).
  • media content is communicated between the media server 95 and the first display unit.
  • the media server 95 may transmit 3D media content to the first r display unit, for example to update the media content displayed at the step s12.
  • the video camera 2 captures images of a scene comprising a first display such as that of the reproduction unit 97.
  • the content reproduction unit 4 reproduces audio and visual content such as images and audio relating the 3D media content as captured within the images captured by the video camera 2 at the step s16.
  • information associated with the media content displayed on the first display is displayed on the display 6 of the apparatus 1 .
  • bidirectional communication between the media server 94 and the portable processing apparatus 1 is carried out, for example so as to allow information relating to the media content to be displayed or to allow a donation operation (such as that described above with reference to Figure 15 or Figures 7 to 10) to be performed.
  • a step s24 one or more commands associated with the media content are input (for example to the apparatus 1 ) using the user input device.
  • a user command might relate to a donate operation, or a vote operation, such as those described above, although it will be appreciated that any other suitable user input could be used.
  • elements of the methods may be implemented in a portable processing apparatus and/or media distribution system in any suitable manner.
  • a conventional portable processing apparatus such as a smart phone
  • the required adaptation may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signal on a network such as an ethernet, a wireless network, the internet, or any combination of these or other networks.
  • the media distribution system could comprises one or more portable processing apparatuses having the same or similar functionality to the portable processing apparatus 1 .
  • the media content comprises a musical performance or a dramatic performance.
  • the media content and/or a performance could comprise any type of media content such as that relating a dance performance, video installation art performance, mixed media performance, recorded audio content and/or video, film, television program and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A portable processing apparatus comprises a video camera operable to capture images of a scene comprising an augmented reality marker. The apparatus comprises a content reproduction unit operable to reproduce audio and visual content. The reproduction unit comprises a display operable to display media content associated with the augmented reality marker in combination with the images captured by the camera. The apparatus comprises a processor operable to detect the augmented reality marker within the captured images and a receiver operable to receive the media content from a media content source. In response to detection of an augmented reality marker, the processor is operable to cause the display to present a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced. In response to user selection of media content from the list, the processor is operable to cause the selected media content to be received by the receiver from the media content source and cause reproduction of the selected media content by the reproduction unit as augmented reality media within the captures images.

Description

PORTABLE PROCESSING APPARATUS, MEDIA DISTRIBUTION SYSTEM AND METHOD
The present disclosure relates to a portable processing apparatus, media distribution system and method.
This application claims priority from UK Patent application No GB1407586.5 the contents of which is hereby incorporated.
Live music and other live performances are perennially popular and performing artists are increasingly looking for novel and diverse ways of promoting their music and/or performances to the public. However, not all venues may have a public performing licence, or be able to provide live music or live performances. Additionally, performers (such as a band or other artist) with limited funds may find it difficult to promote their performances so as to become known to the public.
The present disclosure seeks to address the above problems. In particular, the present disclosure seeks to provide a portable processing apparatus, media distribution system and method which may help provide a more interactive user experience with performers and help performers promote their performances to a wider audience.
In a first aspect, there is provided a portable processing apparatus, comprising: a video camera operable to capture images of a scene comprising an augmented reality marker; a content reproduction unit operable to reproduce audio and visual content, the reproduction unit comprising a display operable to display media content associated with the augmented reality marker in combination with the images captured by the camera; a processor operable to detect the augmented reality marker within the captured images; a receiver operable to receive the media content from a media content source; in which: in response to detection of an augmented reality marker, the processor is operable to cause the display to present a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced; and in response to user selection of media content from the list, the processor is operable to cause the selected media content to be received by the receiver from the media content source and cause reproduction of the selected media content by the reproduction unit as augmented reality media within the captures images.
In a second aspect, there is provided a method performed by a portable processing apparatus comprising a video camera, and a reproduction unit comprising a display operable to display media content associated with an augmented reality marker in combination with the images captured by the camera, the method comprising: capturing, using the video camera, images of a scene comprising an augmented reality marker; detecting the augmented reality marker within the captured images; and in response to detection of an augmented reality marker, displaying a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced; and in response to user selection of media content from the list, receiving the selected media content from the media content source and reproducing the selected media content by the reproduction unit as augmented reality media within the captures images.
For example, the augmented reality marker can thus act as a virtual stage located at a venue such as a coffee shop, metro station, street corner. The user may then view a performance as an augmented reality for example as if a performer or performers were located at the venue.
In a third aspect, there is provided a media distribution system, comprising: a first reproduction unit comprising a first display unit operable to display media content comprising three dimensional video data; a media server operable to communicate media content with the first reproduction unit; a portable processing apparatus comprising: a video camera operable to capture images of a scene comprising the first display; a second reproduction unit operable to reproduce audio and visual content, the second reproduction unit comprising a second display operable to display information associated with the media content; a communication unit operable to communicate bidirectionally with the media server; a user input device arranged so that the user can input of one or more commands associated with the media content.
In a fourth aspect, there is provided a media distribution method for implementation by a media distribution system, the media distribution system comprising a first reproduction unit comprising a first display unit, a media server, and portable processing apparatus comprising a video camera, a second reproduction unit, a communication unit, and a user input device, the method comprising: displaying, on the first display unit, media content comprising three dimensional video data; communicating media content between the media server and the first reproduction unit; capturing, using the video camera, images of a scene comprising the first display; reproducing, using the second reproduction unit, audio and visual content, the second reproduction unit comprising a second display, and the method comprising displaying information on the second display, the information being associated with the media content displayed on the first display; communicating bidirectionally with the media server using the communication unit; and inputting one or more commands associated with the media content using the user input device.
Various other aspects and features of the invention are defined in the appended claims.
Examples of the disclosure will now be described by way of example with reference to the accompanying drawings, throughout which like references refer to like parts, and in which:
Figure 1 is a schematic view of elements of a portable processing apparatus;
Figure 2 is a schematic view of the portable processing apparatus and an augmented reality marker;
Figure 3 is a schematic view of a an augmented reality marker;
Figure 4 is a schematic view of the portable processing apparatus comprising a performance page;
Figure 5 is a schematic view of the portable processing apparatus comprising a list of virtual stages; Figure 6 is a schematic view of the portable processing apparatus comprising a map view of locations of augmented reality markers;
Figure 7 is a schematic view of the portable processing apparatus comprising a performance page for user donation to a performer;
Figure 8 is a schematic view of the portable processing apparatus for operation of a performer donation action;
Figures 9 and 10 are schematic views of operation of a performer donation action and response;
Figure 1 1 is a schematic view of the portable processing apparatus comprising a configuration page;
Figure 12 is a schematic view of the portable processing apparatus comprising a performer preferences page;
Figures 13 and 14 are schematic views of the portable processing apparatus illustrating locations of augmented reality markers with respect to the apparatus;
Figure 15 is a schematic view of a media distribution system;
Figure 16 is a flow chart of method of operation of a portable processing apparatus; and
Figure 17 is a flow chart of a method of operation of a media distribution system.
Examples of a portable processing apparatus, media distribution system and method will now be described. . In the following description, a number of specific details are presented in order to provide a thorough understanding of the examples of the disclosure. It will be apparent however to a person skilled in the art that these specific details need not be employed to practise the present disclosure. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity in presenting the examples.
Figure 1 schematically shows a portable processing apparatus 1 . In examples, the apparatus is a so-called smart phone although it will be appreciated that other portable processing apparatus having the functionality described herein could be used.
The apparatus 1 comprises a video camera 2 operable to capture images of a scene comprising an augmented reality marker and a content reproduction unit 4 operable to reproduce audio and visual content. The reproduction unit 4 comprises a display 6 operable to display media content associated with the augmented reality marker in combination with images captured by the camera. In examples, the reproduction unit 4 comprises an audio output unit 8 operable to output audio from the apparatus 1 , for example to reproduce audio of the media content. In examples, the display 6 comprises a touch sensor for user input to the apparatus. In examples, the touch sensor is a capacitive touch sensor such as those known in the art, although it will be appreciated that any other type of sensor or touch screen could be used. In other words for example, the touch sensor can act as a user input device although it will be appreciated that the apparatus 1 may comprise other input elements such as one or more user input buttons. The apparatus 1 comprises a processor 10, which acts to control the apparatus 1 . The processor 10 is arranged to be in communication with the video camera 2 and the content reproduction unit 4. In examples, the processor 10 is operable to act as an augmented reality marker detector operable to detect one or more augmented reality markers within the captured images. In examples, the processor is operable to apply known techniques to detect the one or more markers. The apparatus 1 comprises a memory 1 1 in communication with the processor 10 for storing program code and other data for the apparatus, such as media content. In examples, the memory 1 1 comprises a solid state storage device although it will be appreciated that any other suitable storage device could be used.
The apparatus 1 comprises a transmitter/receiver 12 for transmitting data to and/or receiving data from a media content source. In examples, the media content source is a media server operable to communicate with the apparatus 1 over a wireless network such as wifi, Bluetooth, or mobile telecommunications such as 3G or 4G. However, it will be appreciated that any suitable communication system could be used. Additionally, in examples, the apparatus 1 is operable to communicate with the internet and other mobile or portable devices such as other smart phones via the transmitter/receiver 12.
In examples, the apparatus 1 comprises a motion sensor 14 operable to detect motion of the apparatus 1 (for example by detecting acceleration of the apparatus about 6 axes using an accelerometer - three translational axes and three rotational axes) using known techniques such as those commonly found in smart phones and other portable devices.
In examples, the apparatus 1 comprises a position detector 16 (location detector) operable to detect the geographical position of the apparatus 1 . In examples, the position detector 16 is operable to detect the geographical position of the apparatus 1 using global positioning satellite (GPS) signals although it will be appreciated that any other suitable system could be used such as location detection based on mobile telephone network signals. In examples, the position detector is operable to detect the geographical position of the apparatus with respect to the marker.
The operation of the motion sensor 14 and the position detector 16 will be described in more detail later below.
In response to detection of a predetermined augmented reality marker, the processor 10 (augmented reality marker detector) is operable to cause the display 6 to present a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced. Additionally, in response to user selection of media content from the list, the processor 10 is operable to cause the selected media content to be received by the transmitter/ receiver 12 from the media content source and cause reproduction of the selected media content by the reproduction unit as augmented reality media within the captures images.
This functionality will now be described in more detail with reference to Figures 2 to 4. Figure 2 is a schematic view of the portable processing apparatus 1 and an augmented reality marker 20. In particular, in the example of Figure 2, the augmented reality marker 20 is located on a table 22, for example in a coffee shop. In examples, the augmented reality marker 20 is fixedly attached to the table, for example by lamination onto the table's surface, although it will be appreciated that it could be attached in any other suitable manner. In some examples, the marker 20 is detachable from the table 22. In other words, for example the marker 20 is separate from the table 22. Additionally, it will be appreciated that the marker 20 could be attached to or form all or part of any other suitable object such as a bar, counter top, wall, advertisement and the like.
In examples, the camera 2 is arranged to capture an image of the marker 20 as indicated by dashed lines 24. The processor 10 is operable to carry out augmented reality processing so as to cause media content such as video images to be combined with the captured images.
Figure 3 is a schematic view of the augmented reality marker 20. In examples, the augmented reality marker is two-dimensional (2D) and comprises a plurality of symbols or shapes which can be detected by the processor 10 within the captured images using known techniques. However, it will be appreciated that the marker 20 could be three dimensional (3D), linear or any other configuration suitable for use as an augmented reality marker.
Figure 4 is a schematic view of the portable processing apparatus 1 comprising a performance page. In examples, the processor 10 is operable to cause the performance page to be displayed on the display 6, in response to a user command. The performance page comprises a playback control area 28, a performance area 30, and a performance settings area 32. The playback control area 28 comprises a plurality of input icons (such as input icons 34) for controlling reproduction of media items. In examples, the playback control area 28 comprises information relating to the media item or items that are to be reproduced in the performance area 30. In examples, the performance area 30 comprises a donate icon 36 for donating credit to a performer. This functionality will be described in more detail later below. In examples, the performance control area comprises a credit icon 38, an audio stage icon 40, an artist list icon 42, and a stage location icon 44, the functionality of which will be described in more detail later below.
In examples, the augmented reality marker 20 can act as a virtual stage on which a performer or performers such as a musicians, dancers and/or actors can perform. For example, a user could walk into a coffee shop, capture an image of the augmented reality marker 20 using the camera 2 of the apparatus 1 so that a list of the media content is caused to be displayed on the display 6. The user could then select their desired performed from the list via a suitable user interface such as the playback control area 28. For example, the processor 10 is operable to cause the selected media content to be, for example, streamed or otherwise received from the media content source so that it can be displayed as augmented reality content within the performance area 30, for example together with reproduction of associated audio by the audio output unit 8. In other words, for example, the augmented reality marker 20 can be thought of as a virtual stage on which one or more performers can be caused to appear as if within the coffee shop for example. However, it will be appreciated that one or more performers could be displayed and reproduced.
In examples, the processor is operable to cause the display 6 to display a list of artist s (performers) associated with an augmented reality marker in response to user input to the artist list icon 42. The user can thus select a performer or artist that they wish to view within the performance area 30.
In examples, the reproduction of the media content can be controlled by user input to the playback control area 28, for example by touch input to the input icons 34 so as to perform playback operations such as play, pause, fast forward, rewind, although it will be appreciated that any other suitable playback operations could be implemented.
Additionally, in examples, one or more markers may be located at different locations. Therefore, for example, a user may travel around a city for example, and be able to access a performance as if they were attending a gig or the like by the operation of the augmented reality functionality of the apparatus 1 . Additionally, it will be appreciated that the markers could be the same or they could be different from each other for example different markers at different locations, or for association with different respective performers.
Figure 5 is a schematic view of the portable processing apparatus comprising a list of virtual stages, and Figure 6 is a schematic view of the portable processing apparatus comprising a map view of locations of augmented reality markers. In examples, referring to Figure 5, the processor 10 is operable to cause a location list area 46 comprising a list of locations of augmented reality markers to be displayed on the display 6 in response to user input to the stage icon 40. In examples, the location list comprises a predetermined number of marker locations that are geographically closest to the apparatus 1 . In examples, the location of the apparatus 1 is detected by the location detector 16 using known techniques. In examples, the apparatus 1 is operable to communicate with a server comprising an augmented reality marker location database so as to determine the location of the marker or markers nearest to the apparatus 1 , although it will be appreciated that the locations of the marker(s) could be stored in the apparatus 1 or determined by any other appropriate technique. In the example shown in Figure 5, the location list comprises eight locations, although it will be appreciated that any suitable number of locations could be displayed and/or that the user could scroll through the list using known user input techniques.
In examples, the apparatus is operable to display a map of locations of one or more markers (such as a location of the marker 20). For example, referring to Figure 6, the apparatus 1 is operable to cause a map view page to be displayed on the display 6. In examples, the map view page comprises a map view area 48 together with the performance control area 32. The processor 10 is operable to cause a map to be displayed within the map view area 48 on the display 6 indicating the position of the apparatus (as shown by star icon 50) together with the positions of a first augmented reality marker 52 and a second augmented reality marker 54. In examples, the apparatus 1 is operable to cause the map view and/or location list to be displayed in response to user input at the find stage icon 44. In examples, the apparatus is operable to display the map view area 48 so that the user can navigate around it using known user interface techniques such as panning, and zooming. The user can thus use the map view page and/or use the location list area to find their way to an augmented reality marker for example located at a coffee shop, restaurant, or other public or private location, so as to be able to access media content associated with the augmented reality marker.
In examples, the apparatus is operable to cause a financial transfer to occur between a user's account and a performer's account, for example so that the user can donate money to the performer in virtual similarity to that which often occurs for example when throwing money into a busker's hat or donating money to a street artist. This will now be describe in more detail with reference to Figures 7 to 10.
Figure 7 is a schematic view of the portable processing apparatus comprising a performance page for user donation to a performer. In the example shown in Figure 7, the performance page is substantially the same as that described above with reference to Figure 4. In examples, the display is operable to display the performance area 30 associated with the media content together with a donation area associated with a donation to a performer displayed in the performance area.
For example, referring to Figure 7, an augmented reality performance could be playing in the performance area 30. For example, in the context of the disclosure an augmented reality performance is taken to be reproduction of media content associated with an augmented reality marker detected in the captured images for example using known augmented reality techniques. In examples, the donate icon 36 can be considered to be a donation area, although it will be appreciated that the donation area of the display could be larger or smaller than this and dynamically sized and positioned depending on user input and the functionality of the apparatus 1 .
In examples, the apparatus 1 is operable to cause a financial transfer to occur from a user's account to a performer's account in response to user input at the donation area (e.g. donate icon 36) of the display. In examples, a user has a user account associated with the apparatus and each performer has a performer account. It will be appreciated that each performer could have their own account or band or group could have a joint account. In other words, a performer should be taken to mean one or more performers. The accounts are operable to communicate with each other using known online banking and/or mobile telephone banking and financial techniques for example so that money can be transferred from a user's account to a performer's account.
In examples, in response to user input at the donate icon 36, the processor is operable to cause a plurality of donation amount icons 56a-d to be displayed on the display 6 in the performance area 30. In examples, the donation amount icons 56a-d can be thought of as corresponding to the donation area. In examples, each donation amount icon is associated with a different monetary amount for donation to the performer(s). In the example of Figure 7 there are four donation amount icons although it will be appreciated that any appropriate number of donation amount icons could be used. In examples, the donation amount icons 56a-d are represented as coins or notes although it will be appreciated that any other visual representation could be used for the donation amount icons.
In an example, a user may choose to donate an amount corresponding to the amount indicated by the donation amount icon by touching the appropriate donation amount icon 56a-d. In response to user input to a donation amount icon (donation area), the apparatus 1 is operable to cause a monetary amount corresponding to the value indicated by the donation amount icon to be transferred from the user's account to the performer's account, for example by communicating with an appropriate banking server. In other examples, the apparatus is operable to cause the financial transfer to occur in response to a user gesture between the donation area and the performance area.
Figure 8 is a schematic view of the portable processing apparatus 1 for operation of a performer donation action. For example, a user may select to donate an amount corresponding to that indicated by donation amount icon 56c by touching the display 6 at the position of the donation amount icon 56c. In examples, to cause the transfer to occur between the user's account and the performer's account, the user can perform a user gesture, such as a flicking gesture towards the performance area where the performer appears to be within the augmented reality images. The apparatus 1 is operable to detect the user gesture and cause the monetary transfer to occur between the user's account and the performer's account. However, it will be appreciated that other user input gestures such as a slide gesture, multitouch input gesture or any other appropriate user input gesture could be used. Accordingly, for example, the user can easily donate money to the performer by a simple touch gesture.
As mentioned above, in examples, the apparatus 1 comprises a motion sensor operable to detect motion of the apparatus. In examples, the apparatus 1 is operable to cause a financial transfer to occur from the user's account to a performer's account associated with the media content in response to a detection of a predetermined movement of the apparatus by the motion sensor. In examples, the apparatus is operable to detect a user input motion gesture such as shaking of the apparatus 1 by a user and the number of shakes. In examples, the apparatus 1 is operable to detect the number of times that the apparatus 1 is shaken, with the number of shakes corresponding to the donation amount to be transferred from the user's account to the the performer's account. In other example, any detected shaking of the apparatus 1 will cause a predetermined amount to be transferred from the user's account to the performer's account. Therefore, in examples of the disclosure, the user may donate money to the performer simply by shaking the apparatus 1 . This may provide a simpler and more intuitive interface for the user as well as making donation quicker and so more likely to occur.
Figures 9 and 10 are schematic view of operation of a performer donation action and response. In examples, in response to user input which can cause a donation to occur (for example, the user input touch gesture or user input motion gesture) the apparatus 1 is operable to cause an animation to be displayed on the display 6 corresponding to a coin or note being tossed towards the performer or performers. However, it will be appreciated that any other animation could be generated and displayed. More generally, in examples, the apparatus 1 is operable to cause predetermined media content to be reproduced in response to reception of the financial transfer by the performer's account, for example in a performance area as indicated by Figure 10. In examples, a performer can pre-record media content such as singing or saying "thank you" or visual acknowledgement of the donation for audio and/or visual reproduction on receipt of a donation into the performer's account.
In examples, a performer or performers can record a plurality of responses to receipt of donations and store them on the content source (which can be thought of as a content server). In examples, the content of the predetermined media content is dependent on the monetary amount of the donation. In other words, for example, a larger donation amount could cause a longer media clip to be reproduced or media content in which the perform is more effusive with their thanks to be played, whereas a smaller donation amount might cause a shorter media clip to be reproduced. However, it will be appreciated that any predetermined media content could be reproduced in response to reception of the financial transfer and/or the donation amount. This functionality may improve the interaction between the performer and user and help build more of a fan base for the performer.
Figure 1 1 is a schematic view of the portable processing apparatus comprising a configuration page 58. In examples, the apparatus 1 is operable to cause the configuration page to be displayed in response to user input at a logo icon 60 displayed on the display 6 although it will be appreciated that the configuration page could be displayed in response to other suitable user input(s). In example, the configuration page 58 comprises a configuration list of attributes of the apparatus that can be configured. In examples, the configuration list comprises a plurality of list items relating to an attribute or attributes of the apparatus 1 . In examples, the list items comprise one or more of the following items: favourite performers 64; purchased songs 66; stage heroes 68; stage preferences 70; performer preferences 72; stage stroller 74; visual performances 76; audio performances 78; settings 80; account info 82; credit settings 84; and stage notifications 86.
In examples, in response to user input at a position corresponding to favourite performers 64 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure which performers they like so that the media content can be easily found on the apparatus or content server. In examples, in response to user input at a position corresponding to purchased songs 66 the apparatus is operable to cause a configuration page to be displayed which allows a user to view and manage songs or other media content they have purchased. In examples, in response to user input at a position corresponding to stage heroes 68, the apparatus is operable to cause a page to be displayed which allows a user to view top performers of the hour, week and month for example those sponsored by brands so as to allow people to quickly see who the best performers are.
In examples, in response to user input at a position corresponding to stage preferences 70 the apparatus is operable to cause a configuration page to be displayed which allows preferences in relation to the augmented reality appearance to be configured. In examples, in response to user input at a position corresponding to performer preferences 72 the apparatus is operable to cause a configuration page to be displayed which allows a user to set preferences for performance such as preferred music style as will be described later below with respect to Figure 12.
In examples, in response to user input at a position corresponding to stage stroller 74 the apparatus is operable to cause a configuration page to be displayed in relation to a stage stroller setting - the stage stroller setting will be described in more detail below with reference to Figures 12 to 15. In examples, in response to user input at a position corresponding to visual performances 76 the apparatus is operable to allow a user to switch reproduction of visual performances of the media content on or off. In examples, in response to user input at a position corresponding to audio performances 78 the apparatus is operable to allows a user to switch audio performances of the media content on or off. In examples, in response to user input at a position corresponding to settings 80 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure general settings of the apparatus such as in relation to the application running the augmented reality processing.
In examples, in response to user input at a position corresponding to account info 82 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure and/or edit information in relation to the user account, for example in relation to bank details for financial transfers for donation to performer's accounts. In examples, in response to user input at a position corresponding to credit settings 84 the apparatus is operable to cause a configuration page to be displayed which allows a user to configure credit settings such as the amount or amounts to donate to a performer or to be displayed as donation amount icons 56a-d. In examples, in response to user input at a position corresponding to stage notifications the apparatus is operable to allow the user to switch stage notifications on or off. In examples, the apparatus is operable to display a notification and/or output an audio alarm if the apparatus is within a threshold distance of a augmented reality marker (a "stage") as a stage notification.
Figure 12 is a schematic view of the portable processing apparatus comprising a performer preferences page. In examples, the apparatus 1 is operable to display a list of musical genres comprising rock and roll, arabic, singer-songwriter, rap, funk, jazz, classical, and r&b although it will be appreciated that one or more of these genres could be included in the list and that other musical genres could be used. In examples, the user is able to select which genres are of interest to them by input to tick boxes in the list using known techniques. Additionally, in examples, the apparatus 1 is operable to allow a user to scroll through the list of genres in the performer preferences page for example using known techniques. However, it will be appreciated that other performer preferences could be included in the performer preferences page.
The functionality of the stage stroller mode will now be described in more detail with reference to Figures 13 and 14.
Figures 13 and 14 are schematic views of the portable processing apparatus illustrating locations of augmented reality markers with respect to the apparatus. In particular, Figure 13 schematically shows a performance page which is substantially the same as that described above with respect to Figure 4. However, in the example shown in Figure 13, the performance settings area 32 comprises a audio mode icon 88 which is operable to switch the stage stroller mode on or off depending on user input to the audio mode icon 88. In examples, user input to the audio mode icon causes a sub-window 90 to be displayed on the display 6 which allows the user to select an audio mode such as the stage stroll mode.
In response to selection of the stage stroll mode, the apparatus 1 is operable to cause a stage stroll mode to be implemented. In particular, Figure 14 schematically shows a map view in a similar manner to that described above with respect to Figure 6. However, in the example of Figure 14, the apparatus 1 is operable to display a stage stroll control area 92 for controlling operation of the stage stroll mode. In examples, the stage stroll mode for example allows reproduction of audio content associated with one or more augmented reality markers in dependence upon the relative position between the one or more markers and the apparatus 1 .
In examples, the location detector 16 is operable to detect the relative distance between the marker and the apparatus, and, if the relative distance is less than a threshold distance, cause the reproduction unit to reproduce audio content of the media content associated with the marker. For example, a user could be walking around a city and if they are within the threshold distance of an augmented reality marker then the apparatus is operable to cause audio content associated with that marker to be reproduced. This helps facilitate discovering performances in a similar manner to traveling around a city and overhearing for example live or recorded performances from music venues or other performance venues. The user can then walk in the direction of the augmented reality marker for example if they are interested in the media content that is being reproduced.
In examples, the location detector is operable to detect a relative distance between the apparatus 1 and a plurality of augmented reality markers, each marker having an associated threshold distance. For example, the first threshold distance (indicated by the dashed line 94) is associated with the marker 54 and the second threshold distance (indicated by the dashed line 96) is associated with the marker 52. However, it will be appreciated that any suitable number of markers could be used, each with their own associated threshold distance.
For example referring to Figure 14, the position of the apparatus 1 as indicated by the star icon 50 is within a first threshold distance (indicated by dashed line 94) of the augmented reality marker 54. Therefore, for example the apparatus 1 is operable to reproduce audio content associated with the augmented reality marker 54. However, the position of the apparatus 1 as indicated by the star icon 50 is outside a second threshold distance (indicated by dashed line 96) associated with the augmented reality marker 52.
In other words, more generally for example, the location detector 16 is operable to cause reproduction of media content associated with those markers whose relative distance between the respective marker and the apparatus is less than the respective threshold distance for that marker.
In other examples, the location detector 16 is operable to cause reproduction of media content associated with the marker detected as having the closest relative distance between the marker and the apparatus. For example, referring to Figure 14, the position of the apparatus 1 (indicated by the star icon 50) is closer to the marker 54 than the marker 52 and so the apparatus is operable to reproduce media content associated with the marker 54.
In examples, the reproduction unit 4 is operable reproduce audio of the media content associated with a respective marker so that the volume of the audio of the media content is dependent on the relative distance between the apparatus and the respective marker. For example, the closer the apparatus 1 is to a marker the louder the volume of the reproduced audio content. However, it will be appreciated that the audio could become closer, as the apparatus becomes closer to the marker, or any other appropriate dependence of volume. In examples, the volume is linearly dependent on the distance between the marker and the apparatus 1 , although it will be appreciated that any other scaling such as logarithmic scaling could be used.
In some examples, the reproduction unit 4 is operable to reproduce audio of the media content associated with the plurality of the markers (such as markers 52 and 54) so as to mix the audio content associated with the markers in dependence upon the relative distance between the apparatus and the respective marker. For example, referring to Figure 14, audio content associated with both the markers 52 and 54 can be reproduced and mixed together. In examples, the relative volumes of the respective audio content associated with the markers 52 and 54 is dependent upon the relative distance between the apparatus 1 and the markers 52 and 54. In the example shown in Figure 14, the apparatus 1 is closer to the marker 54 than the marker 52 and so the audio content associated with the marker 54 will be mixed with the audio content associated with the marker 52 so that the audio content associated with the marker 54 is louder than that associated with the marker 52. However, it will be appreciated that other techniques for mixing the audio content associated with the markers could be used. Therefore, for example, a user can stroll around a city or other location in which one or more augmented reality markers are located and experience audio as if they were wandering around an area where live or reproduced music is being played in venues in the city or other location.
A media distribution system will now be described with reference to Figure 15. Figure 15 is a schematic view of a media distribution system 93 according to examples of the disclosure. In examples, the media distribution system 93 comprises the portable processing apparatus 1 , a plurality of augmented reality markers 20 and a media server 95. In examples, the media server 95 can act as the media source. In examples, the system 93 comprises a reproduction unit 97 comprising a display unit operable to display media content comprising three dimensional (3D) video data. In examples, the reproduction unit 97 comprises a so-called 3D TV although it will be appreciated that any other suitable apparatus for displaying 3D video content could be used.
In examples, the media server 95 is operable to communicate media content with the reproduction unit 97 as indicated by the dashed line 100. In other words, for example, the media server 95 is operable to transmit media content to the reproduction unit 97 for reproduction by the display unit of the reproduction unit 97. In examples, the media server 95 is operable to transmit media content comprising a performance by a performer to the reproduction unit 97 although it will be appreciated that the media content could comprise any other suitable media content.
In examples, the video camera 2 is operable to capture images of a scene comprising the display of the reproduction unit 97 and the display 6 of the content reproduction unit 4 is operable to display information associated with the media content such as artist name, track name and the like. In other words, the apparatus 1 can be used to display information related to the 3D media content, for example by detection of images displayed by the reproduction unit 97. In examples, the reproduction unit 97 is operable to display an augmented reality so as to provide similar or same functionality to that described above with respect to the augmented reality markers 20, 52, and 54. In examples, the apparatus 1 is operable to communicate bidirectionally with the media server 95 as indicated by the dashed line 102, for example, to receive metadata relating to the media content being reproduced by the reproduction unit 96 and/or to receive media content for example as described above with respect to Figures 1 to 14.
As mentioned above, in examples, the apparatus 1 comprises a user input device for example, a touch screen although it will be appreciated that the apparatus 1 could comprise any suitable user input device. In examples, the user input device is arranged so that the user can input one or more commands associated with the media content. For example, the user could use the apparatus 1 to control playback functions of the media content reproduced by the reproduction unit 97. In other words, for example, the reproduction unit 97 can act as an augmented reality display, for example by displaying video and reproducing audio content recorded by a performer or performers. In examples, the user input device is operable to allow other functions such as donation to a performer's account, voting on quality, purchase of media content (such as the currently playing performance) leave messages for the performer and the like to be carried out.
As mentioned above, in examples, the display 6 comprises the user input device and the user input device comprises a touch sensor for user input to the apparatus 1 . In examples, the display 6 is operable to display a performance area associated with the media content together with a donation area associated with a donation to a performer displayed in the performance area. For example, the video camera 2 could capture images of the media content reproduced by the reproduction apparatus 97 and display the media content in the performance area 30. In examples, the apparatus 1 is operable to cause a financial transfer to occur from a user's account to a performer's account in response to user command comprising user input at the donation area of the display in a similar manner to that described above.
In examples, the apparatus 1 is operable to cause the financial transfer to occur in response to a user gesture between the donation area (such as donate icon 36) and the performance area 30. Additionally, as mentioned above, the apparatus 1 comprises a motion sensor operable to detect motion of the apparatus 1 and the the apparatus 1 is operable to cause a financial transfer to occur from the user's account to a performer's account associated with the media content in response to a detection of a predetermined movement of the apparatus by the motion sensor. In other words for example, the media distribution system 93 described with respect to Figure 15 can allow donations from a user's account to one or more performer's account in a similar manner to that described above with respect to Figures 7 to 10. Therefore, for example, the media distribution system described herein can help provide a a simplified and intuitive system which can for example allow a user to donate money to a performer or performers.
As mentioned above, in examples, the media distribution system comprises a plurality of augmented reality markers (such as augmented reality markers 20). In examples, each marker is associated with respective media content. In this context, respective media content should be taken to mean that each marker can be associated with the same media content or different media content. In some examples, each marker is associated with different media content although in other examples, each marker is associated with one more more media items that are the same for each marker. Alternatively, some markers may be associated with the same media content and some may be associated with different media content. As mentioned above with respect to figures 1 to 4, in examples, the apparatus is operable to provide a list of media items for user selection.
In examples, the number of markers associated with media content generated by a particular performer is dependent upon the popularity of the performer. For example, more popular media content can have more markers associated with it and so that media content is more likely to be viewed by a user. More generally, in examples, the media distribution system comprises a ranking element operable to rank the media content according to user popularity. In examples, the media server 95 comprises the ranking element although it will be appreciated that this functionality could be implemented by any suitable device such as a separate server operable to communicate with the media server 95 and the apparatus 1 . In examples, the ranking element is operable to rank the media content based on a total accumulative time that each media item has been viewed. For example, if a first media item has been viewed for a total of 500 minutes by all users and a second media item has been viewed for a total of 250 minutes by all user, then the first media item will be ranked higher than the second media item.
In other examples, the media items are ranked according to user popularity as indicated by user votes and/or the number or value of monetary donations associated with that media item. However, it will be appreciated that one or more of ranking according to cumulative time, user vote, and donations could be combined as appropriate. Additionally it will be appreciated that other appropriate techniques for ranking the media items could be used.
Figure 16 is a flow chart of method of operation of the portable processing apparatus 1 according to examples of the disclosure such as those described above with reference to Figure 4.
At a step s2, the video camera 2 captures images of a scene comprising an augmented reality marker such as augmented reality marker 20. At a step s4, the augmented reality marker is detected within the captured images for example by the processor 10.
At a step s6, in response to detection of an augmented reality marker, a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced is displayed.
At a step s8, in response to user selection of media content from the list, the selected media content is received from the media content source and, at a step s10, the selected media content is reproduced by the reproduction unit as augmented reality media within the captures images.
In other words, for example, the flow chart of Figure 16 can be thought of as relating to the operation of the apparatus 1 for example as described above such as with respect to Figure 4.
Figure 17 is a flow chart of a method of operation of a media distribution system such as the media distribution system 93.
At a step s12, media content comprising three dimensional video data is displayed on a first display unit (such as the reproduction unit 97). At a step s14, media content is communicated between the media server 95 and the first display unit. For example, the media server 95 may transmit 3D media content to the first r display unit, for example to update the media content displayed at the step s12.
At a step s16, the video camera 2 captures images of a scene comprising a first display such as that of the reproduction unit 97. At a step s18, the content reproduction unit 4 reproduces audio and visual content such as images and audio relating the 3D media content as captured within the images captured by the video camera 2 at the step s16. At a step s20, information associated with the media content displayed on the first display is displayed on the display 6 of the apparatus 1 .
At a step s22 bidirectional communication between the media server 94 and the portable processing apparatus 1 is carried out, for example so as to allow information relating to the media content to be displayed or to allow a donation operation (such as that described above with reference to Figure 15 or Figures 7 to 10) to be performed. At a step s24, one or more commands associated with the media content are input (for example to the apparatus 1 ) using the user input device. For example, a user command might relate to a donate operation, or a vote operation, such as those described above, although it will be appreciated that any other suitable user input could be used.
It will be appreciated that in examples of the disclosure, elements of the methods may be implemented in a portable processing apparatus and/or media distribution system in any suitable manner. Thus adapting existing parts of a conventional portable processing apparatus (such as a smart phone) may comprise for example reprogramming one or more processors therein. As such the required adaptation may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signal on a network such as an ethernet, a wireless network, the internet, or any combination of these or other networks.
Although the examples have been described herein with respect to one portable processing apparatus, it will be appreciated that the media distribution system could comprises one or more portable processing apparatuses having the same or similar functionality to the portable processing apparatus 1 .
In examples, the media content comprises a musical performance or a dramatic performance. However, it will be appreciated that the media content and/or a performance could comprise any type of media content such as that relating a dance performance, video installation art performance, mixed media performance, recorded audio content and/or video, film, television program and the like.
It will be appreciated that the features of one or more of any of the different examples described above may be combined together as appropriate with changes as appropriate which will be apparent to the skilled person from the present disclosure.
Insofar as various examples have been described with reference to the present disclosure, it will be appreciated that various modifications and alterations may be made without departing from the spirit and scope of the invention as defined in the appended claims and their equivalents.

Claims

1 . A portable processing apparatus, comprising:
a video camera operable to capture images of a scene comprising an augmented reality marker;
a content reproduction unit operable to reproduce audio and visual content, the reproduction unit comprising a display operable to display media content associated with the augmented reality marker in combination with the images captured by the camera;
a processor operable to detect the augmented reality marker within the captured images; a receiver operable to receive the media content from a media content source;
in which:
in response to detection of an augmented reality marker, the processor is operable to cause the display to present a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced; and
in response to user selection of media content from the list, the processor is operable to cause the selected media content to be received by the receiver from the media content source and cause reproduction of the selected media content by the reproduction unit as augmented reality media within the captures images.
2. An apparatus according to claim 1 , comprising a location detector operable to detect the geographical location of the apparatus with respect to the marker.
3. An apparatus according to claim 2, in which the location detector is operable to detect the relative distance between the marker and the apparatus, and, if the relative distance is less than a threshold distance, cause the reproduction unit to reproduce audio content of the media content associated with the marker.
4. An apparatus according to claim 3, in which:
the location detector is operable to detect a relative distance between the apparatus and a plurality of augmented reality markers, each marker having an associated threshold distance.
5. An apparatus according to claim 4, in which:
the location detector is operable to cause reproduction of media content associated with those markers whose relative distance between the respective marker and the apparatus is less than the respective threshold distance for that marker.
6. An apparatus according to claim 4, in which the location detector is operable to cause reproduction of media content associated with the marker detected as having the closest relative distance between the marker and the apparatus.
7. An apparatus according to any of claims 4 to 6, in which the reproduction unit is operable reproduce audio of the media content associated with a respective marker so that the volume of the audio of the media content is dependent on the relative distance between the apparatus and the respective marker.
8. An apparatus according to any of claims 4, in which the reproduction unit is operable to reproduce audio of the media content associated with the plurality of the markers so as to mix the audio content associated with the markers in dependence upon the relative distance between the apparatus and the respective marker.
9. An apparatus according to any preceding claim, in which:
the display comprises a touch sensor for user input to the apparatus;
the display is operable to display a performance area associated with the media content together with a donation area associated with a donation to a performer displayed in the performance area;
the apparatus is operable to cause a financial transfer to occur from a user's account to a performer's account in response to user input at the donation area of the display.
10. An apparatus according to claim 9, in which the apparatus is operable to cause the financial transfer to occur in response to a user gesture between the donation area and the performance area.
11 . An apparatus according to any preceding claim, comprising:
a motion sensor operable to detect motion of the apparatus;
in which the apparatus is operable to cause a financial transfer to occur from the user's account to a performer's account associated with the media content in response to a detection of a predetermined movement of the apparatus by the motion sensor.
12. An apparatus according to any of claims 9 to 11 , in which the apparatus is operable to cause predetermined media content to be reproduced in response to reception of the financial transfer by the performer's account.
13. An apparatus according to claim 12, in which the content of the predetermined media content is dependent on the monetary amount of the donation.
14. An apparatus according to any preceding claim, in which the media content comprises a musical performance.
15. An apparatus according to any preceding claim, in which the media content comprises a dramatic performance.
16. A media distribution system comprising:
an apparatus according to any preceding claim;
a plurality of augmented reality markers, each marker being associated with respective media content.
17. A system according to claim 16, in which the number of markers associated with media content generated by a particular performer is dependent upon the popularity of the performer.
18. A system according to claim 16 or 17, comprising a ranking element operable to rank the media content according to user popularity.
19. A media system according to claim 18, in which the ranking element is operable to rank the media content based on a total accumulative time that each media item has been viewed.
20. A method performed by a portable processing apparatus comprising a video camera, and a reproduction unit comprising a display operable to display media content associated with an augmented reality marker in combination with the images captured by the camera, the method comprising:
capturing, using the video camera, images of a scene comprising an augmented reality marker;
detecting the augmented reality marker within the captured images; and
in response to detection of an augmented reality marker, displaying a list comprising information relating to media content associated with that marker for user selection of media content to be reproduced; and
in response to user selection of media content from the list, receiving the selected media content from the media content source and reproducing the selected media content by the reproduction unit as augmented reality media within the captures images.
21 . A computer program for carrying out a method according to claim 20.
22. A recording medium comprising a computer program according to claim 21 .
23. A media distribution system, comprising:
a first reproduction unit comprising a first display unit operable to display media content comprising three dimensional video data;
a media server operable to communicate media content with the first reproduction unit; a portable processing apparatus comprising:
a video camera operable to capture images of a scene comprising the first display; a second reproduction unit operable to reproduce audio and visual content, the second reproduction unit comprising a second display operable to display information associated with the media content;
a communication unit operable to communicate bidirectionally with the media server;
a user input device arranged so that the user can input of one or more commands associated with the media content.
24. A system according to claim 23, in which:
the second display comprises the user input device and the user input device comprises a touch sensor for user input to the apparatus;
the second display is operable to display a performance area associated with the media content together with a donation area associated with a donation to a performer displayed in the performance area;
the apparatus is operable to cause a financial transfer to occur from a user's account to a performer's account in response to user command comprising user input at the donation area of the display.
25. A system according to claim 24, in which the apparatus is operable to cause the financial transfer to occur in response to a user gesture between the donation area and the performance area.
26. A system according to any of claims 23 to 25, in which:
the apparatus comprises a motion sensor operable to detect motion of the apparatus; and the apparatus is operable to cause a financial transfer to occur from the user's account to a performer's account associated with the media content in response to a detection of a predetermined movement of the apparatus by the motion sensor.
27. A media distribution method for implementation by a media distribution system, the media distribution system comprising a first reproduction unit comprising a first display unit, a media server, and portable processing apparatus comprising a video camera, a second reproduction unit, a communication unit, and a user input device, the method comprising:
displaying, on the first display unit, media content comprising three dimensional video data; communicating media content between the media server and the first reproduction unit; capturing, using the video camera, images of a scene comprising the first display;
reproducing, using the second reproduction unit, audio and visual content, the second reproduction unit comprising a second display, and the method comprising displaying information on the second display, the information being associated with the media content displayed on the first display;
communicating bidirectionally with the media server using the communication unit; and inputting one or more commands associated with the media content using the user input device.
28. A computer program for carrying out a method according to claim 27.
29. A recording medium comprising a computer program according to claim 28.
30. A portable processing apparatus, media distribution system, or method substantially as described herein with reference to the accompanying drawings.
PCT/EP2015/059613 2014-04-30 2015-04-30 Portable processing apparatus, media distribution system and method WO2015166095A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1407586.5 2014-04-30
GBGB1407586.5A GB201407586D0 (en) 2014-04-30 2014-04-30 Portable processing apparatus, media distribution system and method

Publications (1)

Publication Number Publication Date
WO2015166095A1 true WO2015166095A1 (en) 2015-11-05

Family

ID=50972094

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/059613 WO2015166095A1 (en) 2014-04-30 2015-04-30 Portable processing apparatus, media distribution system and method

Country Status (2)

Country Link
GB (1) GB201407586D0 (en)
WO (1) WO2015166095A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017217713A1 (en) 2016-06-15 2017-12-21 Samsung Electronics Co., Ltd. Method and apparatus for providing augmented reality services
CN108024134A (en) * 2017-11-08 2018-05-11 北京密境和风科技有限公司 It is a kind of based on live data analysing method, device and terminal device
US20180270541A1 (en) * 2016-04-22 2018-09-20 Tencent Technology (Shenzhen) Company Limited Program interaction system, method, client, and backend server
CN111144144A (en) * 2018-11-05 2020-05-12 全光勋 Information providing apparatus for providing information based on mark

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
US20120229624A1 (en) * 2011-03-08 2012-09-13 Bank Of America Corporation Real-time image analysis for providing health related information
US20120275755A1 (en) * 2011-04-26 2012-11-01 Sony Computer Entertainment Europe Limited Entertainment device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
US20120229624A1 (en) * 2011-03-08 2012-09-13 Bank Of America Corporation Real-time image analysis for providing health related information
US20120275755A1 (en) * 2011-04-26 2012-11-01 Sony Computer Entertainment Europe Limited Entertainment device and method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180270541A1 (en) * 2016-04-22 2018-09-20 Tencent Technology (Shenzhen) Company Limited Program interaction system, method, client, and backend server
US10701451B2 (en) * 2016-04-22 2020-06-30 Tencent Technology (Shenzhen) Company Limited Program interaction system, method, client, and backend server
WO2017217713A1 (en) 2016-06-15 2017-12-21 Samsung Electronics Co., Ltd. Method and apparatus for providing augmented reality services
EP3427233A4 (en) * 2016-06-15 2019-01-16 Samsung Electronics Co., Ltd. Method and apparatus for providing augmented reality services
US10368212B2 (en) 2016-06-15 2019-07-30 Samsung Electronics Co., Ltd. Method and apparatus for providing augmented reality services
CN108024134A (en) * 2017-11-08 2018-05-11 北京密境和风科技有限公司 It is a kind of based on live data analysing method, device and terminal device
CN108024134B (en) * 2017-11-08 2020-01-21 北京密境和风科技有限公司 Live broadcast-based data analysis method and device and terminal equipment
CN111144144A (en) * 2018-11-05 2020-05-12 全光勋 Information providing apparatus for providing information based on mark

Also Published As

Publication number Publication date
GB201407586D0 (en) 2014-06-11

Similar Documents

Publication Publication Date Title
US11864285B2 (en) Digital jukebox device with improved user interfaces, and associated methods
US11775146B2 (en) Digital jukebox device with improved karaoke-related user interfaces, and associated methods
KR101971624B1 (en) Method and mobile terminal for displaying information, method and display device for providing information, and method and mobile terminal for generating control signal
US20120120296A1 (en) Methods and Systems for Dynamically Presenting Enhanced Content During a Presentation of a Media Content Instance
CN103577063A (en) Mobile tmerinal and control method thereof
KR20130132886A (en) Method and system for providing additional content related to a displayed content
CN106663115A (en) Continuation of playback of media content by different output devices
CN103403655A (en) Contextual user interface
KR20150026367A (en) Method for providing services using screen mirroring and apparatus thereof
US9137560B2 (en) Methods and systems for providing access to content during a presentation of a media content instance
CN103270769A (en) Method and system for providing media recommendations
WO2015166095A1 (en) Portable processing apparatus, media distribution system and method
WO2010131333A1 (en) Content search device, content search method, content search program, and recording medium
US11837250B2 (en) Audio playout report for ride-sharing session
US20220397411A1 (en) Systems and methods for delivering content to a user based on geolocation
US10402152B2 (en) Media sharing community
US20230156885A1 (en) Digital jukebox device with improved user interfaces, and associated methods
KR102104498B1 (en) System and method for providing slide show
KR20180016805A (en) Display device and operating method thereof
WO2014150958A2 (en) Digital jukebox device with improved karaoke-related user interfaces, and associated methods
KR20150145499A (en) Display device and operating method thereof
JP2013236282A (en) Information communication program, information communication device, and distribution server
US11936702B2 (en) Methods and systems for geolocation-based media streaming
CN108021591A (en) File store path method and device for planning
TWM518792U (en) Media interactive device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15722686

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15722686

Country of ref document: EP

Kind code of ref document: A1