US20170024093A1 - System and method for playback of media content with audio touch menu functionality - Google Patents
System and method for playback of media content with audio touch menu functionality Download PDFInfo
- Publication number
- US20170024093A1 US20170024093A1 US15/289,689 US201615289689A US2017024093A1 US 20170024093 A1 US20170024093 A1 US 20170024093A1 US 201615289689 A US201615289689 A US 201615289689A US 2017024093 A1 US2017024093 A1 US 2017024093A1
- Authority
- US
- United States
- Prior art keywords
- media
- options
- media content
- playback
- playlist
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000000007 visual effect Effects 0.000 claims abstract description 88
- 230000003993 interaction Effects 0.000 claims abstract description 14
- 230000004044 response Effects 0.000 claims description 22
- 239000000872 buffer Substances 0.000 claims description 11
- 230000003139 buffering effect Effects 0.000 claims description 2
- 230000006870 function Effects 0.000 description 11
- 230000003466 anti-cipated effect Effects 0.000 description 10
- 230000006399 behavior Effects 0.000 description 9
- 230000008447 perception Effects 0.000 description 7
- 238000012800 visualization Methods 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000003542 behavioural effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013341 scale-up Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/44—Browsing; Visualisation therefor
-
- G06F17/30058—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/1062—Data buffering arrangements, e.g. recording or playback buffers
- G11B2020/10629—Data buffering arrangements, e.g. recording or playback buffers the buffer having a specific structure
- G11B2020/10657—Cache memories for random data access, e.g. buffers wherein the data output is controlled by a priority parameter other than retention time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
Definitions
- Embodiments of the invention are generally related to means for providing music, video, or other media content, and are particularly related to features for improving user interaction, such as use of audible notifications, media caching, and touch menus.
- a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device.
- the system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- additional features can be provided that improve user interaction, for example the use of audible notifications, media caching, or touch menus, including support for force-sensitive touch input.
- FIG. 2 illustrates playback of media content, in accordance with an embodiment.
- FIG. 3 illustrates an example of a playback volume function, in accordance with an embodiment.
- FIG. 7 further illustrates a user interface, in accordance with an embodiment.
- FIG. 8 further illustrates a user interface, in accordance with an embodiment.
- FIG. 9 further illustrates a user interface, in accordance with an embodiment.
- FIG. 10 further illustrates a user interface, in accordance with an embodiment.
- FIG. 11 further illustrates a user interface, in accordance with an embodiment.
- FIG. 12 illustrates use of the system to append playback of media content, in accordance with an embodiment.
- FIG. 13 illustrates a system which includes support for audible notifications, in accordance with an embodiment.
- FIG. 14 illustrates the use of audible notifications, in accordance with an embodiment.
- FIG. 15 further illustrates the use of audible notifications, in accordance with an embodiment.
- FIG. 16 is a flowchart of a method for providing audible notifications, in accordance with an embodiment.
- FIG. 17 illustrates a system which includes media caching, in accordance with an embodiment.
- FIG. 18 further illustrates a system which includes media caching, in accordance with an embodiment.
- FIG. 19 further illustrates a system which includes media caching, in accordance with an embodiment.
- FIG. 20 is a flowchart of a method for providing media caching, in accordance with an embodiment.
- FIG. 21 illustrates a system which includes touch menus, in accordance with an embodiment.
- FIGS. 22A-22B further illustrate a system which includes a touch menu, in accordance with an embodiment.
- FIGS. 23A-23B further illustrate a system which includes a touch menu, in accordance with an embodiment.
- FIG. 24 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 25 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 26 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 27 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 28 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 29 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 30 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 31 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 32 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 33 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 34 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 35 further illustrates the use of a touch menu, in accordance with an embodiment.
- FIG. 36 is a flowchart of a method for providing a touch menu, in accordance with an embodiment.
- FIG. 37 is a flowchart of a method for use by a server in providing a touch menu, in accordance with an embodiment.
- FIG. 38 is a flowchart of a method for use by a client device in providing a touch menu, in accordance with an embodiment.
- FIG. 39A-39C illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment.
- FIG. 40A-40C further illustrates the use of a touch menu including support for force- sensitive touch input, in accordance with an embodiment.
- FIG. 41A-41C further illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment.
- a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device.
- the system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- additional features can be provided that improve user interaction, for example the use of audible notifications, media caching, or touch menus, including support for force-sensitive touch input.
- FIG. 1 illustrates a system for playback of media content, in accordance with an embodiment.
- a media device or player 100 for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content, can be used to play media content that is provided by a computer system operating as a media server 102 , or from another system or peer device.
- Each of the media device and the computer system operating as the media server can include, respectively, one or more physical computer or hardware resources 104 , 106 , such as one or more processors (CPU), physical memory, network components, or other types of hardware resources.
- physical computer or hardware resources 104 , 106 such as one or more processors (CPU), physical memory, network components, or other types of hardware resources.
- the media server can include an operating system or other processing environment which supports execution of a software application environment 110 , including a media server application 114 which can be used, for example, to stream music, video, or other forms of media content.
- a media stream service 120 can be used to buffer media content, for streaming to one or more streams 122 , 124 , 126 .
- a media application interface 128 can receive requests from media devices or other systems, to retrieve media content from the media server.
- Media content or items 131 (generally referred to herein as media content items), and/or samples 132 associated with the media content items, can be provided, for example, within a database or repository, or can be received at the media server from another source.
- each media content item that can be provided by the media server can be associated with one or more samples.
- a particular media content item can be associated with a plurality of samples of different playing durations, each of which can be taken from different segments (for example, the beginning, or the middle) of the media content item.
- the samples can be similarly stored at, and thereafter provided by, the media server.
- the determination of which of the plurality of samples to use at a particular point in time depends on the particular implementation, and may also take into account realtime considerations such as balancing network bandwidth usage versus providing a smoother user experience.
- the samples can be snippets or fragments of an associated media content that are determined by a media content producer (e.g., a record label) to reflect that particular media content (e.g., a particular song track) created by that content producer.
- a media content producer e.g., a record label
- a song snippet may be a particularly recognizable portion of a particular song.
- a video content snippet may be a particularly recognizable portion of a particular video content.
- other types of samples or snippets can be used to provide a preview of an associated media content.
- the system can use 30, 60, or 90 second audio-preview snippets for every song track. Longer snippets can provide a sufficient audio impression for the user of tuning into a particular track and being able to hear it through to its the end, after which the player can continue playing whatever is next in that context, providing an “on-demand” experience.
- a media streaming logic 130 can be used to retrieve or otherwise access the media content items, and/or the samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device.
- the media device can be a computing system, handheld entertainment device, smartphone, or other type of device that can playback media content.
- the media server can support the simultaneous use of multiple media devices, and/or the media device can simultaneously access media content at multiple media servers.
- the media device can include a user interface 140 , which is adapted to display or otherwise provide a visual array of media options 142 , for example as a two-dimensional grid or list of card elements, or another visual array format, and determine a user input. Examples of various embodiments of visual arrays are described in further detail below.
- Selecting a particular media option, e.g., a particular card element, within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- the software application environment at the media server can be used to stream or otherwise communicate music, video, or other forms of media content to the media device, wherein the user interface at the media device is adapted to display a plurality of music or video titles that correspond to music or videos stored as media content items in a database or repository at the media server.
- the media device can include a media playback application 143 , together with a multi-track playback logic 144 , prebuffering logic 145 , and playback volume function 146 , which can be used to control the playback and crossfading of media content items and/or samples that are received from the media server application, for playback by the media device, as described in further detail below.
- the prebuffering logic is configured to load or pre-buffer a portion of each media content item, sample, or snippet, at the media device, as determined by the multi-track playback logic. While media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- a user 150 can interact 152 with the application user interface and issue requests, for example the playing of a selected music or video item on their media device.
- the user's selection of a particular media option can be communicated 153 to the media server application, via the media application interface.
- the media server application can then stream corresponding media content 155 , including one or more streams of media content data 160 , 161 , 162 , and subsequently stream 164 or otherwise communicate the, e.g., selected music, video, or other form of media content, to the user's media device.
- pre-buffering requests from the media device can also be communicated to the media server application via the media application interface.
- the media playback application, multi-track playback logic, and playback volume function can combine, crossfade, or otherwise play 165 the requested media content to the user, for example by playing back one or more music or videos on the media device, as described in further detail below.
- FIG. 2 illustrates playback of media content, in accordance with an embodiment.
- a user interface can display a visual array of media options arranged as a two-dimensional grid, with rows and columns of media options visualized as tile-like card elements, wherein each of the media options is associated with one or more media content items that can be played on the device. As shown in FIG.
- media options A 170
- B 171
- C 172
- D 173
- media option center 174 illustrated in the figure as a point
- relatively smaller media preview/select area 175 that is centered on the media option center
- relatively larger media play/crossfade area 176 that generally covers the media option, and, depending on the particular implementation, can also extend to cover portions of other card elements or media options.
- a plurality of media options for example a set of song tracks, a music playlist, or the contents of an album or a media library, can be represented on the user interface as an array of tiles, and wherein each tile can be associated with a particular visualization, for example a cover art identifying a represented song track.
- other forms of visualization can be used for the media options, for example texts, colors, images, or animations. While a selected point or region is moved within the grid of media options, the visualization or appearance of those media options that are proximate to the selected point or region can be modified, for example by varying their opacity, to reflect their status as proximate media options.
- the opacity of a particular point or region can be modified to render the closest or proximate media options to a selected point or region in a more visible manner than other (not selected, or not proximate) options/tiles.
- a user can provide input as a user selection of a point or region 180 .
- the user interface can be a touch-sensitive user interface, which recognizes input in the form of touch—for example the position of a user's finger or a stylus upon the user interface—to determine the selected point or region as it is being moved, in response to a user input, within the visual array of media options (referred to herein in some embodiments as “audio touch” or “audiotouch”).
- the input can be provided by a mouse-down event.
- the system can, upon receiving the user input, initialize playback of those media options associated with the selected point or region.
- Selected media options e.g., music or song tracks
- playback parameters such as the playback volume depends on the distance between the point of input and a specified point of the media option's (e.g., the song track's) array or tile visualization.
- FIG. 3 illustrates an example of a playback volume function 182 , in accordance with an embodiment.
- the playback volume of a media content item can be determined as a function of distance, for example:
- different volume/distance behaviors 185 can be defined, which can be used to determine the size of the media preview/select area 175 , the size of the media play/crossfade area 176 , and the crossfading behavior, to address the needs of a particular implementation, or to suit a desired user experience.
- FIG. 3 is provided for purposes of illustration. In accordance with other embodiments, or to address the needs of other implementations, other types of playback volume functions, including the use of different functions, criteria, and constants, can be used.
- the system can use the middle point of a tile as a point of calculating distance 179 from the selection point or region. In accordance with an embodiment, if the distance is zero, then the system considers that determination to be an actual selection by the user of that media option (e.g., that song track). In accordance with an embodiment, since it may be difficult for a user to precisely select the center of a tile, an area (e.g., 20-50%) of each tile, generally corresponding to the media preview/select area in FIG. 2 , can be considered as centered on that particular option.
- an area e.g., 20-50%) of each tile, generally corresponding to the media preview/select area in FIG. 2 , can be considered as centered on that particular option.
- a two-dimensional grid can measure relative distance along both x and y axes. In the case of a one-dimensional array, for example a vertical list, then the system need only determine relative distance along one axis (e.g., the y axis), since it will not matter where the finger is along the x axis within a particular tile.
- media content items e.g., song tracks
- media content items that are assigned a playback volume value of zero
- the system will recalculate the relative combination of the media content in the output, providing an audio crossfading effect that is controllable by the user.
- the media content item that is nearest the last movement input may either continue to play, or the playback can stop.
- the media device while displaying a grid of tile-like card element, can pre-buffer a specified number of bytes from the audio snippets, for example 1 to 5 seconds. This enables the system, upon receiving a user input, to play back the track immediately using the pre-buffered data, and continue fetching the rest of it. This allows for minimal latency of starting the playback, which results in a compelling user experience.
- Listing 1 provides an exemplary pseudocode of a method for determining multi-track playback of media content, in accordance with an embodiment.
- the pseudocode shown therein is provided for purposes of illustration. In accordance with other embodiments, other methods of determining multi-track playback of media content can be used.
- Number cappedDistance 0.2 return clamp((distance ⁇ 0.5) / 2(cappedDistance ⁇ 0.5) + 0.5) ⁇
- the system can determine if the distance between the user selection of a particular point or region, is less distant from the center of a media option than a defined distance. If it is, then the playback volume for that media option is set to a relative value of 1 (within a range of 0 to 1), which makes it easier for the user to select a media option preview point without media noise from nearby media options.
- the linear distance of the user selection can be determined to be 1 when the user selection is within the preview/select area, and taper off to 0 at a distance generally corresponding to the play/crossfade area.
- the system can then determine relative playback volume based on that distance, with shorter distances having higher playback volume, and longer distances having lower playback volume.
- the volume will be 1, which makes it easier to hit the preview point without hearing noise from tracks nearby.
- the volume must be zero when distance is more than 1-cappedDistance; otherwise the media element could be playing when another element should be the only one being played back.
- the value for cappedDistance must be within (0, 0.5).
- Listing 1 illustrates a clamped linear function that meets the following requirements:
- the opacity of the tiles can also be modified using a distance-based function similar to the one used to calculate playback volume.
- FIGS. 4A-4B further illustrate playback of media content, in accordance with an embodiment.
- a visual array of media options A 1 through A 4 ( 202 - 205 ) are shown, each of which media options is associated with one or more media content items.
- a user can make a selection of a particular media option, for example by placing a mouse cursor or some other selector, at a point or region within the visual array.
- the user interface can be a touch-sensitive user interface, which recognizes input in the form of touch, for example the position of a user's finger or a stylus upon the user interface, to determine the selected point or region within the visual array grid of media options.
- the multi-track playback logic can determine a set of one or more of the plurality of media options that are proximate to the selected point or region (in this example, media options A 1 , A 2 and A 3 ), and, together with its playback volume function, adjust playback parameters, such as the playback volume 190 of the set of media content items associated with those media options, by crossfading or otherwise combining the playback of the set of media content items to reflect their relative distances from the selected point or region.
- An output can then be provided as a played-back or crossfaded media, e.g., a set of crossfaded songs, to the user.
- a played-back or crossfaded media e.g., a set of crossfaded songs
- the user may perceive A 2 as being dominantly played, with some crossfading from sources A 1 and A 3 .
- the system can determine a new point or region or selection, and a plurality of media options that are proximate to the new point or region (in this example, media options A 2 , A 3 and A 4 ).
- the multi-track playback logic In response to receiving the input from the user interface, the multi-track playback logic, together with its playback volume function, can again adjust playback parameters, such as the playback volume of the set of media content items associated with those media options, by crossfading or otherwise combining the playback of the set of media content items to reflect their relative distances from the newly selected point or region.
- the output can then be provided as different played-back or crossfaded media to the user, e.g., as a different set of crossfaded songs.
- the relative playback volume of media content item A 1 and A 2 are decreased (in this example the playback volume of A 1 is reduced almost to zero), while the relative playback volume of media content item A 3 and A 4 are increased, reflecting their relative distances from the selected point or region.
- the relative playback volume of A 3 is increased almost to the exclusion of other media content items, reflecting the much shorter distance between A 3 and the user's selected point or region.
- the user may perceive as A 3 being dominantly played, with little or no contribution or crossfading from any other sources.
- FIG. 5 illustrates a media device with an exemplary user interface which supports playback of media content, in accordance with an embodiment.
- a user interface can display, for example, on a media device, a visual array of media options arranged as a two-dimensional grid of card elements, with rows and columns of media options visualized as tiles, here illustrated as A 1 -A n through E 1 -E n ( 202 - 249 ).
- Each of the media options is associated with one or more media content items that can be played on the device.
- each of the media options can be associated with a status 260 that reflects, from the user's perspective, whether that particular option's associated media content item is playing or not, and, if its associated media content item is playing then whether other media content items are being played at the same time.
- media content items can be either not selected and not playing 262 ; proximate to a selected point or region and playing simultaneously with other media content items 264 (i.e., from the perspective of the user, with some perceptible crossfading of other media content items); proximate to a selected point or region but playing dominantly (i.e., from the perspective of the user, with little or no contribution or crossfading of other media content items) 266 ; or selected and playing (i.e., by itself with no other media content items playing simultaneously) 268 .
- FIG. 6 further illustrates a user interface, in accordance with an embodiment.
- a user may initially select a region ( 180 ) of the user interface generally located in the region of, but not precisely upon, media option B 2 ( 213 ), whose neighboring or proximate media options include A 1 ( 202 ), A 2 ( 203 ), A 3 ( 204 ), B 1 ( 212 ), B 3 ( 214 ), C 1 ( 222 ), C 2 ( 223 ) and C 3 ( 224 ).
- the multi-track playback logic can adjust the playback volume of media content items associated with each of these proximate media options, by crossfading or otherwise combining their playback, to reflect their relative distances from the selected point or region.
- the crossfaded or combined result can then be provided as a played-back media to the user.
- the user may perceive an output from their media device in which media content B 2 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options as illustrated in FIG. 6 .
- FIG. 7 further illustrates a user interface, in accordance with an embodiment.
- a user may move their finger, mouse cursor, or other selector, to select or explore a new point or region of the user interface generally located in the region of media option C 3 , but which is also proximate to media options B 2 , B 3 , B 4 , C 2 , C 4 ( 225 ), D 2 ( 233 ), D 3 ( 234 ) and D 4 ( 235 ).
- the multi-track playback logic can again adjust playback parameters, such as the playback volume of the set of media content items associated with these media options, by crossfading or otherwise combining their playback, to reflect their relative distances from the selected point or region.
- the user may perceive an output from their media device in which media content C 3 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options as illustrated in FIG. 7 .
- the user may perceive a crossfading of media output as the multi-track playback logic gradually adjusts the playback volume of media content items from the initial output in which B 2 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options, to the subsequent output in which C 3 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options.
- a particular number of media options proximate to the selection point or region can be used, for example a window of nine (i.e., 3 ⁇ 3 tiles) proximate media options.
- a window of nine i.e., 3 ⁇ 3 tiles
- different numbers of media options proximate to the selection point or region can be used, and the chosen media options need not necessarily be in a square or other uniform pattern.
- FIG. 8 further illustrates a user interface, in accordance with an embodiment.
- a user may again move their finger, mouse cursor, or other selector, to select or explore another new point or region of the user interface generally located at media option B 3 , but which is also proximate to media options A 3 ( 204 ), B 2 , B 4 ( 215 ), C 2 , C 3 and C 4 .
- the multi-track playback logic can adjust playback parameters, such as the playback volume of the set of media content items associated with these media options, by crossfading or otherwise combining their playback, to reflect their relative distances from the selected point or region, in this example using just seven media options.
- the user may perceive a crossfading of media output as the multi-track playback logic gradually adjusts the playback volume of media content items from the original output in which C 3 is being dominantly played, to the subsequent output in which B 3 is being dominantly played.
- FIG. 9 further illustrates a user interface, in accordance with an embodiment.
- a user interface in accordance with an embodiment, if the user moves their finger, mouse cursor, or other selector, to select the center of a point or region of the user interface generally located at a media option, and leaves it there for a period of time, then in accordance with an embodiment, that media content item can be selected, and played by itself (i.e., from the perspective of the user, with no other media content items playing simultaneously).
- FIG. 10 further illustrates a user interface, in accordance with an embodiment.
- the grid can be automatically scrolled or repositioned 315 , both generally centering the currently selected point or region, and in this example displaying on the user interface new or additional media options X 1 -X 8 ( 302 - 309 ), which can be subsequently selected by the user.
- FIG. 11 further illustrates a user interface, in accordance with an embodiment.
- a user may select a region of the user interface generally located at media options A 1 ( 202 ), with neighboring or proximate media options A 2 , B 1 , X 1 ( 302 ), X 2 ( 303 ) and X 6 ( 307 ).
- the process can generally continue as described above, with the user continuing to move the selected point or region within the visual array, to further explore media options, and the playback of proximate media content items continually adjusted, by crossfading or otherwise combining their playback. For example, when the grid is automatically scrolled or repositioned, the new or additional media options can be explored by the user, or offered as suggestions to browse and experience new media content with which they had not previously been familiar.
- the system can be configured so that, if it determines a media content has been selected, for example by detecting that the user's finger is lifted from the user interface while playing a sample or snippet, the system can, for example, play the remainder of that media content item to its end, by transitioning or otherwise appending the media content, at an appropriate point within its stored content, to the previously-played sample or snippet. Playback flows from the end of the previously-played sample or snippet into the remainder of the media content item.
- FIG. 12 illustrates use of the system to append playback of media content, in accordance with an embodiment.
- a media content item 320 can be associated with a sample that generally comprises a region 322 of the media content, for example a snippet of a song.
- the media content item also includes a remainder 324 of the media content that follows the sample.
- the system can stream/pre-buffer a portion of each of a plurality of media content items, samples, or snippets, at the media device, as determined by the multi-track playback logic 340 .
- the media streaming logic can stream or pre-buffer a sample by beginning streaming at the beginning 342 of the sample region. Subsequently, a user can make a selection of a particular media content/item 350 . When the sample has completed playback, the media streaming logic can immediately continue streaming at the beginning 352 of the remainder region, to append and stream or pre-buffer the remainder of the selected media content item 354 .
- playback continues or flows seamlessly from the end of the previously-played sample or snippet, into the remainder of the media content item.
- the device can simply play the sample to its end; at which point a next song can be chosen according to one or more shuffle rules, a playlist, or other means.
- the system when the system appends media content to a previously-played sample or snippet, the system can first begin playback of pre-buffered content associated with that media content item, and, upon determining a selected point or region remaining within a particular region of the visual array for a period of time, subsequently begin playback of associated media content items.
- the sample or snippet can stop playing, and an original, e.g. song, return to being played. If the user immediately taps the same location, the device can play the last-selected media content from its beginning.
- additional features can be provided that improve user interaction, for example the use of audible notifications (referred to herein in some embodiments as an audio spinner).
- audible notifications referred to herein in some embodiments as an audio spinner.
- a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements.
- Each media option can be associated with one or more media content items that can be streamed to and/or played on the device.
- the system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- the touch preview can begin.
- the overall period of delay may be affected by factors such as the network or cellular bandwidth available for data transfer to the device; the type of media content being transferred (for example, music versus high-definition video); and the complexity of the visual array (for example, a large two-dimensional grid of card elements as displayed on a tablet-type device may include many more options than a short list of card elements as displayed on a smartphone-type device, and correspondingly require much more data to be transferred to prebuffer and prepare the interface for use).
- controlled playback and crossfading techniques described above also lend themselves to use cases in which the user might not be watching the interface—for example they may be casually perusing a list of song media options while listening to the crossfaded output, to determine a song they might like to hear—some feedback to the user that reflects something is happening (for example, that media content is currently being transferred to their device to populate the visual array) is useful. Even a short period of delay or hesitation in playing a particular media content can be irritating to some users, or lead them to believe their device or application is not operating correctly.
- the system can be adapted so that, when the period of delay exceeds a maximum (acceptable) delay time (for example, 500 ms), then an audible notification, such as a small generic sound (e.g., bibodibeep . . . ) can be played for a period of time (for example, 700 ms), and subsequently crossfaded with (into) the actual portion of media content, sample, or snippet, using the controlled playback and crossfading techniques described above.
- a maximum (acceptable) delay time for example, 500 ms
- an audible notification such as a small generic sound (e.g., bibodibeep . . . ) can be played for a period of time (for example, 700 ms), and subsequently crossfaded with (into) the actual portion of media content, sample, or snippet, using the controlled playback and crossfading techniques described above.
- a small generic sound e.g., bibodibeep . .
- FIG. 13 illustrates a system which includes support for audible notifications, in accordance with an embodiment.
- a media device or player for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content
- a media stream service can be used to buffer media content, for streaming to one or more streams; while a media streaming logic can be used to retrieve or otherwise access media content items, and/or samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device.
- the media device can have a media playback application and include a touch-sensitive user interface adapted to display a visual array of media options.
- a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate. Selecting a particular media option within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- the media playback application can include an audible notification logic 450 , which is adapted to provide audible notifications.
- the audible notification logic can perform a delay determination 452 that is reflective of an actual or anticipated delay prior to playback of the associated media content item or sample. If the actual or anticipated delay prior to playback of an associated media content item or sample is determined to be greater than a maximum (acceptable) delay time, then an audible notification functionality is invoked, to play an audible notification (audio spinner) 458 , while prebuffering associated media content items or samples. Otherwise the process can continue to prebuffer and play the associated media content items, without playing the audible notification.
- the maximum (acceptable) delay time can be configured to be 500 ms, and the playing duration of the audible notification can be configured to be 700 ms.
- other values can be used to address the needs of a particular implementation or use case.
- the system can be configured to wait before determining that the actual or anticipated delay prior to playback of the associated media content item or sample is greater than the maximum (acceptable) delay time, and then invoking its audible notification functionality.
- the system can be configured to presume that the actual or anticipated delay prior to playback of an associated media content item or sample may be greater than the maximum (acceptable) delay time, and invoke its audible notification functionality regardless, in which case the audible notification may be initiated, and then cross-faded almost immediately and imperceptibly with the associated media content item or sample.
- FIG. 14 illustrates the use of audible notifications, in accordance with an embodiment.
- a user interface can display a visual array of media options arranged as a two-dimensional grid, with rows and columns of media options visualized as tile-like card elements, wherein each of the media options is associated with one or more media content items that can be played on the device.
- each of the media options can also be associated with a status that reflects, from the user's perspective, whether that particular option's associated media content item is playing or not, and, if its associated media content item is playing then whether other media content items are being cross-faded or played at the same time.
- a user can select a particular point or region 460 . If the system determines that the actual or anticipated delay prior to playback of an associated media content item or sample is greater than a maximum (acceptable) delay time, then an audible notification functionality is invoked, to play an audible notification (audio spinner) 462 , while prebuffering associated media content items or samples.
- an audible notification functionality is invoked, to play an audible notification (audio spinner) 462 , while prebuffering associated media content items or samples.
- the system can determine whether the actual or anticipated delay prior to playback of media option A 2 is greater than a maximum (acceptable) delay time.
- FIG. 15 further illustrates the use of audible notifications, in accordance with an embodiment.
- the audible notification can be played for a period of time (for example, 700 ms), and subsequently crossfaded with (into) the playing of media option A 2 , or a sample associated with media option A 2 , using the controlled playback and crossfading techniques described above.
- the media content in this example media option A 2 , or some combination of media option A 2 with one or more other media options as appropriate
- playing of the audible notification can be terminated earlier if the media option is ready for play. If there is no delay in preparing the media content for play, then it can begin playing almost immediately (or be cross-faded almost immediately) from the perspective of the user.
- FIG. 16 is a flowchart of a method for providing audible notifications, in accordance with an embodiment.
- a computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device.
- playback parameters of the set of one or more media options are adjusted, including crossfading and/or determining a delay prior to start of playback of the associated media content items, or samples thereof.
- an audible notification functionality is invoked, to play an audible notification while prebuffering associated media content items or samples. Otherwise the process can continue to prebuffer and play the associated media content items or samples, without playing the audible notification.
- the system can be configured to wait before determining that the actual or anticipated delay prior to playback of the associated media content item or sample will be greater than the maximum (acceptable) delay time; or alternatively presume that the actual or anticipated delay prior to playback of an associated media content item or sample may be greater than the maximum (acceptable) delay time, and invoke its audible notification functionality regardless, in which case the audible notification may be initiated, and then cross-faded almost immediately and imperceptibly with the associated media content item or sample.
- the media content items or samples associated with the set of one or more media options are played.
- the system can subsequently determine selection of a new point or region within the visual array, and plurality of media options that are proximate to the new point or region, and adjust the playback parameters of media content items proximate the new point or region.
- additional features can be provided that improve user interaction, for example the use of media caching (referred to herein in some embodiments as audio touch caching).
- a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements.
- Each media option can be associated with one or more media content items that can be streamed to and/or played on the device.
- the system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- the system can take into account behavioral characteristics and other settings which imply a user's data plan preferences, and use such information to determine a caching policy by which the data necessary for prebuffering and preparing the interface will be cached at the media device.
- caching policies can include cache-all, cache-aggressively, cache-casually, and/or don't-cache policies, each of which can have their own features as described in further detail below. In this manner, the caching policy can be used to optimize the user experience without requiring backend changes, say at the media server, and while respecting the user's data plan.
- the media device can have a media playback application and include a touch-sensitive user interface adapted to display a visual array of media options.
- a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate. Selecting a particular media option within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- a cache logic 500 maintains a cache 502 or repository of media content.
- the cache logic is adapted to make cache requests 504 to the media server application, to cache media content 506 within the cache, as content 508 , which can then be used to support the operation of the visual array or other functionalities as described above, including prebuffering and preparing the interface for use.
- the cache logic can determine, based on behavioral characteristics and other settings, whether the user is likely to be using a lower-bandwidth or lower-allowance data plan, versus a higher-bandwidth or higher-allowance data plan, and operate accordingly.
- FIG. 18 further illustrates a system which includes media caching, in accordance with an embodiment.
- the system can take into account user settings 510 , user behavior 512 , and cache settings 514 , and can additionally receive 516 visible option information 518 describing the currently visible options on the visual array. Based on this information, a policy determination component 520 can determine an appropriate caching policy 522 to be used in making cache request to the media server.
- one or more user settings such as a user-specified streaming quality, or synchronization over a 3G network; user behaviors such as use of a number of streams, or online/offline playlists; or cache settings can be used to determine that the user is likely to be using a lower-bandwidth or lower-allowance data plan, or alternatively a higher-bandwidth or higher-allowance data plan.
- FIG. 19 further illustrates a system which includes media caching, in accordance with an embodiment.
- current and/or historical usage information 528 such as the total amount of data transferred to the device within a current or previous temporal interval (for example, within a current or previous monthly billing cycle) can be determined.
- the cache logic including its policy determination component can select 530 from available caching policies 531 , examples of which can include a cache-all policy 532 , a cache-aggressive policy 534 , a cache-casual policy 536 , or a no-caching policy 538 , some features of which are described below.
- available caching policies 531 examples of which can include a cache-all policy 532 , a cache-aggressive policy 534 , a cache-casual policy 536 , or a no-caching policy 538 , some features of which are described below.
- the policy descriptions are provided by way of example to illustrate various embodiments, and that in accordance with other embodiments or other implementations other types of caching policies can be used.
- the system can determine to use a cache-all policy when the user settings, user behavior and/or cache settings indicate that W-Fi is turned on, and for personalized (e.g., “Your Music”) views during a touch session. Since personalized views have relatively slow data dynamics, caching of media options within such views should occur seldom compared with other views.
- personalized e.g., “Your Music”
- the device and/or media application when a cache-aggressively policy is used, is adapted to caching media content or sample data which is associated with each of the media options currently displayed and visible on the user interface, but with a strict, e.g., monthly, caching limit, or a caching limit based on some other temporal factor.
- the system can determine to use a cache-aggressively policy when current and/or historical usage information indicates that the user has not cached much data during the current month (for example, less than 10-50 Mb), and/or the user settings suggest that the user does not care that much about data plans (for example, setting a high streaming extreme quality, or synchronization over 3G), and/or the user behavior detected similarly suggests that the user does not care that much about data plans (for example, a lot of streams versus offline mode, or a larger number of bytes streamed monthly).
- current and/or historical usage information indicates that the user has not cached much data during the current month (for example, less than 10-50 Mb)
- the user settings suggest that the user does not care that much about data plans (for example, setting a high streaming extreme quality, or synchronization over 3G), and/or the user behavior detected similarly suggests that the user does not care that much about data plans (for example, a lot of streams versus offline mode, or a larger number of bytes streamed monthly).
- the device and/or media application when a cache-casually policy is used, is adapted to not cache media content or sample data associated with all of the media options currently displayed and visible on the user interface, but instead a subset of those media options that have higher usage probability per view. For example, in a track list view, users may tend to start previewing the first two tracks in a list, and so these media options have a higher usage probability than others further down the list.
- the system can determine to use a cache-casually policy in those situations in which the user has already cached much data during the month, and/or the user settings detected that suggest that the user does care about their data plan (for example, setting a low streaming quality, or not to allow synchronization over 3G), and/or the user behavior detected similarly suggests that the user does care about their data plan (for example, a lot of streams from offline playlists compared to streaming).
- the user settings detected for example, setting a low streaming quality, or not to allow synchronization over 3G
- the user behavior detected similarly suggests that the user does care about their data plan (for example, a lot of streams from offline playlists compared to streaming).
- the device and/or media application when a don't cache policy is used the device and/or media application is adapted to not cache anything at all until the user starts a touch session.
- the system can determine to use a don't-cache policy when the user has already cached a particular amount of data within a particular temporal interval, for example within a monthly billing cycle.
- caching policies can be used, in addition to different rules for determining, based on user settings, user behavior, and cache settings and/or visible option information, which particular caching policy may best suit a particular situation or session.
- FIG. 20 is a flowchart of a method for providing media caching, in accordance with an embodiment.
- a computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device.
- the user interface is prepared for use during a user session, which allows selection of points or regions within the visual array, and determination of media options proximate to the selected points or regions.
- one or more cache settings, user settings, user behavior, and/or visible media option information for media options displayed on the user interface are determined.
- the system can determine, from within a plurality of caching policies, based on current usage information and/or historical usage information, a particular caching policy to be used for the user session.
- cache requests can then be communicated by the media playback application to a media server, to retrieve media content to be cached at the computer system or device for subsequent display at the user interface, in accordance with the determined caching policy.
- Embodiments of the system can be used to support a “What can I preview?” environment, in which media content items can be previewable from different presentation formats, so that a user can easily browse and move between, for example, a music album and a playlist, and then on to an individual song, or to a radio station, or other source of media content or presentation format thereof.
- a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements.
- Each media option can be associated with one or more media content items that can be streamed to and/or played on the device.
- the system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- Such environment is not restricted to displaying a static set of media content, or one purely determined by, e.g., the media server.
- a user may want to select various media content items (e.g., songs) and add them to a playlist and/or to a collection in different ways.
- a touch logic allows a user to define and/or select both client-side playlists and server-side playlists, so that a selected playlist can be associated with and viewed or played using a touch menu. For example, upon selection of a playlist media content item, a list of media options within that playlist can be displayed. The user can continue to select media options from the playlist, including the use of cross-fading, audible notification (audio spinner), or other features as described above.
- FIG. 21 illustrates a system which includes touch menus, in accordance with an embodiment.
- a media device or player for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content
- a media stream service can be used to buffer media content, for streaming to one or more streams; while a media streaming logic can be used to retrieve or otherwise access media content items, and/or samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device.
- the media device can include a media playback application and touch-sensitive user interface adapted to display a visual array of media options.
- a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate. Selecting a particular media option within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- a touch logic 545 allows a user to define and/or select both client-side playlists 546 and server-side playlists 547 .
- the touch logic enables a selected playlist 548 to be associated with and viewed or played using a touch menu 549 , as described in further detail below.
- FIGS. 22A-22B further illustrate a system which includes a touch menu, in accordance with an embodiment.
- the device can display a first screen mode 550 in which a visual array of media options is arranged as a two-dimensional grid, with rows and columns of media options visualized as tile-like card elements, wherein each of the media options is associated with one or more media content items that can be played on the device.
- the multi-track playback logic can adjust the playback volume of media content items associated with proximate media options, by crossfading or otherwise combining their playback, to reflect their relative distances from a selected point or region.
- the interface can change to show subcontent or component media content items for the particular media content item, in this example subcontents A-F ( 553 - 558 respectively).
- subcontents A-F subcontents A-F
- a user can make a selection of a song album, which causes the system to display as subcontent the songs within that album.
- a user can make a selection of a song playlist, which causes the system to display as subcontent the songs within that playlist.
- FIGS. 23A-23B further illustrate a system which includes a touch menu, in accordance with an embodiment.
- the media options can be associated with playlists.
- a user can browse a named playlist 552 , and make a selection 554 .
- the touch logic can cause the display 559 of the media options to switch as shown, from a first screen mode as described above to a list screen mode 560 .
- the user can continue to select 562 songs from the playlist, including discretely selecting songs, or incorporating the use of crossfade features as described above.
- FIG. 24 further illustrates the use of a touch menu, in accordance with an embodiment.
- an interface can be displayed 564 , including media content items and/or playlists, optionally with cover art 566 for each media content item.
- FIG. 25 further illustrates the use of a touch menu, in accordance with an embodiment.
- an instruction can be received 570 from a user to store one or more selected media options to a playlist, and/or to follow a selected media option.
- Selected songs can optionally be displayed with additional information for the selected media option 571 .
- the user can toggle on/off a follow button 572 , or can swipe 574 a media option to save it to a playlist, in which case the media option can be identified by a suitable indicator 576 .
- FIG. 26 further illustrates the use of a touch menu, in accordance with an embodiment.
- an instruction can be received 580 from a user to display playlists, and display of a list of available playlist options, wherein each playlist option is associated with a playlist of one or more media options and media content items.
- the device can display user personal/playlist information 581 , optionally with buttons to select playlists 582 , songs 583 , or albums 584 .
- the device can display a plurality of playlists, including in this example a playlist A ( 590 ), playlist B ( 591 ), playlist C ( 592 ), playlist D ( 593 ), playlist E ( 594 ), playlist F ( 595 ), playlist G ( 596 ) through playlist N ( 597 ), for selection thereof.
- a playlist A 590
- playlist B 591
- playlist C 592
- playlist D 593
- playlist E 594
- playlist F 595
- playlist G 596
- N playlist N
- FIG. 27 further illustrates the use of a touch menu, in accordance with an embodiment.
- the user has selected playlist D.
- the system in response to receiving a selection 600 of a particular playlist option, the system can display a visual array of media options associated with the selected playlist option, wherein each media option is associated with one or more media content items that can be selected for playing.
- a card element displayed within the visual array can contain other card elements, with subcontent or component media content items associated therewith.
- the system determines that a user has selected a card element within the visual array which includes other card elements, it causes the resultant grid to zoom in, with the effect of bringing the user closer and further immersed in the content, while revealing a selection of what is inside that card element.
- the user can interact with the contents of the card using the techniques described above, for example by sliding their finger or a cursor over the visual array, changing the audio output and visible representation in a preview like manner. While the content is zoomed closer to the user, a cover art can be blurred while it scales to fill the entire card element.
- the grid can then be formed, building up from the position of the user's finger and then outwardly.
- Grid items can scale and fade in as they appear. While cover art is being pre-fetched the system can display the custom song placeholder image, and then fade in the cover art when it becomes available.
- the user's selected playlist D is associated with media options X 1 -X 3 ( 602 - 604 ), and with potentially other media options not currently displayed. Since a visual array of media options associated with the selected playlist option is displayed, the visual array can be scrolled or repositioned as described above, to allow new or additional media options associated with the selected playlist option, to be explored by the user.
- a cell can include cover art to its left when long pressing a cell that represents content, the cover art to the left can scale up and blur to fill the entire cell or a portion of the screen. At the same time a representative section of what's inside can be revealed within a horizontal visual array or grid, within which the user can then further interact.
- FIG. 28 further illustrates the use of a touch menu, in accordance with an embodiment.
- the user has now moved the selected point or region downwards and has selected playlist E.
- the system can display a visual array of media options associated with the selected new playlist option.
- a crossfaded or combined result of the playlists or media options therein can be provided as a played-back media content to the user, using the above techniques and based on the position of the selected point or region.
- the user's selected new playlist E is associated with media options Y 1 -Y 3 ( 612 - 614 ), and with potentially other media options not currently displayed.
- the visual array can be scrolled or repositioned, to allow new or additional media options associated with the selected playlist option, to be explored by the user.
- FIGS. 29-30 further illustrate the use of a touch menu, in accordance with an embodiment.
- the selected option can scale up 615 to fill a portion of the screen, while other array components can be similarly adjusted in their presentation size.
- the visual array can again be scrolled or repositioned, to allow new or additional media options to be explored by the user, including asymmetric adjustment in their presentation size 616 to better reflect the position of the user's finger and guide them in their previewing.
- FIGS. 31-33 further illustrates the use of a touch menu, in accordance with an embodiment.
- an instruction can be received 617 from a user to display playlists, and display of a list of available playlist options, wherein each playlist option is associated with a playlist of one or more media options and media content items.
- the system in response to receiving a selection 618 of a particular playlist option, can display a visual array of media options associated with the selected playlist option, wherein each media option is associated with one or more media content items that can be selected for playing.
- FIG. 31 in accordance with an embodiment, in response to receiving a selection 618 of a particular playlist option, the system can display a visual array of media options associated with the selected playlist option, wherein each media option is associated with one or more media content items that can be selected for playing.
- FIGS. 34-35 further illustrates the use of a touch menu, in accordance with an embodiment.
- an instruction can be received 620 from a user to display playlists, and a list of available playlist options, wherein each playlist option is associated with a playlist of one or more media options and media content items.
- a user can select a next song to be played 621 , for example by tapping the preview, or by holding their finger on a media option until a selection symbol (e.g., circle) is rendered.
- the previewed (and now selected) media content can then be made the active media content, (e.g., active song) 622 .
- the user can tap off the preview, i.e., quickly lift and tap on the same spot, to trigger the preview as the current song.
- Such an embodiment is useful for environments in which the user is allowed to select the next song.
- FIG. 36 is a flowchart of a method for providing a touch menu, in accordance with an embodiment.
- a computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device.
- an input is received from a user, at the user interface, including a selection of a card element, point or region within the visual array corresponding to a selected media option.
- the system can determine one or more media options associated with or proximate to the selected card element, point or region, for use in adjusting playback of those media options.
- the system can display a visual array of additional media options associated with the selected media options, wherein each media option in the visual array of additional media options is similarly associated with one or more media content items that can be selected for playing.
- step 634 in response to receiving a selection from a user, at the user interface, of a new media option, display a revised visual array of media options that are associated with the selected new media option.
- FIG. 37 is a flowchart of a method for use by a server in providing a touch menu, in accordance with an embodiment.
- a plurality of media content items, and/or samples associated with the media content items that can be streamed to and/or played on a client computer system or device are provided at a media server, within a database or repository.
- step 642 input is received from a client computer system or device having a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more of the media content items, including a selection of a card element, point or region within the visual array corresponding to a selected media option.
- one or more media options associated with or proximate to the selected card element, point or region are determined, for use in adjusting playback of those media options.
- media content associated with the one or more media options associated with or proximate to the selected card element is streamed to the client computer system or device, for use by the client computer system or device in displaying a visual array of additional media options associated with the selected media options, wherein each media option in the visual array of additional media options is similarly associated with one or more media content items that can be selected for playing.
- FIG. 38 is a flowchart of a method for use by a client device in providing a touch menu, in accordance with an embodiment.
- a client computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device.
- step 652 input is received from a user, at the user interface, including a selection of a card element, point or region within the visual array corresponding to a selected media option.
- the selection of card element, point or region within the visual array corresponding to a selected media option is communicated to a media server, for use by the server in determining one or more media options associated with or proximate to the selected card element, point or region, for use in adjusting playback of those media options.
- streamed media content associated with the one or more media options associated with or proximate to the selected card element is received from the media server.
- a visual array of additional media options associated with the selected media options is displayed, wherein each media option in the visual array of additional media options is similarly associated with one or more media content items that can be selected for playing.
- the system can include support for force-sensitive touch input in selection, playback, or other interaction with media options.
- a media device can include an interface that responds differently to variations in input pressure exerted by a user, wherein the device is configured so that an amount of pressure applied by the user in touch-selecting a particular media option or other input region can be used to affect the operation of the device, interface or menu options; or the selection, playback, or other interaction with an associated media content item.
- a less-forceful or light touch by the user upon a particular media option can be used to cause the system to provide an audio preview of an associated song; while a more-forceful, hard, or firm touch by the user can cause the system to switch to playing that song.
- FIG. 39A-39C illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment.
- the system can be configured so that, if it determines a media content has been selected during preview, the system can, for example, play the remainder of that media content item to its end, or otherwise allow the previewed media content to be made the active media content, (e.g., active song).
- the active media content e.g., active song
- a user interface 700 can display media options, and can respond to variations in the amount of pressure applied by the user in touch-selecting a particular media option or other input region, such that a user can use a light touch 704 on a media option to cause the system to provide a preview, e.g., an audio preview of a song 706 . If the user continues to use a light touch, they can similarly continue to preview other media options.
- FIG. 40A-40C further illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment.
- a touch menu including support for force-sensitive touch input
- the user if upon previewing a song, the user instead uses a firm touch 708 , then upon the system detecting the firm touch, the previewed media content can be selected for play as an active media content (e.g., song) 710 .
- the user interface can be updated to indicate the selection.
- FIG. 41A-41C further illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment.
- a menu or information can be displayed for the previewed or selected media content 712 , for example as a menu 714 of options to add the selected media content to a playlist or queue, or to provide other information or options.
- Embodiments of the present invention can be conveniently implemented using one or more conventional general purpose or specialized digital computer, computing device, machine, or microprocessor, including one or more processors, memory and/or computer readable storage media programmed according to the teachings of the present disclosure.
- Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art.
- the present invention includes a computer program product which is a non-transitory storage medium or computer readable medium (media) having instructions stored thereon/in which can be used to program a computer to perform any of the processes of the present invention.
- the storage medium can include, but is not limited to, any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application titled “SYSTEM AND METHOD FOR PLAYBACK OF MEDIA CONTENT WITH SUPPORT FOR AUDIO TOUCH FUNCTIONALITY”, application Ser. No. 14/879,774, filed on Oct. 9, 2015; which claims the benefit of priority to U.S. Provisional Applications “SYSTEM AND METHOD FOR PLAYBACK OF MEDIA CONTENT WITH AUDIO SPINNER FUNCTIONALITY”, Application No. 62/062,573, filed Oct. 10, 2014; “SYSTEM AND METHOD FOR PLAYBACK OF MEDIA CONTENT WITH SUPPORT FOR AUDIO TOUCH CACHING”, Application No. 62/062,580, filed Oct. 10, 2014; “SYSTEM AND METHOD FOR PLAYBACK OF MEDIA CONTENT WITH AUDIO TOUCH MENU FUNCTIONALITY”, Application No. 62/062,582, filed Oct. 10, 2014; and “SYSTEM AND METHOD FOR PLAYBACK OF MEDIA CONTENT WITH SUPPORT FOR FORCE-SENSITIVE TOUCH INPUT”, Application No. 62/217,767, filed Sep. 11, 2015; each of which above applications are herein incorporated by reference.
- This application is related to U.S. patent application titled “SYSTEM AND METHOD FOR MULTI-TRACK PLAYBACK OF MEDIA CONTENT”, application Ser. No. 14/228,605, filed Mar. 28, 2014, which application is herein incorporated by reference.
- A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
- Embodiments of the invention are generally related to means for providing music, video, or other media content, and are particularly related to features for improving user interaction, such as use of audible notifications, media caching, and touch menus.
- The digital media industry has evolved greatly within the past several years. Today's consumers enjoy the ability to access a tremendous amount of media content, such as music and videos, at any location or time of day, using a wide variety of computing systems, handheld entertainment devices, smartphones, or other types of media device. With the availability of reliable high-speed Internet connectivity, and advances in digital rights management, many users can now stream media content, on demand, from peer devices or remote servers.
- However, with the increase in the amount of media content available, there exists the challenge of how to best provide access to that content. Much interest has been directed to techniques that enable users to interact with media content libraries in a user-friendly and intuitive manner which does not interfere with their enjoyment of the content. These are generally the types of environment in which embodiments of the invention can be used.
- In accordance with an embodiment, described herein is a system and method for playback of media content, for example music, video, or other media content. A media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device. The system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate. In accordance with various embodiments, additional features can be provided that improve user interaction, for example the use of audible notifications, media caching, or touch menus, including support for force-sensitive touch input.
-
FIG. 1 illustrates a system for playback of media content, in accordance with an embodiment. -
FIG. 2 illustrates playback of media content, in accordance with an embodiment. -
FIG. 3 illustrates an example of a playback volume function, in accordance with an embodiment. -
FIGS. 4A-4B further illustrate playback of media content, in accordance with an embodiment. -
FIG. 5 illustrates a media device with an exemplary user interface which supports playback of media content, in accordance with an embodiment. -
FIG. 6 further illustrates a user interface, in accordance with an embodiment. -
FIG. 7 further illustrates a user interface, in accordance with an embodiment. -
FIG. 8 further illustrates a user interface, in accordance with an embodiment. -
FIG. 9 further illustrates a user interface, in accordance with an embodiment. -
FIG. 10 further illustrates a user interface, in accordance with an embodiment. -
FIG. 11 further illustrates a user interface, in accordance with an embodiment. -
FIG. 12 illustrates use of the system to append playback of media content, in accordance with an embodiment. -
FIG. 13 illustrates a system which includes support for audible notifications, in accordance with an embodiment. -
FIG. 14 illustrates the use of audible notifications, in accordance with an embodiment. -
FIG. 15 further illustrates the use of audible notifications, in accordance with an embodiment. -
FIG. 16 is a flowchart of a method for providing audible notifications, in accordance with an embodiment. -
FIG. 17 illustrates a system which includes media caching, in accordance with an embodiment. -
FIG. 18 further illustrates a system which includes media caching, in accordance with an embodiment. -
FIG. 19 further illustrates a system which includes media caching, in accordance with an embodiment. -
FIG. 20 is a flowchart of a method for providing media caching, in accordance with an embodiment. -
FIG. 21 illustrates a system which includes touch menus, in accordance with an embodiment. -
FIGS. 22A-22B further illustrate a system which includes a touch menu, in accordance with an embodiment. -
FIGS. 23A-23B further illustrate a system which includes a touch menu, in accordance with an embodiment. -
FIG. 24 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 25 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 26 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 27 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 28 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 29 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 30 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 31 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 32 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 33 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 34 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 35 further illustrates the use of a touch menu, in accordance with an embodiment. -
FIG. 36 is a flowchart of a method for providing a touch menu, in accordance with an embodiment. -
FIG. 37 is a flowchart of a method for use by a server in providing a touch menu, in accordance with an embodiment. -
FIG. 38 is a flowchart of a method for use by a client device in providing a touch menu, in accordance with an embodiment. -
FIG. 39A-39C illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment. -
FIG. 40A-40C further illustrates the use of a touch menu including support for force- sensitive touch input, in accordance with an embodiment. -
FIG. 41A-41C further illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment. - As described above, today's consumers of digital media enjoy the ability to access a tremendous amount of media content, such as music and videos, at any location or time of day, using a wide variety of computing systems, handheld entertainment devices, smartphones, or other types of media device. However, with the increase in the amount of media content available, there exists the challenge of how to best provide access to that content in a user-friendly and intuitive manner.
- To address this, in accordance with an embodiment, described herein is a system and method for playback of media content, for example music, video, or other media content. A media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device. The system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- In accordance with various embodiments, additional features can be provided that improve user interaction, for example the use of audible notifications, media caching, or touch menus, including support for force-sensitive touch input.
-
FIG. 1 illustrates a system for playback of media content, in accordance with an embodiment. As shown inFIG. 1 , in accordance with an embodiment, a media device orplayer 100, for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content, can be used to play media content that is provided by a computer system operating as amedia server 102, or from another system or peer device. - Each of the media device and the computer system operating as the media server can include, respectively, one or more physical computer or
hardware resources - In accordance with an embodiment, the media server can include an operating system or other processing environment which supports execution of a
software application environment 110, including amedia server application 114 which can be used, for example, to stream music, video, or other forms of media content. Amedia stream service 120 can be used to buffer media content, for streaming to one ormore streams media application interface 128 can receive requests from media devices or other systems, to retrieve media content from the media server. - Media content or items 131 (generally referred to herein as media content items), and/or
samples 132 associated with the media content items, can be provided, for example, within a database or repository, or can be received at the media server from another source. - In accordance with an embodiment, each media content item that can be provided by the media server can be associated with one or more samples. For example, in accordance with an embodiment, a particular media content item can be associated with a plurality of samples of different playing durations, each of which can be taken from different segments (for example, the beginning, or the middle) of the media content item. The samples can be similarly stored at, and thereafter provided by, the media server. The determination of which of the plurality of samples to use at a particular point in time depends on the particular implementation, and may also take into account realtime considerations such as balancing network bandwidth usage versus providing a smoother user experience.
- In accordance with an embodiment, the samples can be snippets or fragments of an associated media content that are determined by a media content producer (e.g., a record label) to reflect that particular media content (e.g., a particular song track) created by that content producer. For example, a song snippet may be a particularly recognizable portion of a particular song. Similarly, a video content snippet may be a particularly recognizable portion of a particular video content. In accordance with various embodiments, other types of samples or snippets can be used to provide a preview of an associated media content.
- For example, in accordance with an embodiment, the system can use 30, 60, or 90 second audio-preview snippets for every song track. Longer snippets can provide a sufficient audio impression for the user of tuning into a particular track and being able to hear it through to its the end, after which the player can continue playing whatever is next in that context, providing an “on-demand” experience.
- In accordance with an embodiment, a
media streaming logic 130 can be used to retrieve or otherwise access the media content items, and/or the samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device. - As described above, in accordance with an embodiment, the media device can be a computing system, handheld entertainment device, smartphone, or other type of device that can playback media content. Although in
FIG. 1 only a single media device and media server is shown, in accordance with an embodiment, the media server can support the simultaneous use of multiple media devices, and/or the media device can simultaneously access media content at multiple media servers. - In accordance with an embodiment, the media device can include a user interface 140, which is adapted to display or otherwise provide a visual array of
media options 142, for example as a two-dimensional grid or list of card elements, or another visual array format, and determine a user input. Examples of various embodiments of visual arrays are described in further detail below. - Selecting a particular media option, e.g., a particular card element, within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- For example, in accordance with various embodiments, the software application environment at the media server can be used to stream or otherwise communicate music, video, or other forms of media content to the media device, wherein the user interface at the media device is adapted to display a plurality of music or video titles that correspond to music or videos stored as media content items in a database or repository at the media server.
- In accordance with an embodiment, the media device can include a
media playback application 143, together with amulti-track playback logic 144,prebuffering logic 145, andplayback volume function 146, which can be used to control the playback and crossfading of media content items and/or samples that are received from the media server application, for playback by the media device, as described in further detail below. - In accordance with an embodiment, the prebuffering logic is configured to load or pre-buffer a portion of each media content item, sample, or snippet, at the media device, as determined by the multi-track playback logic. While media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- In accordance with an embodiment, a user 150 can interact 152 with the application user interface and issue requests, for example the playing of a selected music or video item on their media device.
- In accordance with an embodiment, the user's selection of a particular media option can be communicated 153 to the media server application, via the media application interface. The media server application can then stream corresponding
media content 155, including one or more streams ofmedia content data - At the media device, in response to the user's interaction with the user interface, the media playback application, multi-track playback logic, and playback volume function can combine, crossfade, or otherwise play 165 the requested media content to the user, for example by playing back one or more music or videos on the media device, as described in further detail below.
-
FIG. 2 illustrates playback of media content, in accordance with an embodiment. In accordance with an embodiment, a user interface can display a visual array of media options arranged as a two-dimensional grid, with rows and columns of media options visualized as tile-like card elements, wherein each of the media options is associated with one or more media content items that can be played on the device. As shown inFIG. 2 , in the example illustrated therein, four media options A (170), B (171), C (172), and D (173) are visualized as tile-like card elements, each of which has a media option center 174 (illustrated in the figure as a point), a relatively smaller media preview/select area 175 that is centered on the media option center, and a relatively larger media play/crossfade area 176 that generally covers the media option, and, depending on the particular implementation, can also extend to cover portions of other card elements or media options. - In accordance with an embodiment, a plurality of media options, for example a set of song tracks, a music playlist, or the contents of an album or a media library, can be represented on the user interface as an array of tiles, and wherein each tile can be associated with a particular visualization, for example a cover art identifying a represented song track.
- In accordance with other embodiments, other forms of visualization can be used for the media options, for example texts, colors, images, or animations. While a selected point or region is moved within the grid of media options, the visualization or appearance of those media options that are proximate to the selected point or region can be modified, for example by varying their opacity, to reflect their status as proximate media options.
- For example, in accordance with an embodiment, the opacity of a particular point or region, including the closest media option/tile and/or proximate media options/tiles, can be modified to render the closest or proximate media options to a selected point or region in a more visible manner than other (not selected, or not proximate) options/tiles.
- In accordance with an embodiment, a user can provide input as a user selection of a point or
region 180. In accordance with an embodiment, the user interface can be a touch-sensitive user interface, which recognizes input in the form of touch—for example the position of a user's finger or a stylus upon the user interface—to determine the selected point or region as it is being moved, in response to a user input, within the visual array of media options (referred to herein in some embodiments as “audio touch” or “audiotouch”). In the case of a mouse-based interface, the input can be provided by a mouse-down event. - In accordance with an embodiment, the system can, upon receiving the user input, initialize playback of those media options associated with the selected point or region. Selected media options (e.g., music or song tracks) can be played simultaneously according to a playback volume function, wherein playback parameters, such as the playback volume depends on the distance between the point of input and a specified point of the media option's (e.g., the song track's) array or tile visualization.
-
FIG. 3 illustrates an example of aplayback volume function 182, in accordance with an embodiment. In accordance with an embodiment, the playback volume of a media content item can be determined as a function of distance, for example: -
y=max(0,min(1,((x−0.5)/2(C−0.5))+0.5)) - wherein
-
- x represents the
distance 183 between the selected point or region and a particular media option element, such as its center (for example when x=0, the user's finger is considered directly on the center of the particular media option, whereas when x=1, the user's finger is on a media option that is adjacent to the particular media option); - y represents a playback volume 184 (for example, 0 being silent, and 1 being a maximum or full playback volume); and
- C is a constant which reflects a distance from a particular media option's center that still results in full playback volume, such as the preview/select area described above.
- x represents the
- As shown in
FIG. 3 , depending on the value used for C (examples of which for C=0.0, C=0.2, and C=0.4 are illustrated), different volume/distance behaviors 185 can be defined, which can be used to determine the size of the media preview/select area 175, the size of the media play/crossfade area 176, and the crossfading behavior, to address the needs of a particular implementation, or to suit a desired user experience. - The example illustrated in
FIG. 3 is provided for purposes of illustration. In accordance with other embodiments, or to address the needs of other implementations, other types of playback volume functions, including the use of different functions, criteria, and constants, can be used. - In accordance with an embodiment, in the case of a visual array representation, such as a grid, that uses a plurality of tiles, the system can use the middle point of a tile as a point of
calculating distance 179 from the selection point or region. In accordance with an embodiment, if the distance is zero, then the system considers that determination to be an actual selection by the user of that media option (e.g., that song track). In accordance with an embodiment, since it may be difficult for a user to precisely select the center of a tile, an area (e.g., 20-50%) of each tile, generally corresponding to the media preview/select area inFIG. 2 , can be considered as centered on that particular option. - In accordance with an embodiment, if the selected point or region is more than one tile size away from a particular media option, then the playback volume of that particular media option is set to zero. In accordance with an embodiment, a two-dimensional grid can measure relative distance along both x and y axes. In the case of a one-dimensional array, for example a vertical list, then the system need only determine relative distance along one axis (e.g., the y axis), since it will not matter where the finger is along the x axis within a particular tile.
- In accordance with an embodiment, media content items (e.g., song tracks), that are assigned a playback volume value of zero, are not being played. In accordance with an embodiment, while the triggering user input still lasts (for example, the user explores the available media options by keeping their finger touching the screen while moving, or the mouse button is held down), changing the input position (e.g., moving the finger, or moving the mouse cursor respectively), the system will recalculate the relative combination of the media content in the output, providing an audio crossfading effect that is controllable by the user.
- In accordance with an embodiment, after ending the triggering input (e.g., the user releasing their finger, or releasing the mouse button respectively), then depending on the particular implementation, the media content item that is nearest the last movement input may either continue to play, or the playback can stop.
- In accordance with an embodiment, while displaying a grid of tile-like card element, the media device can pre-buffer a specified number of bytes from the audio snippets, for example 1 to 5 seconds. This enables the system, upon receiving a user input, to play back the track immediately using the pre-buffered data, and continue fetching the rest of it. This allows for minimal latency of starting the playback, which results in a compelling user experience.
-
Listing 1 provides an exemplary pseudocode of a method for determining multi-track playback of media content, in accordance with an embodiment. The pseudocode shown therein is provided for purposes of illustration. In accordance with other embodiments, other methods of determining multi-track playback of media content can be used. -
Listing 1Number clamp(Number x) { return max(0, min(1, x)) } # Calculate the volume for a given media element and a selection # position. The coordinate system is assumed to be normalized so # that the distance between different media elements is 1. Number calculateMediaPlaybackVolume( Vector mediaPosition, Vector selectionPosition) { Number distance = |mediaPosition − selectionPosition| Number cappedDistance = 0.2 return clamp((distance − 0.5) / 2(cappedDistance − 0.5) + 0.5) } - In accordance with an embodiment, the system can determine if the distance between the user selection of a particular point or region, is less distant from the center of a media option than a defined distance. If it is, then the playback volume for that media option is set to a relative value of 1 (within a range of 0 to 1), which makes it easier for the user to select a media option preview point without media noise from nearby media options. The linear distance of the user selection can be determined to be 1 when the user selection is within the preview/select area, and taper off to 0 at a distance generally corresponding to the play/crossfade area.
- Having calculated a clamped distance of the user selection with respect to each of a plurality of media options, the system can then determine relative playback volume based on that distance, with shorter distances having higher playback volume, and longer distances having lower playback volume.
- For example, as shown in Listing 1:
-
Number distance=|mediaPosition−selectionPosition| - In accordance with an embodiment, if the distance from a selection to the media element's position is less than a cappedDistance, the volume will be 1, which makes it easier to hit the preview point without hearing noise from tracks nearby. In order for this to work properly, the volume must be zero when distance is more than 1-cappedDistance; otherwise the media element could be playing when another element should be the only one being played back. In this example, the value for cappedDistance must be within (0, 0.5).
-
Listing 1 illustrates a clamped linear function that meets the following requirements: -
- 0≦f(x)≦1 (i.e., the volume is never less than silent, or more than full volume);
- f(x)+f(1−x)=1 (i.e., while the user continually moves the selection between the adjacent media elements, the sum of the volumes of those two media elements is full volume);
- f is monotonically decreasing (i.e., while the user moves the selection away from a media element the volume never increases); and
- f(cappedDistance)=1 (i.e., the volume at this distance is at max even when the selection isn't exactly at the very center of the media element).
- In accordance with an embodiment, as a visualization feature, while the user moves the selected point or region, for example by moving their finger over the media options, the opacity of the tiles can also be modified using a distance-based function similar to the one used to calculate playback volume.
-
FIGS. 4A-4B further illustrate playback of media content, in accordance with an embodiment. As shown inFIG. 4A , in the example user interface illustrated therein, a visual array of media options A1 through A4 (202-205) are shown, each of which media options is associated with one or more media content items. A user can make a selection of a particular media option, for example by placing a mouse cursor or some other selector, at a point or region within the visual array. - As described above, in accordance with an embodiment, the user interface can be a touch-sensitive user interface, which recognizes input in the form of touch, for example the position of a user's finger or a stylus upon the user interface, to determine the selected point or region within the visual array grid of media options.
- In response to receiving an input from the user interface, the multi-track playback logic can determine a set of one or more of the plurality of media options that are proximate to the selected point or region (in this example, media options A1, A2 and A3), and, together with its playback volume function, adjust playback parameters, such as the
playback volume 190 of the set of media content items associated with those media options, by crossfading or otherwise combining the playback of the set of media content items to reflect their relative distances from the selected point or region. - An output can then be provided as a played-back or crossfaded media, e.g., a set of crossfaded songs, to the user. In the example illustrated in
FIG. 4A , the user may perceive A2 as being dominantly played, with some crossfading from sources A1 and A3. - In accordance with an embodiment, as shown in
FIG. 4B , while the user moves their, e.g., finger, stylus, mouse cursor or other selector, to change their selection, the system can determine a new point or region or selection, and a plurality of media options that are proximate to the new point or region (in this example, media options A2, A3 and A4). - In response to receiving the input from the user interface, the multi-track playback logic, together with its playback volume function, can again adjust playback parameters, such as the playback volume of the set of media content items associated with those media options, by crossfading or otherwise combining the playback of the set of media content items to reflect their relative distances from the newly selected point or region. The output can then be provided as different played-back or crossfaded media to the user, e.g., as a different set of crossfaded songs.
- For example, while the selected point or region is moved from that shown in
FIG. 4A to that shown inFIG. 4B , the relative playback volume of media content item A1 and A2 are decreased (in this example the playback volume of A1 is reduced almost to zero), while the relative playback volume of media content item A3 and A4 are increased, reflecting their relative distances from the selected point or region. Particularly, the relative playback volume of A3 is increased almost to the exclusion of other media content items, reflecting the much shorter distance between A3 and the user's selected point or region. In the example illustrated inFIG. 4B , the user may perceive as A3 being dominantly played, with little or no contribution or crossfading from any other sources. - In accordance with an embodiment if, as shown in
FIG. 4B , the user's finger is still held down and a sample associated with the media content for A3 ends, then playback of that media content can be repeated from the beginning of the sample. -
FIG. 5 illustrates a media device with an exemplary user interface which supports playback of media content, in accordance with an embodiment. As shown inFIG. 5 , in accordance with an embodiment, a user interface can display, for example, on a media device, a visual array of media options arranged as a two-dimensional grid of card elements, with rows and columns of media options visualized as tiles, here illustrated as A1-An through E1-En (202-249). Each of the media options is associated with one or more media content items that can be played on the device. - For purposes of illustration, each of the media options can be associated with a status 260 that reflects, from the user's perspective, whether that particular option's associated media content item is playing or not, and, if its associated media content item is playing then whether other media content items are being played at the same time.
- For example, in accordance with an embodiment, media content items can be either not selected and not playing 262; proximate to a selected point or region and playing simultaneously with other media content items 264 (i.e., from the perspective of the user, with some perceptible crossfading of other media content items); proximate to a selected point or region but playing dominantly (i.e., from the perspective of the user, with little or no contribution or crossfading of other media content items) 266; or selected and playing (i.e., by itself with no other media content items playing simultaneously) 268.
-
FIG. 6 further illustrates a user interface, in accordance with an embodiment. As shown inFIG. 6 , for example, a user may initially select a region (180) of the user interface generally located in the region of, but not precisely upon, media option B2 (213), whose neighboring or proximate media options include A1 (202), A2 (203), A3 (204), B1 (212), B3 (214), C1 (222), C2 (223) and C3 (224). - In accordance with an embodiment, the multi-track playback logic can adjust the playback volume of media content items associated with each of these proximate media options, by crossfading or otherwise combining their playback, to reflect their relative distances from the selected point or region. The crossfaded or combined result can then be provided as a played-back media to the user.
- In this example, the user may perceive an output from their media device in which media content B2 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options as illustrated in
FIG. 6 . -
FIG. 7 further illustrates a user interface, in accordance with an embodiment. As shown inFIG. 7 , for example, a user may move their finger, mouse cursor, or other selector, to select or explore a new point or region of the user interface generally located in the region of media option C3, but which is also proximate to media options B2, B3, B4, C2, C4 (225), D2 (233), D3 (234) and D4 (235). The multi-track playback logic can again adjust playback parameters, such as the playback volume of the set of media content items associated with these media options, by crossfading or otherwise combining their playback, to reflect their relative distances from the selected point or region. - In this example, the user may perceive an output from their media device in which media content C3 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options as illustrated in
FIG. 7 . - Additionally, while the user moves their finger, mouse cursor, or other selector from the position shown in
FIG. 6 , to the position shown inFIG. 7 , they may perceive a crossfading of media output as the multi-track playback logic gradually adjusts the playback volume of media content items from the initial output in which B2 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options, to the subsequent output in which C3 is being dominantly played, with some perceptible combination of one or more of its neighboring or proximate media options. - In accordance with an embodiment, a particular number of media options proximate to the selection point or region can be used, for example a window of nine (i.e., 3×3 tiles) proximate media options. However, in accordance with other embodiments, different numbers of media options proximate to the selection point or region can be used, and the chosen media options need not necessarily be in a square or other uniform pattern.
-
FIG. 8 further illustrates a user interface, in accordance with an embodiment. - As shown in
FIG. 8 , for example, a user may again move their finger, mouse cursor, or other selector, to select or explore another new point or region of the user interface generally located at media option B3, but which is also proximate to media options A3 (204), B2, B4 (215), C2, C3 and C4. - Again, the multi-track playback logic can adjust playback parameters, such as the playback volume of the set of media content items associated with these media options, by crossfading or otherwise combining their playback, to reflect their relative distances from the selected point or region, in this example using just seven media options.
- Again also, the user may perceive a crossfading of media output as the multi-track playback logic gradually adjusts the playback volume of media content items from the original output in which C3 is being dominantly played, to the subsequent output in which B3 is being dominantly played.
-
FIG. 9 further illustrates a user interface, in accordance with an embodiment. As shown inFIG. 9 , in accordance with an embodiment, if the user moves their finger, mouse cursor, or other selector, to select the center of a point or region of the user interface generally located at a media option, and leaves it there for a period of time, then in accordance with an embodiment, that media content item can be selected, and played by itself (i.e., from the perspective of the user, with no other media content items playing simultaneously). -
FIG. 10 further illustrates a user interface, in accordance with an embodiment. As shown inFIG. 10 , in accordance with an embodiment, while a media content item is selected and being played, the grid can be automatically scrolled or repositioned 315, both generally centering the currently selected point or region, and in this example displaying on the user interface new or additional media options X1-X8 (302-309), which can be subsequently selected by the user. -
FIG. 11 further illustrates a user interface, in accordance with an embodiment. As shown inFIG. 11 , for example, a user may select a region of the user interface generally located at media options A1 (202), with neighboring or proximate media options A2, B1, X1 (302), X2 (303) and X6 (307). - The process can generally continue as described above, with the user continuing to move the selected point or region within the visual array, to further explore media options, and the playback of proximate media content items continually adjusted, by crossfading or otherwise combining their playback. For example, when the grid is automatically scrolled or repositioned, the new or additional media options can be explored by the user, or offered as suggestions to browse and experience new media content with which they had not previously been familiar.
- In accordance with an embodiment, the system can be configured so that, if it determines a media content has been selected, for example by detecting that the user's finger is lifted from the user interface while playing a sample or snippet, the system can, for example, play the remainder of that media content item to its end, by transitioning or otherwise appending the media content, at an appropriate point within its stored content, to the previously-played sample or snippet. Playback flows from the end of the previously-played sample or snippet into the remainder of the media content item.
-
FIG. 12 illustrates use of the system to append playback of media content, in accordance with an embodiment. As shown inFIG. 12 , in accordance with an embodiment, amedia content item 320 can be associated with a sample that generally comprises aregion 322 of the media content, for example a snippet of a song. The media content item also includes aremainder 324 of the media content that follows the sample. During use, the system can stream/pre-buffer a portion of each of a plurality of media content items, samples, or snippets, at the media device, as determined by the multi-track playback logic 340. - For example, the media streaming logic can stream or pre-buffer a sample by beginning streaming at the beginning 342 of the sample region. Subsequently, a user can make a selection of a particular media content/item 350. When the sample has completed playback, the media streaming logic can immediately continue streaming at the beginning 352 of the remainder region, to append and stream or pre-buffer the remainder of the selected
media content item 354. - From the user's perspective, playback continues or flows seamlessly from the end of the previously-played sample or snippet, into the remainder of the media content item. Depending on the particular implementation, if the sample is a relatively long (e.g., 90 seconds) sample, and is located near the end of the song, the device can simply play the sample to its end; at which point a next song can be chosen according to one or more shuffle rules, a playlist, or other means.
- In accordance with an embodiment, when the system appends media content to a previously-played sample or snippet, the system can first begin playback of pre-buffered content associated with that media content item, and, upon determining a selected point or region remaining within a particular region of the visual array for a period of time, subsequently begin playback of associated media content items.
- In accordance with another embodiment, if the user's finger is lifted from the user interface, the sample or snippet can stop playing, and an original, e.g. song, return to being played. If the user immediately taps the same location, the device can play the last-selected media content from its beginning.
- The examples provided above of various user interaction techniques are provided for purposes of illustration. In accordance with other embodiments, or to address the needs of other implementations, other types of user interaction techniques can be supported.
- In accordance with an embodiment, additional features can be provided that improve user interaction, for example the use of audible notifications (referred to herein in some embodiments as an audio spinner).
- As illustrated above, in accordance with an embodiment, a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device. The system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- As further illustrated above, in accordance with an embodiment, a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- When such an interface is used, there may be a short period of delay before the portion of each media content item, sample, or snippet, which is needed to populate the visual array, is actually received and loaded at the media device and the touch preview can begin. The overall period of delay may be affected by factors such as the network or cellular bandwidth available for data transfer to the device; the type of media content being transferred (for example, music versus high-definition video); and the complexity of the visual array (for example, a large two-dimensional grid of card elements as displayed on a tablet-type device may include many more options than a short list of card elements as displayed on a smartphone-type device, and correspondingly require much more data to be transferred to prebuffer and prepare the interface for use).
- Additionally, since the controlled playback and crossfading techniques described above also lend themselves to use cases in which the user might not be watching the interface—for example they may be casually perusing a list of song media options while listening to the crossfaded output, to determine a song they might like to hear—some feedback to the user that reflects something is happening (for example, that media content is currently being transferred to their device to populate the visual array) is useful. Even a short period of delay or hesitation in playing a particular media content can be irritating to some users, or lead them to believe their device or application is not operating correctly.
- In accordance with an embodiment, the system can be adapted so that, when the period of delay exceeds a maximum (acceptable) delay time (for example, 500 ms), then an audible notification, such as a small generic sound (e.g., bibodibeep . . . ) can be played for a period of time (for example, 700 ms), and subsequently crossfaded with (into) the actual portion of media content, sample, or snippet, using the controlled playback and crossfading techniques described above. The use of such audible notification can provide reassurance that something is happening, and an overall more enjoyable user experience, particularly when combined with the crossfading techniques described above.
-
FIG. 13 illustrates a system which includes support for audible notifications, in accordance with an embodiment. As shown inFIG. 13 , and as described above, in accordance with an embodiment, a media device or player, for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content, can be provided for use in playing media content provided by a media server, or another system or peer device. A media stream service can be used to buffer media content, for streaming to one or more streams; while a media streaming logic can be used to retrieve or otherwise access media content items, and/or samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device. - As further described above, in accordance with an embodiment, the media device can have a media playback application and include a touch-sensitive user interface adapted to display a visual array of media options. A prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate. Selecting a particular media option within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- In accordance with an embodiment, the media playback application can include an
audible notification logic 450, which is adapted to provide audible notifications. In accordance with an embodiment, the audible notification logic can perform adelay determination 452 that is reflective of an actual or anticipated delay prior to playback of the associated media content item or sample. If the actual or anticipated delay prior to playback of an associated media content item or sample is determined to be greater than a maximum (acceptable) delay time, then an audible notification functionality is invoked, to play an audible notification (audio spinner) 458, while prebuffering associated media content items or samples. Otherwise the process can continue to prebuffer and play the associated media content items, without playing the audible notification. - By way of example, in accordance with an embodiment, the maximum (acceptable) delay time can be configured to be 500 ms, and the playing duration of the audible notification can be configured to be 700 ms. However, in accordance with other embodiments, other values can be used to address the needs of a particular implementation or use case.
- Depending on the particular embodiment or implementation, the system can be configured to wait before determining that the actual or anticipated delay prior to playback of the associated media content item or sample is greater than the maximum (acceptable) delay time, and then invoking its audible notification functionality.
- Alternatively, in accordance with an embodiment, the system can be configured to presume that the actual or anticipated delay prior to playback of an associated media content item or sample may be greater than the maximum (acceptable) delay time, and invoke its audible notification functionality regardless, in which case the audible notification may be initiated, and then cross-faded almost immediately and imperceptibly with the associated media content item or sample.
-
FIG. 14 illustrates the use of audible notifications, in accordance with an embodiment. As shown inFIG. 14 , in accordance with an embodiment, a user interface can display a visual array of media options arranged as a two-dimensional grid, with rows and columns of media options visualized as tile-like card elements, wherein each of the media options is associated with one or more media content items that can be played on the device. - The example illustrated in
FIG. 14 includes an array of media options similar to that previously illustrated inFIGS. 10-11 . As described above, for purposes of illustration, each of the media options can also be associated with a status that reflects, from the user's perspective, whether that particular option's associated media content item is playing or not, and, if its associated media content item is playing then whether other media content items are being cross-faded or played at the same time. - In accordance with an embodiment, a user can select a particular point or region 460. If the system determines that the actual or anticipated delay prior to playback of an associated media content item or sample is greater than a maximum (acceptable) delay time, then an audible notification functionality is invoked, to play an audible notification (audio spinner) 462, while prebuffering associated media content items or samples.
- For example, as shown in
FIG. 14 , when the user moves the selected point or region from media option A1 (202) to media option A2 (203), the system can determine whether the actual or anticipated delay prior to playback of media option A2 is greater than a maximum (acceptable) delay time. -
FIG. 15 further illustrates the use of audible notifications, in accordance with an embodiment. As shown inFIG. 15 , if the actual or anticipated delay prior to playback of media option A2, or a sample associated therewith, is determined to be greater than the maximum (acceptable) delay time (for example, 500 ms), then the audible notification can be played for a period of time (for example, 700 ms), and subsequently crossfaded with (into) the playing of media option A2, or a sample associated with media option A2, using the controlled playback and crossfading techniques described above. - In accordance with an embodiment, once the audible notification has completed, then the media content (in this example media option A2, or some combination of media option A2 with one or more other media options as appropriate) can be played 464. Alternatively, playing of the audible notification can be terminated earlier if the media option is ready for play. If there is no delay in preparing the media content for play, then it can begin playing almost immediately (or be cross-faded almost immediately) from the perspective of the user.
-
FIG. 16 is a flowchart of a method for providing audible notifications, in accordance with an embodiment. - As shown in
FIG. 16 , atstep 470, a computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device. - At
step 472, an input is received, at the user interface, including a selection of a point or region within the visual array, which the system uses to determine a set of one or more media options proximate to the selected point or region. - At
step 474, playback parameters of the set of one or more media options are adjusted, including crossfading and/or determining a delay prior to start of playback of the associated media content items, or samples thereof. - If, at
step 476, an actual or anticipated delay prior to playback of an associated media content item or sample is determined to be greater than a maximum (acceptable) delay time, then, atstep 478, an audible notification functionality is invoked, to play an audible notification while prebuffering associated media content items or samples. Otherwise the process can continue to prebuffer and play the associated media content items or samples, without playing the audible notification. - As described above, depending on the particular embodiment or implementation, the system can be configured to wait before determining that the actual or anticipated delay prior to playback of the associated media content item or sample will be greater than the maximum (acceptable) delay time; or alternatively presume that the actual or anticipated delay prior to playback of an associated media content item or sample may be greater than the maximum (acceptable) delay time, and invoke its audible notification functionality regardless, in which case the audible notification may be initiated, and then cross-faded almost immediately and imperceptibly with the associated media content item or sample.
- At
step 482, the media content items or samples associated with the set of one or more media options are played. - At
step 484, the system can subsequently determine selection of a new point or region within the visual array, and plurality of media options that are proximate to the new point or region, and adjust the playback parameters of media content items proximate the new point or region. - In accordance with an embodiment, additional features can be provided that improve user interaction, for example the use of media caching (referred to herein in some embodiments as audio touch caching).
- As illustrated above, in accordance with an embodiment, a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device. The system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- As further illustrated above, in accordance with an embodiment, a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- The display of such visual array of media options necessarily requires the transfer of considerable amounts of data to the device, which as described above depends on factors such as the type of media content being transferred (for example, music versus high-definition video); and the complexity of the visual array (for example, a large two-dimensional grid as displayed on a tablet-type device may include many more options than a short list as displayed on a smartphone-type device, and correspondingly require much more data to be transferred to prebuffer and prepare the interface for use).
- However, such transfer of data can be negatively affected if the device is associated with a limited network or cellular data plan, or must otherwise adhere to restrictions that limit the network or cellular bandwidth available to the device for transfer of media content, or the total amount of data that can be transferred to the device within a particular temporal interval (for example, within a monthly billing cycle). To address this, some manner of intelligently utilizing the available network or cellular bandwidth, or data allowance, associated with a device can provide a more enjoyable user experience.
- In accordance with an embodiment, the system can take into account behavioral characteristics and other settings which imply a user's data plan preferences, and use such information to determine a caching policy by which the data necessary for prebuffering and preparing the interface will be cached at the media device. Examples of caching policies can include cache-all, cache-aggressively, cache-casually, and/or don't-cache policies, each of which can have their own features as described in further detail below. In this manner, the caching policy can be used to optimize the user experience without requiring backend changes, say at the media server, and while respecting the user's data plan.
-
FIG. 17 illustrates a system which includes media caching, in accordance with an embodiment. As shown inFIG. 17 , and as described above, in accordance with an embodiment, a media device or player, for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content, can be provided for use in playing media content provided by a media server, or another system or peer device. A media stream service can be used to buffer media content, for streaming to one or more streams; while a media streaming logic can be used to retrieve or otherwise access media content items, and/or samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device. - As further described above, in accordance with an embodiment, the media device can have a media playback application and include a touch-sensitive user interface adapted to display a visual array of media options. A prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate. Selecting a particular media option within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- In accordance with an embodiment, a
cache logic 500 maintains acache 502 or repository of media content. The cache logic is adapted to makecache requests 504 to the media server application, tocache media content 506 within the cache, ascontent 508, which can then be used to support the operation of the visual array or other functionalities as described above, including prebuffering and preparing the interface for use. - For example, the cache logic can determine, based on behavioral characteristics and other settings, whether the user is likely to be using a lower-bandwidth or lower-allowance data plan, versus a higher-bandwidth or higher-allowance data plan, and operate accordingly.
-
FIG. 18 further illustrates a system which includes media caching, in accordance with an embodiment. As shown inFIG. 18 , the system can take into account user settings 510, user behavior 512, andcache settings 514, and can additionally receive 516visible option information 518 describing the currently visible options on the visual array. Based on this information, apolicy determination component 520 can determine anappropriate caching policy 522 to be used in making cache request to the media server. - For example, in accordance with an embodiment one or more user settings such as a user-specified streaming quality, or synchronization over a 3G network; user behaviors such as use of a number of streams, or online/offline playlists; or cache settings can be used to determine that the user is likely to be using a lower-bandwidth or lower-allowance data plan, or alternatively a higher-bandwidth or higher-allowance data plan.
-
FIG. 19 further illustrates a system which includes media caching, in accordance with an embodiment. As shown inFIG. 19 , while the user interacts with the software application, current and/orhistorical usage information 528, such as the total amount of data transferred to the device within a current or previous temporal interval (for example, within a current or previous monthly billing cycle) can be determined. - In accordance with an embodiment, the cache logic, including its policy determination component can select 530 from available caching
policies 531, examples of which can include a cache-allpolicy 532, a cache-aggressive policy 534, a cache-casual policy 536, or a no-cachingpolicy 538, some features of which are described below. However, it will be evident that the policy descriptions are provided by way of example to illustrate various embodiments, and that in accordance with other embodiments or other implementations other types of caching policies can be used. - In accordance with an embodiment, when a cache-all policy is used, the device and/or media application is adapted to cache media content or sample data which is associated with each of the media options currently displayed and visible on the user interface, without any data capping.
- For example, in accordance with an embodiment, the system can determine to use a cache-all policy when the user settings, user behavior and/or cache settings indicate that W-Fi is turned on, and for personalized (e.g., “Your Music”) views during a touch session. Since personalized views have relatively slow data dynamics, caching of media options within such views should occur seldom compared with other views.
- In accordance with an embodiment, when a cache-aggressively policy is used, the device and/or media application is adapted to caching media content or sample data which is associated with each of the media options currently displayed and visible on the user interface, but with a strict, e.g., monthly, caching limit, or a caching limit based on some other temporal factor.
- For example, in accordance with an embodiment, the system can determine to use a cache-aggressively policy when current and/or historical usage information indicates that the user has not cached much data during the current month (for example, less than 10-50 Mb), and/or the user settings suggest that the user does not care that much about data plans (for example, setting a high streaming extreme quality, or synchronization over 3G), and/or the user behavior detected similarly suggests that the user does not care that much about data plans (for example, a lot of streams versus offline mode, or a larger number of bytes streamed monthly).
- In accordance with an embodiment, when a cache-casually policy is used, the device and/or media application is adapted to not cache media content or sample data associated with all of the media options currently displayed and visible on the user interface, but instead a subset of those media options that have higher usage probability per view. For example, in a track list view, users may tend to start previewing the first two tracks in a list, and so these media options have a higher usage probability than others further down the list.
- For example, in accordance with an embodiment, the system can determine to use a cache-casually policy in those situations in which the user has already cached much data during the month, and/or the user settings detected that suggest that the user does care about their data plan (for example, setting a low streaming quality, or not to allow synchronization over 3G), and/or the user behavior detected similarly suggests that the user does care about their data plan (for example, a lot of streams from offline playlists compared to streaming).
- In accordance with an embodiment, when a don't cache policy is used the device and/or media application is adapted to not cache anything at all until the user starts a touch session.
- For example, in accordance with an embodiment, the system can determine to use a don't-cache policy when the user has already cached a particular amount of data within a particular temporal interval, for example within a monthly billing cycle.
- The above are provided by way of example to illustrate various caching policies. It will be evident that in various embodiments different caching policies can be used, in addition to different rules for determining, based on user settings, user behavior, and cache settings and/or visible option information, which particular caching policy may best suit a particular situation or session.
-
FIG. 20 is a flowchart of a method for providing media caching, in accordance with an embodiment. - As shown in
FIG. 20 , atstep 540, a computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device. - At
step 541, the user interface is prepared for use during a user session, which allows selection of points or regions within the visual array, and determination of media options proximate to the selected points or regions. - At
step 542, one or more cache settings, user settings, user behavior, and/or visible media option information for media options displayed on the user interface, are determined. - At
step 543, the system can determine, from within a plurality of caching policies, based on current usage information and/or historical usage information, a particular caching policy to be used for the user session. - At
step 544, cache requests can then be communicated by the media playback application to a media server, to retrieve media content to be cached at the computer system or device for subsequent display at the user interface, in accordance with the determined caching policy. - In accordance with an embodiment, additional features can be provided that improve user interaction, for example the use of touch menus (referred to herein in some embodiments as audio touch menus). Embodiments of the system can be used to support a “What can I preview?” environment, in which media content items can be previewable from different presentation formats, so that a user can easily browse and move between, for example, a music album and a playlist, and then on to an individual song, or to a radio station, or other source of media content or presentation format thereof.
- As illustrated above, in accordance with an embodiment, a media device having a media playback application and including a touch-sensitive user interface can be adapted to display a visual array of media options, for example as a grid or list of card elements. Each media option can be associated with one or more media content items that can be streamed to and/or played on the device. The system can determine a selected card element, or media options that are proximate to a selected point or region of the visual array, and play or crossfade media content as appropriate.
- As further illustrated above, in accordance with an embodiment, a prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate.
- However, such environment is not restricted to displaying a static set of media content, or one purely determined by, e.g., the media server. For example, a user may want to select various media content items (e.g., songs) and add them to a playlist and/or to a collection in different ways.
- In accordance with an embodiment, a touch logic allows a user to define and/or select both client-side playlists and server-side playlists, so that a selected playlist can be associated with and viewed or played using a touch menu. For example, upon selection of a playlist media content item, a list of media options within that playlist can be displayed. The user can continue to select media options from the playlist, including the use of cross-fading, audible notification (audio spinner), or other features as described above.
-
FIG. 21 illustrates a system which includes touch menus, in accordance with an embodiment. As shown inFIG. 21 , and as described above, in accordance with an embodiment, a media device or player, for example a computing system, handheld entertainment device, smartphone, or other type of media device capable of playing media content, can be provided for use in playing media content provided by a media server, or another system or peer device. A media stream service can be used to buffer media content, for streaming to one or more streams; while a media streaming logic can be used to retrieve or otherwise access media content items, and/or samples associated with the media content items, in response to requests from media devices or other systems, and populate the media stream service with streams of corresponding media content data that can be returned to the requesting device. - As further described above, in accordance with an embodiment, the media device can include a media playback application and touch-sensitive user interface adapted to display a visual array of media options. A prebuffering logic can enable a portion of each media content item, sample, or snippet, to be pre-buffered at the media device, as determined by a multi-track playback logic, so that, while media options are being prepared for display, their related media content can be pre-buffered at the same time, allowing for a playback experience that, from the user's perception, seems immediate. Selecting a particular media option within the visual array can be used as a request or instruction to the media server application to stream or otherwise return a corresponding particular item of media content.
- In accordance with an embodiment, a
touch logic 545 allows a user to define and/or select both client-side playlists 546 and server-side playlists 547. The touch logic enables a selectedplaylist 548 to be associated with and viewed or played using atouch menu 549, as described in further detail below. -
FIGS. 22A-22B further illustrate a system which includes a touch menu, in accordance with an embodiment. As shown inFIG. 22A , in accordance with an embodiment, the device can display afirst screen mode 550 in which a visual array of media options is arranged as a two-dimensional grid, with rows and columns of media options visualized as tile-like card elements, wherein each of the media options is associated with one or more media content items that can be played on the device. - As described above, in accordance with an embodiment, the multi-track playback logic can adjust the playback volume of media content items associated with proximate media options, by crossfading or otherwise combining their playback, to reflect their relative distances from a selected point or region.
- As further shown in
FIG. 22B , in accordance with an embodiment, when a user makes aselection 551 of a particular media content item, the interface can change to show subcontent or component media content items for the particular media content item, in this example subcontents A-F (553-558 respectively). For example, a user can make a selection of a song album, which causes the system to display as subcontent the songs within that album. As another example, a user can make a selection of a song playlist, which causes the system to display as subcontent the songs within that playlist. -
FIGS. 23A-23B further illustrate a system which includes a touch menu, in accordance with an embodiment. - In accordance with an embodiment, the media options can be associated with playlists. For example, a user can browse a named
playlist 552, and make aselection 554. As shown inFIG. 23B , upon selecting a media option, the touch logic can cause thedisplay 559 of the media options to switch as shown, from a first screen mode as described above to alist screen mode 560. In accordance with an embodiment, the user can continue to select 562 songs from the playlist, including discretely selecting songs, or incorporating the use of crossfade features as described above. -
FIG. 24 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 24 , in accordance with an embodiment an interface can be displayed 564, including media content items and/or playlists, optionally withcover art 566 for each media content item. -
FIG. 25 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 25 , optionally an instruction can be received 570 from a user to store one or more selected media options to a playlist, and/or to follow a selected media option. Selected songs can optionally be displayed with additional information for the selectedmedia option 571. For example, the user can toggle on/off afollow button 572, or can swipe 574 a media option to save it to a playlist, in which case the media option can be identified by asuitable indicator 576. -
FIG. 26 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 26 , an instruction can be received 580 from a user to display playlists, and display of a list of available playlist options, wherein each playlist option is associated with a playlist of one or more media options and media content items. In accordance with an embodiment, the device can display user personal/playlist information 581, optionally with buttons to selectplaylists 582,songs 583, oralbums 584. Upon request from a user, the device can display a plurality of playlists, including in this example a playlist A (590), playlist B (591), playlist C (592), playlist D (593), playlist E (594), playlist F (595), playlist G (596) through playlist N (597), for selection thereof. In the example illustrated inFIG. 25 , the user has selected playlist D. -
FIG. 27 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 27 , continuing the example from above, the user has selected playlist D. In accordance with an embodiment, in response to receiving aselection 600 of a particular playlist option, the system can display a visual array of media options associated with the selected playlist option, wherein each media option is associated with one or more media content items that can be selected for playing. - As illustrated above, in accordance with an embodiment, a card element displayed within the visual array can contain other card elements, with subcontent or component media content items associated therewith. For example, in accordance with an embodiment, when the system determines that a user has selected a card element within the visual array which includes other card elements, it causes the resultant grid to zoom in, with the effect of bringing the user closer and further immersed in the content, while revealing a selection of what is inside that card element. The user can interact with the contents of the card using the techniques described above, for example by sliding their finger or a cursor over the visual array, changing the audio output and visible representation in a preview like manner. While the content is zoomed closer to the user, a cover art can be blurred while it scales to fill the entire card element. The grid can then be formed, building up from the position of the user's finger and then outwardly. Grid items can scale and fade in as they appear. While cover art is being pre-fetched the system can display the custom song placeholder image, and then fade in the cover art when it becomes available.
- In the example illustrated in
FIG. 27 , the user's selected playlist D is associated with media options X1-X3 (602-604), and with potentially other media options not currently displayed. Since a visual array of media options associated with the selected playlist option is displayed, the visual array can be scrolled or repositioned as described above, to allow new or additional media options associated with the selected playlist option, to be explored by the user. - In accordance with an embodiment, a cell can include cover art to its left when long pressing a cell that represents content, the cover art to the left can scale up and blur to fill the entire cell or a portion of the screen. At the same time a representative section of what's inside can be revealed within a horizontal visual array or grid, within which the user can then further interact.
-
FIG. 28 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 28 , continuing the example from above, the user has now moved the selected point or region downwards and has selected playlist E. In response to receiving a selection of anew playlist option 610, the system can display a visual array of media options associated with the selected new playlist option. Additionally, a crossfaded or combined result of the playlists or media options therein can be provided as a played-back media content to the user, using the above techniques and based on the position of the selected point or region. - In the example illustrated in
FIG. 28 , the user's selected new playlist E is associated with media options Y1-Y3 (612-614), and with potentially other media options not currently displayed. Again, the visual array can be scrolled or repositioned, to allow new or additional media options associated with the selected playlist option, to be explored by the user. -
FIGS. 29-30 further illustrate the use of a touch menu, in accordance with an embodiment. As shown inFIG. 29 , in accordance with an embodiment, while the user interacts with the visual array, the selected option can scale up 615 to fill a portion of the screen, while other array components can be similarly adjusted in their presentation size. As shown inFIG. 30 , the visual array can again be scrolled or repositioned, to allow new or additional media options to be explored by the user, including asymmetric adjustment in theirpresentation size 616 to better reflect the position of the user's finger and guide them in their previewing. -
FIGS. 31-33 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 31 , in accordance with an embodiment, an instruction can be received 617 from a user to display playlists, and display of a list of available playlist options, wherein each playlist option is associated with a playlist of one or more media options and media content items. As shown inFIG. 32 , in accordance with an embodiment, in response to receiving aselection 618 of a particular playlist option, the system can display a visual array of media options associated with the selected playlist option, wherein each media option is associated with one or more media content items that can be selected for playing. As shown inFIG. 33 , in accordance with an embodiment, when the user stops browsing a playlist using thetouch menu 619, for example by lifting their finger from the interface, the touch menu disappears and the interface returns to its normal appearance with the original playlists displayed. Such an embodiment is useful for environments in which the user is not allowed to select the next song, but instead is allowed to choose only the playlist they have just sampled. -
FIGS. 34-35 further illustrates the use of a touch menu, in accordance with an embodiment. As shown inFIG. 34 , in accordance with an embodiment, an instruction can be received 620 from a user to display playlists, and a list of available playlist options, wherein each playlist option is associated with a playlist of one or more media options and media content items. As shown inFIG. 35 , in accordance with an embodiment, a user can select a next song to be played 621, for example by tapping the preview, or by holding their finger on a media option until a selection symbol (e.g., circle) is rendered. The previewed (and now selected) media content can then be made the active media content, (e.g., active song) 622. Alternatively, the user can tap off the preview, i.e., quickly lift and tap on the same spot, to trigger the preview as the current song. Such an embodiment is useful for environments in which the user is allowed to select the next song. -
FIG. 36 is a flowchart of a method for providing a touch menu, in accordance with an embodiment. - As shown in
FIG. 36 , atstep 623, a computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device. - At
step 624, an input is received from a user, at the user interface, including a selection of a card element, point or region within the visual array corresponding to a selected media option. - At
step 626, the system can determine one or more media options associated with or proximate to the selected card element, point or region, for use in adjusting playback of those media options. - At
step 632, the system can display a visual array of additional media options associated with the selected media options, wherein each media option in the visual array of additional media options is similarly associated with one or more media content items that can be selected for playing. - At
step 634, in response to receiving a selection from a user, at the user interface, of a new media option, display a revised visual array of media options that are associated with the selected new media option. -
FIG. 37 is a flowchart of a method for use by a server in providing a touch menu, in accordance with an embodiment. - As shown in
FIG. 37 , atstep 640, a plurality of media content items, and/or samples associated with the media content items that can be streamed to and/or played on a client computer system or device are provided at a media server, within a database or repository. - At
step 642, input is received from a client computer system or device having a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more of the media content items, including a selection of a card element, point or region within the visual array corresponding to a selected media option. - At
step 644, one or more media options associated with or proximate to the selected card element, point or region, are determined, for use in adjusting playback of those media options. - At
step 646, media content associated with the one or more media options associated with or proximate to the selected card element, is streamed to the client computer system or device, for use by the client computer system or device in displaying a visual array of additional media options associated with the selected media options, wherein each media option in the visual array of additional media options is similarly associated with one or more media content items that can be selected for playing. -
FIG. 38 is a flowchart of a method for use by a client device in providing a touch menu, in accordance with an embodiment. - As shown in
FIG. 38 , atstep 650, a client computer system or device is provided with a media playback application and user interface which enables display of a visual array of media options, wherein each media option is associated with one or more media content items that can be streamed to and/or played on the computer system or device. - At
step 652, input is received from a user, at the user interface, including a selection of a card element, point or region within the visual array corresponding to a selected media option. - At
step 654, the selection of card element, point or region within the visual array corresponding to a selected media option, is communicated to a media server, for use by the server in determining one or more media options associated with or proximate to the selected card element, point or region, for use in adjusting playback of those media options. - At
step 656, streamed media content associated with the one or more media options associated with or proximate to the selected card element, is received from the media server. - At
step 658, a visual array of additional media options associated with the selected media options is displayed, wherein each media option in the visual array of additional media options is similarly associated with one or more media content items that can be selected for playing. - In accordance with an embodiment, the system can include support for force-sensitive touch input in selection, playback, or other interaction with media options. A media device can include an interface that responds differently to variations in input pressure exerted by a user, wherein the device is configured so that an amount of pressure applied by the user in touch-selecting a particular media option or other input region can be used to affect the operation of the device, interface or menu options; or the selection, playback, or other interaction with an associated media content item.
- For example, in accordance with an embodiment, a less-forceful or light touch by the user upon a particular media option can be used to cause the system to provide an audio preview of an associated song; while a more-forceful, hard, or firm touch by the user can cause the system to switch to playing that song.
-
FIG. 39A-39C illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment. - As described above, in accordance with various embodiments, the system can be configured so that, if it determines a media content has been selected during preview, the system can, for example, play the remainder of that media content item to its end, or otherwise allow the previewed media content to be made the active media content, (e.g., active song).
- As shown in
FIG. 39A-39C , in accordance with an embodiment, with support for force-sensitive touch input, a user interface 700 can display media options, and can respond to variations in the amount of pressure applied by the user in touch-selecting a particular media option or other input region, such that a user can use alight touch 704 on a media option to cause the system to provide a preview, e.g., an audio preview of a song 706. If the user continues to use a light touch, they can similarly continue to preview other media options. -
FIG. 40A-40C further illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment. As shown inFIG. 40A-40C , if upon previewing a song, the user instead uses afirm touch 708, then upon the system detecting the firm touch, the previewed media content can be selected for play as an active media content (e.g., song) 710. Optionally, the user interface can be updated to indicate the selection. -
FIG. 41A-41C further illustrates the use of a touch menu including support for force-sensitive touch input, in accordance with an embodiment. As shown inFIG. 41A-41C , alternatively, upon the system detecting a firm touch, a menu or information can be displayed for the previewed or selected media content 712, for example as amenu 714 of options to add the selected media content to a playlist or queue, or to provide other information or options. - Embodiments of the present invention can be conveniently implemented using one or more conventional general purpose or specialized digital computer, computing device, machine, or microprocessor, including one or more processors, memory and/or computer readable storage media programmed according to the teachings of the present disclosure. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art.
- In some embodiments, the present invention includes a computer program product which is a non-transitory storage medium or computer readable medium (media) having instructions stored thereon/in which can be used to program a computer to perform any of the processes of the present invention. Examples of the storage medium can include, but is not limited to, any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
- The foregoing description of embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner skilled in the art.
- For example, although the above examples generally describe the providing of music media content, such as songs, and the use of song cover art as a visualization to be used with the media options, the systems, methods and techniques described herein can be used with other forms of media content, including but not limited to video media content.
- The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/289,689 US20170024093A1 (en) | 2014-03-28 | 2016-10-10 | System and method for playback of media content with audio touch menu functionality |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/228,605 US20150277707A1 (en) | 2014-03-28 | 2014-03-28 | System and method for multi-track playback of media content |
US201462062573P | 2014-10-10 | 2014-10-10 | |
US201462062582P | 2014-10-10 | 2014-10-10 | |
US201462062580P | 2014-10-10 | 2014-10-10 | |
US201562217767P | 2015-09-11 | 2015-09-11 | |
US14/879,774 US9489113B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with audio touch menu functionality |
US15/289,689 US20170024093A1 (en) | 2014-03-28 | 2016-10-10 | System and method for playback of media content with audio touch menu functionality |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/879,774 Continuation US9489113B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with audio touch menu functionality |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170024093A1 true US20170024093A1 (en) | 2017-01-26 |
Family
ID=55655455
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/879,737 Active US9423998B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with audio spinner functionality |
US14/879,774 Active US9489113B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with audio touch menu functionality |
US14/879,743 Active US9483166B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with support for audio touch caching |
US15/289,684 Abandoned US20170024092A1 (en) | 2014-03-28 | 2016-10-10 | System and method for playback of media content with support for audio touch caching |
US15/289,689 Abandoned US20170024093A1 (en) | 2014-03-28 | 2016-10-10 | System and method for playback of media content with audio touch menu functionality |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/879,737 Active US9423998B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with audio spinner functionality |
US14/879,774 Active US9489113B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with audio touch menu functionality |
US14/879,743 Active US9483166B2 (en) | 2014-03-28 | 2015-10-09 | System and method for playback of media content with support for audio touch caching |
US15/289,684 Abandoned US20170024092A1 (en) | 2014-03-28 | 2016-10-10 | System and method for playback of media content with support for audio touch caching |
Country Status (1)
Country | Link |
---|---|
US (5) | US9423998B2 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10467998B2 (en) | 2015-09-29 | 2019-11-05 | Amper Music, Inc. | Automated music composition and generation system for spotting digital media objects and event markers using emotion-type, style-type, timing-type and accent-type musical experience descriptors that characterize the digital music to be automatically composed and generated by the system |
US10854180B2 (en) | 2015-09-29 | 2020-12-01 | Amper Music, Inc. | Method of and system for controlling the qualities of musical energy embodied in and expressed by digital music to be automatically composed and generated by an automated music composition and generation engine |
CN112104689A (en) * | 2019-06-18 | 2020-12-18 | 明日基金知识产权控股有限公司 | Location-based application activation |
CN112104595A (en) * | 2019-06-18 | 2020-12-18 | 明日基金知识产权控股有限公司 | Location-based application flow activation |
US20200401281A1 (en) * | 2018-03-01 | 2020-12-24 | Huawei Technologies Co., Ltd. | Information Display Method, Graphical User Interface, and Terminal |
US10964299B1 (en) | 2019-10-15 | 2021-03-30 | Shutterstock, Inc. | Method of and system for automatically generating digital performances of music compositions using notes selected from virtual musical instruments based on the music-theoretic states of the music compositions |
US11024275B2 (en) | 2019-10-15 | 2021-06-01 | Shutterstock, Inc. | Method of digitally performing a music composition using virtual musical instruments having performance logic executing within a virtual musical instrument (VMI) library management system |
US11037538B2 (en) | 2019-10-15 | 2021-06-15 | Shutterstock, Inc. | Method of and system for automated musical arrangement and musical instrument performance style transformation supported within an automated music performance system |
US11270513B2 (en) | 2019-06-18 | 2022-03-08 | The Calany Holding S. À R.L. | System and method for attaching applications and interactions to static objects |
US11341727B2 (en) | 2019-06-18 | 2022-05-24 | The Calany Holding S. À R.L. | Location-based platform for multiple 3D engines for delivering location-based 3D content to a user |
US11379180B2 (en) * | 2018-09-04 | 2022-07-05 | Beijing Dajia Internet Information Technology Co., Ltd | Method and device for playing voice, electronic device, and storage medium |
US11455777B2 (en) | 2019-06-18 | 2022-09-27 | The Calany Holding S. À R.L. | System and method for virtually attaching applications to and enabling interactions with dynamic objects |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150277707A1 (en) * | 2014-03-28 | 2015-10-01 | Spotify Ab | System and method for multi-track playback of media content |
US9423998B2 (en) | 2014-03-28 | 2016-08-23 | Spotify Ab | System and method for playback of media content with audio spinner functionality |
US20150293681A1 (en) * | 2014-04-09 | 2015-10-15 | Google Inc. | Methods, systems, and media for providing a media interface with multiple control interfaces |
US20150356877A1 (en) * | 2014-06-06 | 2015-12-10 | Catherine Ann Downey | Visual organization architecture system |
CN113821143A (en) * | 2014-06-24 | 2021-12-21 | 苹果公司 | Music playing user interface |
US9606620B2 (en) | 2015-05-19 | 2017-03-28 | Spotify Ab | Multi-track playback of media content during repetitive motion activities |
US10193943B2 (en) * | 2015-11-09 | 2019-01-29 | T-Mobile Usa, Inc. | Data-plan-based quality setting suggestions and use thereof to manage content provider services |
US10305952B2 (en) * | 2015-11-09 | 2019-05-28 | T-Mobile Usa, Inc. | Preference-aware content streaming |
US10372410B2 (en) * | 2015-12-21 | 2019-08-06 | Facebook, Inc. | Systems and methods to optimize music play in a scrolling news feed |
US10728152B2 (en) | 2016-02-08 | 2020-07-28 | T-Mobile Usa, Inc. | Dynamic network rate control |
US9798514B2 (en) | 2016-03-09 | 2017-10-24 | Spotify Ab | System and method for color beat display in a media content environment |
US10747423B2 (en) * | 2016-12-31 | 2020-08-18 | Spotify Ab | User interface for media content playback |
US10489106B2 (en) | 2016-12-31 | 2019-11-26 | Spotify Ab | Media content playback during travel |
US11514098B2 (en) | 2016-12-31 | 2022-11-29 | Spotify Ab | Playlist trailers for media content playback during travel |
US10691329B2 (en) * | 2017-06-19 | 2020-06-23 | Simple Design Ltd. | User interface of media player application for controlling media content display |
JP7026900B2 (en) * | 2017-10-16 | 2022-03-01 | 株式会社スマイルテレビ | Content distribution system and content distribution method |
CN110149539A (en) * | 2019-05-21 | 2019-08-20 | 北京字节跳动网络技术有限公司 | Method for broadcasting multimedia file, device, electronic equipment and storage medium |
US11210057B2 (en) * | 2020-01-10 | 2021-12-28 | James Matthew Gielarowski | Multi-User Media Player GUI |
WO2023090831A1 (en) * | 2021-11-18 | 2023-05-25 | 주식회사버시스 | Electronic device for providing sound on basis of user input and method for operating same |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040261040A1 (en) * | 2003-06-23 | 2004-12-23 | Microsoft Corporation | Method and apparatus for media access control |
US20070192739A1 (en) * | 2005-12-02 | 2007-08-16 | Hillcrest Laboratories, Inc. | Scene transitions in a zoomable user interface using a zoomable markup language |
US20080086687A1 (en) * | 2006-10-06 | 2008-04-10 | Ryutaro Sakai | Graphical User Interface For Audio-Visual Browsing |
US20090249222A1 (en) * | 2008-03-25 | 2009-10-01 | Square Products Corporation | System and method for simultaneous media presentation |
US20100262938A1 (en) * | 2009-04-10 | 2010-10-14 | Rovi Technologies Corporation | Systems and methods for generating a media guidance application with multiple perspective views |
US20110035705A1 (en) * | 2009-08-05 | 2011-02-10 | Robert Bosch Gmbh | Entertainment media visualization and interaction method |
US20110234480A1 (en) * | 2010-03-23 | 2011-09-29 | Apple Inc. | Audio preview of music |
US20130113737A1 (en) * | 2011-11-08 | 2013-05-09 | Sony Corporation | Information processing device, information processing method, and computer program |
US20130227463A1 (en) * | 2012-02-24 | 2013-08-29 | Research In Motion Limited | Electronic device including touch-sensitive display and method of controlling same |
US20150062052A1 (en) * | 2012-05-09 | 2015-03-05 | Apple Inc. | Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture |
US20150334204A1 (en) * | 2014-05-15 | 2015-11-19 | Google Inc. | Intelligent auto-caching of media |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6034689A (en) | 1996-06-03 | 2000-03-07 | Webtv Networks, Inc. | Web browser allowing navigation between hypertext objects using remote control |
US6654367B1 (en) | 1998-08-19 | 2003-11-25 | Lucent Technologies Inc. | Internet audio appliance |
US20010030660A1 (en) | 1999-12-10 | 2001-10-18 | Roustem Zainoulline | Interactive graphical user interface and method for previewing media products |
US7346698B2 (en) | 2000-12-20 | 2008-03-18 | G. W. Hannaway & Associates | Webcasting method and system for time-based synchronization of multiple, independent media streams |
US7631088B2 (en) | 2001-02-27 | 2009-12-08 | Jonathan Logan | System and method for minimizing perceived dead air time in internet streaming media delivery |
WO2002078328A1 (en) | 2001-03-26 | 2002-10-03 | Fujitsu Limited | Multi-channel information processor |
US8453175B2 (en) | 2003-05-29 | 2013-05-28 | Eat.Tv, Llc | System for presentation of multimedia content |
US7571014B1 (en) | 2004-04-01 | 2009-08-04 | Sonos, Inc. | Method and apparatus for controlling multimedia players in a multi-zone system |
US8028323B2 (en) | 2004-05-05 | 2011-09-27 | Dryden Enterprises, Llc | Method and system for employing a first device to direct a networked audio device to obtain a media item |
US8682722B1 (en) * | 2005-01-28 | 2014-03-25 | Advertising.Com Llc | Controlling user experience |
US7509593B2 (en) | 2005-05-12 | 2009-03-24 | Microsoft Corporation | Mouse sound volume control |
US7613736B2 (en) | 2005-05-23 | 2009-11-03 | Resonance Media Services, Inc. | Sharing music essence in a recommendation system |
US7730405B2 (en) | 2005-12-07 | 2010-06-01 | Iac Search & Media, Inc. | Method and system to present video content |
US9148628B2 (en) | 2007-08-16 | 2015-09-29 | Yahoo! Inc. | Intelligent media buffering based on input focus proximity |
US20090193465A1 (en) | 2008-01-25 | 2009-07-30 | Sony Corporation | Expanded playlist for tv video player |
US8903525B2 (en) | 2010-09-28 | 2014-12-02 | Sony Corporation | Sound processing device, sound data selecting method and sound data selecting program |
US20120105367A1 (en) * | 2010-11-01 | 2012-05-03 | Impress Inc. | Methods of using tactile force sensing for intuitive user interface |
CN103348312A (en) | 2010-12-02 | 2013-10-09 | 戴斯帕克有限公司 | Systems, devices and methods for streaming multiple different media content in a digital container |
US20120311444A1 (en) | 2011-06-05 | 2012-12-06 | Apple Inc. | Portable multifunction device, method, and graphical user interface for controlling media playback using gestures |
US9549012B2 (en) | 2011-07-14 | 2017-01-17 | Sirius Xm Radio Inc. | Content caching services in satellite and satellite/IP content delivery systems content caching |
US10706096B2 (en) | 2011-08-18 | 2020-07-07 | Apple Inc. | Management of local and remote media items |
US9021355B2 (en) | 2011-09-08 | 2015-04-28 | Imagine Communications Corp. | Graphical user interface to facilitate managing media operations |
SG11201401773XA (en) | 2011-10-24 | 2014-08-28 | Omnifone Ltd | Method, system and computer program product for navigating digital media content |
JP2013117869A (en) | 2011-12-02 | 2013-06-13 | Sony Corp | Display control device, display control method, and program |
US8855798B2 (en) * | 2012-01-06 | 2014-10-07 | Gracenote, Inc. | User interface to media files |
US9547437B2 (en) | 2012-07-31 | 2017-01-17 | Apple Inc. | Method and system for scanning preview of digital media |
US20140123006A1 (en) | 2012-10-25 | 2014-05-01 | Apple Inc. | User interface for streaming media stations with flexible station creation |
US9843607B2 (en) | 2012-11-01 | 2017-12-12 | Blackberry Limited | System and method of transferring control of media playback between electronic devices |
US9002991B2 (en) | 2013-04-06 | 2015-04-07 | Miranda Technologies Partnership | System and methods for cloud-based media play out |
US9495076B2 (en) | 2013-05-29 | 2016-11-15 | Sonos, Inc. | Playlist modification |
US10715973B2 (en) | 2013-05-29 | 2020-07-14 | Sonos, Inc. | Playback queue control transition |
US9423998B2 (en) | 2014-03-28 | 2016-08-23 | Spotify Ab | System and method for playback of media content with audio spinner functionality |
US10462505B2 (en) | 2014-07-14 | 2019-10-29 | Sonos, Inc. | Policies for media playback |
-
2015
- 2015-10-09 US US14/879,737 patent/US9423998B2/en active Active
- 2015-10-09 US US14/879,774 patent/US9489113B2/en active Active
- 2015-10-09 US US14/879,743 patent/US9483166B2/en active Active
-
2016
- 2016-10-10 US US15/289,684 patent/US20170024092A1/en not_active Abandoned
- 2016-10-10 US US15/289,689 patent/US20170024093A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040261040A1 (en) * | 2003-06-23 | 2004-12-23 | Microsoft Corporation | Method and apparatus for media access control |
US20070192739A1 (en) * | 2005-12-02 | 2007-08-16 | Hillcrest Laboratories, Inc. | Scene transitions in a zoomable user interface using a zoomable markup language |
US20080086687A1 (en) * | 2006-10-06 | 2008-04-10 | Ryutaro Sakai | Graphical User Interface For Audio-Visual Browsing |
US20090249222A1 (en) * | 2008-03-25 | 2009-10-01 | Square Products Corporation | System and method for simultaneous media presentation |
US20100262938A1 (en) * | 2009-04-10 | 2010-10-14 | Rovi Technologies Corporation | Systems and methods for generating a media guidance application with multiple perspective views |
US20110035705A1 (en) * | 2009-08-05 | 2011-02-10 | Robert Bosch Gmbh | Entertainment media visualization and interaction method |
US20110234480A1 (en) * | 2010-03-23 | 2011-09-29 | Apple Inc. | Audio preview of music |
US20130113737A1 (en) * | 2011-11-08 | 2013-05-09 | Sony Corporation | Information processing device, information processing method, and computer program |
US20130227463A1 (en) * | 2012-02-24 | 2013-08-29 | Research In Motion Limited | Electronic device including touch-sensitive display and method of controlling same |
US20150062052A1 (en) * | 2012-05-09 | 2015-03-05 | Apple Inc. | Device, Method, and Graphical User Interface for Transitioning Between Display States in Response to a Gesture |
US20150334204A1 (en) * | 2014-05-15 | 2015-11-19 | Google Inc. | Intelligent auto-caching of media |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11037539B2 (en) | 2015-09-29 | 2021-06-15 | Shutterstock, Inc. | Autonomous music composition and performance system employing real-time analysis of a musical performance to automatically compose and perform music to accompany the musical performance |
US11037540B2 (en) | 2015-09-29 | 2021-06-15 | Shutterstock, Inc. | Automated music composition and generation systems, engines and methods employing parameter mapping configurations to enable automated music composition and generation |
US10467998B2 (en) | 2015-09-29 | 2019-11-05 | Amper Music, Inc. | Automated music composition and generation system for spotting digital media objects and event markers using emotion-type, style-type, timing-type and accent-type musical experience descriptors that characterize the digital music to be automatically composed and generated by the system |
US12039959B2 (en) | 2015-09-29 | 2024-07-16 | Shutterstock, Inc. | Automated music composition and generation system employing virtual musical instrument libraries for producing notes contained in the digital pieces of automatically composed music |
US11776518B2 (en) | 2015-09-29 | 2023-10-03 | Shutterstock, Inc. | Automated music composition and generation system employing virtual musical instrument libraries for producing notes contained in the digital pieces of automatically composed music |
US11430418B2 (en) | 2015-09-29 | 2022-08-30 | Shutterstock, Inc. | Automatically managing the musical tastes and preferences of system users based on user feedback and autonomous analysis of music automatically composed and generated by an automated music composition and generation system |
US11657787B2 (en) | 2015-09-29 | 2023-05-23 | Shutterstock, Inc. | Method of and system for automatically generating music compositions and productions using lyrical input and music experience descriptors |
US11011144B2 (en) | 2015-09-29 | 2021-05-18 | Shutterstock, Inc. | Automated music composition and generation system supporting automated generation of musical kernels for use in replicating future music compositions and production environments |
US11017750B2 (en) | 2015-09-29 | 2021-05-25 | Shutterstock, Inc. | Method of automatically confirming the uniqueness of digital pieces of music produced by an automated music composition and generation system while satisfying the creative intentions of system users |
US11651757B2 (en) | 2015-09-29 | 2023-05-16 | Shutterstock, Inc. | Automated music composition and generation system driven by lyrical input |
US11030984B2 (en) | 2015-09-29 | 2021-06-08 | Shutterstock, Inc. | Method of scoring digital media objects using musical experience descriptors to indicate what, where and when musical events should appear in pieces of digital music automatically composed and generated by an automated music composition and generation system |
US11037541B2 (en) | 2015-09-29 | 2021-06-15 | Shutterstock, Inc. | Method of composing a piece of digital music using musical experience descriptors to indicate what, when and how musical events should appear in the piece of digital music automatically composed and generated by an automated music composition and generation system |
US11468871B2 (en) | 2015-09-29 | 2022-10-11 | Shutterstock, Inc. | Automated music composition and generation system employing an instrument selector for automatically selecting virtual instruments from a library of virtual instruments to perform the notes of the composed piece of digital music |
US11430419B2 (en) | 2015-09-29 | 2022-08-30 | Shutterstock, Inc. | Automatically managing the musical tastes and preferences of a population of users requesting digital pieces of music automatically composed and generated by an automated music composition and generation system |
US10854180B2 (en) | 2015-09-29 | 2020-12-01 | Amper Music, Inc. | Method of and system for controlling the qualities of musical energy embodied in and expressed by digital music to be automatically composed and generated by an automated music composition and generation engine |
US10672371B2 (en) | 2015-09-29 | 2020-06-02 | Amper Music, Inc. | Method of and system for spotting digital media objects and event markers using musical experience descriptors to characterize digital music to be automatically composed and generated by an automated music composition and generation engine |
US11635873B2 (en) * | 2018-03-01 | 2023-04-25 | Huawei Technologies Co., Ltd. | Information display method, graphical user interface, and terminal for displaying media interface information in a floating window |
US20200401281A1 (en) * | 2018-03-01 | 2020-12-24 | Huawei Technologies Co., Ltd. | Information Display Method, Graphical User Interface, and Terminal |
US11379180B2 (en) * | 2018-09-04 | 2022-07-05 | Beijing Dajia Internet Information Technology Co., Ltd | Method and device for playing voice, electronic device, and storage medium |
CN112104595A (en) * | 2019-06-18 | 2020-12-18 | 明日基金知识产权控股有限公司 | Location-based application flow activation |
US11455777B2 (en) | 2019-06-18 | 2022-09-27 | The Calany Holding S. À R.L. | System and method for virtually attaching applications to and enabling interactions with dynamic objects |
US11516296B2 (en) | 2019-06-18 | 2022-11-29 | THE CALANY Holding S.ÀR.L | Location-based application stream activation |
US11546721B2 (en) * | 2019-06-18 | 2023-01-03 | The Calany Holding S.À.R.L. | Location-based application activation |
US11341727B2 (en) | 2019-06-18 | 2022-05-24 | The Calany Holding S. À R.L. | Location-based platform for multiple 3D engines for delivering location-based 3D content to a user |
US11270513B2 (en) | 2019-06-18 | 2022-03-08 | The Calany Holding S. À R.L. | System and method for attaching applications and interactions to static objects |
CN112104689A (en) * | 2019-06-18 | 2020-12-18 | 明日基金知识产权控股有限公司 | Location-based application activation |
US11037538B2 (en) | 2019-10-15 | 2021-06-15 | Shutterstock, Inc. | Method of and system for automated musical arrangement and musical instrument performance style transformation supported within an automated music performance system |
US11024275B2 (en) | 2019-10-15 | 2021-06-01 | Shutterstock, Inc. | Method of digitally performing a music composition using virtual musical instruments having performance logic executing within a virtual musical instrument (VMI) library management system |
US10964299B1 (en) | 2019-10-15 | 2021-03-30 | Shutterstock, Inc. | Method of and system for automatically generating digital performances of music compositions using notes selected from virtual musical instruments based on the music-theoretic states of the music compositions |
Also Published As
Publication number | Publication date |
---|---|
US20160103595A1 (en) | 2016-04-14 |
US20160103589A1 (en) | 2016-04-14 |
US20160103656A1 (en) | 2016-04-14 |
US9483166B2 (en) | 2016-11-01 |
US20170024092A1 (en) | 2017-01-26 |
US9489113B2 (en) | 2016-11-08 |
US9423998B2 (en) | 2016-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9489113B2 (en) | System and method for playback of media content with audio touch menu functionality | |
US20170075468A1 (en) | System and method for playback of media content with support for force-sensitive touch input | |
US12026296B2 (en) | Multi-track playback of media content during repetitive motion activities | |
US11956291B2 (en) | Station creation | |
US20170039028A1 (en) | User interface for streaming media stations with virtual playback | |
JP7195426B2 (en) | Display page interaction control method and apparatus | |
CA3004231C (en) | Enhancing video content with extrinsic data | |
US20150277707A1 (en) | System and method for multi-track playback of media content | |
US20150205511A1 (en) | Systems And Methods For An Animated Graphical User Interface | |
EP2925008A1 (en) | System and method for multi-track playback of media content | |
US11209972B2 (en) | Combined tablet screen drag-and-drop interface | |
US10338799B1 (en) | System and method for providing an adaptive seek bar for use with an electronic device | |
US10021156B2 (en) | Method and an electronic device for performing playback and sharing of streamed media | |
US20160249091A1 (en) | Method and an electronic device for providing a media stream | |
CN105122826B (en) | System and method for displaying annotated video content by a mobile computing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |