EP2599304A1 - Method and apparatus for determining and equalizing one or more segments of a media track - Google Patents

Method and apparatus for determining and equalizing one or more segments of a media track

Info

Publication number
EP2599304A1
EP2599304A1 EP11811885.0A EP11811885A EP2599304A1 EP 2599304 A1 EP2599304 A1 EP 2599304A1 EP 11811885 A EP11811885 A EP 11811885A EP 2599304 A1 EP2599304 A1 EP 2599304A1
Authority
EP
European Patent Office
Prior art keywords
media
output settings
segments
settings
corresponding output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11811885.0A
Other languages
German (de)
French (fr)
Other versions
EP2599304A4 (en
Inventor
Maulik Sailor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Publication of EP2599304A1 publication Critical patent/EP2599304A1/en
Publication of EP2599304A4 publication Critical patent/EP2599304A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25891Management of end-user data being end-user preferences
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Definitions

  • Wireless (e.g., cellular) service providers and device manufacturers are continually challenged to deliver value and convenience to consumers by, for example, providing compelling services and features.
  • One area of interest has been in the development of services, applications, and the like for facilitating media playback.
  • consumers typically utilize various electronic devices and media players to playback media such as audio and video tracks.
  • Each of these devices and/or players may have different output characteristics (e.g., audio or video quality). Because of these differences, may consumers often customize or adjust output settings during media playback to optimize media reproduction.
  • conventional media players and/or devices allow at least some user control over playback settings using, for instance, an equalizer.
  • a method comprises determining one or more segments of a media track.
  • the method further comprises associating the one or more segments with one or more corresponding output settings.
  • the method further comprises causing, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
  • an apparatus comprising at least one processor, and at least one memory including computer program code for one or more programs, the at least one memory and the computer program code configured to, with the at least one processor, cause, at least in part, the apparatus to determine one or more segments of a media track.
  • the apparatus is also caused, at least in part, to associate the one or more segments with one or more corresponding output settings.
  • the apparatus is also caused, at least in part, to cause, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
  • a computer-readable storage medium carrying one or more sequences of one or more instructions which, when executed by one or more processors, cause, at least in part, an apparatus to determine one or more segments of a media track.
  • the apparatus is also caused, at least in part, to associate the one or more segments with one or more corresponding output settings.
  • the apparatus is also caused, at least in part, to cause, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
  • an apparatus comprises means for determining one or more segments of a media track.
  • the apparatus further comprises means for causing, at least in part, associating the one or more segments with one or more corresponding output settings.
  • the apparatus further comprises means for causing, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
  • a method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on (including derived at least in part from) any one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
  • a method comprising facilitating access to at least one interface configured to allow access to at least one service, the at least one service configured to perform any one or any combination of network or service provider methods (or processes) disclosed in this application.
  • a method comprising facilitating creating and/or facilitating modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based, at least in part, on data and/or information resulting from one or any combination of methods or processes disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
  • a method comprising creating and/or modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based at least in part on data and/or information resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
  • the methods can be accomplished on the service provider side or on the mobile device side or in any shared way between service provider and mobile device with actions being performed on both sides.
  • An apparatus comprising means for performing the method of any of originally filed claims 1- 14 and 27-29.
  • FIG. 1 is a diagram of a system capable of customizing media output settings on a user device, according to one embodiment
  • FIG. 2A is a diagram of a system for determining one or more output settings for one or more segments of one or more media tracks, according to one embodiment
  • FIG. 2B is a diagram showing an example of a segmented media track and corresponding output settings for each segment, according to one embodiment
  • FIG. 3 is a diagram of components of a media player capable of determining and/or utilizing media output settings for media playback, according to one embodiment
  • FIG. 4 is a flowchart of a process for determining one or more segments of a media track, determining one or more output settings and associating the one or more segments with the one or more output settings, according to one embodiment
  • FIG. 5 is a flowchart of a process for associating one or more segments of a media track with one or more output setting metadata, according to one embodiment
  • FIG. 6 is a flowchart of a process for retrieving one or more output setting metadata, according to one embodiment
  • FIG. 8 is a flowchart of a process for determining output capabilities of a device and applying one or more output setting metadata, according to one embodiment
  • FIG. 9 is diagram of a user interface for determining one or more segments of one or more media tracks and/or specifying one or more corresponding output settings, according to one embodiment
  • FIG. 10 is a diagram of hardware that can be used to implement an embodiment of the invention.
  • FIG. 11 is a diagram of a chip set that can be used to implement an embodiment of the invention.
  • FIG. 12 is a diagram of a mobile terminal (e.g., handset) that can be used to implement an embodiment of the invention. DESCRIPTION OF SOME EMBODIMENTS
  • content or media includes, for example, digital sound, songs, digital images, digital games, digital maps, point of interest information, digital videos, such as music videos, news clips and theatrical videos, advertisements, program files or objects, any other digital media or content, or any combination thereof.
  • the term rendering indicates, for instance, any method for presenting the content at a device, including playing music through speakers, displaying images on a screen or in a projection or on tangible media such as photographic or plain paper, showing videos on a suitable display device with sound, graphing game or map data, or any other term of art for presentation, or any combination thereof.
  • a player is an example of a rendering module.
  • a play list is information about content rendered on one or more players in response to input by a user, and is associated with that user.
  • a play history is information about the time sequence of content rendered on one or more players in response to input by a user, and is associated with that user.
  • FIG. 1 is a diagram of a system capable of customizing media output settings on a user device, according to one embodiment.
  • an equalizer to customize the sound output settings for playing back media tracks, particularly which using devices and/or players that have different output characteristics.
  • media output settings apply to an entire media track (e.g., a musical track in a musical album) at a time.
  • the same equalizer setting is specified for the entire track and is not applied to the track at a finer granularity (e.g., to segments or portions of the track).
  • the content creator, content provider and/or content user is not able to fine-tune, encode and associate a more granular output setting scheme such as one or more output settings for one or more segments of a media track.
  • This shortcoming may make the media output less than optimal for the user.
  • the system 100 of FIG. 1 introduces the capability to specify media output settings (e.g., equalizer settings) for individual segments of a media track or file, and then to playback the media track according to those settings. More specifically, the system 100 provides, for instance, at least the following capabilities: (1) to allow a content creator, a content producer, a content provider and/or a content user to identify one or more segments of a media track, encode and associate one or more output settings with the one or more segments of the media track; and (2) for a media player to utilize the one or more output settings when rendering the output.
  • media output settings e.g., equalizer settings
  • a content creator/provider can divide a media track into any number of segments and then assign individual output settings to each segment.
  • an equalizer e.g., included in a media player or provided as a standalone module
  • the equalizer can then specify or direct the output of the media track according to the settings.
  • the equalizer can be implemented by using passive and/or active electronic elements or by using digital algorithms for altering one or more output settings such as amplitude, frequency, phase, time delay, and/or the like.
  • preset output parameters may be specified based on media genre such as classical music, pop music, movie videos, sports videos, etc. For example, for a music track that includes a medley of many different styles, the content creator can specify classical equalizer settings for the introduction, a pop equalizer setting for a first verse, a rock equalizer setting for a second verse, etc.
  • the individual output settings may be associated with corresponding segments of the media track as metadata.
  • the metadata may be encoded or otherwise embedded in the media track file.
  • the output settings may be associated with the corresponding segments in a lookup table or other database. Accordingly, in another embodiment, during playback of media track, the system 100 can retrieve the output settings for the media track from either metadata, lookup table, or database, and then initiate playback based on the retrieved settings.
  • a user can specify segment output settings for a particular track.
  • the user specified-settings can then be used to override the settings specified by the content creator/provider or the user can select which (e.g., either the user-specified or creator-specified settings) to use.
  • the user-specified settings can then be associated with the media track for use in subsequent playbacks.
  • the user-specified settings may be shared and/or retrieved from other users or devices.
  • the settings may define at least in part a set of frequencies and
  • the settings may specify any other media characteristics (e.g., brightness, color, saturation, etc. for video) and their corresponding values.
  • the system 100 includes a service platform 101 with connectivity to a web server 103 and user equipment (UE) 105 a- 105b over the
  • the web server 103 provides access to one or more services of the service platform 101.
  • the UEs 101a- 105b may also access the service platform 101 directly (e.g., using client applications).
  • FIG. 1 depicts only two UEs (e.g., UEs 105a-105b) in the system 100.
  • the system may support any number of UEs 105 up to the maximum capacity of the communication network 107.
  • the network capacity may be determined based on available bandwidth, available connection points, and/or the like.
  • the web server 103 further includes one or more web pages 109 including one or more content portals 111 to facilitate automatic and efficient sharing of content such as different types of media and/or related services.
  • the content and any related segment output settings are provided by one or more of the media services 113a-l 13n of the service platform 101.
  • the service platform 101 includes one or more services 113a-l 13n (e.g., music service, mapping service, video service, social network service, etc.), a user account manager 115, and a user account database 117.
  • the services 113a- 113n are connected directly or indirectly to network 107 and may be associated to share login credentials for granting user access.
  • one or more of the services 113a-l 13n are managed services provided by a service provider or operator of the network 107.
  • the user account manager 115 manages the sharing of login credentials by tracking which of the services 113a-l 13n share login credentials and then linking the credentials to the user accounts created with the various services 113a-l 13n a particular user.
  • the user account manager 115 can manage or provide access to premium or subscription-based content for playback at the UE 105.
  • the user account manager 115 may store the tracking information for the login credentials and the user account information in the user account database 117.
  • the user account database 117 can reside on one or more nodes connected directly or indirectly to one or more of the services 113a-l 13n. In other embodiments, user account database 117 resides on one or more nodes in network 107.
  • the user account database 117 includes one or more processes (not shown) and one or more data structures that stores information about registered user each of the services 113a-l 13n including login credentials and related information as well as data, configurations, user profiles, variables, conditions, and the like associated with using any of the services 113a- 113n.
  • the service platform 101 and the web server 103 can be implemented via shared, partially shared, or different computer hardware (e.g., the hardware described with respect to FIG. 10).
  • the communication network 107 of the system 100 includes one or more networks such as a data network (not shown), a wireless network (not shown), a telephony network (not shown), or any combination thereof.
  • the data network may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), a public data network (e.g., the Internet), or any other suitable packet-switched network, such as a commercially owned, proprietary packet- switched network, e.g., a proprietary cable or fiber-optic network.
  • the wireless network may be, for example, a cellular network and may employ various technologies including enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite, mobile ad-hoc network (MANET), and the like.
  • EDGE enhanced data rates for global evolution
  • GPRS general packet radio service
  • GSM global system for mobile communications
  • IMS Internet protocol multimedia subsystem
  • UMTS universal mobile telecommunications system
  • WiMAX worldwide interoperability for microwave access
  • LTE Long Term Evolution
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • WiFi wireless fidelity
  • satellite mobile
  • the UEs 105 are any type of mobile terminal, fixed terminal, or portable terminal including a mobile handset, station, unit, device, multimedia tablet, multimedia computer, Internet node, communicator, desktop computer, laptop computer, Personal Digital Assistants (PDAs), or any combination thereof. It is also contemplated that the UE 105 can support any type of interface to the user (such as "wearable" circuitry, etc.).
  • the UE 105 may also be equipped with one or more sensors (e.g., a global positioning satellite (GPS) sensor, accelerometer, light sensor, etc.) for use with the services 113a-l 13n.
  • GPS global positioning satellite
  • a protocol includes a set of rules defining how the network nodes within the communication network 107 interact with each other based on information sent over the communication links.
  • the protocols are effective at different layers of operation within each node, from generating and receiving physical signals of various types, to selecting a link for transferring those signals, to the format of information indicated by those signals, to identifying which software application executing on a computer system sends or receives the information.
  • the conceptually different layers of protocols for exchanging information over a network are described in the Open Systems Interconnection (OSI) Reference Model.
  • Each packet typically comprises (1) header information associated with a particular protocol, and (2) payload information that follows the header information and contains information that may be processed independently of that particular protocol.
  • the packet includes (3) trailer information following the payload and indicating the end of the payload information.
  • the header includes information such as the source of the packet, its destination, the length of the payload, and other properties used by the protocol.
  • the data in the payload for the particular protocol includes a header and payload for a different protocol associated with a different, higher layer of the OSI Reference Model.
  • the header for a particular protocol typically indicates a type for the next protocol contained in its payload.
  • the higher layer protocol is said to be encapsulated in the lower layer protocol.
  • the headers included in a packet traversing multiple heterogeneous networks, such as the Internet typically include a physical (layer 1) header, a data- link (layer 2) header, an internetwork (layer 3) header and a transport (layer 4) header, and various application headers (layer 5, layer 6 and layer 7) as defined by the OSI Reference Model.
  • the media portal 111 and the corresponding service 113 interact according to a client-server model.
  • client-server model of computer process interaction is widely known and used.
  • a client process sends a message including a request to a server process, and the server process responds by providing a service.
  • the server process may also return a message with a response to the client process.
  • client process and server process execute on different computer devices, called hosts, and communicate via a network using one or more protocols for network communications.
  • the term "server” is conventionally used to refer to the process that provides the service, or the host computer on which the process operates.
  • client is conventionally used to refer to the process that makes the request, or the host computer on which the process operates.
  • server refer to the processes, rather than the host computers, unless otherwise clear from the context.
  • process performed by a server can be broken up to run as multiple processes on multiple hosts (sometimes called tiers) for reasons that include reliability, scalability, and redundancy, among others.
  • FIG. 2A is a diagram of a system 200 for determining one or more output settings for one or more segments of one or more media tracks, according to one embodiment.
  • the system 200 enables a user (e.g., an end-user or a content creator/provider) to determine one or more segments of a media track 201, encode one or more output settings at an equalizer 203, and associate the one or more output settings with the one or more segments of the media track 205.
  • the equalizer 203 is a component of the media player 123, the UE 105, or a combination thereof.
  • the equalizer 203 may be a separate component of the system 100 with connectivity to the UE 105 and the media player 123 over the
  • the output settings metadata 207 is available without having been associated with a media track.
  • one or more creators of the media track provide the encoding and the association of the output settings.
  • one or more users provide the encoding and the association of the output settings.
  • a content provider and/or a service provider provide the encoding and the association of the output settings.
  • the output settings are in the form of a metadata file, which specifies one or more parameters as output settings.
  • the metadata file is stored in one or more databases and can be accessed by one or more users.
  • FIG. 2B is a diagram showing an example of a segmented media track and corresponding output settings for each segment, according to one embodiment.
  • a graphical representation shows segments 221a-221n, amplitude and frequency graph of the segments, and the output settings 223a-223n corresponding to the segments 221-22 In respectively.
  • a media track can be segmented into any number of segments and the graph shows boundaries of each segment in this example. Accordingly, as each segment 221a-221n is played back, the media player 123 retrieves and applies the corresponding output settings 223a-223n.
  • FIG. 3 is a diagram of components of a media player capable of determining and/or utilizing media output settings for media playback, according to one embodiment.
  • the media player 123 is capable of playing back different types of media and/or content such as audio, video, photos and/or the like.
  • the media can be stored in an analogue format on an analogue medium, such as analogue storage tape, or in a digital format on different storage devices such as a portable disc, a compact disc (CD), a digital video disc (DVD), a memory unit, a hard drive, a digital tape and/or the like.
  • the media player can be a substantially standalone unit; such as a tape player, a CD player, a DVD player, a video player, a memory-based player (such as a MP3 player); and/or can be substantially included in another device such as a computer, a mobile phone, an electronic notebook, an electronic reader and/or the like.
  • the media player 123 includes one or more components for obtaining, creating, determining and/or utilizing one or more output settings for reproducing media content. It is contemplated that the functions of these components may be combined in one or more components or performed by other components of equivalent functionality.
  • the media player 123 includes a media interface 301, an applications module 303, a runtime module 305, a user interface 307, a user profile 309, a storage device 311, a communication interface 313 and an output module 315. Also, output devices 317 and 319 are shown in the system 300.
  • the communication interface 313 can be used to communicate with one or more content providers, service providers, other user devices and/or the like. Certain communications can be via methods such as an internet protocol, messaging, or any other communication method.
  • one or more users can send a query or a request to obtain content and/or output settings from one or more online services and/or from one or more users via the communication interface 313.
  • the media interface 301 can accept one or more media inputs; such as an audio tape, a video tape, a compact disc, a memory card, a memory stick and/or the like; and can cause to store one or more content from the one or more media onto the storage device 311.
  • the one or more content are then executed on the runtime module 305. For example, a musical track is obtained from a compact disc and executed on the runtime module 305, which then begins other processes in the media player 123.
  • the applications module 303 includes one or more applications such as a digital equalizer, an internet browser, a media editing application and/or the like.
  • one or more users of the UE 105 are able to edit a media track to, at least, determine one or more segments for the media track, encode and associate one or more output settings with the one or more segments.
  • the one or more users of the UE 105 are able to search for and obtain one or more output setting records for one or more media tracks and then associate the one or more output settings with the one or more segments of the one or more media tracks.
  • the one or more users of the UE 105 can cause to share one or more output settings with one or more other users of one or more UE 105.
  • one or more of the obtained, the created and/or the edited output settings are caused to be stored onto the storage device 311 and/or in the user profile 309.
  • the output settings may be stored in another storage device or component (not shown) (e.g., a cloud computing server) accessible over the communication network 107.
  • the user interface (UI) 307 can include various methods of communication.
  • the UI 307 can have outputs including a visual component (e.g., a screen), an audio component, a physical component (e.g., vibrations), and other methods of communication.
  • User inputs can include a touch-screen interface, voice input, gesture based interface, a scroll-and-click interface, a button interface, a microphone, biometric based inputs (for example human electronic brain signals, eye parameters, etc.) and/or the like.
  • the UI 307 may be used to prompt the user to enter local credentials (e.g., a PIN code, biometric sensor input, etc.) and receive local credentials from the user.
  • the communication can be via a wired or a wireless interface.
  • the UI 307 can be located on one device, multiple devices, at one location, at multiple locations and can be shared by one or more users.
  • the output module 315 utilizes the one or more output settings associated with the one or more segments of the media track to cause output to the output device 317 and/or the output device 319.
  • an output device, such as 317, is substantially included in the media player 123.
  • FIG. 4 is a flowchart of a process for determining one or more segments of a media track, determining one or more output settings and associating the one or more segments with the one or more output settings, according to one embodiment.
  • the equalizer 203 performs the process 300 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11.
  • one or more segments of a media track such as an audio track, are determined.
  • the determination may include specifying start/stops times for each segment within the media track. This specification may be made visually (e.g., sliders for the start/stop times of the segments), explicitly (e.g., by direct input of the start/stop times) or a combination thereof
  • the one or more creators of the media track can determine how a media track should be segmented and/or perform the segmentation.
  • one or more users, one or more service providers and/or one or more devices can determine and/or perform the segmentation.
  • one or more output settings for the one or more segments are determined.
  • the one or more creators of the media track may determine one or more specific output settings for the one or more segments, which can cause an optimal rendering of the output.
  • the one or more users can determine the one or more output settings for the one or more segments.
  • one or more content providers and/or one or more service providers provide the one or more output settings as a service for free and/or for a fee.
  • the one or more output settings are shared via one or more internet sites, such as one or more social network sites, and/or via one or more artist network sites.
  • the one or more segments are associated with the one or more corresponding output settings.
  • each segment may have one or more alternate output settings from which the user can select.
  • one or more applications and/or one or more modules in the media player 123 cause the association of the one or more segments with the one or more corresponding output settings.
  • one or more media tracks are already associated with one or more corresponding output settings.
  • the one or more output setting metadata for the one or more segments of the one or more media tracks are available as a metadata record.
  • FIG. 5 is a flowchart of process 500 for associating one or more segments of a media track with one or more output setting metadata, according to one embodiment.
  • the media player 123 performs the process 500 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11.
  • the process 500 assumes that the media player is to playback a media track such as a music track.
  • the user of the UE 105 downloads one or more media tracks from one or more content providers.
  • the user of the UE 105 inserts a CD into the media interface 301.
  • the media player determines if one or more output settings are included with the media track.
  • the media track is in the form of a digital file such as in MP3 format.
  • one or more output settings are included with the media track, then at step 505, one or more segments of the media track are determined.
  • the media player applications module 303 determines the one or more segments by analyzing the one or more settings metadata record.
  • the one or more segments of the media track are associated with the one or more corresponding output settings. In one embodiment, the one or more segments are associated with the one or more output settings.
  • the media player 123 causes, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings. In one embodiment, the one or more output settings are utilized as provided. In another embodiment, the one or more users further edit the one or more output setting metadata. However, at step 503, if it was determined that output settings are not included with the media track, then at step 511, the media player causes a determination whether one or more output settings are available for the media track of step 501.
  • FIG. 6 is a flowchart of process 600 for associating one or more segments of a media track with one or more output setting metadata, according to one embodiment.
  • the media player 123 and/or the equalizer 203 perform the process 600 and are implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11.
  • one or more segments of a media track are encoded with association of one or more corresponding output settings specifying at least a start time, a length, and the one or more corresponding output settings for the one or more segments.
  • substantially a device perform the encoding and the association.
  • FIG. 7 is a flowchart of process 700 for retrieving one or more output setting metadata, according to one embodiment.
  • the media player 123 and/or the equalizer 203 perform the process 700 and are implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11.
  • one or more corresponding output settings are retrieved from one or more service providers, one or more content providers, one or more social network sites, one or more devices, one or more users and/or a combination thereof.
  • user of UE 105 downloads a metadata file from a social network site, which contains one or more output settings for a media track on the user device.
  • the user downloads a media track from another user device and then downloads an output settings metadata file for the media track from same and/or another user device.
  • the user downloads a media track from a content provider, which already contains output setting metadata.
  • a fingerprint associated with a media track is determined and the retrieving of the one or more corresponding output setting metadata is based, at least in part, on the fingerprint.
  • a fingerprint identifies, at least in part, information about a media track, about one or more media tracks, about an album, about the creator of the media track and/or the like.
  • FIG. 8 is a flowchart of process 800 for determining output capabilities of a device and applying one or more output setting metadata, according to one embodiment.
  • the media player 123 and/or the equalizer 203 perform the process 800 and are implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11.
  • a media output capability of a device is determined and playback of a media track is further based, at least in part, on the media output capability.
  • out capability of a device is determined from information identifying type of the device, type of one or more output devices being utilized by the device.
  • a media player can be utilizing one or more external speakers, one or more internal speakers and/or one or more headsets all of which can be wired and/or wireless devices.
  • the media player can be utilizing an amplifier device.
  • the media player is paired with and/or connected to another media player device.
  • the one or more corresponding output settings include at least one of frequencies and corresponding amplitude, audio effects, video settings, video effects and/or the like.
  • of or more output settings specify one or more amplitude settings for one or more frequencies.
  • one or more output settings are specified for the audio track of a video media.
  • FIG. 9 is diagram of a user interface 900 for determining one or more segments of one or more media tracks and/or specifying one or more corresponding output settings, according to one embodiment.
  • one or more users determine one or more segments of a media track by identifying a starting point and an ending point for one or more segments in UI 900 section 901.
  • the one or more segments are already determined in a metadata file and the one or more segment information are shown in section 901.
  • amplitude levels for one or more frequencies can be determined for the one more segments in section 901.
  • the one or more users can determine and set a range of options in section 905, for example, a media name, a media track No.
  • a media track is loaded into the media player where the media track is already associated with one or more output settings in which case, the UI 900 displays the one or more output settings.
  • one or more users can utilize UI 900 to modify one or more output setting metadata for one or more media tracks.
  • the processes described herein for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output may be advantageously implemented via software, hardware (e.g., general processor, Digital Signal Processing (DSP) chip, an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Arrays (FPGAs), etc.), firmware or a combination thereof.
  • DSP Digital Signal Processing
  • ASIC Application Specific Integrated Circuit
  • FPGAs Field Programmable Gate Arrays
  • FIG. 10 illustrates a computer system 1000 upon which an embodiment of the invention may be implemented.
  • Computer system 1000 is depicted with respect to a particular device or equipment, it is contemplated that other devices or equipment (e.g., network elements, servers, etc.) within FIG. 10 can deploy the illustrated hardware and components of system 1000.
  • Computer system 1000 is programmed (e.g., via computer program code or instructions) to determine one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output as described herein and includes a communication mechanism such as a bus 1010 for passing information between other internal and external components of the computer system 1000.
  • Information is represented as a physical expression of a measurable phenomenon, typically electric voltages, but including, in other embodiments, such phenomena as magnetic, electromagnetic, pressure, chemical, biological, molecular, atomic, sub-atomic and quantum interactions.
  • a measurable phenomenon typically electric voltages, but including, in other embodiments, such phenomena as magnetic, electromagnetic, pressure, chemical, biological, molecular, atomic, sub-atomic and quantum interactions.
  • north and south magnetic fields, or a zero and non-zero electric voltage represent two states (0, 1) of a binary digit (bit).
  • Other phenomena can represent digits of a higher base.
  • a superposition of multiple simultaneous quantum states before measurement represents a quantum bit (qubit).
  • a sequence of one or more digits constitutes digital data that is used to represent a number or code for a character.
  • information called analog data is represented by a near continuum of measurable values within a particular range.
  • Computer system 1000 or a portion thereof, constitutes a means for performing one or more steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • a bus 1010 includes one or more parallel conductors of information so that information is transferred quickly among devices coupled to the bus 1010.
  • One or more processors 1002 for processing information are coupled with the bus 1010.
  • a processor (or multiple processors) 1002 performs a set of operations on information as specified by computer program code related to determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • the computer program code is a set of instructions or statements providing instructions for the operation of the processor and/or the computer system to perform specified functions.
  • the code for example, may be written in a computer programming language that is compiled into a native instruction set of the processor.
  • the code may also be written directly using the native instruction set (e.g., machine language).
  • the set of operations include bringing information in from the bus 1010 and placing information on the bus 1010.
  • the set of operations also typically include comparing two or more units of information, shifting positions of units of information, and combining two or more units of information, such as by addition or multiplication or logical operations like OR, exclusive OR (XOR), and AND.
  • Each operation of the set of operations that can be performed by the processor is represented to the processor by information called instructions, such as an operation code of one or more digits.
  • a sequence of operations to be executed by the processor 1002, such as a sequence of operation codes, constitute processor instructions, also called computer system instructions or, simply, computer instructions.
  • Processors may be implemented as mechanical, electrical, magnetic, optical, chemical or quantum components, among others, alone or in combination.
  • Computer system 1000 also includes a memory 1004 coupled to bus 1010.
  • the memory 1004 such as a random access memory (RAM) or other dynamic storage device, stores information including processor instructions for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • Dynamic memory allows information stored therein to be changed by the computer system 1000.
  • RAM allows a unit of information stored at a location called a memory address to be stored and retrieved independently of information at neighboring addresses.
  • the memory 1004 is also used by the processor 1002 to store temporary values during execution of processor instructions.
  • the computer system 1000 also includes a read only memory (ROM) 1006 or other static storage device coupled to the bus 1010 for storing static information, including instructions, that is not changed by the computer system 1000. Some memory is composed of volatile storage that loses the information stored thereon when power is lost. Also coupled to bus 1010 is a nonvolatile (persistent) storage device 1008, such as a magnetic disk, optical disk or flash card, for storing information, including instructions, that persists even when the computer system 1000 is turned off or otherwise loses power.
  • ROM read only memory
  • static storage device such as a magnetic disk, optical disk or flash card
  • Information including instructions for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output, is provided to the bus 1010 for use by the processor from an external input device 1012, such as a keyboard containing alphanumeric keys operated by a human user, or a sensor.
  • an external input device 1012 such as a keyboard containing alphanumeric keys operated by a human user, or a sensor.
  • a sensor detects conditions in its vicinity and transforms those detections into physical expression compatible with the measurable phenomenon used to represent information in computer system 1000.
  • Other external devices coupled to bus 1010 used primarily for interacting with humans, include a display device 1014, such as a cathode ray tube (CRT) or a liquid crystal display (LCD), or plasma screen or printer for presenting text or images, and a pointing device 1016, such as a mouse or a trackball or cursor direction keys, or motion sensor, for controlling a position of a small cursor image presented on the display 1014 and issuing commands associated with graphical elements presented on the display 1014.
  • a display device 1014 such as a cathode ray tube (CRT) or a liquid crystal display (LCD), or plasma screen or printer for presenting text or images
  • a pointing device 1016 such as a mouse or a trackball or cursor direction keys, or motion sensor, for controlling a position of a small cursor image presented on the display 1014 and issuing commands associated with graphical elements presented on the display 1014.
  • a display device 1014 such as a cathode ray tube (CRT
  • special purpose hardware such as an application specific integrated circuit (ASIC) 1020
  • ASIC application specific integrated circuit
  • the special purpose hardware is configured to perform operations not performed by processor 1002 quickly enough for special purposes.
  • application specific ICs include graphics accelerator cards for generating images for display 1014, cryptographic boards for encrypting and decrypting messages sent over a network, speech recognition, and interfaces to special external devices, such as robotic arms and medical scanning equipment that repeatedly perform some complex sequence of operations that are more efficiently implemented in hardware.
  • Computer system 1000 also includes one or more instances of a communications interface 1070 coupled to bus 1010.
  • Communication interface 1070 provides a one-way or two-way communication coupling to a variety of external devices that operate with their own processors, such as printers, scanners and external disks. In general the coupling is with a network link 1078 that is connected to a local network 1080 to which a variety of external devices with their own processors are connected.
  • communication interface 1070 may be a parallel port or a serial port or a universal serial bus (USB) port on a personal computer.
  • communications interface 1070 is an integrated services digital network (ISDN) card or a digital subscriber line (DSL) card or a telephone modem that provides an information communication connection to a corresponding type of telephone line.
  • ISDN integrated services digital network
  • DSL digital subscriber line
  • a communication interface 1070 is a cable modem that converts signals on bus 1010 into signals for a communication connection over a coaxial cable or into optical signals for a communication connection over a fiber optic cable.
  • communications interface 1070 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, such as Ethernet.
  • LAN local area network
  • Wireless links may also be implemented.
  • the communications interface 1070 sends or receives or both sends and receives electrical, acoustic or electromagnetic signals, including infrared and optical signals, that carry information streams, such as digital data.
  • the communications interface 1070 includes a radio band electromagnetic transmitter and receiver called a radio transceiver.
  • the communications interface 1070 enables connection to the communication network 105 for the UE 101.
  • Non-transitory media such as non- volatile media, include, for example, optical or magnetic disks, such as storage device 1008.
  • Volatile media include, for example, dynamic memory 1004.
  • Transmission media include, for example, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves.
  • Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media.
  • Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD- ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
  • the term computer-readable storage medium is used herein to refer to any computer-readable medium except transmission media.
  • Logic encoded in one or more tangible media includes one or both of processor instructions on a computer-readable storage media and special purpose hardware, such as ASIC 1020.
  • Network link 1078 typically provides information communication using transmission media through one or more networks to other devices that use or process the information.
  • network link 1078 may provide a connection through local network 1080 to a host computer 1082 or to equipment 1084 operated by an Internet
  • ISP Internet Service Provider
  • a computer called a server host 1092 connected to the Internet hosts a process that provides a service in response to information received over the Internet.
  • server host 1092 hosts a process that provides information representing video data for presentation at display 1014. It is contemplated that the components of system 1000 can be deployed in various configurations within other computer systems, e.g., host 1082 and server 1092.
  • At least some embodiments of the invention are related to the use of computer system 1000 for implementing some or all of the techniques described herein. According to one embodiment of the invention, those techniques are performed by computer system 1000 in response to processor 1002 executing one or more sequences of one or more processor instructions contained in memory 1004. Such instructions, also called computer instructions, software and program code, may be read into memory 1004 from another computer-readable medium such as storage device 1008 or network link 1078. Execution of the sequences of instructions contained in memory 1004 causes processor 1002 to perform one or more of the method steps described herein. In alternative embodiments, hardware, such as ASIC 1020, may be used in place of or in combination with software to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware and software, unless otherwise explicitly stated herein.
  • Computer system 1000 can send and receive information, including program code, through the networks 1080, 1090 among others, through network link 1078 and
  • a server host 1092 transmits program code for a particular application, requested by a message sent from computer 1000, through Internet 1090, ISP equipment 1084, local network 1080 and communications interface 1070.
  • the received code may be executed by processor 1002 as it is received, or may be stored in memory 1004 or in storage device 1008 or other non- volatile storage for later execution, or both.
  • computer system 1000 may obtain application program code in the form of signals on a carrier wave.
  • Various forms of computer readable media may be involved in carrying one or more sequence of instructions or data or both to processor 1002 for execution.
  • instructions and data may initially be carried on a magnetic disk of a remote computer such as host 1082.
  • the remote computer loads the instructions and data into its dynamic memory and sends the instructions and data over a telephone line using a modem.
  • a modem local to the computer system 1000 receives the instructions and data on a telephone line and uses an infra-red transmitter to convert the instructions and data to a signal on an infra-red carrier wave serving as the network link 1078.
  • An infrared detector serving as communications interface 1070 receives the instructions and data carried in the infrared signal and places information representing the instructions and data onto bus 1010.
  • Bus 1010 carries the information to memory 1004 from which processor 1002 retrieves and executes the instructions using some of the data sent with the instructions.
  • the instructions and data received in memory 1004 may optionally be stored on storage device 1008, either before or after execution by the processor 1002.
  • FIG. 11 illustrates a chip set or chip 1100 upon which an embodiment of the invention may be implemented.
  • Chip set 1100 is programmed to determine one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output as described herein and includes, for instance, the processor and memory components described with respect to FIG. 10 incorporated in one or more physical packages (e.g., chips).
  • a physical package includes an arrangement of one or more materials, components, and/or wires on a structural assembly (e.g., a baseboard) to provide one or more characteristics such as physical strength, conservation of size, and/or limitation of electrical interaction.
  • chip set 1100 can be implemented in a single chip. It is further contemplated that in certain embodiments the chip set or chip 1100 can be implemented as a single "system on a chip.” It is further contemplated that in certain embodiments a separate ASIC would not be used, for example, and that all relevant functions as disclosed herein would be performed by a processor or processors. Chip set or chip 1100, or a portion thereof, constitutes a means for performing one or more steps of providing user interface navigation information associated with the availability of services.
  • Chip set or chip 1100 constitutes a means for performing one or more steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • the chip set or chip 1100 includes a communication mechanism such as a bus 1101 for passing information among the components of the chip set 1100.
  • a processor 1103 has connectivity to the bus 1101 to execute instructions and process information stored in, for example, a memory 1105.
  • the processor 1103 may include one or more processing cores with each core configured to perform independently.
  • a multi-core processor enables multiprocessing within a single physical package.
  • Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores.
  • the processor 1103 may include one or more microprocessors configured in tandem via the bus 1101 to enable independent execution of instructions, pipelining, and multithreading.
  • the processor 1103 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 1107, or one or more application- specific integrated circuits (ASIC) 1109.
  • DSP 1107 typically is configured to process real-world signals (e.g., sound) in real time independently of the processor 1103.
  • an ASIC 1109 can be configured to performed specialized functions not easily performed by a more general purpose processor.
  • Other specialized components to aid in performing the inventive functions described herein may include one or more field programmable gate arrays (FPGA) (not shown), one or more controllers (not shown), or one or more other special-purpose computer chips.
  • FPGA field programmable gate arrays
  • the chip set or chip 800 includes merely one or more processors and some software and/or firmware supporting and/or relating to and/or for the one or more processors.
  • the processor 1103 and accompanying components have connectivity to the memory 1105 via the bus 1101.
  • the memory 1105 includes both dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.) and static memory (e.g., ROM, CD- ROM, etc.) for storing executable instructions that when executed perform the inventive steps described herein for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output
  • the memory 1105 also stores the data associated with or generated by the execution of the inventive steps.
  • FIG. 12 is a diagram of exemplary components of a mobile terminal (e.g., handset) for communications, which is capable of operating in the system of FIG. 1, according to one embodiment.
  • mobile terminal 1200 or a portion thereof, constitutes a means for performing one or more steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • a radio receiver is often defined in terms of front-end and back-end characteristics. The front-end of the receiver
  • circuitry refers to both: (1) hardware-only implementations (such as implementations in only analog and/or digital circuitry), and (2) to combinations of circuitry and software (and/or firmware) (such as, if applicable to the particular context, to a combination of processor(s), including digital signal processor(s), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions).
  • This definition of "circuitry” applies to all uses of this term in this application, including in any claims.
  • the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) and its (or their) accompanying software/or firmware.
  • circuitry would also cover if applicable to the particular context, for example, a baseband integrated circuit or applications processor integrated circuit in a mobile phone or a similar integrated circuit in a cellular network device or other network devices.
  • Pertinent internal components of the telephone include a Main Control Unit (MCU) 1203, a Digital Signal Processor (DSP) 1205, and a receiver/transmitter unit including a microphone gain control unit and a speaker gain control unit.
  • a main display unit 1207 provides a display to the user in support of various applications and mobile terminal functions that perform or support the steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • the display 1207 includes display circuitry configured to display at least a portion of a user interface of the mobile terminal (e.g., mobile telephone).
  • An audio function circuitry 1209 includes a microphone 1211 and microphone amplifier that amplifies the speech signal output from the microphone 1211. The amplified speech signal output from the microphone 1211 is fed to a coder/decoder (CODEC) 1213.
  • CDDEC coder/decoder
  • a radio section 1215 amplifies power and converts frequency in order to communicate with a base station, which is included in a mobile communication system, via antenna 1217.
  • the power amplifier (PA) 1219 and the transmitter/modulation circuitry are operationally responsive to the MCU 1203, with an output from the PA 1219 coupled to the dup lexer 1221 or circulator or antenna switch, as known in the art.
  • the PA 1219 also couples to a battery interface and power control unit 1220.
  • a user of mobile terminal 1201 speaks into the microphone 1211 and his or her voice along with any detected background noise is converted into an analog voltage.
  • the analog voltage is then converted into a digital signal through the Analog to Digital Converter (ADC) 1223.
  • the control unit 1203 routes the digital signal into the DSP 1205 for processing therein, such as speech encoding, channel encoding, encrypting, and interleaving.
  • the processed voice signals are encoded, by units not separately shown, using a cellular transmission protocol such as global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile
  • EDGE global evolution
  • GPRS general packet radio service
  • GSM global system for mobile communications
  • IMS Internet protocol multimedia subsystem
  • UMTS telecommunications system
  • any other suitable wireless medium e.g., microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite, and the like.
  • WiMAX microwave access
  • LTE Long Term Evolution
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • WiFi wireless fidelity
  • satellite and the like.
  • the encoded signals are then routed to an equalizer 1225 for
  • the modulator 1227 After equalizing the bit stream, the modulator 1227 combines the signal with a RF signal generated in the RF interface 1229. The modulator 1227 generates a sine wave by way of frequency or phase modulation. In order to prepare the signal for transmission, an up-converter 1231 combines the sine wave output from the modulator 1227 with another sine wave generated by a synthesizer 1233 to achieve the desired frequency of transmission. The signal is then sent through a PA 1219 to increase the signal to an appropriate power level. In practical systems, the PA 1219 acts as a variable gain amplifier whose gain is controlled by the DSP 1205 from information received from a network base station.
  • the signal is then filtered within the duplexer 1221 and optionally sent to an antenna coupler 1235 to match impedances to provide maximum power transfer. Finally, the signal is transmitted via antenna 1217 to a local base station.
  • An automatic gain control (AGC) can be supplied to control the gain of the final stages of the receiver.
  • the signals may be forwarded from there to a remote telephone which may be another cellular telephone, other mobile phone or a land-line connected to a Public Switched Telephone Network (PSTN), or other telephony networks.
  • PSTN Public Switched Telephone Network
  • Voice signals transmitted to the mobile terminal 1201 are received via antenna 1217 and immediately amplified by a low noise amplifier (LNA) 1237.
  • LNA low noise amplifier
  • a down- converter 1239 lowers the carrier frequency while the demodulator 1241 strips away the RF leaving only a digital bit stream.
  • the signal then goes through the equalizer 1225 and is processed by the DSP 1205.
  • a Digital to Analog Converter (DAC) 1243 converts the signal and the resulting output is transmitted to the user through the speaker 1245, all under control of a Main Control Unit (MCU) 1203-which can be implemented as a Central Processing Unit (CPU) (not shown).
  • MCU Main Control Unit
  • CPU Central Processing Unit
  • the MCU 1203 receives various signals including input signals from the keyboard 1247.
  • the keyboard 1247 and/or the MCU 1203 in combination with other user input components comprise a user interface circuitry for managing user input.
  • the MCU 1203 runs a user interface software to facilitate user control of at least some functions of the mobile terminal 1201 to determine one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
  • the MCU 1203 also delivers a display command and a switch command to the display 1207 and to the speech output switching controller, respectively.
  • the MCU 1203 exchanges information with the DSP 1205 and can access an optionally incorporated SIM card 1249 and a memory 1251.
  • the MCU 1203 executes various control functions required of the terminal.
  • the DSP 1205 may, depending upon the implementation, perform any of a variety of conventional digital processing functions on the voice signals. Additionally, DSP 1205 determines the background noise level of the local environment from the signals detected by microphone 1211 and sets the gain of microphone 1211 to a level selected to compensate for the natural tendency of the user of the mobile terminal 1201.
  • the CODEC 1213 includes the ADC 1223 and DAC 1243.
  • the 1251 stores various data including call incoming tone data and is capable of storing other data including music data received via, e.g., the global Internet.
  • the software module could reside in RAM memory, flash memory, registers, or any other form of writable storage medium known in the art.
  • the memory device 1251 may be, but not limited to, a single memory, CD, DVD, ROM, RAM, EEPROM, optical storage, or any other nonvolatile storage medium capable of storing digital data.
  • An optionally incorporated SIM card 1249 carries, for instance, important information, such as the cellular phone number, the carrier supplying service, subscription details, and security information.
  • the SIM card 1249 serves primarily to identify the mobile terminal 1201 on a radio network.
  • the card 1249 also contains a memory for storing a personal telephone number registry, text messages, and user specific mobile terminal settings.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

An approach is provided for customizing output settings for media playback. An equalizer or media player determines one or more segments of a media track and associates the one or more segments with one or more corresponding output settings. Playback of the media track at a device is then based, at least in part, on the one or more output settings.

Description

METHOD AND APPARATUS FOR DETERMINING AND EQUALIZING ONE OR MORE SEGMENTS OF A MEDIA TRACK
BACKGROUND
[0001] Wireless (e.g., cellular) service providers and device manufacturers are continually challenged to deliver value and convenience to consumers by, for example, providing compelling services and features. One area of interest has been in the development of services, applications, and the like for facilitating media playback. By way of example, consumers typically utilize various electronic devices and media players to playback media such as audio and video tracks. Each of these devices and/or players may have different output characteristics (e.g., audio or video quality). Because of these differences, may consumers often customize or adjust output settings during media playback to optimize media reproduction. Traditionally, conventional media players and/or devices allow at least some user control over playback settings using, for instance, an equalizer. However, conventional equalizer settings and other playback controls generally apply to the entirety of a media (e.g., an entire music track, an entire video file). This coarse level of control over playback settings can limit the ability of consumers to customize media playback. As a result, service providers and device manufacturers face significant technical challenges to enabling greater control on media playback by both consumers and content creators/providers (e.g., artists, publishers, etc.).
SOME EXAMPLE EMBODIMENTS
[0002] Therefore, there is a need for an approach for efficiently customizing output settings for media playback.
[0003] According to one embodiment, a method comprises determining one or more segments of a media track. The method further comprises associating the one or more segments with one or more corresponding output settings. The method further comprises causing, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings. [0004] According to another embodiment, an apparatus comprising at least one processor, and at least one memory including computer program code for one or more programs, the at least one memory and the computer program code configured to, with the at least one processor, cause, at least in part, the apparatus to determine one or more segments of a media track. The apparatus is also caused, at least in part, to associate the one or more segments with one or more corresponding output settings. The apparatus is also caused, at least in part, to cause, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
[0005] According to another embodiment, a computer-readable storage medium carrying one or more sequences of one or more instructions which, when executed by one or more processors, cause, at least in part, an apparatus to determine one or more segments of a media track. The apparatus is also caused, at least in part, to associate the one or more segments with one or more corresponding output settings. The apparatus is also caused, at least in part, to cause, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
[0006] According to another embodiment, an apparatus comprises means for determining one or more segments of a media track. The apparatus further comprises means for causing, at least in part, associating the one or more segments with one or more corresponding output settings. The apparatus further comprises means for causing, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
[0007] In addition, for various example embodiments of the invention, the following is applicable: a method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on (including derived at least in part from) any one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
[0008] For various example embodiments of the invention, the following is also applicable: a method comprising facilitating access to at least one interface configured to allow access to at least one service, the at least one service configured to perform any one or any combination of network or service provider methods (or processes) disclosed in this application. [0009] For various example embodiments of the invention, the following is also applicable: a method comprising facilitating creating and/or facilitating modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based, at least in part, on data and/or information resulting from one or any combination of methods or processes disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
[0010] For various example embodiments of the invention, the following is also applicable: a method comprising creating and/or modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based at least in part on data and/or information resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
[0011] In various example embodiments, the methods (or processes) can be accomplished on the service provider side or on the mobile device side or in any shared way between service provider and mobile device with actions being performed on both sides.
[0012] For various example embodiments, the following is applicable: An apparatus comprising means for performing the method of any of originally filed claims 1- 14 and 27-29.
[0013] Still other aspects, features, and advantages of the invention are readily apparent from the following detailed description, simply by illustrating a number of particular embodiments and implementations, including the best mode contemplated for carrying out the invention. The invention is also capable of other and different
embodiments, and its several details can be modified in various obvious respects, all without departing from the spirit and scope of the invention. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive. BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings:
[0015] FIG. 1 is a diagram of a system capable of customizing media output settings on a user device, according to one embodiment;
[0016] FIG. 2A is a diagram of a system for determining one or more output settings for one or more segments of one or more media tracks, according to one embodiment;
[0017] FIG. 2B is a diagram showing an example of a segmented media track and corresponding output settings for each segment, according to one embodiment;
[0018] FIG. 3 is a diagram of components of a media player capable of determining and/or utilizing media output settings for media playback, according to one embodiment;
[0019] FIG. 4 is a flowchart of a process for determining one or more segments of a media track, determining one or more output settings and associating the one or more segments with the one or more output settings, according to one embodiment;
[0020] FIG. 5 is a flowchart of a process for associating one or more segments of a media track with one or more output setting metadata, according to one embodiment;
[0021] FIG. 6 is a flowchart of a process for retrieving one or more output setting metadata, according to one embodiment;
[0022] FIG. 8 is a flowchart of a process for determining output capabilities of a device and applying one or more output setting metadata, according to one embodiment;
[0023] FIG. 9 is diagram of a user interface for determining one or more segments of one or more media tracks and/or specifying one or more corresponding output settings, according to one embodiment;
[0024] FIG. 10 is a diagram of hardware that can be used to implement an embodiment of the invention;
[0025] FIG. 11 is a diagram of a chip set that can be used to implement an embodiment of the invention; and
[0026] FIG. 12 is a diagram of a mobile terminal (e.g., handset) that can be used to implement an embodiment of the invention. DESCRIPTION OF SOME EMBODIMENTS
[0027] Examples of a method, apparatus, and computer program for customizing output settings for media playback are disclosed. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It is apparent, however, to one skilled in the art that the embodiments of the invention may be practiced without these specific details or with an equivalent arrangement. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments of the invention.
[0028] Although several embodiments of the invention are discussed with respect to audio media content, it is recognized that the embodiments of the inventions have applicability to any type of content rendering, e.g., music or video playback or streaming, games playing, image or map displaying, radio or television content broadcasting or streaming, involving any device, e.g., wired and wireless local device or both local and remote wired or wireless devices, capable of rendering audio media content.
[0029] As used herein, content or media includes, for example, digital sound, songs, digital images, digital games, digital maps, point of interest information, digital videos, such as music videos, news clips and theatrical videos, advertisements, program files or objects, any other digital media or content, or any combination thereof. The term rendering indicates, for instance, any method for presenting the content at a device, including playing music through speakers, displaying images on a screen or in a projection or on tangible media such as photographic or plain paper, showing videos on a suitable display device with sound, graphing game or map data, or any other term of art for presentation, or any combination thereof. In many illustrated embodiments, a player is an example of a rendering module. A play list is information about content rendered on one or more players in response to input by a user, and is associated with that user. A play history is information about the time sequence of content rendered on one or more players in response to input by a user, and is associated with that user.
[0030] FIG. 1 is a diagram of a system capable of customizing media output settings on a user device, according to one embodiment. As discussed above, consumers often use, for instance, an equalizer to customize the sound output settings for playing back media tracks, particularly which using devices and/or players that have different output characteristics. However, traditionally, media output settings apply to an entire media track (e.g., a musical track in a musical album) at a time. In other words, the same equalizer setting is specified for the entire track and is not applied to the track at a finer granularity (e.g., to segments or portions of the track). Under this traditional approach, the content creator, content provider and/or content user is not able to fine-tune, encode and associate a more granular output setting scheme such as one or more output settings for one or more segments of a media track. This shortcoming may make the media output less than optimal for the user.
[0031] To address the problems described above, the system 100 of FIG. 1 introduces the capability to specify media output settings (e.g., equalizer settings) for individual segments of a media track or file, and then to playback the media track according to those settings. More specifically, the system 100 provides, for instance, at least the following capabilities: (1) to allow a content creator, a content producer, a content provider and/or a content user to identify one or more segments of a media track, encode and associate one or more output settings with the one or more segments of the media track; and (2) for a media player to utilize the one or more output settings when rendering the output.
[0032] In one embodiment, a content creator/provider can divide a media track into any number of segments and then assign individual output settings to each segment. Further, an equalizer (e.g., included in a media player or provided as a standalone module) can then specify or direct the output of the media track according to the settings. By way of example, the equalizer can be implemented by using passive and/or active electronic elements or by using digital algorithms for altering one or more output settings such as amplitude, frequency, phase, time delay, and/or the like. In some embodiments, preset output parameters may be specified based on media genre such as classical music, pop music, movie videos, sports videos, etc. For example, for a music track that includes a medley of many different styles, the content creator can specify classical equalizer settings for the introduction, a pop equalizer setting for a first verse, a rock equalizer setting for a second verse, etc.
[0033] In some embodiments, the individual output settings may be associated with corresponding segments of the media track as metadata. For example, the metadata may be encoded or otherwise embedded in the media track file. In addition or
alternatively, the output settings may be associated with the corresponding segments in a lookup table or other database. Accordingly, in another embodiment, during playback of media track, the system 100 can retrieve the output settings for the media track from either metadata, lookup table, or database, and then initiate playback based on the retrieved settings.
[0034] In one embodiment, a user can specify segment output settings for a particular track. In some cases, the user specified-settings can then be used to override the settings specified by the content creator/provider or the user can select which (e.g., either the user-specified or creator-specified settings) to use. The user-specified settings can then be associated with the media track for use in subsequent playbacks. In some embodiments, the user-specified settings may be shared and/or retrieved from other users or devices. By way of example, the settings may define at least in part a set of frequencies and
corresponding amplitude covering the entire audible frequency spectrum. It is also contemplated that the settings may specify any other media characteristics (e.g., brightness, color, saturation, etc. for video) and their corresponding values.
[0035] As shown in FIG. 1, the system 100 includes a service platform 101 with connectivity to a web server 103 and user equipment (UE) 105 a- 105b over the
communication network 107. In one embodiment, the web server 103 provides access to one or more services of the service platform 101. In other embodiments, the UEs 101a- 105b may also access the service platform 101 directly (e.g., using client applications). For the sake of simplicity, FIG. 1 depicts only two UEs (e.g., UEs 105a-105b) in the system 100. However, it is contemplated that the system may support any number of UEs 105 up to the maximum capacity of the communication network 107. In one embodiment, the network capacity may be determined based on available bandwidth, available connection points, and/or the like. The web server 103 further includes one or more web pages 109 including one or more content portals 111 to facilitate automatic and efficient sharing of content such as different types of media and/or related services.
[0036] The content and any related segment output settings (if available), for example, are provided by one or more of the media services 113a-l 13n of the service platform 101. In one embodiment, the service platform 101 includes one or more services 113a-l 13n (e.g., music service, mapping service, video service, social network service, etc.), a user account manager 115, and a user account database 117. The services 113a- 113n are connected directly or indirectly to network 107 and may be associated to share login credentials for granting user access. In another embodiment, one or more of the services 113a-l 13n are managed services provided by a service provider or operator of the network 107. The user account manager 115, for instance, manages the sharing of login credentials by tracking which of the services 113a-l 13n share login credentials and then linking the credentials to the user accounts created with the various services 113a-l 13n a particular user. In one embodiment, the user account manager 115 can manage or provide access to premium or subscription-based content for playback at the UE 105. By way of example, the user account manager 115 may store the tracking information for the login credentials and the user account information in the user account database 117. In addition or alternatively, the user account database 117 can reside on one or more nodes connected directly or indirectly to one or more of the services 113a-l 13n. In other embodiments, user account database 117 resides on one or more nodes in network 107. More specifically, the user account database 117 includes one or more processes (not shown) and one or more data structures that stores information about registered user each of the services 113a-l 13n including login credentials and related information as well as data, configurations, user profiles, variables, conditions, and the like associated with using any of the services 113a- 113n.
[0037] In one embodiment, the service platform 101 and the web server 103 can be implemented via shared, partially shared, or different computer hardware (e.g., the hardware described with respect to FIG. 10).
[0038] By way of example, the communication network 107 of the system 100 includes one or more networks such as a data network (not shown), a wireless network (not shown), a telephony network (not shown), or any combination thereof. It is contemplated that the data network may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), a public data network (e.g., the Internet), or any other suitable packet-switched network, such as a commercially owned, proprietary packet- switched network, e.g., a proprietary cable or fiber-optic network. In addition, the wireless network may be, for example, a cellular network and may employ various technologies including enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite, mobile ad-hoc network (MANET), and the like.
[0039] The UEs 105 are any type of mobile terminal, fixed terminal, or portable terminal including a mobile handset, station, unit, device, multimedia tablet, multimedia computer, Internet node, communicator, desktop computer, laptop computer, Personal Digital Assistants (PDAs), or any combination thereof. It is also contemplated that the UE 105 can support any type of interface to the user (such as "wearable" circuitry, etc.). The UE 105 may also be equipped with one or more sensors (e.g., a global positioning satellite (GPS) sensor, accelerometer, light sensor, etc.) for use with the services 113a-l 13n.
[0040] By way of example, the UEs 105, the service platform 101, and the web server 103 communicate with each other and other components of the communication network 107 using well known, new or still developing protocols. In this context, a protocol includes a set of rules defining how the network nodes within the communication network 107 interact with each other based on information sent over the communication links. The protocols are effective at different layers of operation within each node, from generating and receiving physical signals of various types, to selecting a link for transferring those signals, to the format of information indicated by those signals, to identifying which software application executing on a computer system sends or receives the information. The conceptually different layers of protocols for exchanging information over a network are described in the Open Systems Interconnection (OSI) Reference Model.
[0041] Communications between the network nodes are typically effected by exchanging discrete packets of data. Each packet typically comprises (1) header information associated with a particular protocol, and (2) payload information that follows the header information and contains information that may be processed independently of that particular protocol. In some protocols, the packet includes (3) trailer information following the payload and indicating the end of the payload information. The header includes information such as the source of the packet, its destination, the length of the payload, and other properties used by the protocol. Often, the data in the payload for the particular protocol includes a header and payload for a different protocol associated with a different, higher layer of the OSI Reference Model. The header for a particular protocol typically indicates a type for the next protocol contained in its payload. The higher layer protocol is said to be encapsulated in the lower layer protocol. The headers included in a packet traversing multiple heterogeneous networks, such as the Internet, typically include a physical (layer 1) header, a data- link (layer 2) header, an internetwork (layer 3) header and a transport (layer 4) header, and various application headers (layer 5, layer 6 and layer 7) as defined by the OSI Reference Model.
[0042] In one embodiment, the media portal 111 and the corresponding service 113 interact according to a client-server model. It is noted that the client-server model of computer process interaction is widely known and used. According to the client-server model, a client process sends a message including a request to a server process, and the server process responds by providing a service. The server process may also return a message with a response to the client process. Often the client process and server process execute on different computer devices, called hosts, and communicate via a network using one or more protocols for network communications. The term "server" is conventionally used to refer to the process that provides the service, or the host computer on which the process operates. Similarly, the term "client" is conventionally used to refer to the process that makes the request, or the host computer on which the process operates. As used herein, the terms "client" and "server" refer to the processes, rather than the host computers, unless otherwise clear from the context. In addition, the process performed by a server can be broken up to run as multiple processes on multiple hosts (sometimes called tiers) for reasons that include reliability, scalability, and redundancy, among others.
[0043] FIG. 2A is a diagram of a system 200 for determining one or more output settings for one or more segments of one or more media tracks, according to one embodiment. In one embodiment, the system 200 enables a user (e.g., an end-user or a content creator/provider) to determine one or more segments of a media track 201, encode one or more output settings at an equalizer 203, and associate the one or more output settings with the one or more segments of the media track 205. In one embodiment, the equalizer 203 is a component of the media player 123, the UE 105, or a combination thereof. In addition or alternatively, the equalizer 203 may be a separate component of the system 100 with connectivity to the UE 105 and the media player 123 over the
communication network 107. In another embodiment, the output settings metadata 207 is available without having been associated with a media track. In another embodiment, one or more creators of the media track provide the encoding and the association of the output settings. In another embodiment, one or more users provide the encoding and the association of the output settings. In another embodiment, a content provider and/or a service provider provide the encoding and the association of the output settings. In one embodiment, the output settings are in the form of a metadata file, which specifies one or more parameters as output settings. In one embodiment, the metadata file is stored in one or more databases and can be accessed by one or more users. In one example, following is a metadata specifying one or more output setting parameters which can be utilized by a media player.
<Schema>
<track id = "174827412joakdahjkg9821974ba" comment="unique identifier"> <segment id = "03843uil" start="00000" length ="00898">
<equalizer>
<frequency value ="400" db ="8"/>
<frequency value ="500" db ="0"/>
<frequency value ="600" db ="-8"/>
</equalizer>
</segment>
<segment id = "030797" start="01343" length ="03898">
<equalizer>
<frequency value ="400" db ="3"/>
<frequency value ="500" db ="-2"/>
<frequency value ="600" db ="5"/>
</equalizer>
</segment>
</track>
</schema> [0044] FIG. 2B is a diagram showing an example of a segmented media track and corresponding output settings for each segment, according to one embodiment. In this example, a graphical representation shows segments 221a-221n, amplitude and frequency graph of the segments, and the output settings 223a-223n corresponding to the segments 221-22 In respectively. A media track can be segmented into any number of segments and the graph shows boundaries of each segment in this example. Accordingly, as each segment 221a-221n is played back, the media player 123 retrieves and applies the corresponding output settings 223a-223n.
[0045] FIG. 3 is a diagram of components of a media player capable of determining and/or utilizing media output settings for media playback, according to one embodiment. The media player 123 is capable of playing back different types of media and/or content such as audio, video, photos and/or the like. The media can be stored in an analogue format on an analogue medium, such as analogue storage tape, or in a digital format on different storage devices such as a portable disc, a compact disc (CD), a digital video disc (DVD), a memory unit, a hard drive, a digital tape and/or the like. The media player can be a substantially standalone unit; such as a tape player, a CD player, a DVD player, a video player, a memory-based player (such as a MP3 player); and/or can be substantially included in another device such as a computer, a mobile phone, an electronic notebook, an electronic reader and/or the like.
[0046] By way of example, the media player 123 includes one or more components for obtaining, creating, determining and/or utilizing one or more output settings for reproducing media content. It is contemplated that the functions of these components may be combined in one or more components or performed by other components of equivalent functionality. In this embodiment, the media player 123 includes a media interface 301, an applications module 303, a runtime module 305, a user interface 307, a user profile 309, a storage device 311, a communication interface 313 and an output module 315. Also, output devices 317 and 319 are shown in the system 300.
[0047] In one embodiment, the communication interface 313 can be used to communicate with one or more content providers, service providers, other user devices and/or the like. Certain communications can be via methods such as an internet protocol, messaging, or any other communication method. In some examples, one or more users can send a query or a request to obtain content and/or output settings from one or more online services and/or from one or more users via the communication interface 313.
[0048] The media interface 301 can accept one or more media inputs; such as an audio tape, a video tape, a compact disc, a memory card, a memory stick and/or the like; and can cause to store one or more content from the one or more media onto the storage device 311. The one or more content are then executed on the runtime module 305. For example, a musical track is obtained from a compact disc and executed on the runtime module 305, which then begins other processes in the media player 123.
[0049] The applications module 303 includes one or more applications such as a digital equalizer, an internet browser, a media editing application and/or the like. In one embodiment, one or more users of the UE 105 are able to edit a media track to, at least, determine one or more segments for the media track, encode and associate one or more output settings with the one or more segments. In another embodiment, the one or more users of the UE 105 are able to search for and obtain one or more output setting records for one or more media tracks and then associate the one or more output settings with the one or more segments of the one or more media tracks. In another embodiment, the one or more users of the UE 105 can cause to share one or more output settings with one or more other users of one or more UE 105.
[0050] In another embodiment one or more of the obtained, the created and/or the edited output settings are caused to be stored onto the storage device 311 and/or in the user profile 309. In yet another embodiment, the output settings may be stored in another storage device or component (not shown) (e.g., a cloud computing server) accessible over the communication network 107.
[0051] The user interface (UI) 307 can include various methods of communication.
For example, the UI 307 can have outputs including a visual component (e.g., a screen), an audio component, a physical component (e.g., vibrations), and other methods of communication. User inputs can include a touch-screen interface, voice input, gesture based interface, a scroll-and-click interface, a button interface, a microphone, biometric based inputs (for example human electronic brain signals, eye parameters, etc.) and/or the like. Moreover, the UI 307 may be used to prompt the user to enter local credentials (e.g., a PIN code, biometric sensor input, etc.) and receive local credentials from the user. The communication can be via a wired or a wireless interface. The UI 307 can be located on one device, multiple devices, at one location, at multiple locations and can be shared by one or more users.
[0052] In another embodiment, the output module 315 utilizes the one or more output settings associated with the one or more segments of the media track to cause output to the output device 317 and/or the output device 319. In another embodiment, an output device, such as 317, is substantially included in the media player 123.
[0053] FIG. 4 is a flowchart of a process for determining one or more segments of a media track, determining one or more output settings and associating the one or more segments with the one or more output settings, according to one embodiment. In one embodiment, the equalizer 203 performs the process 300 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11. At step 401, one or more segments of a media track, such as an audio track, are determined. By way of example, the determination may include specifying start/stops times for each segment within the media track. This specification may be made visually (e.g., sliders for the start/stop times of the segments), explicitly (e.g., by direct input of the start/stop times) or a combination thereof
[0054] . In one embodiment, the one or more creators of the media track can determine how a media track should be segmented and/or perform the segmentation. In other embodiments, it is contemplated that one or more users, one or more service providers and/or one or more devices can determine and/or perform the segmentation.
[0055] At step 403, one or more output settings for the one or more segments are determined. In one embodiment, the one or more creators of the media track may determine one or more specific output settings for the one or more segments, which can cause an optimal rendering of the output. In another embodiment, the one or more users can determine the one or more output settings for the one or more segments. In another embodiment, one or more content providers and/or one or more service providers provide the one or more output settings as a service for free and/or for a fee. In another
embodiment, the one or more output settings are shared via one or more internet sites, such as one or more social network sites, and/or via one or more artist network sites. At step 405, the one or more segments are associated with the one or more corresponding output settings. For example, it is contemplated that each segment may have one or more alternate output settings from which the user can select. In one embodiment, one or more applications and/or one or more modules in the media player 123 cause the association of the one or more segments with the one or more corresponding output settings. In another embodiment, one or more media tracks are already associated with one or more corresponding output settings. In another embodiment, the one or more output setting metadata for the one or more segments of the one or more media tracks are available as a metadata record.
[0056] FIG. 5 is a flowchart of process 500 for associating one or more segments of a media track with one or more output setting metadata, according to one embodiment. In one embodiment, the media player 123 performs the process 500 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11. The process 500 assumes that the media player is to playback a media track such as a music track. In one embodiment, the user of the UE 105 downloads one or more media tracks from one or more content providers. In another embodiment, the user of the UE 105 inserts a CD into the media interface 301. At step 501, the media player determines if one or more output settings are included with the media track. In one embodiment, the media track is in the form of a digital file such as in MP3 format. At step 503, if one or more output settings are included with the media track, then at step 505, one or more segments of the media track are determined. In one embodiment, the media player applications module 303 determines the one or more segments by analyzing the one or more settings metadata record.
[0057] At step 507, the one or more segments of the media track are associated with the one or more corresponding output settings. In one embodiment, the one or more segments are associated with the one or more output settings. At step 509, the media player 123 causes, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings. In one embodiment, the one or more output settings are utilized as provided. In another embodiment, the one or more users further edit the one or more output setting metadata. However, at step 503, if it was determined that output settings are not included with the media track, then at step 511, the media player causes a determination whether one or more output settings are available for the media track of step 501. At step 513, if one or more output settings are available, then at step 515, the output setting metadata are retrieved and the process proceeds to step 505, otherwise, the process proceeds to step 517 where the media player causes playback of the media track. In one embodiment, at step 517, the user of UE 105 is given an option of creating one or more output settings before the playback at 517. In another embodiment, at step 517, one or more output settings from the user profile 309 and/or the storage unit 311 are utilized for the playback at step 517. [0058] FIG. 6 is a flowchart of process 600 for associating one or more segments of a media track with one or more output setting metadata, according to one embodiment. In one embodiment, the media player 123 and/or the equalizer 203 perform the process 600 and are implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11. At step 601 , one or more segments of a media track are encoded with association of one or more corresponding output settings specifying at least a start time, a length, and the one or more corresponding output settings for the one or more segments. At step 603, an artist, a producer, a service provider, one or more users, and/or
substantially a device perform the encoding and the association.
[0059] FIG. 7 is a flowchart of process 700 for retrieving one or more output setting metadata, according to one embodiment. In one embodiment, the media player 123 and/or the equalizer 203 perform the process 700 and are implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11. At step 701, one or more corresponding output settings are retrieved from one or more service providers, one or more content providers, one or more social network sites, one or more devices, one or more users and/or a combination thereof. In one embodiment, user of UE 105 downloads a metadata file from a social network site, which contains one or more output settings for a media track on the user device. In another embodiment, the user downloads a media track from another user device and then downloads an output settings metadata file for the media track from same and/or another user device. In another embodiment, the user downloads a media track from a content provider, which already contains output setting metadata. At step 703, a fingerprint associated with a media track is determined and the retrieving of the one or more corresponding output setting metadata is based, at least in part, on the fingerprint. In one embodiment, a fingerprint identifies, at least in part, information about a media track, about one or more media tracks, about an album, about the creator of the media track and/or the like.
[0060] FIG. 8 is a flowchart of process 800 for determining output capabilities of a device and applying one or more output setting metadata, according to one embodiment. In one embodiment, the media player 123 and/or the equalizer 203 perform the process 800 and are implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 11. At step 801, a media output capability of a device is determined and playback of a media track is further based, at least in part, on the media output capability. In one embodiment, out capability of a device is determined from information identifying type of the device, type of one or more output devices being utilized by the device. For example, a media player can be utilizing one or more external speakers, one or more internal speakers and/or one or more headsets all of which can be wired and/or wireless devices. In another embodiment, the media player can be utilizing an amplifier device. In another embodiment, the media player is paired with and/or connected to another media player device. At 803, the one or more corresponding output settings include at least one of frequencies and corresponding amplitude, audio effects, video settings, video effects and/or the like. In one embodiment, of or more output settings specify one or more amplitude settings for one or more frequencies. In another embodiment, one or more output settings are specified for the audio track of a video media.
[0061] FIG. 9 is diagram of a user interface 900 for determining one or more segments of one or more media tracks and/or specifying one or more corresponding output settings, according to one embodiment. In one embodiment, one or more users determine one or more segments of a media track by identifying a starting point and an ending point for one or more segments in UI 900 section 901. In another embodiment, the one or more segments are already determined in a metadata file and the one or more segment information are shown in section 901. In another embodiment, amplitude levels for one or more frequencies can be determined for the one more segments in section 901. In another embodiment, the one or more users can determine and set a range of options in section 905, for example, a media name, a media track No. (number), reset the output settings to default settings of the media player and/or the device, enable or disable equalizer options, select an amplifier option, save current settings and/or the like. In another embodiment, a media track is loaded into the media player where the media track is already associated with one or more output settings in which case, the UI 900 displays the one or more output settings. In another embodiment, one or more users can utilize UI 900 to modify one or more output setting metadata for one or more media tracks.
[0062] The processes described herein for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output may be advantageously implemented via software, hardware (e.g., general processor, Digital Signal Processing (DSP) chip, an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Arrays (FPGAs), etc.), firmware or a combination thereof. Such exemplary hardware for performing the described functions is detailed below.
[0063] FIG. 10 illustrates a computer system 1000 upon which an embodiment of the invention may be implemented. Although computer system 1000 is depicted with respect to a particular device or equipment, it is contemplated that other devices or equipment (e.g., network elements, servers, etc.) within FIG. 10 can deploy the illustrated hardware and components of system 1000. Computer system 1000 is programmed (e.g., via computer program code or instructions) to determine one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output as described herein and includes a communication mechanism such as a bus 1010 for passing information between other internal and external components of the computer system 1000. Information (also called data) is represented as a physical expression of a measurable phenomenon, typically electric voltages, but including, in other embodiments, such phenomena as magnetic, electromagnetic, pressure, chemical, biological, molecular, atomic, sub-atomic and quantum interactions. For example, north and south magnetic fields, or a zero and non-zero electric voltage, represent two states (0, 1) of a binary digit (bit). Other phenomena can represent digits of a higher base. A superposition of multiple simultaneous quantum states before measurement represents a quantum bit (qubit). A sequence of one or more digits constitutes digital data that is used to represent a number or code for a character. In some embodiments, information called analog data is represented by a near continuum of measurable values within a particular range. Computer system 1000, or a portion thereof, constitutes a means for performing one or more steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
[0064] A bus 1010 includes one or more parallel conductors of information so that information is transferred quickly among devices coupled to the bus 1010. One or more processors 1002 for processing information are coupled with the bus 1010.
[0065] A processor (or multiple processors) 1002 performs a set of operations on information as specified by computer program code related to determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output. The computer program code is a set of instructions or statements providing instructions for the operation of the processor and/or the computer system to perform specified functions. The code, for example, may be written in a computer programming language that is compiled into a native instruction set of the processor. The code may also be written directly using the native instruction set (e.g., machine language). The set of operations include bringing information in from the bus 1010 and placing information on the bus 1010. The set of operations also typically include comparing two or more units of information, shifting positions of units of information, and combining two or more units of information, such as by addition or multiplication or logical operations like OR, exclusive OR (XOR), and AND. Each operation of the set of operations that can be performed by the processor is represented to the processor by information called instructions, such as an operation code of one or more digits. A sequence of operations to be executed by the processor 1002, such as a sequence of operation codes, constitute processor instructions, also called computer system instructions or, simply, computer instructions. Processors may be implemented as mechanical, electrical, magnetic, optical, chemical or quantum components, among others, alone or in combination.
[0066] Computer system 1000 also includes a memory 1004 coupled to bus 1010.
The memory 1004, such as a random access memory (RAM) or other dynamic storage device, stores information including processor instructions for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output. Dynamic memory allows information stored therein to be changed by the computer system 1000. RAM allows a unit of information stored at a location called a memory address to be stored and retrieved independently of information at neighboring addresses. The memory 1004 is also used by the processor 1002 to store temporary values during execution of processor instructions. The computer system 1000 also includes a read only memory (ROM) 1006 or other static storage device coupled to the bus 1010 for storing static information, including instructions, that is not changed by the computer system 1000. Some memory is composed of volatile storage that loses the information stored thereon when power is lost. Also coupled to bus 1010 is a nonvolatile (persistent) storage device 1008, such as a magnetic disk, optical disk or flash card, for storing information, including instructions, that persists even when the computer system 1000 is turned off or otherwise loses power.
[0067] Information, including instructions for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output, is provided to the bus 1010 for use by the processor from an external input device 1012, such as a keyboard containing alphanumeric keys operated by a human user, or a sensor. A sensor detects conditions in its vicinity and transforms those detections into physical expression compatible with the measurable phenomenon used to represent information in computer system 1000. Other external devices coupled to bus 1010, used primarily for interacting with humans, include a display device 1014, such as a cathode ray tube (CRT) or a liquid crystal display (LCD), or plasma screen or printer for presenting text or images, and a pointing device 1016, such as a mouse or a trackball or cursor direction keys, or motion sensor, for controlling a position of a small cursor image presented on the display 1014 and issuing commands associated with graphical elements presented on the display 1014. In some embodiments, for example, in embodiments in which the computer system 1000 performs all functions automatically without human input, one or more of external input device 1012, display device 1014 and pointing device 1016 is omitted.
[0068] In the illustrated embodiment, special purpose hardware, such as an application specific integrated circuit (ASIC) 1020, is coupled to bus 1010. The special purpose hardware is configured to perform operations not performed by processor 1002 quickly enough for special purposes. Examples of application specific ICs include graphics accelerator cards for generating images for display 1014, cryptographic boards for encrypting and decrypting messages sent over a network, speech recognition, and interfaces to special external devices, such as robotic arms and medical scanning equipment that repeatedly perform some complex sequence of operations that are more efficiently implemented in hardware.
[0069] Computer system 1000 also includes one or more instances of a communications interface 1070 coupled to bus 1010. Communication interface 1070 provides a one-way or two-way communication coupling to a variety of external devices that operate with their own processors, such as printers, scanners and external disks. In general the coupling is with a network link 1078 that is connected to a local network 1080 to which a variety of external devices with their own processors are connected. For example, communication interface 1070 may be a parallel port or a serial port or a universal serial bus (USB) port on a personal computer. In some embodiments, communications interface 1070 is an integrated services digital network (ISDN) card or a digital subscriber line (DSL) card or a telephone modem that provides an information communication connection to a corresponding type of telephone line. In some
embodiments, a communication interface 1070 is a cable modem that converts signals on bus 1010 into signals for a communication connection over a coaxial cable or into optical signals for a communication connection over a fiber optic cable. As another example, communications interface 1070 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, such as Ethernet. Wireless links may also be implemented. For wireless links, the communications interface 1070 sends or receives or both sends and receives electrical, acoustic or electromagnetic signals, including infrared and optical signals, that carry information streams, such as digital data. For example, in wireless handheld devices, such as mobile telephones like cell phones, the communications interface 1070 includes a radio band electromagnetic transmitter and receiver called a radio transceiver. In certain embodiments, the communications interface 1070 enables connection to the communication network 105 for the UE 101.
[0070] The term "computer-readable medium" as used herein refers to any medium that participates in providing information to processor 1002, including instructions for execution. Such a medium may take many forms, including, but not limited to computer-readable storage medium (e.g., non-volatile media, volatile media), and transmission media. Non-transitory media, such as non- volatile media, include, for example, optical or magnetic disks, such as storage device 1008. Volatile media include, for example, dynamic memory 1004. Transmission media include, for example, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves. Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD- ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read. The term computer-readable storage medium is used herein to refer to any computer-readable medium except transmission media.
[0071] Logic encoded in one or more tangible media includes one or both of processor instructions on a computer-readable storage media and special purpose hardware, such as ASIC 1020.
[0072] Network link 1078 typically provides information communication using transmission media through one or more networks to other devices that use or process the information. For example, network link 1078 may provide a connection through local network 1080 to a host computer 1082 or to equipment 1084 operated by an Internet
Service Provider (ISP). ISP equipment 1084 in turn provides data communication services through the public, world-wide packet-switching communication network of networks now commonly referred to as the Internet 1090.
[0073] A computer called a server host 1092 connected to the Internet hosts a process that provides a service in response to information received over the Internet. For example, server host 1092 hosts a process that provides information representing video data for presentation at display 1014. It is contemplated that the components of system 1000 can be deployed in various configurations within other computer systems, e.g., host 1082 and server 1092.
[0074] At least some embodiments of the invention are related to the use of computer system 1000 for implementing some or all of the techniques described herein. According to one embodiment of the invention, those techniques are performed by computer system 1000 in response to processor 1002 executing one or more sequences of one or more processor instructions contained in memory 1004. Such instructions, also called computer instructions, software and program code, may be read into memory 1004 from another computer-readable medium such as storage device 1008 or network link 1078. Execution of the sequences of instructions contained in memory 1004 causes processor 1002 to perform one or more of the method steps described herein. In alternative embodiments, hardware, such as ASIC 1020, may be used in place of or in combination with software to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware and software, unless otherwise explicitly stated herein.
[0075] The signals transmitted over network link 1078 and other networks through communications interface 1070, carry information to and from computer system 1000. Computer system 1000 can send and receive information, including program code, through the networks 1080, 1090 among others, through network link 1078 and
communications interface 1070. In an example using the Internet 1090, a server host 1092 transmits program code for a particular application, requested by a message sent from computer 1000, through Internet 1090, ISP equipment 1084, local network 1080 and communications interface 1070. The received code may be executed by processor 1002 as it is received, or may be stored in memory 1004 or in storage device 1008 or other non- volatile storage for later execution, or both. In this manner, computer system 1000 may obtain application program code in the form of signals on a carrier wave.
[0076] Various forms of computer readable media may be involved in carrying one or more sequence of instructions or data or both to processor 1002 for execution. For example, instructions and data may initially be carried on a magnetic disk of a remote computer such as host 1082. The remote computer loads the instructions and data into its dynamic memory and sends the instructions and data over a telephone line using a modem. A modem local to the computer system 1000 receives the instructions and data on a telephone line and uses an infra-red transmitter to convert the instructions and data to a signal on an infra-red carrier wave serving as the network link 1078. An infrared detector serving as communications interface 1070 receives the instructions and data carried in the infrared signal and places information representing the instructions and data onto bus 1010. Bus 1010 carries the information to memory 1004 from which processor 1002 retrieves and executes the instructions using some of the data sent with the instructions. The instructions and data received in memory 1004 may optionally be stored on storage device 1008, either before or after execution by the processor 1002.
[0077] FIG. 11 illustrates a chip set or chip 1100 upon which an embodiment of the invention may be implemented. Chip set 1100 is programmed to determine one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output as described herein and includes, for instance, the processor and memory components described with respect to FIG. 10 incorporated in one or more physical packages (e.g., chips). By way of example, a physical package includes an arrangement of one or more materials, components, and/or wires on a structural assembly (e.g., a baseboard) to provide one or more characteristics such as physical strength, conservation of size, and/or limitation of electrical interaction. It is contemplated that in certain embodiments the chip set 1100 can be implemented in a single chip. It is further contemplated that in certain embodiments the chip set or chip 1100 can be implemented as a single "system on a chip." It is further contemplated that in certain embodiments a separate ASIC would not be used, for example, and that all relevant functions as disclosed herein would be performed by a processor or processors. Chip set or chip 1100, or a portion thereof, constitutes a means for performing one or more steps of providing user interface navigation information associated with the availability of services. Chip set or chip 1100, or a portion thereof, constitutes a means for performing one or more steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output.
[0078] In one embodiment, the chip set or chip 1100 includes a communication mechanism such as a bus 1101 for passing information among the components of the chip set 1100. A processor 1103 has connectivity to the bus 1101 to execute instructions and process information stored in, for example, a memory 1105. The processor 1103 may include one or more processing cores with each core configured to perform independently. A multi-core processor enables multiprocessing within a single physical package.
Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores. Alternatively or in addition, the processor 1103 may include one or more microprocessors configured in tandem via the bus 1101 to enable independent execution of instructions, pipelining, and multithreading. The processor 1103 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 1107, or one or more application- specific integrated circuits (ASIC) 1109. A DSP 1107 typically is configured to process real-world signals (e.g., sound) in real time independently of the processor 1103.
Similarly, an ASIC 1109 can be configured to performed specialized functions not easily performed by a more general purpose processor. Other specialized components to aid in performing the inventive functions described herein may include one or more field programmable gate arrays (FPGA) (not shown), one or more controllers (not shown), or one or more other special-purpose computer chips.
[0079] In one embodiment, the chip set or chip 800 includes merely one or more processors and some software and/or firmware supporting and/or relating to and/or for the one or more processors.
[0080] The processor 1103 and accompanying components have connectivity to the memory 1105 via the bus 1101. The memory 1105 includes both dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.) and static memory (e.g., ROM, CD- ROM, etc.) for storing executable instructions that when executed perform the inventive steps described herein for determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output The memory 1105 also stores the data associated with or generated by the execution of the inventive steps.
[0081] FIG. 12 is a diagram of exemplary components of a mobile terminal (e.g., handset) for communications, which is capable of operating in the system of FIG. 1, according to one embodiment. In some embodiments, mobile terminal 1200, or a portion thereof, constitutes a means for performing one or more steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output. Generally, a radio receiver is often defined in terms of front-end and back-end characteristics. The front-end of the receiver
encompasses all of the Radio Frequency (RF) circuitry whereas the back-end encompasses all of the base-band processing circuitry. As used in this application, the term "circuitry" refers to both: (1) hardware-only implementations (such as implementations in only analog and/or digital circuitry), and (2) to combinations of circuitry and software (and/or firmware) (such as, if applicable to the particular context, to a combination of processor(s), including digital signal processor(s), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions). This definition of "circuitry" applies to all uses of this term in this application, including in any claims. As a further example, as used in this application and if applicable to the particular context, the term "circuitry" would also cover an implementation of merely a processor (or multiple processors) and its (or their) accompanying software/or firmware. The term
"circuitry" would also cover if applicable to the particular context, for example, a baseband integrated circuit or applications processor integrated circuit in a mobile phone or a similar integrated circuit in a cellular network device or other network devices.
[0082] Pertinent internal components of the telephone include a Main Control Unit (MCU) 1203, a Digital Signal Processor (DSP) 1205, and a receiver/transmitter unit including a microphone gain control unit and a speaker gain control unit. A main display unit 1207 provides a display to the user in support of various applications and mobile terminal functions that perform or support the steps of determining one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output. The display 1207 includes display circuitry configured to display at least a portion of a user interface of the mobile terminal (e.g., mobile telephone). Additionally, the display 1207 and display circuitry are configured to facilitate user control of at least some functions of the mobile terminal. An audio function circuitry 1209 includes a microphone 1211 and microphone amplifier that amplifies the speech signal output from the microphone 1211. The amplified speech signal output from the microphone 1211 is fed to a coder/decoder (CODEC) 1213.
[0083] A radio section 1215 amplifies power and converts frequency in order to communicate with a base station, which is included in a mobile communication system, via antenna 1217. The power amplifier (PA) 1219 and the transmitter/modulation circuitry are operationally responsive to the MCU 1203, with an output from the PA 1219 coupled to the dup lexer 1221 or circulator or antenna switch, as known in the art. The PA 1219 also couples to a battery interface and power control unit 1220.
[0084] In use, a user of mobile terminal 1201 speaks into the microphone 1211 and his or her voice along with any detected background noise is converted into an analog voltage. The analog voltage is then converted into a digital signal through the Analog to Digital Converter (ADC) 1223. The control unit 1203 routes the digital signal into the DSP 1205 for processing therein, such as speech encoding, channel encoding, encrypting, and interleaving. In one embodiment, the processed voice signals are encoded, by units not separately shown, using a cellular transmission protocol such as global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile
telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite, and the like.
[0085] The encoded signals are then routed to an equalizer 1225 for
compensation of any frequency-dependent impairments that occur during transmission though the air such as phase and amplitude distortion. After equalizing the bit stream, the modulator 1227 combines the signal with a RF signal generated in the RF interface 1229. The modulator 1227 generates a sine wave by way of frequency or phase modulation. In order to prepare the signal for transmission, an up-converter 1231 combines the sine wave output from the modulator 1227 with another sine wave generated by a synthesizer 1233 to achieve the desired frequency of transmission. The signal is then sent through a PA 1219 to increase the signal to an appropriate power level. In practical systems, the PA 1219 acts as a variable gain amplifier whose gain is controlled by the DSP 1205 from information received from a network base station. The signal is then filtered within the duplexer 1221 and optionally sent to an antenna coupler 1235 to match impedances to provide maximum power transfer. Finally, the signal is transmitted via antenna 1217 to a local base station. An automatic gain control (AGC) can be supplied to control the gain of the final stages of the receiver. The signals may be forwarded from there to a remote telephone which may be another cellular telephone, other mobile phone or a land-line connected to a Public Switched Telephone Network (PSTN), or other telephony networks.
[0086] Voice signals transmitted to the mobile terminal 1201 are received via antenna 1217 and immediately amplified by a low noise amplifier (LNA) 1237. A down- converter 1239 lowers the carrier frequency while the demodulator 1241 strips away the RF leaving only a digital bit stream. The signal then goes through the equalizer 1225 and is processed by the DSP 1205. A Digital to Analog Converter (DAC) 1243 converts the signal and the resulting output is transmitted to the user through the speaker 1245, all under control of a Main Control Unit (MCU) 1203-which can be implemented as a Central Processing Unit (CPU) (not shown).
[0087] The MCU 1203 receives various signals including input signals from the keyboard 1247. The keyboard 1247 and/or the MCU 1203 in combination with other user input components (e.g., the microphone 1211) comprise a user interface circuitry for managing user input. The MCU 1203 runs a user interface software to facilitate user control of at least some functions of the mobile terminal 1201 to determine one or more segments of a media track, encoding and associating one or more output settings with the one or more segments and a media player system capable of utilizing the one or more output settings for rendering an output. The MCU 1203 also delivers a display command and a switch command to the display 1207 and to the speech output switching controller, respectively. Further, the MCU 1203 exchanges information with the DSP 1205 and can access an optionally incorporated SIM card 1249 and a memory 1251. In addition, the MCU 1203 executes various control functions required of the terminal. The DSP 1205 may, depending upon the implementation, perform any of a variety of conventional digital processing functions on the voice signals. Additionally, DSP 1205 determines the background noise level of the local environment from the signals detected by microphone 1211 and sets the gain of microphone 1211 to a level selected to compensate for the natural tendency of the user of the mobile terminal 1201.
[0088] The CODEC 1213 includes the ADC 1223 and DAC 1243. The memory
1251 stores various data including call incoming tone data and is capable of storing other data including music data received via, e.g., the global Internet. The software module could reside in RAM memory, flash memory, registers, or any other form of writable storage medium known in the art. The memory device 1251 may be, but not limited to, a single memory, CD, DVD, ROM, RAM, EEPROM, optical storage, or any other nonvolatile storage medium capable of storing digital data.
[0089] An optionally incorporated SIM card 1249 carries, for instance, important information, such as the cellular phone number, the carrier supplying service, subscription details, and security information. The SIM card 1249 serves primarily to identify the mobile terminal 1201 on a radio network. The card 1249 also contains a memory for storing a personal telephone number registry, text messages, and user specific mobile terminal settings. [0090] While the invention has been described in connection with a number of embodiments and implementations, the invention is not so limited but covers various obvious modifications and equivalent arrangements, which fall within the purview of the appended claims. Although features of the invention are expressed in certain combinations among the claims, it is contemplated that these features can be arranged in any
combination and order.

Claims

WHAT IS CLAIMED IS:
1. A method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on the following:
at least one determination of one or more segments of a media track;
at least one association of the one or more segments with one or more
corresponding output settings; and
a playback of the media track at a device based, at least in part, on the one or more output settings.
2. A method of claim 1, wherein the (1) data and/or (2) information and/or (3) at least one signal are further based, at least in part, on the following:
an encoding of the at least one association of the one or more segments with the one or more corresponding output settings in metadata specifying at least a start time, a length, and the one or more corresponding output settings for the one or more segments.
3. A method according of claim 2, wherein the encoding is performed by an artist, a third party, a user, or a combination thereof.
4. A method according to any of claims 1-3, wherein the (1) data and/or (2) information and/or (3) at least one signal are further based, at least in part, on the following:
at least one retrieving of the one or more corresponding output settings from a service, a content provider, a social network, another device, or combination thereof.
5. A method according to any of claims 1-4, wherein the (1) data and/or (2) information and/or (3) at least one signal are further based, at least in part, on the following:
a determination of a fingerprint associated with the media track,
wherein the retrieving of the one or more corresponding output settings is based, at least in part, on the fingerprint.
6. A method according to any of claims 1-5, wherein the (1) data and/or (2) information and/or (3) at least one signal are further based, at least in part, on the following:
a determination of a media output capability of the device,
wherein the playback of the media track is further based, at least in part, on the media output capability.
7. A method according to any of claims 1-6, wherein the one or more corresponding output settings include at least one of frequencies and corresponding amplitude, audio effects, video settings, and video effects.
8. A method comprising:
determining one or more segments of a media track;
associating the one or more segments with one or more corresponding output settings; and
causing, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
9. A method of claim 8, further comprising:
encoding the association of the one or more segments with the one or more corresponding output settings in metadata specifying at least a start time, a length, and the one or more corresponding output settings for the one or more segments.
10. A method of claim 9, wherein the encoding is performed by an artist, a third party, a user, or a combination thereof.
11. A method according to any of claims 8-10, further comprising:
retrieving the one or more corresponding output settings from a service, a content provider, a social network, another device, or combination thereof.
12. A method according to any of claims 8-11, further comprising: determining a fingerprint associated with the media track,
wherein the retrieving of the one or more corresponding output settings is based, at least in part, on the fingerprint.
13. A method according to any of claims 8-12, further comprising:
determining a media output capability of the device,
wherein the playback of the media track is further based, at least in part, on the media output capability.
14. A method according to any of claims 8-13, wherein the one or more corresponding output settings include at least one of frequencies and corresponding amplitude, audio effects, video settings, and video effects.
15. An apparatus comprising:
at least one processor; and
at least one memory including computer program code,
the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following,
determine one or more segments of a media track;
associate the one or more segments with one or more corresponding output settings; and
cause, at least in part, playback of the media track at a device based, at least in part, on the one or more output settings.
16. An apparatus of claim 15, wherein the apparatus is further caused, at least in part, to:
encode the association of the one or more segments with the one or more corresponding output settings in metadata specifying at least a start time, a length, and the one or more corresponding output settings for the one or more segments.
17. An apparatus of claim 16, wherein the encoding is performed by an artist, a third party, a user, or a combination thereof.
18. An apparatus according to any of claim 15-17, wherein the apparatus is further caused, at least in part, to:
retrieve the one or more corresponding output settings from a service, a content provider, a social network, another device, or combination thereof.
19. An apparatus of claiml8, wherein the apparatus is further caused, at least in part, to:
determine a fingerprint associated with the media track,
wherein the retrieving of the one or more corresponding output settings is based, at least in part, on the fingerprint.
20. An apparatus according to any of claims 15-19, wherein the apparatus is further caused, at least in part, to:
determine a media output capability of the device,
wherein the playback of the media track is further based, at least in part, on the media output capability.
21. An apparatus according to any of claim 15-20, wherein the one or more corresponding output settings include at least one of frequencies and corresponding amplitude, audio effects, video settings, and video effects.
22. An apparatus according to any of claims 15-21, wherein the apparatus is a mobile phone further comprising:
user interface circuitry and user interface software configured to facilitate user control of at least some functions of the mobile phone through use of a display and configured to respond to user input; and
a display and display circuitry configured to display at least a portion of a user interface of the mobile phone, the display and display circuitry configured to facilitate user control of at least some functions of the mobile phone.
23. A computer-readable storage medium carrying one or more sequences of one or more instructions which, when executed by one or more processors, cause an apparatus to perform at least a method of any of claims 8-14.
24. An apparatus comprising means for performing a method of any of claims 8-14.
25. An apparatus of claim 24, wherein the apparatus is a mobile phone further comprising:
user interface circuitry and user interface software configured to facilitate user control of at least some functions of the mobile phone through use of a display and configured to respond to user input; and
a display and display circuitry configured to display at least a portion of a user interface of the mobile phone, the display and display circuitry configured to facilitate user control of at least some functions of the mobile phone.
26. A computer program including one or more sequences of one or more instructions which, when executed by one or more processors, cause an apparatus to at least perform the steps of a method of any of claims 8-14.
27. A method comprising facilitating access to at least one interface configured to allow access to at least one service, the at least one service configured to perform a method of any of claims 8-14.
28. A method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on the method of any of claims 8-14.
29. A method comprising facilitating creating and/or facilitating modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based, at least in part, on the method of of claims 8-14.
EP11811885.0A 2010-07-30 2011-07-19 Method and apparatus for determining and equalizing one or more segments of a media track Withdrawn EP2599304A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US36930810P 2010-07-30 2010-07-30
PCT/FI2011/050663 WO2012013858A1 (en) 2010-07-30 2011-07-19 Method and apparatus for determining and equalizing one or more segments of a media track

Publications (2)

Publication Number Publication Date
EP2599304A1 true EP2599304A1 (en) 2013-06-05
EP2599304A4 EP2599304A4 (en) 2014-01-08

Family

ID=45529458

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11811885.0A Withdrawn EP2599304A4 (en) 2010-07-30 2011-07-19 Method and apparatus for determining and equalizing one or more segments of a media track

Country Status (3)

Country Link
EP (1) EP2599304A4 (en)
CN (1) CN103053157B (en)
WO (1) WO2012013858A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9654757B2 (en) * 2013-03-01 2017-05-16 Nokia Technologies Oy Method, apparatus, and computer program product for including device playback preferences in multimedia metadata
US9380383B2 (en) 2013-09-06 2016-06-28 Gracenote, Inc. Modifying playback of content using pre-processed profile information
CN104754178B (en) * 2013-12-31 2018-07-06 广州励丰文化科技股份有限公司 audio control method
CN104754241B (en) * 2013-12-31 2017-10-13 广州励丰文化科技股份有限公司 Panorama multi-channel audio control method based on variable domain acoustic image
US10516718B2 (en) * 2015-06-10 2019-12-24 Google Llc Platform for multiple device playout
US9832590B2 (en) 2015-09-12 2017-11-28 Dolby Laboratories Licensing Corporation Audio program playback calibration based on content creation environment
CN106937021B (en) * 2015-12-31 2019-12-13 上海励丰创意展示有限公司 performance integrated control method based on time axis multi-track playback technology
CN106937023B (en) * 2015-12-31 2019-12-13 上海励丰创意展示有限公司 multi-professional collaborative editing and control method for film, television and stage
CN106937022B (en) * 2015-12-31 2019-12-13 上海励丰创意展示有限公司 multi-professional collaborative editing and control method for audio, video, light and machinery
US11611605B2 (en) * 2016-10-21 2023-03-21 Microsoft Technology Licensing, Llc Dynamically modifying an execution environment for varying data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040255340A1 (en) * 2000-03-28 2004-12-16 Gotuit Audio, Inc. Methods and apparatus for playing different programs to different listeners using a compact disk player
US20070154190A1 (en) * 2005-05-23 2007-07-05 Gilley Thomas S Content tracking for movie segment bookmarks
US20090079833A1 (en) * 2007-09-24 2009-03-26 International Business Machines Corporation Technique for allowing the modification of the audio characteristics of items appearing in an interactive video using rfid tags

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999043111A1 (en) * 1998-02-23 1999-08-26 Personal Audio, Inc. System for distributing personalized audio programming
JP3835554B2 (en) 2003-09-09 2006-10-18 ソニー株式会社 FILE RECORDING DEVICE, FILE REPRODUCTION DEVICE, FILE RECORDING METHOD, FILE RECORDING METHOD PROGRAM, RECORDING MEDIUM RECORDING THE FILE RECORDING METHOD PROGRAM, FILE PLAYING METHOD, FILE PLAYING METHOD PROGRAM, AND RECORDING MEDIUM RECORDING THE PROGRAM
US20050132293A1 (en) 2003-12-10 2005-06-16 Magix Ag System and method of multimedia content editing
WO2006035438A1 (en) * 2004-09-28 2006-04-06 Dvtel Inc. Media player and method for operating a media player
CN101667814B (en) * 2009-09-24 2012-08-15 华为终端有限公司 Audio playing method and audio player

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040255340A1 (en) * 2000-03-28 2004-12-16 Gotuit Audio, Inc. Methods and apparatus for playing different programs to different listeners using a compact disk player
US20070154190A1 (en) * 2005-05-23 2007-07-05 Gilley Thomas S Content tracking for movie segment bookmarks
US20090079833A1 (en) * 2007-09-24 2009-03-26 International Business Machines Corporation Technique for allowing the modification of the audio characteristics of items appearing in an interactive video using rfid tags

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2012013858A1 *

Also Published As

Publication number Publication date
CN103053157B (en) 2017-05-24
EP2599304A4 (en) 2014-01-08
WO2012013858A1 (en) 2012-02-02
CN103053157A (en) 2013-04-17

Similar Documents

Publication Publication Date Title
EP2599304A1 (en) Method and apparatus for determining and equalizing one or more segments of a media track
US20110154213A1 (en) Method and apparatus for presenting content download progress
US8687946B2 (en) Method and apparatus for enriching media with meta-information
US20110138299A1 (en) Method and apparatus for suggesting data for transfer
US8844051B2 (en) Method and apparatus for media relaying and mixing in social networks
US9078091B2 (en) Method and apparatus for generating media based on media elements from multiple locations
US8631436B2 (en) Method and apparatus for presenting media segments
US20110119637A1 (en) Method and apparatus for interacting with a content object
US20110131180A1 (en) Method and apparatus for configuring a content object
US20110314388A1 (en) Method and apparatus for generating a collaborative playlist
US20110239142A1 (en) Method and apparatus for providing content over multiple displays
US20120311081A1 (en) Management of Network-Based Digital Data Repository
US20110125765A1 (en) Method and apparatus for updating media profile
US7930367B2 (en) Low storage portable media player
US20130259447A1 (en) Method and apparatus for user directed video editing
US20130085586A1 (en) Audio File Processing to Reduce Latencies in Play Start Times for Cloud Served Audio Files
US8782309B2 (en) Method and apparatus for suggesting data for deletion
US20120310762A1 (en) Remote Storage of Acquired Data at Network-Based Data Repository
US9696884B2 (en) Method and apparatus for generating personalized media streams
US9442935B2 (en) Method and apparatus for presenting media to users
US20120198347A1 (en) Method and apparatus for enhancing user based content data
KR20120104279A (en) Method and apparatus for providing media content searching capabilities
KR20140031717A (en) Method and apparatus for managing contents
US9721612B2 (en) Method and apparatus for providing content lists using connecting user interface elements
US20110107431A1 (en) Method and apparatus for protecting an embedded content object

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130117

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20131205

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 21/485 20110101ALI20131129BHEP

Ipc: G11B 27/00 20060101ALI20131129BHEP

Ipc: G11B 27/32 20060101ALI20131129BHEP

Ipc: H04N 21/439 20110101ALI20131129BHEP

Ipc: H04N 5/91 20060101AFI20131129BHEP

Ipc: H04N 9/82 20060101ALI20131129BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NOKIA CORPORATION

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NOKIA TECHNOLOGIES OY

17Q First examination report despatched

Effective date: 20170616

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180103