WO2019191291A1 - Method and apparatus for providing an application user interface for generating color-encoded music - Google Patents

Method and apparatus for providing an application user interface for generating color-encoded music Download PDF

Info

Publication number
WO2019191291A1
WO2019191291A1 PCT/US2019/024370 US2019024370W WO2019191291A1 WO 2019191291 A1 WO2019191291 A1 WO 2019191291A1 US 2019024370 W US2019024370 W US 2019024370W WO 2019191291 A1 WO2019191291 A1 WO 2019191291A1
Authority
WO
WIPO (PCT)
Prior art keywords
musical
user interface
interface element
representation
keyboard
Prior art date
Application number
PCT/US2019/024370
Other languages
French (fr)
Inventor
Zi Hao QIU
Original Assignee
Qiu Zi Hao
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qiu Zi Hao filed Critical Qiu Zi Hao
Publication of WO2019191291A1 publication Critical patent/WO2019191291A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B15/00Teaching music
    • G09B15/02Boards or like means for providing an indication of notes

Definitions

  • a method comprises presenting a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition.
  • the method also comprises assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note.
  • the method further comprises receiving, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys.
  • the method also comprises rendering a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
  • an apparatus comprises at least one processor, and at least one memory including computer program code for one or more computer programs, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to present a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition.
  • the apparatus is also caused to assign one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note.
  • the apparatus is further caused to receive, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys.
  • the apparatus is also caused to render a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
  • a computer-readable storage medium carries one or more sequences of one or more instructions which, when executed by one or more processors, present a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition.
  • the apparatus is also caused to assign one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note.
  • the apparatus is further caused to receive, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys.
  • the apparatus is also caused to render a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
  • an apparatus comprises means for presenting a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition.
  • the apparatus also comprises means for assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note.
  • the apparatus further comprises means for receiving, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys.
  • the apparatus also comprises means for rendering a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
  • a method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on (or derived at least in part from) any one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
  • a method comprising facilitating access to at least one interface configured to allow access to at least one service, the at least one service configured to perform any one or any combination of network or service provider methods (or processes) disclosed in this application.
  • a method comprising facilitating creating and/or facilitating modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based, at least in part, on data and/or information resulting from one or any combination of methods or processes disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
  • a method comprising creating and/or modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based at least in part on data and/or information resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
  • the methods can be accomplished on the service provider side or on the mobile device side or in any shared way between service provider and mobile device with actions being performed on both sides.
  • An apparatus comprising means for performing a method of any of the claims.
  • FIG. l is a diagram of a system capable of composing one or more musical notes in a color-encoded musical system, according to one embodiment
  • FIG. 2 is a diagram of the components of the color processing platform 103, according to one example embodiment
  • FIG. 3 is a flowchart of a process for converting color data into one or more musical notes, according to one example embodiment
  • FIG. 4 is a flowchart of a process for uploading an audio data and generating a visual representation of the audio data, according to one example embodiment
  • FIGs. 5A and 5B are diagrams that represent first user interface element, according to one example embodiment
  • FIG. 6A is a diagram that represents a second user interface, according to one embodiment
  • FIG. 6B is a diagram of a process for determining an active slot, according to one example embodiment
  • FIG. 6C is a diagram of a process for changing the length of at least one slot and duration for the corresponding note, according to one example embodiment
  • FIG.7 is a flowchart of a process for playing a color-encoded musical game, according to one embodiment
  • FIG. 8 is a diagram that represents a game screen, according to one example embodiment.
  • FIG. 9 is a user interface diagram that represents a game screen based, at least in part, on the orientation of UE 101, according to one example embodiment;
  • FIG.10A is a user interface diagram that represents a screen to set-up at least one game, according to one example embodiment
  • FIG.10B is a user interface diagram that represents a screen for saving a game, according to one example embodiment
  • FIG.10C is a user interface diagram that represents a screen for saving one or more composed music and/or visual representation corresponding to the composed music, according to one example embodiment
  • FIG.10D is a user interface diagram that represents a screen for loading a game, according to one example embodiment
  • FIG.10E is a user interface diagram that represents a catalogue of various genres of music, according to one example embodiment
  • FIG.10F is a user interface diagram that represents a screen for loading music, according to one example embodiment
  • FIG.11A is a user interface diagram that represents a screen for recording a song, according to one example embodiment
  • FIG.11B is a user interface diagram that represents a screen during song recording, according to one example embodiment
  • FIG.11C is a user interface diagram that represents a screen for saving a recording, according to one example embodiment
  • FIG.11D is a user interface diagram that represents a screen after saving a recording, according to one example embodiment
  • FIG. 12 is a diagram of hardware that can be used to implement an embodiment of the invention.
  • FIG. 13 is a diagram of a chip set that can be used to implement an embodiment of the invention.
  • FIG. 14 is a diagram of a mobile terminal (e.g., handset) that can be used to implement an embodiment of the invention.
  • a mobile terminal e.g., handset
  • FIG. l is a diagram of a system capable of providing an application user interface for composing one or more musical notes in a color-encoded musical notation system, according to one embodiment.
  • composing music traditionally requires mastery of traditional musical notation systems (e.g., ability to read notes in staff notation) as well as musical skill and talent to achieve good results.
  • An average user may experience difficulty in understanding written music due to the counter-intuitive nature of traditional music notation.
  • young users may have difficulty in understanding written music due to the complex nature of the musical structures.
  • system 100 associates one or more visual representation (e.g., colors) with one or more musical notes.
  • colors are associated according to a predefined color scheme based on the tonal or rhythmic qualities of the musical notes (e.g., the color a shape can represent the musical note, and the size or length of the shape can represent the length of the musical note).
  • This allows inexperienced users as well as young users to compose music by using colors, while intuitively becoming aware of musical structures. Subsequently, the colors are converted into musical notes.
  • This process of converting color data to musical notes traditionally has relied on expertise in cross-disciplinary artistic concepts (e.g., music theory, composition, etc.) combined with artistic and musical skill to achieve subjectively pleasing or “good” results.
  • this knowledge and skill often is out of reach for average users, thereby, limiting the ability of these users to convert color data to musical notes.
  • system 100 of FIG. 1 introduces a unique application user interface (e.g., a game-based application user interface) that enables users to learn and compose music using a color-based musical notation system.
  • the application user interface includes a first user interface element 121 (e.g., color-encoded piano keyboard with one or more colors associated with each key) including a full-length keyboard and a zoomed keyboard.
  • the application user interface provided by system 100 also includes a second user interface element 123 with a composition area which comprises a plurality of slots for filling-in the colors and then the colors may be read or scanned by a color-reading device 125.
  • a user may select colors that are associated with keys in the keyboard to fill-in the template in the slots in the second user interface element 123 through an input module 127.
  • the input module 127 includes a touch-based input, e.g., a user may use stylus or his/her fingers, a voice-based input through a microphone, or a combination thereof.
  • a touch-based input e.g., a user may use stylus or his/her fingers, a voice-based input through a microphone, or a combination thereof.
  • the color-reading device 125 produces color data indicating the colors filled in the composition area in the second user interface element 123 by using, e.g., an optical scanner or equivalent technology to measure the color wavelength or other indicator of the colors applied to the composition area in the second user interface element 123.
  • the color reading device 125 is part of the application or device that is presenting the application user interface. In other words, the color reading device 125 can directly capture an image of the composition area within an application or device setting without using an external capture device such as an optical scanner.
  • color accuracy and the ability to accurately differentiate between a large number of colors is particularly important when trying to map the colors to the full range of musical notes available to, for instance, a full- length keyboard. In the case of a full-length keyboard, the system 100 would use at least 88 colors in the keyboard in first user interface element 121 to represent each of the 88 notes.
  • the color processing platform 103 can then output the musical composition in any format or media selected by a user.
  • the composition can then be played through an audio out as audible music.
  • the composition can be converted into another musical notation system (e.g., staff notation or any other system of musical notation).
  • the color processing platform 103 uses an algorithmic process based on certain parameters (e.g., color level, shapes and/or sizes of the color appearing on the composition area in the second user interface element 123, associated symbols/drawings/patterns, etc.) to determine one or more musical characteristics for generating the composition including, but not limited to, the sound level, pitch, or duration for the one or more musical notes.
  • the size of the color in the composition area in the second user interface element 123 determines the duration of the musical note.
  • the system 100 comprises UE 101 that may include or be associated with applications 107, sensors 111, first user interface element 121 and second user interface element 123.
  • the UE 101 has connectivity to a color processing platform 103 via a communication network 105, e.g., a wireless communication network.
  • the color processing platform 103 performs one or more functions associated with converting color data into one or more musical notes.
  • the conversion from color data to musical notes using this method includes receiving a second user interface element 123 on which a user has applied a visual arrangement of colors through the input module 127 (e.g., colored pens or user figures) which have been mapped to correspond to particular musical notes.
  • the color processing platform 103 can pre-determine which colors correspond to which musical notes, and what sizes of the respective shapes of the color correspond to which length of musical notes (e.g., eighth notes, quarter notes, half notes, etc.). In one embodiment, the color processing platform 103 can provide one or more color sets to be applied to the keyboard in the first user interface element 121. [0047] In another embodiment, the system 100 can enable one or more users to dynamically map the colors the corresponding musical note in the keyboard of the first user interface element 121. In one example embodiment, the color processing platform 103 can present a user interface in which the user can manually specify one or more colors to correspond to one or more musical notes in the keyboard of the first user interface element 121.
  • this manual correlation can occur on a color by color basis.
  • the color processing platform 103 can shift colors along a musical scale based on change in a single note or color combination. In one example embodiment, if red color is matched to a middle C by default, the user may change the mapping so that green color is now matched to middle C, the color processing platform 103 can use the same initial color sequence but shift all other default colors in the same sequence so that green matches or corresponds to middle C on a musical scale.
  • the system 100 comprises of UE 101.
  • the UE 101 is any type of mobile terminal, fixed terminal, or portable terminal including a navigation unit (e.g., in-vehicle or standalone), a mobile handset, station, unit, device, multimedia computer, multimedia tablet, Internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system (PCS) device, personal navigation device, personal digital assistants (PDAs), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof.
  • the EE 101 can support any type of interface to the user (such as“wearable” circuitry, etc.).
  • the color processing platform 103 may be a platform with multiple interconnected components.
  • the color processing platform 103 may include one or more servers, intelligent networking devices, computing devices, components and corresponding software for converting color data into one or more musical notes (and vice versa).
  • the color processing platform 103 may be a separate entity of the system 100, a part of the one or more services H5a-l l5n (collectively referred to as services 115) of the service platform 113, or the EE 101. Any known or still developing methods, techniques or processes for converting color data into one or more musical notes may be employed by the color processing platform 103.
  • the color processing platform 103 may read respective colors applied to a composition area by a plurality of drawing instruments as color data using a color reading device 125. Then, the color processing platform 103 may process the color data to generate a composition of the one or more musical notes from the color data based on the set of musical notes that correspond to the respective colors in the color data. In other words, in one embodiment, the color processing platform 103 may convert color data into one or more musical notes by the following means: 1) corresponding or mapping one or more features (e.g.
  • the color processing platform 103 may generate a legend for correlating the one or more colors, their patterns and sizes to the one or more musical notes and their respective lengths. In one example embodiment, the color processing platform 103 may use the features extracted from the color data to match the musical notes to its appropriate partner. In another embodiment, the color processing platform 103 may process one or more color data to determine one or more element (e.g., shades, ranges, hues, brightness, contrasts, purity) of the color data. In a further embodiment, the color processing platform 103 may determine the size of the one or more colors, wherein the size of the colors can be used to represent the duration or length of the musical notes.
  • the color processing platform 103 may determine the size of the one or more colors, wherein the size of the colors can be used to represent the duration or length of the musical notes.
  • the communication network 105 of system 100 includes one or more networks such as a data network, a wireless network, a telephony network, or any combination thereof.
  • the data network may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), a public data network (e.g., the Internet), short range wireless network, or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, e.g., a proprietary cable or fiber-optic network, and the like, or any combination thereof.
  • the wireless network may be, for example, a cellular communication network and may employ various technologies including enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (Wi-Fi), wireless LAN (WLAN), Bluetooth®, Internet Protocol (IP) data casting, satellite, mobile ad-hoc network (MANET), vehicle controller area network (CAN bus), and the like, or any combination thereof.
  • EDGE enhanced data rates for global evolution
  • GPRS general packet radio service
  • GSM global system for mobile communications
  • IMS Internet protocol multimedia subsystem
  • UMTS universal mobile telecommunications system
  • any other suitable wireless medium e.g., worldwide interoperability
  • one or more content provider H7a-l l7n (collectively referred to as content provider 117) enables musical notes and visual representation to be derived from existing songs.
  • the existing songs may be uploaded from database 119 or downloaded from service platform 113.
  • an existing song“Twinkle twinkle little star” may be fed-into the program.
  • the UE 101 can generate one or more musical notes derived from the song.
  • the output device 109 and the second user interface element 123 can generate an audio output and a corresponding visual representation of the one or more musical notes.
  • the generated composition of one or more musical notes and corresponding visual representations can be saved in database 119 or uploaded and shared in service platform 113.
  • the UE 101 may further include applications 107 to perform one or more functions of converting color data into one or more musical notes.
  • the applications 107 and the color processing platform 103 interact according to a client-server model.
  • client-server model of computer process interaction is widely known and used.
  • a client process sends a message including a request to a server process, and the server process responds by providing a service.
  • the server process may also return a message with a response to the client process.
  • the client process and server process execute on different computer devices, called hosts, and communicate via a network using one or more protocols for network communications.
  • server is conventionally used to refer to the process that provides the service, or the host computer on which the process operates.
  • client is conventionally used to refer to the process that makes the request, or the host computer on which the process operates.
  • server refers to the processes, rather than the host computers, unless otherwise clear from the context.
  • process performed by a server can be broken up to run as multiple processes on multiple hosts (sometimes called tiers) for reasons that include reliability, scalability, and redundancy, among others.
  • the UE 101 further has connectivity to one or more input modules 127.
  • the input module may include a microphone for capturing one or more musical notes.
  • the input module 127 can capture a user’s singing or a song playing.
  • the input module 127 includes color-reading device.
  • the input module 127 may include a camera or a scanner for capturing image data or a quick response code (QR code), wherein the QR code is associated with color data or musical notes . It is contemplated that the input module may be configured with any sensor suitable for sampling or capturing visual data into digital format for processing by the system 100.
  • the UE 101 further has connectivity to one or more audio output devices 109.
  • the output device 109 can be configured with any number of suitable output modules.
  • the output device 109 may be configured with displays (e.g., monitors, projectors, televisions, etc.) for aural or visual representation of the one or more musical notes.
  • the output device 109 may include devices for creating physical versions (e.g., paper, canvas, and/or other media such as wood, stone, etc.) of the one or more musical notes converted from the color data or uploaded from the input module 127. These devices include, but are not limited to, printers, three-dimensional printers, computerized numerical control (CNC) machines, printing presses, and the like.
  • CNC computerized numerical control
  • the system 100 also includes one or more sensors 111, which can be implemented, embedded or connected to the UE 101. It is contemplated that the UE 101 may be configured with any sensors suitable for sampling or capturing an image data into digital format for processing by the system 100. It is also contemplated that the UE 101 may be configured with any sensors suitable for sampling or capturing music data into digital format for processing by the system 100.
  • the sensors 111 may be any type of sensor. In one embodiment, the type of sensors 111 configured can be based on the type of source data. For example, it is contemplated that image data can include color data presented in any form.
  • the UE 101 can use a microphone sensor to capture a song or singing for conversion into one or more musical notes and corresponding visual representations.
  • a color-reading device is configured with sensors 111 capable of reading color values. In this way, a user can create musical notes by using the color-reading device to read different colors (e.g., from a drawing, an existing image, painting, or other visual representation). The colors that are read by the color-reading device are then converted into musical notes using the processes discussed with respect to the various embodiments described herein.
  • the UE 101 and/or the color processing platform 103 also have connectivity to a service platform 113 that includes one or more services 115 for providing other services that support the color processing platform 103.
  • the service platform 113 may include social networking services/application, content (e.g., audio, video, images, etc.) provisioning services/application, application services/application, storage services/application, etc.
  • the service platform 113 may interact with the UE 101, the color processing platform 103 and the content provider 117 to supplement or aid in the processing of the content information.
  • the service platform 113 may be implemented or embedded in the color processing platform 103 or in its functions.
  • the services 115 may be an online service that reflects interests and/or activities of one or more users.
  • the services 115 allow users to share activities information, historical user information and interests (e.g., musical interest) within communication network 105 and their individual networks, and provides for data portability.
  • the service platform 113 and/or services 115 interact with one or more content providers H7a-l l7n (also collectively referred to as content providers 117) to provide musical notes and/or other related information to the color processing platform 103.
  • the content provided may be any type of content, such as, image content, textual content, audio content (e.g., audio notification), video content (e.g., visual notification), etc.
  • the content provider 117 may also store content associated with the UE 101, the color processing platform 103, and the services 115 of the service platform 113.
  • the system 100 also includes database 119.
  • the database 119 stores one or more musical notes corresponding to one or more colors.
  • the information may be any multiple types of information that can provide means for aiding in the content provisioning and sharing process.
  • the system 100 also can generate an application user interface that includes a first user interface element 121.
  • the first user interface element 121 includes a full-length keyboard and a zoomed keyboard.
  • the first user interface element 121 is configured to associate a plurality of colors with a plurality of keys of a keyboard, wherein the plurality of keys corresponds to one or more respective musical notes.
  • the full-length keyboard is a color-encoded 88-note piano keyboard.
  • the zoomed keyboard is an active octave of a zoomed representation of the 88-note piano keyboard.
  • one or more colors are assigned to each key in the zoomed keyboard.
  • the first user interface element 121 includes two sets of keyboard control system for easy navigation between octaves.
  • the color-encoded musical module 121 is a virtual instrument in a computer application or a mobile phone application. It is noted that a piano keyboard is provided by way illustration and not as a limitation. It is contemplated that any type of musical instrument can be represented and used in the embodiments of the application user interface described herein.
  • the application user interface of system 100 can also include a second user interface element 123.
  • the second user interface element 123 includes a composition area comprising a plurality of slots for creating a visual representation of the musical composition.
  • the second user interface element 123 is configured to apply a plurality of colors.
  • One of the plurality of slots is an active slot. A user can fill the active slot with colors by interacting with keys in the zoomed keyboard.
  • the second user interface 123 can be a virtual user interface in a computer application (e.g., the same computer application supporting the first user interface 121 described above). In this case, the user can change the size of the slots by dragging the slot left or right.
  • the system 100 also includes color-reading device 125.
  • the color-reading device 125 may read respective colors applied to a physical composition area via an image sensor, a scanner, or a combination thereof.
  • the color-reading device 125 may scan or otherwise detect a shape or a size of the respective colors applied to the physical composition area as part of the color data.
  • the color-reading device 125 may a color-reading module of the computer application that supports or provides the virtual first user interface 121 and virtual second user interface 123.
  • a protocol includes a set of rules defining how the network nodes within the communication network 105 interact with each other based on information sent over the communication links.
  • the protocols are effective at different layers of operation within each node, from generating and receiving physical signals of various types, to selecting a link for transferring those signals, to the format of information indicated by those signals, to identifying which software application executing on a computer system sends or receives the information.
  • the conceptually different layers of protocols for exchanging information over a network are described in the Open Systems Interconnection (OSI) Reference Model.
  • Each packet typically comprises (1) header information associated with a particular protocol, and (2) payload information that follows the header information and contains information that may be processed independently of that particular protocol.
  • the packet includes (3) trailer information following the payload and indicating the end of the payload information.
  • the header includes information such as the source of the packet, its destination, the length of the payload, and other properties used by the protocol.
  • the data in the payload for the particular protocol includes a header and payload for a different protocol associated with a different, higher layer of the OSI Reference Model.
  • the header for a particular protocol typically indicates a type for the next protocol contained in its payload.
  • the higher layer protocol is said to be encapsulated in the lower layer protocol.
  • the headers included in a packet traversing multiple heterogeneous networks, such as the Internet typically include a physical (layer 1) header, a data-link (layer 2) header, an internetwork (layer 3) header and a transport (layer 4) header, and various application (layer 5, layer 6 and layer 7) headers as defined by the OSI Reference Model.
  • FIG. 2 is a diagram of the components of the color processing platform 103, according to one example embodiment.
  • the color processing platform 103 includes one or more components for converting color data into one or more musical notes. It is contemplated that the functions of these components may be combined in one or more components or performed by other components of equivalent functionality.
  • the color processing platform 103 comprises one or more configuration modules 201, mapping modules 203, color processing modules 205, and presentation modules 207, or any combination thereof.
  • the configuration module 201 may configure a color-reading device or application module to scan respective colors applied to a composition area in a composition area.
  • the configuration module 201 may configure an application of a mobile device to read respective colors in the composition area of the application.
  • the color-reading device and/or the application of a mobile device includes an image sensor, a scanner, or a combination thereof to read respective colors.
  • the configuration module 201 may configure color sets to a plurality of keys in the color-encoded musical module.
  • the mapping module 203 may associate at least one color with at least one musical note. In another embodiment, the mapping module 203 may associate at least one color pattern with at least one set of musical note. In a further embodiment, the mapping module 203 may correlate the size of the color drawn on the composition area, the composition area of an application, or a combination thereof to the duration for the one or more musical notes. [0070] In one embodiment, the color processing module 205 may process the color data to generate a composition of the one or more musical notes from the color data based on the set of musical notes that correspond to the respective colors in the color data.
  • the color processing module 205 may generate a composition of musical notes that correspond to the respective colors based, at least in part, on the sequence of colors applied to a composition area, a composition area of an application, or a combination thereof. In another embodiment, the color processing module 205 may be configured to determine note duration information for the one or more musical notes in the composition based on the shape or the size of the respective colors. In one example embodiment, the color processing module 205 may generate a composition based, at least in part, on the duration information for the one or more musical notes. In a further embodiment, the color processing module 205 is configured to generate a representation of the composition in staff notation and to output the composition in the staff notation via an output device.
  • the color processing module 205 may process a drawing to determine a sequence for one or more colors. Then, the color processing module 205 may select one or more musical notes that correlate to the one or more colors based, at least in part, on the sequence. Subsequently, the color processing module 205 may convert the one or more colors with the one or more musical notes.
  • the presentation module 207 may represent the composition in staff notation in at least one user interface of at least one device.
  • the representation includes a visual representation, an aural representation, or a combination thereof.
  • the visual representation includes color representations of the composition in the composition area.
  • the visual representation also includes music visualizer.
  • the default aural representation of the composition is piano style.
  • a user can select a different instrument from an instrument list to play the composition.
  • the instrument list includes violin, guitar, trumpet, flute, or a combination thereof.
  • the system sends a prompt for payment to the user. When the user’s payment information is authorized, the system presents the instrument list for the user to select.
  • the presentation module 207 employs various application programming interfaces (APIs) or other function calls corresponding to the applications 107 of UE 101 and/or output device 109, thus enabling the display of graphics primitives such as menus, buttons, data entry fields, etc., for generating the user interface elements.
  • APIs application programming interfaces
  • the presentation module 207 enables a presentation of a graphical user interface (GUI) for displaying one or more colors to the users for drawing on a canvas of an application.
  • GUI graphical user interface
  • the above presented modules and components of the color processing platform 103 can be implemented in hardware, firmware, software, or a combination thereof. Though depicted as a separate entity in FIG. 1, it is contemplated that the color processing platform 103 may be implemented for direct operation by respective UE 101. As such, the color processing platform 103 may generate direct signal inputs by way of the operating system of the UE 101 for interacting with the applications 107. In another embodiment, one or more of the modules 201- 207 may be implemented for operation by respective UEs, as the color processing platform 103, or combination thereof. Still further, the color processing platform 103 may be integrated for direct operation with the services 115, such as in the form of a widget or applet, in accordance with an information and/or subscriber sharing arrangement. The various executions presented herein contemplate any and all arrangements and models.
  • FIG. 3 is a flowchart of a process for converting color data into one or more musical notes, according to one example embodiment.
  • the color processing platform 103 performs the process 300 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 10.
  • the color processing platform 103 and/or any of its modules can be means for performing the process 300 or any other processes described herein for providing an application user interface for composing color-encoded music.
  • system 100 may present a user interface to a user.
  • the user interface comprises a first user interface element and a second user interface element.
  • the first user interface element presents a representation of a musical keyboard for creating a musical composition.
  • the representation of the musical keyboard includes a full- length keyboard representation that highlights an active octave of a zoomed keyboard representation.
  • the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition.
  • a user can modify the plurality of slots in the composition area. In this case, the modification includes altering a total number of slots and changing size of the slots in the composition area.
  • the user interface is a virtual user interface in the computer application.
  • the computer application is executable on a mobile device.
  • the mobile device is a mobile phone or a tablet.
  • the mobile phone or the tablet is operated under iOS or Android operating system.
  • the color processing platform 103 assigns one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note.
  • the one or more respective colors are assigned to a plurality of keys of the representation of the zoomed keyboard.
  • a user can replace the one or more respective colors with one or more different respective colors based on user preference.
  • system 100 assigns the one or more different respective colors to the plurality of keys of the representation of the musical keyboard.
  • the zoomed keyboard includes more than one octaves. In such case, system 100 renders one or more respective colors with different saturation values to indicate different octaves of the respective musical notes.
  • system 100 receives a user input via the first user interface that indicates interaction with a played key of the plurality of keys of the representation of the musical keyboard.
  • the user input includes a touch-based input, a voice-based input, or a combination thereof.
  • the user input that indicates the interaction with the at least one key is received via the zoomed keyboard representation.
  • the color processing platform 103 may render a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
  • the color processing platform may use a color-reading device to read the one or more respective colors in the active slot.
  • a color-reading device includes an image sensor, a scanner, or a combination thereof for reading colors applied to the composition area.
  • a user can change size of the active slot. In this case, the user can extend or reduce a length of the active slot, wherein the length of the active slot represents a duration of the corresponding musical note for the active slot.
  • system 100 designates a next slot in the second user interface element as a next active slot for creating the musical composition.
  • a user can select any slot in the composition area as the active slot.
  • FIG. 4 is a flowchart of a process for recording and/or uploading audio data and generating a visual representation of the audio data, according to one example embodiment.
  • the color processing platform 103 performs the process 400 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 10
  • a user may record and/or upload audio data into system 100.
  • the audio data is uploaded from one or more databases, downloaded from a network server via the Internet, recorded by a microphone of user device, or a combination thereof.
  • the color processing platform 103 can present a user interface on a user interface that includes an option to upload a data file containing the audio (e.g., in any standard format known in the art such as, but not limited to, MP3, WAV, lossless audio format, etc.).
  • the user interface can include a recording option that enables a corresponding user device to capture audio data through a user device’s microphone, external microphone, connected instrument (e.g., through a MIDI interface, etc.), and/or the like.
  • the color processing platform 103 can define a minimum audio capture quality (e.g., sample rate, number of bits, etc.) for uploaded and/or recorded music.
  • system 100 determines one or more attributes of the audio data, wherein the one or more attributes include one or more musical notes of a song or composition represented in the audio data.
  • the color processing platform 103 processes the audio data to identify a sequence of musical notes and their respective durations as represented in the audio data.
  • other characteristics such as relative volume, instrumentation, stereo position, etc. can also be extracted from the audio data for each note.
  • the color processing platform 103 maps the one or more extracted musical notes to one or more respective colors.
  • the mapping is based on the association of the one or more colors with a plurality of keys of a keyboard, wherein each of the plurality of keys corresponds to a respective musical note.
  • the association can map each musical note directly to a different color instead of the keys of a keyboard. For example, each note in an octave can be assigned to different colors so that a musical note A corresponds to a first color (e.g., red), musical note B corresponds to a second color (e.g., orange), and so on.
  • system 100 generates a visual representation of the audio data in the composition area of the device user interface (e.g., as generated in the process 300 of FIG. 3).
  • the visual representation includes one or more colors and shapes of the one or more colors.
  • FIGs. 5A and 5B are diagrams that represent first user interface element, according to one example embodiment.
  • the first user interface element 500 includes two- keyboard control system.
  • the first keyboard control system includes a full-length keyboard representation 502 having fifty-two white keys and thirty-six black keys. These keys may be repeated every octave, thereby giving seven basic tones.
  • the full- length keyboard representation 502 may generate a realistic piano tune, and a user may compose music by clicking the keys of the full-length keyboard 502.
  • the full- length keyboard representation 502 includes region 504.
  • a user may select the highlighted region 504 as an active octave in the full-length keyboard 502 by moving and/or changing the size of the frame 503. Subsequently, the highlighted region 504 is presented in the second keyboard control system as zoomed keyboard 505.
  • This zoomed keyboard 505 may include one octave or two octaves or more.
  • system 100 may assign one or more respective colors to each key in the zoomed keyboard 505. As shown in FIG.5A, at least one color may be assigned to the at least one white key in the zoomed keyboard 505. In another embodiment, seven colors of a rainbow may be assigned to each of the seven white keys in the zoomed keyboard 505. In one example embodiment, for a gradient of red fading to violet, the system 100 may match the midrange sequence of basic tones C, D, E, F, G, A, B, to a digitally encoded sequence of 1, 3, 5, 6, 8, 10, and 12. In another example embodiment, at least one color 507, e.g., red, may be assigned to key C 1 of the zoomed keyboard 505.
  • at least one color 507 e.g., red
  • black keys of the zoomed keyboard 505 are assigned with multiple colors, e.g., two colors associated with two adjacent white keys.
  • red and orange colors as represented by 509 may be assigned to key C # (D b ) which is marked as key 2 on full-length keyboard 502.
  • Table 1 shows how timbre is encoded to correspond with certain colors:
  • a user may select one or more colors from a color database, and associate them to one or more keys in the zoomed keyboard 505.
  • the color of the played key in the first user interface changes.
  • key 510 is the played key.
  • system 100 plays musical notes associated with the played key.
  • piano tone D is played upon receiving the interaction.
  • the one or more colors associated with the played key are assigned to an active slot in composition area.
  • system 100 automatically designates a next slot in the composition area as a next active slot.
  • FIG.5B includes two octaves in the zoomed keyboard 505.
  • a set of seven colors of a rainbow may be assigned to the seven white keys in one octave and another set of seven colors of the rainbow with different saturation values may be assigned to seven white keys in another octave.
  • the system 100 may assign the midrange sequence of basic tones C, D, E, F, G, A, B, to a digitally encoded sequence of 1, 3, 5, 6, 8, 10, 12.
  • the system 100 may assign the midrange sequence of basic tones C, D, E, F, G, A, B, to a digitally encoded sequence of 13, 15, 17, 18, 20, 22, and 24.
  • a color 507 e.g., red
  • another color 511 e.g., a lighter red
  • FIG. 6A is a diagram that represents a second user interface, according to one embodiment.
  • the second user interface element 600 includes a composition area 601.
  • the composition area 601 includes a plurality of slots 611.
  • the plurality of slots 611 form a slot line 609, and plurality of slot lines 609 comprise the composition area 601.
  • a user may associate each slot in the composition area with one or more colors, wherein the one or more colors are associated with one or more musical notes as shown in FIG. 5A-B.
  • the first empty slot is an active slot.
  • slot 611 is the active slot and a user may fill the active slot with one or more colors by selecting at least one key from the zoomed keyboard 505.
  • each slot in the composition area 601 may be filled with at least one color that corresponds to at least one white key of the zoomed keyboard 505, for example, slot 603 is filled with at least one color.
  • each slot may be filled with multiple colors that correspond to at least one black key in the zoomed keyboard 505, for example, slot 608 is filled with two colors.
  • the system 100 may consider the size of one or more slots to determine a time value or duration of the one or more musical notes.
  • common duration for notes include whole note, half note, quarter note, eighth note, etc.
  • different notes have different time durations.
  • time duration in a musical score is used to express the relative duration between each bar. Time duration also determines how long a note lasts.
  • system 100 may match the time duration of a musical note to the size of the color slots and may make the duration of the musical note proportional to the size.
  • slot 605 represents a musical note that is two times longer than a musical note represented by slot 603
  • slot 607 represents a musical note that is three times longer than musical note represented by slot 603.
  • the second user interface element 600 includes a slot control area 615.
  • a user can upload a song and/or one or more musical notes through the“UPLOAD” button 617 to generate a corresponding visual representation.
  • a user may upload songs and/or one or more musical notes from database 119 and /or communication network 105.
  • a user can record a song and/or one or more musical notes through a microphone and may upload it to the system 100.
  • a user may play or stop the uploaded song and/or one or more musical notes through a“PLAY” button 619 and a “STOP” button 621.
  • the user may also save a composition and the corresponding visual representation through a“SAVE” button 623.
  • a user can adjust composition area through a “TEMPLATE” button 625. In this example, the user can adjust total number of slots in the composition area 601, size of each slot, and number of slots in each slot line 609.
  • the second user interface element 600 also includes a music visualizer 613.
  • the second user interface element 600 also includes a song control area 627.
  • a user can choose to play or pause the musical composition and the corresponding visual representation through a“PLAY” button 629 and a“PAUSE” button 631, respectively.
  • a user can also choose to record a song or one or more musical notes through “RECORD” button 633 and play the recorded song or the one or more musical notes in system 100.
  • the second user interface element 600 also includes buttons representing other functions, for example, a delete or edit note button (not shown for illustrative convenience).
  • FIG. 6B is a diagram of a process for determining an active slot, according to one example embodiment.
  • the first empty slot 635 in the composition area 601 is a default active slot, i.e., the empty slot 635 is ready to be filled with a plurality of colors by the at least one user.
  • the empty slot 635 is represented as an active slot with a smooth blinking animation showing that the slot is in use.
  • a user may select another slot apart from the empty slot 635 of the plurality of slots in the composition area 601 as an active slot through a touch-based input, e.g., by using a pen or a finger, a voice-based input, or a combination thereof.
  • a user can fill the active slot with one or more colors by interacting with one key in the first user interface element. After receiving the user input that indicates the interaction, a next slot is automatically designated as a next active slot. In one embodiment, the changing of active slots is indicated with animation in the second user interface element 600. As shown in FIG. 6B, arrow 636 is used to indicated the active slot is moved to next slot.
  • FIG. 6C is a diagram of a process for changing the length of at least one slot and duration for the corresponding note, according to one example embodiment.
  • each and every slot can be extended or reduced in length.
  • only active slot can be extended or reduced in length.
  • a user may click slot 637 and then drag the slot to the left or to the right to decrease or extend the length of slot 637.
  • the length of the slots determine the duration of the corresponding musical notes.
  • FIG. 7 is a flowchart of a process for playing a color-encoded musical game, according to one embodiment.
  • system 100 presents a splash screen to at least one player.
  • the splash screen is a colorful screen featuring a name for at least one game.
  • the splash screen may include an option of playing music during presentation to the at least one player.
  • a start screen is presented to the at least one player.
  • the start screen includes plurality of buttons that navigates the at least one player to one or more other screens.
  • the start screen may comprise: a “PLAY” button, a“LOAD GAME” button, a“STORE” button and/or an“ACHIEVEMENT” button.
  • system 100 may present a game mode selection screen comprising of plurality of game mode buttons based, at least in part, on at least one player selecting the“LOAD GAME” button.
  • the game mode comprises of plurality of features, such as but not limited to, replicating a song, assigning color-in one or more slots to match a song, experimental play, and so on.
  • a game screen is presented to the at least one player upon selecting a game mode.
  • the game screen includes a first user interface element 500 and a second user interface element 600.
  • the game screen 707 includes a“SONG SCREEN” 709, a“LOAD GAME” button 719, a“SAVE GAME” button 721, a“STORE” button 723, a“GAME SETTING” button 727 and an“ACHIEVEMENT” button 733.
  • at least one player can upload a song and/or one or more musical notes to the game by selecting the “SONG SCREEN” 709.
  • the“SONG SCREEN” 709 may further include two buttons: a“RECORD SONG” button 711 and a“LOAD SONG” button 717.
  • the at least one player may record a song and/or one or more musical notes via a microphone by clicking the “RECORD SONG” button 711. After the recording, the at least one player can choose to rewrite the song by clicking“REWRTIE” button 713. The at least one player can also choose to save and upload the recorded song to system 100 by clicking“SAVE” button 715. Subsequently, the player is redirected to the game screen 707. In another example embodiment, If the at least one player clicks the“LOAD SONG” button 717 in the“SONG SCREEN” 709, the player may upload a song and/or one or more musical notes to system 100 from the database 119 or the communication network 105. Once the song and/or one or more musical notes are uploaded, the player is redirected to the game screen 707.
  • At least one player can choose to reload the game by clicking “LOAD GAME” button 719 in game screen 707.
  • the player may also save the game by clicking “SAVE GAME” button 721.
  • the game screen 707 also includes a“STORE” button 723.
  • the at least one player may buy a song or a music pack comprising of multiple songs by clicking the“STORE” button 723.
  • system 100 may send a notification regarding payment information to the at least one player. Once the at least one player validates the payment information and authorizes the payment, the purchased song or the music pack is uploaded to EE 101 or the system 100.
  • a pop-up message confirming the purchase is presented to the player.
  • the at least one player may upload a song or a music pack from“STORE” for free.
  • the game screen 707 further includes a“GAME SETTING” button 727.
  • the player may adjust the composition area 601 in second user interface element 600 by clicking the“GAME SETTING” button 727.
  • the player may also adjust the total number of slots in the composition area and number of slots in each slot line.
  • the player may also adjust the first user interface element settings.
  • the player may also change the color sets assigned to the keyboard.
  • the player can select one or more octaves as zoomed keyboard 505. ETpon completion, the at least one player is redirected to game screen 707.
  • the game screen 707 further includes an“ACHIEVEMENT” session 733.
  • achievements may include but is not limited to: the at least one player has played for 10, 20, or 30 hours; the at least one player has used‘C’ note 50 times; the at least one player has placed 100 red slots; the at least one player has saved 5 songs; the at least one player has shared 5 songs and at least one other player have liked the shared songs. After viewing his/her achievements, the at least one player may be redirected to the game screen 707.
  • FIG.8 is a diagram that represents a game screen, according to one example embodiment.
  • the game screen 800 comprises of a first user interface element 500 and a second user interface element 600.
  • the game screen 800 includes a name session 801 for one or more players to enter a game name, a song name, or a painting name.
  • the first user interface element 500 includes two keyboard control system: (i) a full-length keyboard 502 and (ii) zoomed keyboard 505.
  • the second user interface element 600 includes a slot control area 615, a composition area 601, and a song control area 627.
  • the second user interface element 600 also includes buttons representing other functions, for example, a delete or edit note button (not shown for illustrative convenience).
  • FIG. 9 is a user interface diagram that represents a game screen based, at least in part, on the orientation of UE 101, according to one example embodiment.
  • display of a game screen changes from 800 to 803 when the physical orientation of the UE 101 is changed.
  • FIG.10A is a user interface diagram that represents a screen to set-up at least one game, according to one example embodiment.
  • game menu screen 1000 includes button 1001 for displaying the menu, button 1003 to start a new game, button 1005 to save the game, button 1007 to load a game, button 1009 for redirecting one or more players to an online store for purchasing one or more songs, and button 1011 informs the one or more players on their achievements.
  • button 1009 for redirecting one or more players to an online store for uploading one or more songs for free may select one or more buttons displayed in screen 1000 through a touch-based input, e.g., by using a pen or a finger, a voice-based input, or a combination thereof.
  • FIG.10B is a user interface diagram that represents a screen for saving a game, according to one example embodiment.
  • screen 1013 includes a back button 1015. At least one player may select the back button 1035 to be redirected to the game menu screen 1000.
  • the screen 1013 also includes a text field 1017 for one or more players to enter a name of their choice for the game. Then, the user may select button 1019 to save the game.
  • FIG.10C is a user interface diagram that represents a screen for saving one or more composed music and/or visual representation corresponding to the composed music, according to one example embodiment.
  • screen 1021 includes back button 1023, at least one player may select the back button 1035 to be redirected to the game menu screen 1000.
  • the screen 1021 may include a music visualizer 1025 for visualizing music in a graphic display, e.g., by using pattern of colors and/or shapes, to provide an abstract interpretation of the music being played, a play button 1027 for playing the composition, a pause button 1029 for pausing the composition and a save button 1031 for saving the composition and/or the corresponding visual representation.
  • a pop-up message indicating that the composed musical and/or the visual representation saved is presented to the player. Once the composition and the corresponding visual representation are saved, the player is redirected to the game screen 707.
  • FIG.10D is a user interface diagram that represents a screen for loading a game, according to one example embodiment.
  • screen 1033 includes a back button 1035, at least one player may select the back button 1035 to be redirected to the game menu screen 1000.
  • the screen 1033 includes“name of game” button 1037 that provides information pertaining to at least one song, e.g., song name, artist’s information and/or album information.
  • the screen 1033 also includes button 1039 for listening to a song for a specified duration, and once the user is convinced he/she may choose to buy the songs by clicking button 1041. Subsequently, the purchased song is uploaded to UE 101, and then the player is redirected to game menu screen 1000.
  • FIG.10E is a user interface diagram that represents a catalogue of various genres of music, according to one example embodiment.
  • screen 1043 includes a back button 1045, at least one player may select the back button 1045 to be redirected to the game menu screen 1000.
  • the screen 1043 includes buttons 1047, each button 1047 includes plurality of songs for a particular genre of music. Further, button 1047 may provide one or more players with song information, e.g., duration of the song, artist’s information, e.g., name of the artist and his/her background, number of songs included in a package and/or payment information, e.g., total cost of each package represented by individual button 1047.
  • song information e.g., duration of the song, artist’s information, e.g., name of the artist and his/her background
  • payment information e.g., total cost of each package represented by individual button 1047.
  • the screen 1043 may also include button 1049 for listening to the songs in the package for a specified duration, then the players may choose to buy the songs by clicking button 1051. In another embodiment, the players may choose to upload the songs by clicking button 1051 for free. In one scenario, upon clicking button 1051, the player may be presented with payment information, and once the transaction is authorized by the player the selected package including a particular genre of music is uploaded to EE 101.
  • FIG.10F is a user interface diagram that represents a screen for loading music, according to one example embodiment.
  • the screen 1053 e.g., a screen for loading songs includes a back button 1055.
  • at least one player may select the back button 1055, whereupon the player is redirected to the game menu screen 1000.
  • the screen 1053 includes button 1057 for recording new songs. The player may record a new song by using microphone of the EE 101.
  • the screen 1053 may also include button 1059 for uploading one or more songs saved in the EE 101 and/or from the database 119.
  • the screen 1053 may also include button 1061 for purchasing songs.
  • the player may be redirected to a store screen 1043 upon clicking button 1061, and the player may purchase music of his/her choice.
  • FIG.11A is a user interface diagram that represents a screen for recording a song, according to one example embodiment.
  • screen 1101 includes a back button 1103, at least one player may select the back button 1103 to be redirected to the game menu screen 1000.
  • the screen 1101 includes a music visualizer 1105. Before recording is started, the music visualizer 1105 is disabled.
  • the screen 1101 also includes a play button 1107 for playing the recorded song, a record button 1109 for recording and a save button 1111 for saving the recorded song. Before the recording is started, the play button 1107 is disabled.
  • FIG.11B is a user interface diagram that represents a screen during song recording, according to one example embodiment.
  • screen 1113 includes a back button 1115, at least one player may select the back button 1115 to be redirected to the game menu screen 1000.
  • the screen 1113 includes a music visualizer 1117. During recording, the music visualizer 1117 is enabled.
  • the screen 1113 also includes a play button 1119 for playing the recorded song, a pause button 1121 for pausing the recording and a save button 1123 for saving the recorded song.
  • FIG.11C is a user interface diagram that represents a screen for saving a recording, according to one example embodiment.
  • a pop-up message 1127 is presented in screen 1125.
  • the pop-up message 1127 includes a text filed 1128 for entering a name for the recording.
  • the user may select button 1131 to save the recording.
  • the user can also select button 1129 to rewrite the recording.
  • a pop-up message is presented to the user.
  • the rewrite pop-up message includes a“CANCEL” button and a“YES” button.
  • the user may be redirected to a screen 1053 upon clicking“CANCEL” button, and the user may be redirected to recording screen 1101 upon clicking“YES” button.
  • FIG.11D is a user interface diagram that represents a screen after saving a recording, according to one example embodiment.
  • screen 1133 includes a pop-up message 1135 indicating the song has been saved and added to the game.
  • the processes described herein for converting color data into one or more musical notes may be advantageously implemented via software, hardware, firmware or a combination of software and/or firmware and/or hardware.
  • the processes described herein may be advantageously implemented via processor(s), Digital Signal Processing (DSP) chip, an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Arrays (FPGAs), etc.
  • DSP Digital Signal Processing
  • ASIC Application Specific Integrated Circuit
  • FPGAs Field Programmable Gate Arrays
  • FIG. 12 illustrates a computer system 1200 upon which an embodiment of the invention may be implemented.
  • computer system 1200 is depicted with respect to a particular device or equipment, it is contemplated that other devices or equipment (e.g., network elements, servers, etc.) within FIG. 12 can deploy the illustrated hardware and components of system 1200.
  • Computer system 1200 is programmed (e.g., via computer program code or instructions) to convert color data into one or more musical notes as described herein and includes a communication mechanism such as a bus 1210 for passing information between other internal and external components of the computer system 1200.
  • Information is represented as a physical expression of a measurable phenomenon, typically electric voltages, but including, in other embodiments, such phenomena as magnetic, electromagnetic, pressure, chemical, biological, molecular, atomic, sub-atomic and quantum interactions.
  • a measurable phenomenon typically electric voltages, but including, in other embodiments, such phenomena as magnetic, electromagnetic, pressure, chemical, biological, molecular, atomic, sub-atomic and quantum interactions.
  • north and south magnetic fields, or a zero and non-zero electric voltage represent two states (0, 1) of a binary digit (bit).
  • Other phenomena can represent digits of a higher base.
  • a superposition of multiple simultaneous quantum states before measurement represents a quantum bit (qubit).
  • a sequence of one or more digits constitutes digital data that is used to represent a number or code for a character.
  • information called analog data is represented by a near continuum of measurable values within a particular range.
  • Computer system 1200, or a portion thereof constitutes a means for performing one or more steps of converting color data
  • a bus 1210 includes one or more parallel conductors of information so that information is transferred quickly among devices coupled to the bus 1210.
  • One or more processors 1202 for processing information are coupled with the bus 1210.
  • a processor (or multiple processors) 1202 performs a set of operations on information as specified by computer program code related to converting color data into one or more musical notes.
  • the computer program code is a set of instructions or statements providing instructions for the operation of the processor and/or the computer system to perform specified functions.
  • the code for example, may be written in a computer programming language that is compiled into a native instruction set of the processor. The code may also be written directly using the native instruction set (e.g., machine language).
  • the set of operations include bringing information in from the bus 1210 and placing information on the bus 1210.
  • the set of operations also typically include comparing two or more units of information, shifting positions of units of information, and combining two or more units of information, such as by addition or multiplication or logical operations like OR, exclusive OR (XOR), and AND.
  • Each operation of the set of operations that can be performed by the processor is represented to the processor by information called instructions, such as an operation code of one or more digits.
  • a sequence of operations to be executed by the processor 1202, such as a sequence of operation codes constitute processor instructions, also called computer system instructions or, simply, computer instructions.
  • Processors may be implemented as mechanical, electrical, magnetic, optical, chemical, or quantum components, among others, alone or in combination.
  • Computer system 1200 also includes a memory 1204 coupled to bus 1210.
  • the memory 1204 such as a random access memory (RAM) or any other dynamic storage device, stores information including processor instructions for converting color data into one or more musical notes. Dynamic memory allows information stored therein to be changed by the computer system 1200. RAM allows a unit of information stored at a location called a memory address to be stored and retrieved independently of information at neighboring addresses.
  • the memory 1204 is also used by the processor 1202 to store temporary values during execution of processor instructions.
  • the computer system 1200 also includes a read only memory (ROM) 1206 or any other static storage device coupled to the bus 1210 for storing static information, including instructions, that is not changed by the computer system 1200.
  • ROM read only memory
  • Non-volatile (persistent) storage device 1208 such as a magnetic disk, optical disk or flash card, for storing information, including instructions, that persists even when the computer system 1200 is turned off or otherwise loses power.
  • Information including instructions for converting color data into one or more musical notes, is provided to the bus 1210 for use by the processor from an external input device 1212, such as a keyboard containing alphanumeric keys operated by a human user, a microphone, an Infrared (IR) remote control, a joystick, a game pad, a stylus pen, a touch screen, or a sensor.
  • IR Infrared
  • a sensor detects conditions in its vicinity and transforms those detections into physical expression compatible with the measurable phenomenon used to represent information in computer system 1200.
  • a display device 1214 such as a cathode ray tube (CRT), a liquid crystal display (LCD), a light emitting diode (LED) display, an organic LED (OLED) display, a plasma screen, or a printer for presenting text or images
  • a pointing device 1216 such as a mouse, a trackball, cursor direction keys, or a motion sensor, for controlling a position of a small cursor image presented on the display 1214 and issuing commands associated with graphical elements presented on the display 1214, and one or more camera sensors 1294 for capturing, recording and causing to store one or more still and/or moving images (e.g., videos, movies, etc.) which also may comprise audio recordings.
  • one or more of external input device 1212, display device 1214 and pointing device 1216 may be omitted.
  • special purpose hardware such as an application specific integrated circuit (ASIC) 1220
  • ASIC application specific integrated circuit
  • the special purpose hardware is configured to perform operations not performed by processor 1202 quickly enough for special purposes.
  • ASICs include graphics accelerator cards for generating images for display 1214, cryptographic boards for encrypting and decrypting messages sent over a network, speech recognition, and interfaces to special external devices, such as robotic arms and medical scanning equipment that repeatedly perform some complex sequence of operations that are more efficiently implemented in hardware.
  • Computer system 1200 also includes one or more instances of a communications interface 1270 coupled to bus 1210.
  • Communication interface 1270 provides a one-way or two- way communication coupling to a variety of external devices that operate with their own processors, such as printers, scanners and external disks. In general the coupling is with a network link 1278 that is connected to a local network 1280 to which a variety of external devices with their own processors are connected.
  • communication interface 1270 may be a parallel port or a serial port or a universal serial bus (USB) port on a personal computer.
  • communications interface 1270 is an integrated services digital network (ISDN) card or a digital subscriber line (DSL) card or a telephone modem that provides an information communication connection to a corresponding type of telephone line.
  • ISDN integrated services digital network
  • DSL digital subscriber line
  • a communication interface 1270 is a cable modem that converts signals on bus 1210 into signals for a communication connection over a coaxial cable or into optical signals for a communication connection over a fiber optic cable.
  • communications interface 1270 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, such as Ethernet.
  • LAN local area network
  • Wireless links may also be implemented.
  • the communications interface 1270 sends and/or receives electrical, acoustic or electromagnetic signals, including infrared and optical signals, that carry information streams, such as digital data.
  • the communications interface 1270 includes a radio band electromagnetic transmitter and receiver called a radio transceiver.
  • the communications interface 1270 enables connection to the communication network 105 for converting color data into one or more musical notes to the UE 101.
  • Non-transitory media such as non-volatile media, include, for example, optical or magnetic disks, such as storage device 1208.
  • Volatile media include, for example, dynamic memory 1204.
  • Transmission media include, for example, twisted pair cables, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves.
  • Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media.
  • Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, an EEPROM, a flash memory, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
  • the term computer-readable storage medium is used herein to refer to any computer- readable medium except transmission media.
  • Logic encoded in one or more tangible media includes one or both of processor instructions on a computer-readable storage media and special purpose hardware, such as ASIC 1220
  • Network link 1278 typically provides information communication using transmission media through one or more networks to other devices that use or process the information.
  • network link 1278 may provide a connection through local network 1280 to a host computer 1282 or to equipment 1284 operated by an Internet Service Provider (ISP).
  • ISP equipment 1284 in turn provides data communication services through the public, world-wide packet-switching communication network of networks now commonly referred to as the Internet 1290.
  • a computer called a server host 1292 connected to the Internet hosts a process that provides a service in response to information received over the Internet.
  • server host 1292 hosts a process that provides information representing video data for presentation at display 1214. It is contemplated that the components of system 1200 can be deployed in various configurations within other computer systems, e.g., host 1282 and server 1292.
  • At least some embodiments of the invention are related to the use of computer system 1200 for implementing some or all of the techniques described herein. According to one embodiment of the invention, those techniques are performed by computer system 1200 in response to processor 1202 executing one or more sequences of one or more processor instructions contained in memory 1204. Such instructions, also called computer instructions, software and program code, may be read into memory 1204 from another computer-readable medium such as storage device 1208 or network link 1278. Execution of the sequences of instructions contained in memory 1204 causes processor 1202 to perform one or more of the method steps described herein. In alternative embodiments, hardware, such as ASIC 1220, may be used in place of or in combination with software to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware and software, unless otherwise explicitly stated herein.
  • Computer system 1200 can send and receive information, including program code, through the networks 1280 and 1290, among others, through network link 1278, and communications interface 1270.
  • a server host 1292 transmits program code for a particular application, requested by a message sent from computer 1200, through Internet 1290, ISP equipment 1284, local network 1280 and communications interface 1270.
  • the received code may be executed by processor 1202 as it is received, or may be stored in memory 1204 or in storage device 1208 or any other non-volatile storage for later execution, or both. In this manner, computer system 1200 may obtain application program code in the form of signals on a carrier wave.
  • Various forms of computer readable media may be involved in carrying one or more sequence of instructions or data or both to processor 1202 for execution.
  • instructions and data may initially be carried on a magnetic disk of a remote computer such as host 1282.
  • the remote computer loads the instructions and data into its dynamic memory and sends the instructions and data over a telephone line using a modem.
  • a modem local to the computer system 1200 receives the instructions and data on a telephone line and uses an infra red transmitter to convert the instructions and data to a signal on an infra-red carrier wave serving as the network link 1278.
  • An infrared detector serving as communications interface 1270 receives the instructions and data carried in the infrared signal and places information representing the instructions and data onto bus 1210.
  • Bus 1210 carries the information to memory 1204 from which processor 1202 retrieves and executes the instructions using some of the data sent with the instructions.
  • the instructions and data received in memory 1204 may optionally be stored on storage device 1208, either before or after execution by the processor 1202
  • FIG. 13 illustrates a chip set or chip 1300 upon which an embodiment of the invention may be implemented.
  • Chip set 1300 is programmed to convert color data into one or more musical notes as described herein and includes, for instance, the processor and memory components described with respect to FIG. 12 incorporated in one or more physical packages (e.g., chips).
  • a physical package includes an arrangement of one or more materials, components, and/or wires on a structural assembly (e.g., a baseboard) to provide one or more characteristics such as physical strength, conservation of size, and/or limitation of electrical interaction. It is contemplated that in certain embodiments the chip set 1300 can be implemented in a single chip.
  • chip set or chip 1300 can be implemented as a single“system on a chip.” It is further contemplated that in certain embodiments a separate ASIC would not be used, for example, and that all relevant functions as disclosed herein would be performed by a processor or processors.
  • Chip set or chip 1300, or a portion thereof constitutes a means for performing one or more steps of providing user interface navigation information associated with the availability of functions.
  • Chip set or chip 1300, or a portion thereof constitutes a means for performing one or more steps of converting color data into one or more musical notes.
  • the chip set or chip 1300 includes a communication mechanism such as a bus 1301 for passing information among the components of the chip set 1300.
  • a processor 1303 has connectivity to the bus 1301 to execute instructions and process information stored in, for example, a memory 1305.
  • the processor 1303 may include one or more processing cores with each core configured to perform independently.
  • a multi-core processor enables multiprocessing within a single physical package. Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores.
  • the processor 1303 may include one or more microprocessors configured in tandem via the bus 1301 to enable independent execution of instructions, pipelining, and multithreading.
  • the processor 1303 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 1307, or one or more application-specific integrated circuits (ASIC) 1309.
  • DSP digital signal processors
  • ASIC application-specific integrated circuits
  • a DSP 1307 typically is configured to process real-world signals (e.g., sound) in real time independently of the processor 1303.
  • an ASIC 1309 can be configured to performed specialized functions not easily performed by a more general purpose processor.
  • Other specialized components to aid in performing the inventive functions described herein may include one or more field programmable gate arrays (FPGA), one or more controllers, or one or more other special- purpose computer chips.
  • FPGA field programmable gate arrays
  • the chip set or chip 1300 includes merely one or more processors and some software and/or firmware supporting and/or relating to and/or for the one or more processors.
  • the processor 1303 and accompanying components have connectivity to the memory 1305 via the bus 1301.
  • the memory 1305 includes both dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.) and static memory (e.g., ROM, CD-ROM, etc.) for storing executable instructions that when executed perform the inventive steps described herein to convert color data into one or more musical notes.
  • the memory 1305 also stores the data associated with or generated by the execution of the inventive steps.
  • FIG. 14 is a diagram of exemplary components of a mobile terminal (e.g., handset) for communications, which is capable of operating in the system of FIG. 1, according to one embodiment.
  • mobile terminal 1401, or a portion thereof constitutes a means for performing one or more steps of converting color data into one or more musical notes.
  • a radio receiver is often defined in terms of front-end and back-end characteristics. The front-end of the receiver encompasses all of the Radio Frequency (RF) circuitry whereas the back-end encompasses all of the base-band processing circuitry.
  • RF Radio Frequency
  • circuitry refers to both: (1) hardware-only implementations (such as implementations in only analog and/or digital circuitry), and (2) to combinations of circuitry and software (and/or firmware) (such as, if applicable to the particular context, to a combination of processor(s), including digital signal processor(s), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions).
  • This definition of “circuitry” applies to all uses of this term in this application, including in any claims.
  • the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) and its (or their) accompanying software/or firmware.
  • the term“circuitry” would also cover if applicable to the particular context, for example, a baseband integrated circuit or applications processor integrated circuit in a mobile phone or a similar integrated circuit in a cellular network device or other network devices.
  • Pertinent internal components of the telephone include a Main Control Unit (MCU) 1403, a Digital Signal Processor (DSP) 1405, and a receiver/transmitter unit including a microphone gain control unit and a speaker gain control unit.
  • a main display unit 1407 provides a display to the user in support of various applications and mobile terminal functions that perform or support the steps of converting color data into one or more musical notes.
  • the display 1407 includes display circuitry configured to display at least a portion of a user interface of the mobile terminal (e.g., mobile telephone). Additionally, the display 1407 and display circuitry are configured to facilitate user control of at least some functions of the mobile terminal.
  • An audio function circuitry 1409 includes a microphone 1411 and microphone amplifier that amplifies the speech signal output from the microphone 1411. The amplified speech signal output from the microphone 1411 is fed to a coder/decoder (CODEC) 1413.
  • CDEC coder/decoder
  • a radio section 1415 amplifies power and converts frequency in order to communicate with a base station, which is included in a mobile communication system, via antenna 1417.
  • the power amplifier (PA) 1419 and the transmitter/modulation circuitry are operationally responsive to the MCU 1403, with an output from the PA 1419 coupled to the duplexer 1421 or circulator or antenna switch, as known in the art.
  • the PA 1419 also couples to a battery interface and power control unit 1420.
  • a user of mobile terminal 1401 speaks into the microphone 1411 and his or her voice along with any detected background noise is converted into an analog voltage.
  • the analog voltage is then converted into a digital signal through the Analog to Digital Converter (ADC) 1423.
  • ADC Analog to Digital Converter
  • the control unit 1403 routes the digital signal into the DSP 1405 for processing therein, such as speech encoding, channel encoding, encrypting, and interleaving.
  • the processed voice signals are encoded, by units not separately shown, using a cellular transmission protocol such as enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite, and the like, or any combination thereof.
  • EDGE enhanced data rates for global evolution
  • GPRS general packet radio service
  • GSM global system for mobile communications
  • IMS Internet protocol multimedia subsystem
  • UMTS universal mobile telecommunications system
  • any other suitable wireless medium e.g., microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite,
  • the encoded signals are then routed to an equalizer 1425 for compensation of any frequency-dependent impairments that occur during transmission though the air such as phase and amplitude distortion.
  • the modulator 1427 combines the signal with a RF signal generated in the RF interface 1429.
  • the modulator 1427 generates a sine wave by way of frequency or phase modulation.
  • an up-converter 1431 combines the sine wave output from the modulator 1427 with another sine wave generated by a synthesizer 1433 to achieve the desired frequency of transmission.
  • the signal is then sent through a PA 1419 to increase the signal to an appropriate power level.
  • the PA 1419 acts as a variable gain amplifier whose gain is controlled by the DSP 1405 from information received from a network base station.
  • the signal is then filtered within the duplexer 1421 and optionally sent to an antenna coupler 1435 to match impedances to provide maximum power transfer. Finally, the signal is transmitted via antenna 1417 to a local base station.
  • An automatic gain control (AGC) can be supplied to control the gain of the final stages of the receiver.
  • the signals may be forwarded from there to a remote telephone which may be another cellular telephone, any other mobile phone or a land-line connected to a Public Switched Telephone Network (PSTN), or other telephony networks.
  • PSTN Public Switched Telephone Network
  • Voice signals transmitted to the mobile terminal 1401 are received via antenna 1417 and immediately amplified by a low noise amplifier (LNA) 1437.
  • LNA low noise amplifier
  • a down-converter 1439 lowers the carrier frequency while the demodulator 1441 strips away the RF leaving only a digital bit stream.
  • the signal then goes through the equalizer 1425 and is processed by the DSP 1405.
  • a Digital to Analog Converter (DAC) 1443 converts the signal and the resulting output is transmitted to the user through the speaker 1445, all under control of a Main Control Unit (MCU) 1403 which can be implemented as a Central Processing Unit (CPU).
  • MCU Main Control Unit
  • CPU Central Processing Unit
  • the MCU 1403 receives various signals including input signals from the keyboard 1447.
  • the keyboard 1447 and/or the MCU 1403 in combination with other user input components comprise a user interface circuitry for managing user input.
  • the MCU 1403 runs a user interface software to facilitate user control of at least some functions of the mobile terminal 1401 to convert color data into one or more musical notes.
  • the MCU 1403 also delivers a display command and a switch command to the display 1407 and to the speech output switching controller, respectively. Further, the MCU 1403 exchanges information with the DSP 1405 and can access an optionally incorporated SIM card 1449 and a memory 1451. In addition, the MCU 1403 executes various control functions required of the terminal.
  • the DSP 1405 may, depending upon the implementation, perform any of a variety of conventional digital processing functions on the voice signals. Additionally, DSP 1405 determines the background noise level of the local environment from the signals detected by microphone 1411 and sets the gain of microphone 1411 to a level selected to compensate for the natural tendency of the user of the mobile terminal 1401.
  • the CODEC 1413 includes the ADC 1423 and DAC 1443.
  • the memory 1451 stores various data including call incoming tone data and is capable of storing other data including music data received via, e.g., the global Internet.
  • the software module could reside in RAM memory, flash memory, registers, or any other form of writable storage medium known in the art.
  • the memory device 1451 may be, but not limited to, a single memory, CD, DVD, ROM, RAM, EEPROM, optical storage, magnetic disk storage, flash memory storage, or any other non volatile storage medium capable of storing digital data.
  • An optionally incorporated SIM card 1449 carries, for instance, important information, such as the cellular phone number, the carrier supplying service, subscription details, and security information.
  • the SIM card 1449 serves primarily to identify the mobile terminal 11401 on a radio network.
  • the card 1449 also contains a memory for storing a personal telephone number registry, text messages, and user specific mobile terminal settings.
  • one or more camera sensors 1453 may be incorporated onto the mobile station 1401 wherein the one or more camera sensors may be placed at one or more locations on the mobile station.
  • the camera sensors may be utilized to capture, record, and cause to store one or more still and/or moving images (e.g., videos, movies, etc.) which also may comprise audio recordings.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Auxiliary Devices For Music (AREA)

Abstract

An approachis provided herein for composing one or more musical notes in a color-encoded musical system. The approach involves presenting a musical keyboard for creating a musical composition and a composition area comprising a plurality of slots for creating a visual representation. The approach also involves assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note. The approach further involves receiving a user input that indicates interaction with a played key of the plurality of keys. The approach also involves rendering a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and the respective musical note to represent a composed musical note of the musical composition.

Description

METHOD AND APPARATUS FOR
PROVIDING AN APPLICATION USER INTERFACE FOR GENERATING COLOR-
ENCODED MUSIC
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] This application claims priority from ETnited States Provisional Application Serial No. 62/648,727, entitled“METHOD AND APPARATUS FOR PROVIDING AN APPLICATION USER INTERFACE FOR GENERATING COLOR-ENCODED MUSIC,” and filed March 27, 2018, the contents of which are hereby incorporated herein in their entirety by this reference.
BACKGROUND
[0002] Users enjoy composing music as well as listening to music. However, the difficulty in understanding and recognizing the overwhelming number of musical concepts makes it tough for users with limited music experience to compose music. Correspondingly, providing musical instruction to individuals with no music experience can be very difficult, and this difficulty is intensified when the musical instruction is directed to children. One big challenge for service providers is to offer an effective and efficient approach for identifying one or more musical notes, for instance, by colors that can be correlated individually with each of the musical notes. Then, compositions of one or more music notes are generated by converting these colors into corresponding musical notes.
SOME EXAMPLE EMBODIMENTS
[0003] Therefore, there is a need for an approach for providing an application user interface (e.g., a gaming application user interface) for composing music in a color-encoded musical notation system. This application user interface, for instance, introduces a capability for users to learn to read and compose music using a color-encoded musical notation system.
[0004] According to one embodiment, a method comprises presenting a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition. The method also comprises assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note. The method further comprises receiving, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys. The method also comprises rendering a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
[0005] According to another embodiment, an apparatus comprises at least one processor, and at least one memory including computer program code for one or more computer programs, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to present a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition. The apparatus is also caused to assign one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note. The apparatus is further caused to receive, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys. The apparatus is also caused to render a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
[0006] According to another embodiment, a computer-readable storage medium carries one or more sequences of one or more instructions which, when executed by one or more processors, present a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition. The apparatus is also caused to assign one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note. The apparatus is further caused to receive, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys. The apparatus is also caused to render a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
[0007] According to another embodiment, an apparatus comprises means for presenting a user interface comprising a first user interface element and a second user interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition. The apparatus also comprises means for assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note. The apparatus further comprises means for receiving, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys. The apparatus also comprises means for rendering a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition.
[0008] In addition, for various example embodiments of the invention, the following is applicable: a method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on (or derived at least in part from) any one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention. [0009] For various example embodiments of the invention, the following is also applicable: a method comprising facilitating access to at least one interface configured to allow access to at least one service, the at least one service configured to perform any one or any combination of network or service provider methods (or processes) disclosed in this application.
[0010] For various example embodiments of the invention, the following is also applicable: a method comprising facilitating creating and/or facilitating modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based, at least in part, on data and/or information resulting from one or any combination of methods or processes disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
[0011] For various example embodiments of the invention, the following is also applicable: a method comprising creating and/or modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based at least in part on data and/or information resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention, and/or at least one signal resulting from one or any combination of methods (or processes) disclosed in this application as relevant to any embodiment of the invention.
[0012] In various example embodiments, the methods (or processes) can be accomplished on the service provider side or on the mobile device side or in any shared way between service provider and mobile device with actions being performed on both sides.
[0013] For various example embodiments, the following is applicable: An apparatus comprising means for performing a method of any of the claims.
[0014] Still other aspects, features, and advantages of the invention are readily apparent from the following detailed description, simply by illustrating a number of particular embodiments and implementations, including the best mode contemplated for carrying out the invention. The invention is also capable of other and different embodiments, and its several details can be modified in various obvious respects, all without departing from the spirit and scope of the invention. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings:
[0016] FIG. l is a diagram of a system capable of composing one or more musical notes in a color-encoded musical system, according to one embodiment;
[0017] FIG. 2 is a diagram of the components of the color processing platform 103, according to one example embodiment;
[0018] FIG. 3 is a flowchart of a process for converting color data into one or more musical notes, according to one example embodiment;
[0019] FIG. 4 is a flowchart of a process for uploading an audio data and generating a visual representation of the audio data, according to one example embodiment;
[0020] FIGs. 5A and 5B are diagrams that represent first user interface element, according to one example embodiment;
[0021] FIG. 6A is a diagram that represents a second user interface, according to one embodiment;
[0022] FIG. 6B is a diagram of a process for determining an active slot, according to one example embodiment;
[0023] FIG. 6C is a diagram of a process for changing the length of at least one slot and duration for the corresponding note, according to one example embodiment;
[0024] FIG.7 is a flowchart of a process for playing a color-encoded musical game, according to one embodiment;
[0025] FIG. 8 is a diagram that represents a game screen, according to one example embodiment; [0026] FIG. 9 is a user interface diagram that represents a game screen based, at least in part, on the orientation of UE 101, according to one example embodiment;
[0027] FIG.10A is a user interface diagram that represents a screen to set-up at least one game, according to one example embodiment;
[0028] FIG.10B is a user interface diagram that represents a screen for saving a game, according to one example embodiment;
[0029] FIG.10C is a user interface diagram that represents a screen for saving one or more composed music and/or visual representation corresponding to the composed music, according to one example embodiment;
[0030] FIG.10D is a user interface diagram that represents a screen for loading a game, according to one example embodiment;
[0031] FIG.10E is a user interface diagram that represents a catalogue of various genres of music, according to one example embodiment;
[0032] FIG.10F is a user interface diagram that represents a screen for loading music, according to one example embodiment;
[0033] FIG.11A is a user interface diagram that represents a screen for recording a song, according to one example embodiment;
[0034] FIG.11B is a user interface diagram that represents a screen during song recording, according to one example embodiment;
[0035] FIG.11C is a user interface diagram that represents a screen for saving a recording, according to one example embodiment;
[0036] FIG.11D is a user interface diagram that represents a screen after saving a recording, according to one example embodiment
[0037] FIG. 12 is a diagram of hardware that can be used to implement an embodiment of the invention; [0038] FIG. 13 is a diagram of a chip set that can be used to implement an embodiment of the invention; and
[0039] FIG. 14 is a diagram of a mobile terminal (e.g., handset) that can be used to implement an embodiment of the invention.
DESCRIPTION OF SOME EMBODIMENTS
[0040] Examples of a method, apparatus, and computer program for providing an application user interface for composing music using a color-encoded musical notation system are disclosed. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It is apparent, however, to one skilled in the art that the embodiments of the invention may be practiced without these specific details or with an equivalent arrangement. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments of the invention.
[0041] FIG. l is a diagram of a system capable of providing an application user interface for composing one or more musical notes in a color-encoded musical notation system, according to one embodiment. As noted above, composing music traditionally requires mastery of traditional musical notation systems (e.g., ability to read notes in staff notation) as well as musical skill and talent to achieve good results. An average user may experience difficulty in understanding written music due to the counter-intuitive nature of traditional music notation. In addition, young users may have difficulty in understanding written music due to the complex nature of the musical structures. As a result, system 100 associates one or more visual representation (e.g., colors) with one or more musical notes. These colors are associated according to a predefined color scheme based on the tonal or rhythmic qualities of the musical notes (e.g., the color a shape can represent the musical note, and the size or length of the shape can represent the length of the musical note). This allows inexperienced users as well as young users to compose music by using colors, while intuitively becoming aware of musical structures. Subsequently, the colors are converted into musical notes. This process of converting color data to musical notes traditionally has relied on expertise in cross-disciplinary artistic concepts (e.g., music theory, composition, etc.) combined with artistic and musical skill to achieve subjectively pleasing or “good” results. However, this knowledge and skill often is out of reach for average users, thereby, limiting the ability of these users to convert color data to musical notes.
[0042] In light of this problem, system 100 of FIG. 1 introduces a unique application user interface (e.g., a game-based application user interface) that enables users to learn and compose music using a color-based musical notation system. In one embodiment, the application user interface includes a first user interface element 121 (e.g., color-encoded piano keyboard with one or more colors associated with each key) including a full-length keyboard and a zoomed keyboard. The application user interface provided by system 100 also includes a second user interface element 123 with a composition area which comprises a plurality of slots for filling-in the colors and then the colors may be read or scanned by a color-reading device 125. In one embodiment, a user may select colors that are associated with keys in the keyboard to fill-in the template in the slots in the second user interface element 123 through an input module 127. In one embodiment, the input module 127 includes a touch-based input, e.g., a user may use stylus or his/her fingers, a voice-based input through a microphone, or a combination thereof. As noted above, for users who are not familiar or skilled in composing a music using standard musical notation (e.g., staff notation), the user can more intuitively and advantageously draw or apply colors in an arrangement to create a composition with a visual representation that can then be converted by the system 100 into a musical composition or set of musical notes.
[0043] In one embodiment, the color-reading device 125 produces color data indicating the colors filled in the composition area in the second user interface element 123 by using, e.g., an optical scanner or equivalent technology to measure the color wavelength or other indicator of the colors applied to the composition area in the second user interface element 123. In one embodiment, the color reading device 125 is part of the application or device that is presenting the application user interface. In other words, the color reading device 125 can directly capture an image of the composition area within an application or device setting without using an external capture device such as an optical scanner. In one embodiment, color accuracy and the ability to accurately differentiate between a large number of colors is particularly important when trying to map the colors to the full range of musical notes available to, for instance, a full- length keyboard. In the case of a full-length keyboard, the system 100 would use at least 88 colors in the keyboard in first user interface element 121 to represent each of the 88 notes.
[0044] In one embodiment, the color processing platform 103 can then output the musical composition in any format or media selected by a user. For example, the composition can then be played through an audio out as audible music. In another embodiment, the composition can be converted into another musical notation system (e.g., staff notation or any other system of musical notation). In one embodiment, the color processing platform 103 uses an algorithmic process based on certain parameters (e.g., color level, shapes and/or sizes of the color appearing on the composition area in the second user interface element 123, associated symbols/drawings/patterns, etc.) to determine one or more musical characteristics for generating the composition including, but not limited to, the sound level, pitch, or duration for the one or more musical notes. In one example embodiment, the size of the color in the composition area in the second user interface element 123 determines the duration of the musical note.
[0045] As shown in FIG. 1, the system 100 comprises UE 101 that may include or be associated with applications 107, sensors 111, first user interface element 121 and second user interface element 123.
[0046] In one embodiment, the UE 101 has connectivity to a color processing platform 103 via a communication network 105, e.g., a wireless communication network. In another embodiment, the color processing platform 103 performs one or more functions associated with converting color data into one or more musical notes. In one embodiment, the conversion from color data to musical notes using this method includes receiving a second user interface element 123 on which a user has applied a visual arrangement of colors through the input module 127 (e.g., colored pens or user figures) which have been mapped to correspond to particular musical notes. As noted above, the color processing platform 103 can pre-determine which colors correspond to which musical notes, and what sizes of the respective shapes of the color correspond to which length of musical notes (e.g., eighth notes, quarter notes, half notes, etc.). In one embodiment, the color processing platform 103 can provide one or more color sets to be applied to the keyboard in the first user interface element 121. [0047] In another embodiment, the system 100 can enable one or more users to dynamically map the colors the corresponding musical note in the keyboard of the first user interface element 121. In one example embodiment, the color processing platform 103 can present a user interface in which the user can manually specify one or more colors to correspond to one or more musical notes in the keyboard of the first user interface element 121. In one embodiment, this manual correlation can occur on a color by color basis. In addition or alternatively, the color processing platform 103 can shift colors along a musical scale based on change in a single note or color combination. In one example embodiment, if red color is matched to a middle C by default, the user may change the mapping so that green color is now matched to middle C, the color processing platform 103 can use the same initial color sequence but shift all other default colors in the same sequence so that green matches or corresponds to middle C on a musical scale.
[0048] As shown in FIG. 1, the system 100 comprises of UE 101. In one embodiment, the UE 101 is any type of mobile terminal, fixed terminal, or portable terminal including a navigation unit (e.g., in-vehicle or standalone), a mobile handset, station, unit, device, multimedia computer, multimedia tablet, Internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system (PCS) device, personal navigation device, personal digital assistants (PDAs), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof. It is also contemplated that the EE 101 can support any type of interface to the user (such as“wearable” circuitry, etc.).
[0049] In one embodiment, the color processing platform 103 may be a platform with multiple interconnected components. The color processing platform 103 may include one or more servers, intelligent networking devices, computing devices, components and corresponding software for converting color data into one or more musical notes (and vice versa). In addition, it is noted that the color processing platform 103 may be a separate entity of the system 100, a part of the one or more services H5a-l l5n (collectively referred to as services 115) of the service platform 113, or the EE 101. Any known or still developing methods, techniques or processes for converting color data into one or more musical notes may be employed by the color processing platform 103.
[0050] In one embodiment, the color processing platform 103 may read respective colors applied to a composition area by a plurality of drawing instruments as color data using a color reading device 125. Then, the color processing platform 103 may process the color data to generate a composition of the one or more musical notes from the color data based on the set of musical notes that correspond to the respective colors in the color data. In other words, in one embodiment, the color processing platform 103 may convert color data into one or more musical notes by the following means: 1) corresponding or mapping one or more features (e.g. colors, color patterns, and color sizes) with one or more musical notes in a color-encoded musical module; (2) determining one or more features applied to a composition area; and (3) generating a composition of one or more music notes with a visual representation or an aural representation of the one or more musical notes that correspond to the respective colors in the composition area.
[0051] In one embodiment, the color processing platform 103 may generate a legend for correlating the one or more colors, their patterns and sizes to the one or more musical notes and their respective lengths. In one example embodiment, the color processing platform 103 may use the features extracted from the color data to match the musical notes to its appropriate partner. In another embodiment, the color processing platform 103 may process one or more color data to determine one or more element (e.g., shades, ranges, hues, brightness, contrasts, purity) of the color data. In a further embodiment, the color processing platform 103 may determine the size of the one or more colors, wherein the size of the colors can be used to represent the duration or length of the musical notes.
[0052] Further, various elements of the system 100 may communicate with each other through the communication network 105. The communication network 105 of system 100 includes one or more networks such as a data network, a wireless network, a telephony network, or any combination thereof. It is contemplated that the data network may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), a public data network (e.g., the Internet), short range wireless network, or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, e.g., a proprietary cable or fiber-optic network, and the like, or any combination thereof. In addition, the wireless network may be, for example, a cellular communication network and may employ various technologies including enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (Wi-Fi), wireless LAN (WLAN), Bluetooth®, Internet Protocol (IP) data casting, satellite, mobile ad-hoc network (MANET), vehicle controller area network (CAN bus), and the like, or any combination thereof.
[0053] In one embodiment, one or more content provider H7a-l l7n (collectively referred to as content provider 117) enables musical notes and visual representation to be derived from existing songs. The existing songs may be uploaded from database 119 or downloaded from service platform 113. For example, an existing song“Twinkle twinkle little star” may be fed-into the program. The UE 101 can generate one or more musical notes derived from the song. The output device 109 and the second user interface element 123 can generate an audio output and a corresponding visual representation of the one or more musical notes.
[0054] In one embodiment, the generated composition of one or more musical notes and corresponding visual representations can be saved in database 119 or uploaded and shared in service platform 113.
[0055] The UE 101 may further include applications 107 to perform one or more functions of converting color data into one or more musical notes. In one embodiment, the applications 107 and the color processing platform 103 interact according to a client-server model. It is noted that the client-server model of computer process interaction is widely known and used. According to the client-server model, a client process sends a message including a request to a server process, and the server process responds by providing a service. The server process may also return a message with a response to the client process. Often the client process and server process execute on different computer devices, called hosts, and communicate via a network using one or more protocols for network communications. The term“server” is conventionally used to refer to the process that provides the service, or the host computer on which the process operates. Similarly, the term“client” is conventionally used to refer to the process that makes the request, or the host computer on which the process operates. As used herein, the terms“client” and “server” refer to the processes, rather than the host computers, unless otherwise clear from the context. In addition, the process performed by a server can be broken up to run as multiple processes on multiple hosts (sometimes called tiers) for reasons that include reliability, scalability, and redundancy, among others.
[0056] In one embodiment, the UE 101 further has connectivity to one or more input modules 127. In one embodiment, the input module may include a microphone for capturing one or more musical notes. For example, the input module 127 can capture a user’s singing or a song playing. In another embodiment, the input module 127 includes color-reading device. For example, for ingesting color data or musical tones, the input module 127 may include a camera or a scanner for capturing image data or a quick response code (QR code), wherein the QR code is associated with color data or musical notes . It is contemplated that the input module may be configured with any sensor suitable for sampling or capturing visual data into digital format for processing by the system 100.
[0057] In one embodiment, the UE 101 further has connectivity to one or more audio output devices 109. In one embodiment, for outputting musical notes in an aural or visual representation, the output device 109 can be configured with any number of suitable output modules. For example, the output device 109 may be configured with displays (e.g., monitors, projectors, televisions, etc.) for aural or visual representation of the one or more musical notes. In addition, the output device 109 may include devices for creating physical versions (e.g., paper, canvas, and/or other media such as wood, stone, etc.) of the one or more musical notes converted from the color data or uploaded from the input module 127. These devices include, but are not limited to, printers, three-dimensional printers, computerized numerical control (CNC) machines, printing presses, and the like.
[0058] The system 100 also includes one or more sensors 111, which can be implemented, embedded or connected to the UE 101. It is contemplated that the UE 101 may be configured with any sensors suitable for sampling or capturing an image data into digital format for processing by the system 100. It is also contemplated that the UE 101 may be configured with any sensors suitable for sampling or capturing music data into digital format for processing by the system 100. The sensors 111 may be any type of sensor. In one embodiment, the type of sensors 111 configured can be based on the type of source data. For example, it is contemplated that image data can include color data presented in any form. If image data is presented in the form of drawings, for instance, the UE 101 can use a microphone sensor to capture a song or singing for conversion into one or more musical notes and corresponding visual representations. In one example embodiment, a color-reading device is configured with sensors 111 capable of reading color values. In this way, a user can create musical notes by using the color-reading device to read different colors (e.g., from a drawing, an existing image, painting, or other visual representation). The colors that are read by the color-reading device are then converted into musical notes using the processes discussed with respect to the various embodiments described herein.
[0059] In one embodiment, the UE 101 and/or the color processing platform 103 also have connectivity to a service platform 113 that includes one or more services 115 for providing other services that support the color processing platform 103. By way of example, the service platform 113 may include social networking services/application, content (e.g., audio, video, images, etc.) provisioning services/application, application services/application, storage services/application, etc. In one embodiment, the service platform 113 may interact with the UE 101, the color processing platform 103 and the content provider 117 to supplement or aid in the processing of the content information. In one embodiment, the service platform 113 may be implemented or embedded in the color processing platform 103 or in its functions.
[0060] By way of example, the services 115 may be an online service that reflects interests and/or activities of one or more users. The services 115 allow users to share activities information, historical user information and interests (e.g., musical interest) within communication network 105 and their individual networks, and provides for data portability. In one embodiment, the service platform 113 and/or services 115 interact with one or more content providers H7a-l l7n (also collectively referred to as content providers 117) to provide musical notes and/or other related information to the color processing platform 103. The content provided may be any type of content, such as, image content, textual content, audio content (e.g., audio notification), video content (e.g., visual notification), etc. In one embodiment, the content provider 117 may also store content associated with the UE 101, the color processing platform 103, and the services 115 of the service platform 113.
[0061] The system 100 also includes database 119. The database 119 stores one or more musical notes corresponding to one or more colors. The information may be any multiple types of information that can provide means for aiding in the content provisioning and sharing process.
[0062] The system 100 also can generate an application user interface that includes a first user interface element 121. The first user interface element 121 includes a full-length keyboard and a zoomed keyboard. The first user interface element 121 is configured to associate a plurality of colors with a plurality of keys of a keyboard, wherein the plurality of keys corresponds to one or more respective musical notes. In one embodiment, the full-length keyboard is a color-encoded 88-note piano keyboard. In one embodiment, the zoomed keyboard is an active octave of a zoomed representation of the 88-note piano keyboard. In one embodiment, one or more colors are assigned to each key in the zoomed keyboard. In addition or alternatively, the first user interface element 121 includes two sets of keyboard control system for easy navigation between octaves. In one embodiment, the color-encoded musical module 121 is a virtual instrument in a computer application or a mobile phone application. It is noted that a piano keyboard is provided by way illustration and not as a limitation. It is contemplated that any type of musical instrument can be represented and used in the embodiments of the application user interface described herein.
[0063] The application user interface of system 100 can also include a second user interface element 123. In one embodiment, the second user interface element 123 includes a composition area comprising a plurality of slots for creating a visual representation of the musical composition. In one embodiment, the second user interface element 123 is configured to apply a plurality of colors. One of the plurality of slots is an active slot. A user can fill the active slot with colors by interacting with keys in the zoomed keyboard. In another embodiment, similar to the first user interface 121 described above, the second user interface 123 can be a virtual user interface in a computer application (e.g., the same computer application supporting the first user interface 121 described above). In this case, the user can change the size of the slots by dragging the slot left or right.
[0064] The system 100 also includes color-reading device 125. In one embodiment, the color-reading device 125 may read respective colors applied to a physical composition area via an image sensor, a scanner, or a combination thereof. In another embodiment, the color-reading device 125 may scan or otherwise detect a shape or a size of the respective colors applied to the physical composition area as part of the color data. In a further embodiment wherein the system 100 is implemented as a computer application rather than physical components, the color-reading device 125 may a color-reading module of the computer application that supports or provides the virtual first user interface 121 and virtual second user interface 123.
[0065] By way of example, the UE 101, the color processing platform 103, and the conversion application 107 communicate with each other and other components of the communication network 105 using well known, new or still developing protocols. In this context, a protocol includes a set of rules defining how the network nodes within the communication network 105 interact with each other based on information sent over the communication links. The protocols are effective at different layers of operation within each node, from generating and receiving physical signals of various types, to selecting a link for transferring those signals, to the format of information indicated by those signals, to identifying which software application executing on a computer system sends or receives the information. The conceptually different layers of protocols for exchanging information over a network are described in the Open Systems Interconnection (OSI) Reference Model.
[0066] Communications between the network nodes are typically effected by exchanging discrete packets of data. Each packet typically comprises (1) header information associated with a particular protocol, and (2) payload information that follows the header information and contains information that may be processed independently of that particular protocol. In some protocols, the packet includes (3) trailer information following the payload and indicating the end of the payload information. The header includes information such as the source of the packet, its destination, the length of the payload, and other properties used by the protocol. Often, the data in the payload for the particular protocol includes a header and payload for a different protocol associated with a different, higher layer of the OSI Reference Model. The header for a particular protocol typically indicates a type for the next protocol contained in its payload. The higher layer protocol is said to be encapsulated in the lower layer protocol. The headers included in a packet traversing multiple heterogeneous networks, such as the Internet, typically include a physical (layer 1) header, a data-link (layer 2) header, an internetwork (layer 3) header and a transport (layer 4) header, and various application (layer 5, layer 6 and layer 7) headers as defined by the OSI Reference Model.
[0067] FIG. 2 is a diagram of the components of the color processing platform 103, according to one example embodiment. By way of example, the color processing platform 103 includes one or more components for converting color data into one or more musical notes. It is contemplated that the functions of these components may be combined in one or more components or performed by other components of equivalent functionality. In one embodiment, the color processing platform 103 comprises one or more configuration modules 201, mapping modules 203, color processing modules 205, and presentation modules 207, or any combination thereof.
[0068] In one embodiment, the configuration module 201 may configure a color-reading device or application module to scan respective colors applied to a composition area in a composition area. In another embodiment, the configuration module 201 may configure an application of a mobile device to read respective colors in the composition area of the application. In one example embodiment, the color-reading device and/or the application of a mobile device includes an image sensor, a scanner, or a combination thereof to read respective colors. In a further embodiment, the configuration module 201 may configure color sets to a plurality of keys in the color-encoded musical module.
[0069] In one embodiment, the mapping module 203 may associate at least one color with at least one musical note. In another embodiment, the mapping module 203 may associate at least one color pattern with at least one set of musical note. In a further embodiment, the mapping module 203 may correlate the size of the color drawn on the composition area, the composition area of an application, or a combination thereof to the duration for the one or more musical notes. [0070] In one embodiment, the color processing module 205 may process the color data to generate a composition of the one or more musical notes from the color data based on the set of musical notes that correspond to the respective colors in the color data. In one example embodiment, the color processing module 205 may generate a composition of musical notes that correspond to the respective colors based, at least in part, on the sequence of colors applied to a composition area, a composition area of an application, or a combination thereof. In another embodiment, the color processing module 205 may be configured to determine note duration information for the one or more musical notes in the composition based on the shape or the size of the respective colors. In one example embodiment, the color processing module 205 may generate a composition based, at least in part, on the duration information for the one or more musical notes. In a further embodiment, the color processing module 205 is configured to generate a representation of the composition in staff notation and to output the composition in the staff notation via an output device. In another example embodiment, the color processing module 205 may process a drawing to determine a sequence for one or more colors. Then, the color processing module 205 may select one or more musical notes that correlate to the one or more colors based, at least in part, on the sequence. Subsequently, the color processing module 205 may convert the one or more colors with the one or more musical notes.
[0071] In one embodiment, the presentation module 207 may represent the composition in staff notation in at least one user interface of at least one device. In one embodiment, the representation includes a visual representation, an aural representation, or a combination thereof. In one embodiment, the visual representation includes color representations of the composition in the composition area. In another embodiment, the visual representation also includes music visualizer. In one embodiment, the default aural representation of the composition is piano style. In another embodiment, a user can select a different instrument from an instrument list to play the composition. In this case, the instrument list includes violin, guitar, trumpet, flute, or a combination thereof. In such case, the system sends a prompt for payment to the user. When the user’s payment information is authorized, the system presents the instrument list for the user to select. In a further embodiment, the presentation module 207 employs various application programming interfaces (APIs) or other function calls corresponding to the applications 107 of UE 101 and/or output device 109, thus enabling the display of graphics primitives such as menus, buttons, data entry fields, etc., for generating the user interface elements. In one embodiment, the presentation module 207 enables a presentation of a graphical user interface (GUI) for displaying one or more colors to the users for drawing on a canvas of an application.
[0072] The above presented modules and components of the color processing platform 103 can be implemented in hardware, firmware, software, or a combination thereof. Though depicted as a separate entity in FIG. 1, it is contemplated that the color processing platform 103 may be implemented for direct operation by respective UE 101. As such, the color processing platform 103 may generate direct signal inputs by way of the operating system of the UE 101 for interacting with the applications 107. In another embodiment, one or more of the modules 201- 207 may be implemented for operation by respective UEs, as the color processing platform 103, or combination thereof. Still further, the color processing platform 103 may be integrated for direct operation with the services 115, such as in the form of a widget or applet, in accordance with an information and/or subscriber sharing arrangement. The various executions presented herein contemplate any and all arrangements and models.
[0073] FIG. 3 is a flowchart of a process for converting color data into one or more musical notes, according to one example embodiment. In one embodiment, the color processing platform 103 performs the process 300 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 10. As such the color processing platform 103 and/or any of its modules can be means for performing the process 300 or any other processes described herein for providing an application user interface for composing color-encoded music.
[0074] In step 301, system 100 may present a user interface to a user. The user interface comprises a first user interface element and a second user interface element. In one embodiment, the first user interface element presents a representation of a musical keyboard for creating a musical composition. In this case, the representation of the musical keyboard includes a full- length keyboard representation that highlights an active octave of a zoomed keyboard representation. In one embodiment, the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition. In one embodiment, a user can modify the plurality of slots in the composition area. In this case, the modification includes altering a total number of slots and changing size of the slots in the composition area. In one another embodiment, the user interface is a virtual user interface in the computer application. The computer application is executable on a mobile device. In one embodiment, the mobile device is a mobile phone or a tablet. In one example use case, the mobile phone or the tablet is operated under iOS or Android operating system.
[0075] In step 303, the color processing platform 103 assigns one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note. In one embodiment, the one or more respective colors are assigned to a plurality of keys of the representation of the zoomed keyboard. In another embodiment, a user can replace the one or more respective colors with one or more different respective colors based on user preference. In this case, system 100 assigns the one or more different respective colors to the plurality of keys of the representation of the musical keyboard. In one embodiment, the zoomed keyboard includes more than one octaves. In such case, system 100 renders one or more respective colors with different saturation values to indicate different octaves of the respective musical notes.
[0076] In step 305, system 100 receives a user input via the first user interface that indicates interaction with a played key of the plurality of keys of the representation of the musical keyboard. In one embodiment, the user input includes a touch-based input, a voice-based input, or a combination thereof. In one embodiment, the user input that indicates the interaction with the at least one key is received via the zoomed keyboard representation.
[0077] In step 307, the color processing platform 103 may render a color of an active slot of the composition area in the second user interface element based on the one or more respective colors assigned to the played key and its respective musical note to represent a composed musical note of the musical composition. In one embodiment, the color processing platform may use a color-reading device to read the one or more respective colors in the active slot. In one embodiment, a color-reading device includes an image sensor, a scanner, or a combination thereof for reading colors applied to the composition area. In one embodiment, a user can change size of the active slot. In this case, the user can extend or reduce a length of the active slot, wherein the length of the active slot represents a duration of the corresponding musical note for the active slot. In another embodiment, after receiving the user input that indicates the interaction with the played key, system 100 designates a next slot in the second user interface element as a next active slot for creating the musical composition. In one embodiment, a user can select any slot in the composition area as the active slot.
[0078] FIG. 4 is a flowchart of a process for recording and/or uploading audio data and generating a visual representation of the audio data, according to one example embodiment. In one embodiment, the color processing platform 103 performs the process 400 and is implemented in, for instance, a chip set including a processor and a memory as shown in FIG. 10
[0079] In step 401, a user may record and/or upload audio data into system 100. In one embodiment, the audio data is uploaded from one or more databases, downloaded from a network server via the Internet, recorded by a microphone of user device, or a combination thereof. For example, the color processing platform 103 can present a user interface on a user interface that includes an option to upload a data file containing the audio (e.g., in any standard format known in the art such as, but not limited to, MP3, WAV, lossless audio format, etc.). In addition or alternatively, the user interface can include a recording option that enables a corresponding user device to capture audio data through a user device’s microphone, external microphone, connected instrument (e.g., through a MIDI interface, etc.), and/or the like. In one embodiment, the color processing platform 103 can define a minimum audio capture quality (e.g., sample rate, number of bits, etc.) for uploaded and/or recorded music.
[0080] In step 403, system 100 determines one or more attributes of the audio data, wherein the one or more attributes include one or more musical notes of a song or composition represented in the audio data. In other words, the color processing platform 103 processes the audio data to identify a sequence of musical notes and their respective durations as represented in the audio data. In some embodiments, other characteristics such as relative volume, instrumentation, stereo position, etc. can also be extracted from the audio data for each note.
[0081] In step 405, the color processing platform 103 maps the one or more extracted musical notes to one or more respective colors. In one embodiment, the mapping is based on the association of the one or more colors with a plurality of keys of a keyboard, wherein each of the plurality of keys corresponds to a respective musical note. In other embodiments, the association can map each musical note directly to a different color instead of the keys of a keyboard. For example, each note in an octave can be assigned to different colors so that a musical note A corresponds to a first color (e.g., red), musical note B corresponds to a second color (e.g., orange), and so on.
[0082] In step 407, system 100 generates a visual representation of the audio data in the composition area of the device user interface (e.g., as generated in the process 300 of FIG. 3). In one embodiment, the visual representation includes one or more colors and shapes of the one or more colors.
[0083] FIGs. 5A and 5B are diagrams that represent first user interface element, according to one example embodiment. In one embodiment, the first user interface element 500 includes two- keyboard control system. The first keyboard control system includes a full-length keyboard representation 502 having fifty-two white keys and thirty-six black keys. These keys may be repeated every octave, thereby giving seven basic tones. In one example embodiment, the full- length keyboard representation 502 may generate a realistic piano tune, and a user may compose music by clicking the keys of the full-length keyboard 502. In another embodiment, the full- length keyboard representation 502 includes region 504. In one scenario, a user may select the highlighted region 504 as an active octave in the full-length keyboard 502 by moving and/or changing the size of the frame 503. Subsequently, the highlighted region 504 is presented in the second keyboard control system as zoomed keyboard 505. This zoomed keyboard 505 may include one octave or two octaves or more.
[0084] In one embodiment, system 100 may assign one or more respective colors to each key in the zoomed keyboard 505. As shown in FIG.5A, at least one color may be assigned to the at least one white key in the zoomed keyboard 505. In another embodiment, seven colors of a rainbow may be assigned to each of the seven white keys in the zoomed keyboard 505. In one example embodiment, for a gradient of red fading to violet, the system 100 may match the midrange sequence of basic tones C, D, E, F, G, A, B, to a digitally encoded sequence of 1, 3, 5, 6, 8, 10, and 12. In another example embodiment, at least one color 507, e.g., red, may be assigned to key C 1 of the zoomed keyboard 505. On the other hand, black keys of the zoomed keyboard 505 are assigned with multiple colors, e.g., two colors associated with two adjacent white keys. In one scenario, red and orange colors as represented by 509 may be assigned to key C#(D b) which is marked as key 2 on full-length keyboard 502.
[0085] By way of example, the following Table 1 shows how timbre is encoded to correspond with certain colors:
Tone/Musical Note Color
C 1 Red
C#(D b) Red/Orange
D Orange
D# (Eb) Orange/Yellow
E Yellow
F Green
F# (Gb) Green/Blue
G Blue
G#(A b) Blue/Indigo
A Indigo
A# (Bb) Indigo/Violet
B Violet
Table 1
[0086] In one example embodiment, a user may select one or more colors from a color database, and associate them to one or more keys in the zoomed keyboard 505.
[0087] In one embodiment, when a user input is received that indicates interaction with a played key, the color of the played key in the first user interface changes. For example, key 510 is the played key. In one embodiment, system 100 plays musical notes associated with the played key. In this example, piano tone D is played upon receiving the interaction. Moreover, the one or more colors associated with the played key are assigned to an active slot in composition area. In addition, system 100 automatically designates a next slot in the composition area as a next active slot. [0088] Adverting to FIG.5B, FIG.5B includes two octaves in the zoomed keyboard 505. In one example embodiment, a set of seven colors of a rainbow may be assigned to the seven white keys in one octave and another set of seven colors of the rainbow with different saturation values may be assigned to seven white keys in another octave. In another example embodiment, for a gradient of red fading to violet, the system 100 may assign the midrange sequence of basic tones C, D, E, F, G, A, B, to a digitally encoded sequence of 1, 3, 5, 6, 8, 10, 12. In addition, for a gradient of light red fading to light violet, the system 100 may assign the midrange sequence of basic tones C, D, E, F, G, A, B, to a digitally encoded sequence of 13, 15, 17, 18, 20, 22, and 24. As represented in FIG. 5B, a color 507, e.g., red, is assigned to key Cl and another color 511, e.g., a lighter red, may be assigned to key C 13.
[0089] FIG. 6A is a diagram that represents a second user interface, according to one embodiment. In one embodiment, the second user interface element 600 includes a composition area 601. The composition area 601 includes a plurality of slots 611. The plurality of slots 611 form a slot line 609, and plurality of slot lines 609 comprise the composition area 601. In one embodiment, a user may associate each slot in the composition area with one or more colors, wherein the one or more colors are associated with one or more musical notes as shown in FIG. 5A-B. In another embodiment, the first empty slot is an active slot. In this example, slot 611 is the active slot and a user may fill the active slot with one or more colors by selecting at least one key from the zoomed keyboard 505. In one embodiment, a user may either choose a key by clicking on the key, by a voice based input, and/or by any other equivalent input means. In another embodiment, each slot in the composition area 601 may be filled with at least one color that corresponds to at least one white key of the zoomed keyboard 505, for example, slot 603 is filled with at least one color. In another example embodiment, each slot may be filled with multiple colors that correspond to at least one black key in the zoomed keyboard 505, for example, slot 608 is filled with two colors.
[0090] In one embodiment, the system 100 may consider the size of one or more slots to determine a time value or duration of the one or more musical notes. By way of example, common duration for notes include whole note, half note, quarter note, eighth note, etc. In many cases, different notes have different time durations. For example, time duration in a musical score is used to express the relative duration between each bar. Time duration also determines how long a note lasts. In one embodiment, system 100 may match the time duration of a musical note to the size of the color slots and may make the duration of the musical note proportional to the size. In this example, slot 605 represents a musical note that is two times longer than a musical note represented by slot 603 and slot 607 represents a musical note that is three times longer than musical note represented by slot 603.
[0091] In one embodiment, the second user interface element 600 includes a slot control area 615. A user can upload a song and/or one or more musical notes through the“UPLOAD” button 617 to generate a corresponding visual representation. In another embodiment, a user may upload songs and/or one or more musical notes from database 119 and /or communication network 105. In a further embodiment, a user can record a song and/or one or more musical notes through a microphone and may upload it to the system 100. Further, a user may play or stop the uploaded song and/or one or more musical notes through a“PLAY” button 619 and a “STOP” button 621. The user may also save a composition and the corresponding visual representation through a“SAVE” button 623. A user can adjust composition area through a “TEMPLATE” button 625. In this example, the user can adjust total number of slots in the composition area 601, size of each slot, and number of slots in each slot line 609.
[0092] In one embodiment, the second user interface element 600 also includes a music visualizer 613. In one embodiment, the second user interface element 600 also includes a song control area 627. A user can choose to play or pause the musical composition and the corresponding visual representation through a“PLAY” button 629 and a“PAUSE” button 631, respectively. A user can also choose to record a song or one or more musical notes through “RECORD” button 633 and play the recorded song or the one or more musical notes in system 100. In another embodiment, the second user interface element 600 also includes buttons representing other functions, for example, a delete or edit note button (not shown for illustrative convenience).
[0093] FIG. 6B is a diagram of a process for determining an active slot, according to one example embodiment. In one embodiment, the first empty slot 635 in the composition area 601 is a default active slot, i.e., the empty slot 635 is ready to be filled with a plurality of colors by the at least one user. The empty slot 635 is represented as an active slot with a smooth blinking animation showing that the slot is in use. In another embodiment, a user may select another slot apart from the empty slot 635 of the plurality of slots in the composition area 601 as an active slot through a touch-based input, e.g., by using a pen or a finger, a voice-based input, or a combination thereof. In one embodiment, a user can fill the active slot with one or more colors by interacting with one key in the first user interface element. After receiving the user input that indicates the interaction, a next slot is automatically designated as a next active slot. In one embodiment, the changing of active slots is indicated with animation in the second user interface element 600. As shown in FIG. 6B, arrow 636 is used to indicated the active slot is moved to next slot.
[0094] FIG. 6C is a diagram of a process for changing the length of at least one slot and duration for the corresponding note, according to one example embodiment. In one example embodiment, each and every slot can be extended or reduced in length. In another embodiment, only active slot can be extended or reduced in length. A user may click slot 637 and then drag the slot to the left or to the right to decrease or extend the length of slot 637. As discussed above, the length of the slots determine the duration of the corresponding musical notes.
[0095] FIG. 7 is a flowchart of a process for playing a color-encoded musical game, according to one embodiment. In step 701, system 100 presents a splash screen to at least one player. The splash screen is a colorful screen featuring a name for at least one game. In one embodiment, the splash screen may include an option of playing music during presentation to the at least one player. In step 703, a start screen is presented to the at least one player. In one embodiment, the start screen includes plurality of buttons that navigates the at least one player to one or more other screens. In one example embodiment, the start screen may comprise: a “PLAY” button, a“LOAD GAME” button, a“STORE” button and/or an“ACHIEVEMENT” button. In step 705, system 100 may present a game mode selection screen comprising of plurality of game mode buttons based, at least in part, on at least one player selecting the“LOAD GAME” button. In one embodiment, the game mode comprises of plurality of features, such as but not limited to, replicating a song, assigning color-in one or more slots to match a song, experimental play, and so on. In step 707, a game screen is presented to the at least one player upon selecting a game mode. In one embodiment, the game screen includes a first user interface element 500 and a second user interface element 600. In another embodiment, the game screen 707 includes a“SONG SCREEN” 709, a“LOAD GAME” button 719, a“SAVE GAME” button 721, a“STORE” button 723, a“GAME SETTING” button 727 and an“ACHIEVEMENT” button 733. In one example embodiment, at least one player can upload a song and/or one or more musical notes to the game by selecting the “SONG SCREEN” 709. In another embodiment, the“SONG SCREEN” 709 may further include two buttons: a“RECORD SONG” button 711 and a“LOAD SONG” button 717. In one example embodiment, the at least one player may record a song and/or one or more musical notes via a microphone by clicking the “RECORD SONG” button 711. After the recording, the at least one player can choose to rewrite the song by clicking“REWRTIE” button 713. The at least one player can also choose to save and upload the recorded song to system 100 by clicking“SAVE” button 715. Subsequently, the player is redirected to the game screen 707. In another example embodiment, If the at least one player clicks the“LOAD SONG” button 717 in the“SONG SCREEN” 709, the player may upload a song and/or one or more musical notes to system 100 from the database 119 or the communication network 105. Once the song and/or one or more musical notes are uploaded, the player is redirected to the game screen 707.
[0096] In another embodiment, at least one player can choose to reload the game by clicking “LOAD GAME” button 719 in game screen 707. The player may also save the game by clicking “SAVE GAME” button 721. After the game is saved, the player is once again redirected to the game screen 707. The game screen 707 also includes a“STORE” button 723. The at least one player may buy a song or a music pack comprising of multiple songs by clicking the“STORE” button 723. In a further embodiment, system 100 may send a notification regarding payment information to the at least one player. Once the at least one player validates the payment information and authorizes the payment, the purchased song or the music pack is uploaded to EE 101 or the system 100. In one embodiment, a pop-up message confirming the purchase is presented to the player. In another embodiment, the at least one player may upload a song or a music pack from“STORE” for free. [0097] In one embodiment, the game screen 707 further includes a“GAME SETTING” button 727. In step 729, the player may adjust the composition area 601 in second user interface element 600 by clicking the“GAME SETTING” button 727. In one embodiment, the player may also adjust the total number of slots in the composition area and number of slots in each slot line. Next, in step 731, the player may also adjust the first user interface element settings. In one embodiment, the player may also change the color sets assigned to the keyboard. In another embodiment, the player can select one or more octaves as zoomed keyboard 505. ETpon completion, the at least one player is redirected to game screen 707.
[0098] In one embodiment, the game screen 707 further includes an“ACHIEVEMENT” session 733. In one example embodiment, achievements may include but is not limited to: the at least one player has played for 10, 20, or 30 hours; the at least one player has used‘C’ note 50 times; the at least one player has placed 100 red slots; the at least one player has saved 5 songs; the at least one player has shared 5 songs and at least one other player have liked the shared songs. After viewing his/her achievements, the at least one player may be redirected to the game screen 707.
[0099] FIG.8 is a diagram that represents a game screen, according to one example embodiment. In one embodiment, the game screen 800 comprises of a first user interface element 500 and a second user interface element 600. In another embodiment, the game screen 800 includes a name session 801 for one or more players to enter a game name, a song name, or a painting name. In another embodiment, the first user interface element 500 includes two keyboard control system: (i) a full-length keyboard 502 and (ii) zoomed keyboard 505. In a further embodiment, the second user interface element 600 includes a slot control area 615, a composition area 601, and a song control area 627. In another embodiment, the second user interface element 600 also includes buttons representing other functions, for example, a delete or edit note button (not shown for illustrative convenience).
[0100] FIG. 9 is a user interface diagram that represents a game screen based, at least in part, on the orientation of UE 101, according to one example embodiment. In one scenario, display of a game screen changes from 800 to 803 when the physical orientation of the UE 101 is changed. [0101] FIG.10A is a user interface diagram that represents a screen to set-up at least one game, according to one example embodiment. In one embodiment, game menu screen 1000 includes button 1001 for displaying the menu, button 1003 to start a new game, button 1005 to save the game, button 1007 to load a game, button 1009 for redirecting one or more players to an online store for purchasing one or more songs, and button 1011 informs the one or more players on their achievements. In another embodiment, button 1009 for redirecting one or more players to an online store for uploading one or more songs for free. In one scenario, one or more players may select one or more buttons displayed in screen 1000 through a touch-based input, e.g., by using a pen or a finger, a voice-based input, or a combination thereof.
[0102] FIG.10B is a user interface diagram that represents a screen for saving a game, according to one example embodiment. In one embodiment, screen 1013 includes a back button 1015. At least one player may select the back button 1035 to be redirected to the game menu screen 1000. The screen 1013 also includes a text field 1017 for one or more players to enter a name of their choice for the game. Then, the user may select button 1019 to save the game.
[0103] FIG.10C is a user interface diagram that represents a screen for saving one or more composed music and/or visual representation corresponding to the composed music, according to one example embodiment. In one embodiment, screen 1021 includes back button 1023, at least one player may select the back button 1035 to be redirected to the game menu screen 1000. In another embodiment, the screen 1021 may include a music visualizer 1025 for visualizing music in a graphic display, e.g., by using pattern of colors and/or shapes, to provide an abstract interpretation of the music being played, a play button 1027 for playing the composition, a pause button 1029 for pausing the composition and a save button 1031 for saving the composition and/or the corresponding visual representation. In one embodiment, a pop-up message indicating that the composed musical and/or the visual representation saved is presented to the player. Once the composition and the corresponding visual representation are saved, the player is redirected to the game screen 707.
[0104] FIG.10D is a user interface diagram that represents a screen for loading a game, according to one example embodiment. In one embodiment, screen 1033 includes a back button 1035, at least one player may select the back button 1035 to be redirected to the game menu screen 1000. In another embodiment, the screen 1033 includes“name of game” button 1037 that provides information pertaining to at least one song, e.g., song name, artist’s information and/or album information. In a further embodiment, the screen 1033 also includes button 1039 for listening to a song for a specified duration, and once the user is convinced he/she may choose to buy the songs by clicking button 1041. Subsequently, the purchased song is uploaded to UE 101, and then the player is redirected to game menu screen 1000.
[0105] FIG.10E is a user interface diagram that represents a catalogue of various genres of music, according to one example embodiment. In one embodiment, screen 1043 includes a back button 1045, at least one player may select the back button 1045 to be redirected to the game menu screen 1000. In another embodiment, the screen 1043 includes buttons 1047, each button 1047 includes plurality of songs for a particular genre of music. Further, button 1047 may provide one or more players with song information, e.g., duration of the song, artist’s information, e.g., name of the artist and his/her background, number of songs included in a package and/or payment information, e.g., total cost of each package represented by individual button 1047. In a further embodiment, the screen 1043 may also include button 1049 for listening to the songs in the package for a specified duration, then the players may choose to buy the songs by clicking button 1051. In another embodiment, the players may choose to upload the songs by clicking button 1051 for free. In one scenario, upon clicking button 1051, the player may be presented with payment information, and once the transaction is authorized by the player the selected package including a particular genre of music is uploaded to EE 101.
[0106] FIG.10F is a user interface diagram that represents a screen for loading music, according to one example embodiment. In one embodiment, the screen 1053, e.g., a screen for loading songs includes a back button 1055. In one scenario, at least one player may select the back button 1055, whereupon the player is redirected to the game menu screen 1000. In another embodiment, the screen 1053 includes button 1057 for recording new songs. The player may record a new song by using microphone of the EE 101. In a further embodiment, the screen 1053 may also include button 1059 for uploading one or more songs saved in the EE 101 and/or from the database 119. In another embodiment, the screen 1053 may also include button 1061 for purchasing songs. In one scenario, the player may be redirected to a store screen 1043 upon clicking button 1061, and the player may purchase music of his/her choice.
[0107] FIG.11A is a user interface diagram that represents a screen for recording a song, according to one example embodiment. In one embodiment, screen 1101 includes a back button 1103, at least one player may select the back button 1103 to be redirected to the game menu screen 1000. In another embodiment, the screen 1101 includes a music visualizer 1105. Before recording is started, the music visualizer 1105 is disabled. In a further embodiment, the screen 1101 also includes a play button 1107 for playing the recorded song, a record button 1109 for recording and a save button 1111 for saving the recorded song. Before the recording is started, the play button 1107 is disabled.
[0108] FIG.11B is a user interface diagram that represents a screen during song recording, according to one example embodiment. In one embodiment, screen 1113 includes a back button 1115, at least one player may select the back button 1115 to be redirected to the game menu screen 1000. In another embodiment, the screen 1113 includes a music visualizer 1117. During recording, the music visualizer 1117 is enabled. In a further embodiment, the screen 1113 also includes a play button 1119 for playing the recorded song, a pause button 1121 for pausing the recording and a save button 1123 for saving the recorded song.
[0109] FIG.11C is a user interface diagram that represents a screen for saving a recording, according to one example embodiment. In one embodiment, a pop-up message 1127 is presented in screen 1125. The pop-up message 1127 includes a text filed 1128 for entering a name for the recording. Then, the user may select button 1131 to save the recording. The user can also select button 1129 to rewrite the recording. In this case, a pop-up message is presented to the user. In this case, the rewrite pop-up message includes a“CANCEL” button and a“YES” button. In one scenario, the user may be redirected to a screen 1053 upon clicking“CANCEL” button, and the user may be redirected to recording screen 1101 upon clicking“YES” button.
[0110] FIG.11D is a user interface diagram that represents a screen after saving a recording, according to one example embodiment. In one embodiment, screen 1133 includes a pop-up message 1135 indicating the song has been saved and added to the game. [0111] The processes described herein for converting color data into one or more musical notes may be advantageously implemented via software, hardware, firmware or a combination of software and/or firmware and/or hardware. For example, the processes described herein, may be advantageously implemented via processor(s), Digital Signal Processing (DSP) chip, an Application Specific Integrated Circuit (ASIC), Field Programmable Gate Arrays (FPGAs), etc. Such exemplary hardware for performing the described functions is detailed below.
[0112] FIG. 12 illustrates a computer system 1200 upon which an embodiment of the invention may be implemented. Although computer system 1200 is depicted with respect to a particular device or equipment, it is contemplated that other devices or equipment (e.g., network elements, servers, etc.) within FIG. 12 can deploy the illustrated hardware and components of system 1200. Computer system 1200 is programmed (e.g., via computer program code or instructions) to convert color data into one or more musical notes as described herein and includes a communication mechanism such as a bus 1210 for passing information between other internal and external components of the computer system 1200. Information (also called data) is represented as a physical expression of a measurable phenomenon, typically electric voltages, but including, in other embodiments, such phenomena as magnetic, electromagnetic, pressure, chemical, biological, molecular, atomic, sub-atomic and quantum interactions. For example, north and south magnetic fields, or a zero and non-zero electric voltage, represent two states (0, 1) of a binary digit (bit). Other phenomena can represent digits of a higher base. A superposition of multiple simultaneous quantum states before measurement represents a quantum bit (qubit). A sequence of one or more digits constitutes digital data that is used to represent a number or code for a character. In some embodiments, information called analog data is represented by a near continuum of measurable values within a particular range. Computer system 1200, or a portion thereof, constitutes a means for performing one or more steps of converting color data into one or more musical notes.
[0113] A bus 1210 includes one or more parallel conductors of information so that information is transferred quickly among devices coupled to the bus 1210. One or more processors 1202 for processing information are coupled with the bus 1210. [0114] A processor (or multiple processors) 1202 performs a set of operations on information as specified by computer program code related to converting color data into one or more musical notes. The computer program code is a set of instructions or statements providing instructions for the operation of the processor and/or the computer system to perform specified functions. The code, for example, may be written in a computer programming language that is compiled into a native instruction set of the processor. The code may also be written directly using the native instruction set (e.g., machine language). The set of operations include bringing information in from the bus 1210 and placing information on the bus 1210. The set of operations also typically include comparing two or more units of information, shifting positions of units of information, and combining two or more units of information, such as by addition or multiplication or logical operations like OR, exclusive OR (XOR), and AND. Each operation of the set of operations that can be performed by the processor is represented to the processor by information called instructions, such as an operation code of one or more digits. A sequence of operations to be executed by the processor 1202, such as a sequence of operation codes, constitute processor instructions, also called computer system instructions or, simply, computer instructions. Processors may be implemented as mechanical, electrical, magnetic, optical, chemical, or quantum components, among others, alone or in combination.
[0115] Computer system 1200 also includes a memory 1204 coupled to bus 1210. The memory 1204, such as a random access memory (RAM) or any other dynamic storage device, stores information including processor instructions for converting color data into one or more musical notes. Dynamic memory allows information stored therein to be changed by the computer system 1200. RAM allows a unit of information stored at a location called a memory address to be stored and retrieved independently of information at neighboring addresses. The memory 1204 is also used by the processor 1202 to store temporary values during execution of processor instructions. The computer system 1200 also includes a read only memory (ROM) 1206 or any other static storage device coupled to the bus 1210 for storing static information, including instructions, that is not changed by the computer system 1200. Some memory is composed of volatile storage that loses the information stored thereon when power is lost. Also coupled to bus 1210 is a non-volatile (persistent) storage device 1208, such as a magnetic disk, optical disk or flash card, for storing information, including instructions, that persists even when the computer system 1200 is turned off or otherwise loses power.
[0116] Information, including instructions for converting color data into one or more musical notes, is provided to the bus 1210 for use by the processor from an external input device 1212, such as a keyboard containing alphanumeric keys operated by a human user, a microphone, an Infrared (IR) remote control, a joystick, a game pad, a stylus pen, a touch screen, or a sensor. A sensor detects conditions in its vicinity and transforms those detections into physical expression compatible with the measurable phenomenon used to represent information in computer system 1200. Other external devices coupled to bus 1210, used primarily for interacting with humans, include a display device 1214, such as a cathode ray tube (CRT), a liquid crystal display (LCD), a light emitting diode (LED) display, an organic LED (OLED) display, a plasma screen, or a printer for presenting text or images, and a pointing device 1216, such as a mouse, a trackball, cursor direction keys, or a motion sensor, for controlling a position of a small cursor image presented on the display 1214 and issuing commands associated with graphical elements presented on the display 1214, and one or more camera sensors 1294 for capturing, recording and causing to store one or more still and/or moving images (e.g., videos, movies, etc.) which also may comprise audio recordings. In some embodiments, for example, in embodiments in which the computer system 1200 performs all functions automatically without human input, one or more of external input device 1212, display device 1214 and pointing device 1216 may be omitted.
[0117] In the illustrated embodiment, special purpose hardware, such as an application specific integrated circuit (ASIC) 1220, is coupled to bus 1210. The special purpose hardware is configured to perform operations not performed by processor 1202 quickly enough for special purposes. Examples of ASICs include graphics accelerator cards for generating images for display 1214, cryptographic boards for encrypting and decrypting messages sent over a network, speech recognition, and interfaces to special external devices, such as robotic arms and medical scanning equipment that repeatedly perform some complex sequence of operations that are more efficiently implemented in hardware. [0118] Computer system 1200 also includes one or more instances of a communications interface 1270 coupled to bus 1210. Communication interface 1270 provides a one-way or two- way communication coupling to a variety of external devices that operate with their own processors, such as printers, scanners and external disks. In general the coupling is with a network link 1278 that is connected to a local network 1280 to which a variety of external devices with their own processors are connected. For example, communication interface 1270 may be a parallel port or a serial port or a universal serial bus (USB) port on a personal computer. In some embodiments, communications interface 1270 is an integrated services digital network (ISDN) card or a digital subscriber line (DSL) card or a telephone modem that provides an information communication connection to a corresponding type of telephone line. In some embodiments, a communication interface 1270 is a cable modem that converts signals on bus 1210 into signals for a communication connection over a coaxial cable or into optical signals for a communication connection over a fiber optic cable. As another example, communications interface 1270 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, such as Ethernet. Wireless links may also be implemented. For wireless links, the communications interface 1270 sends and/or receives electrical, acoustic or electromagnetic signals, including infrared and optical signals, that carry information streams, such as digital data. For example, in wireless handheld devices, such as mobile telephones like cell phones, the communications interface 1270 includes a radio band electromagnetic transmitter and receiver called a radio transceiver. In certain embodiments, the communications interface 1270 enables connection to the communication network 105 for converting color data into one or more musical notes to the UE 101.
[0119] The term“computer-readable medium” as used herein refers to any medium that participates in providing information to processor 1202, including instructions for execution. Such a medium may take many forms, including, but not limited to computer-readable storage medium (e.g., non-volatile media, volatile media), and transmission media. Non-transitory media, such as non-volatile media, include, for example, optical or magnetic disks, such as storage device 1208. Volatile media include, for example, dynamic memory 1204. Transmission media include, for example, twisted pair cables, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves. Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, an EEPROM, a flash memory, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read. The term computer-readable storage medium is used herein to refer to any computer- readable medium except transmission media.
[0120] Logic encoded in one or more tangible media includes one or both of processor instructions on a computer-readable storage media and special purpose hardware, such as ASIC 1220
[0121] Network link 1278 typically provides information communication using transmission media through one or more networks to other devices that use or process the information. For example, network link 1278 may provide a connection through local network 1280 to a host computer 1282 or to equipment 1284 operated by an Internet Service Provider (ISP). ISP equipment 1284 in turn provides data communication services through the public, world-wide packet-switching communication network of networks now commonly referred to as the Internet 1290.
[0122] A computer called a server host 1292 connected to the Internet hosts a process that provides a service in response to information received over the Internet. For example, server host 1292 hosts a process that provides information representing video data for presentation at display 1214. It is contemplated that the components of system 1200 can be deployed in various configurations within other computer systems, e.g., host 1282 and server 1292.
[0123] At least some embodiments of the invention are related to the use of computer system 1200 for implementing some or all of the techniques described herein. According to one embodiment of the invention, those techniques are performed by computer system 1200 in response to processor 1202 executing one or more sequences of one or more processor instructions contained in memory 1204. Such instructions, also called computer instructions, software and program code, may be read into memory 1204 from another computer-readable medium such as storage device 1208 or network link 1278. Execution of the sequences of instructions contained in memory 1204 causes processor 1202 to perform one or more of the method steps described herein. In alternative embodiments, hardware, such as ASIC 1220, may be used in place of or in combination with software to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware and software, unless otherwise explicitly stated herein.
[0124] The signals transmitted over network link 1278 and other networks through communications interface 1270, carry information to and from computer system 1200. Computer system 1200 can send and receive information, including program code, through the networks 1280 and 1290, among others, through network link 1278, and communications interface 1270. In an example using the Internet 1290, a server host 1292 transmits program code for a particular application, requested by a message sent from computer 1200, through Internet 1290, ISP equipment 1284, local network 1280 and communications interface 1270. The received code may be executed by processor 1202 as it is received, or may be stored in memory 1204 or in storage device 1208 or any other non-volatile storage for later execution, or both. In this manner, computer system 1200 may obtain application program code in the form of signals on a carrier wave.
[0125] Various forms of computer readable media may be involved in carrying one or more sequence of instructions or data or both to processor 1202 for execution. For example, instructions and data may initially be carried on a magnetic disk of a remote computer such as host 1282. The remote computer loads the instructions and data into its dynamic memory and sends the instructions and data over a telephone line using a modem. A modem local to the computer system 1200 receives the instructions and data on a telephone line and uses an infra red transmitter to convert the instructions and data to a signal on an infra-red carrier wave serving as the network link 1278. An infrared detector serving as communications interface 1270 receives the instructions and data carried in the infrared signal and places information representing the instructions and data onto bus 1210. Bus 1210 carries the information to memory 1204 from which processor 1202 retrieves and executes the instructions using some of the data sent with the instructions. The instructions and data received in memory 1204 may optionally be stored on storage device 1208, either before or after execution by the processor 1202
[0126] FIG. 13 illustrates a chip set or chip 1300 upon which an embodiment of the invention may be implemented. Chip set 1300 is programmed to convert color data into one or more musical notes as described herein and includes, for instance, the processor and memory components described with respect to FIG. 12 incorporated in one or more physical packages (e.g., chips). By way of example, a physical package includes an arrangement of one or more materials, components, and/or wires on a structural assembly (e.g., a baseboard) to provide one or more characteristics such as physical strength, conservation of size, and/or limitation of electrical interaction. It is contemplated that in certain embodiments the chip set 1300 can be implemented in a single chip. It is further contemplated that in certain embodiments the chip set or chip 1300 can be implemented as a single“system on a chip.” It is further contemplated that in certain embodiments a separate ASIC would not be used, for example, and that all relevant functions as disclosed herein would be performed by a processor or processors. Chip set or chip 1300, or a portion thereof, constitutes a means for performing one or more steps of providing user interface navigation information associated with the availability of functions. Chip set or chip 1300, or a portion thereof, constitutes a means for performing one or more steps of converting color data into one or more musical notes.
[0127] In one embodiment, the chip set or chip 1300 includes a communication mechanism such as a bus 1301 for passing information among the components of the chip set 1300. A processor 1303 has connectivity to the bus 1301 to execute instructions and process information stored in, for example, a memory 1305. The processor 1303 may include one or more processing cores with each core configured to perform independently. A multi-core processor enables multiprocessing within a single physical package. Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores. Alternatively or in addition, the processor 1303 may include one or more microprocessors configured in tandem via the bus 1301 to enable independent execution of instructions, pipelining, and multithreading. The processor 1303 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 1307, or one or more application-specific integrated circuits (ASIC) 1309. A DSP 1307 typically is configured to process real-world signals (e.g., sound) in real time independently of the processor 1303. Similarly, an ASIC 1309 can be configured to performed specialized functions not easily performed by a more general purpose processor. Other specialized components to aid in performing the inventive functions described herein may include one or more field programmable gate arrays (FPGA), one or more controllers, or one or more other special- purpose computer chips.
[0128] In one embodiment, the chip set or chip 1300 includes merely one or more processors and some software and/or firmware supporting and/or relating to and/or for the one or more processors.
[0129] The processor 1303 and accompanying components have connectivity to the memory 1305 via the bus 1301. The memory 1305 includes both dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.) and static memory (e.g., ROM, CD-ROM, etc.) for storing executable instructions that when executed perform the inventive steps described herein to convert color data into one or more musical notes. The memory 1305 also stores the data associated with or generated by the execution of the inventive steps.
[0130] FIG. 14 is a diagram of exemplary components of a mobile terminal (e.g., handset) for communications, which is capable of operating in the system of FIG. 1, according to one embodiment. In some embodiments, mobile terminal 1401, or a portion thereof, constitutes a means for performing one or more steps of converting color data into one or more musical notes. Generally, a radio receiver is often defined in terms of front-end and back-end characteristics. The front-end of the receiver encompasses all of the Radio Frequency (RF) circuitry whereas the back-end encompasses all of the base-band processing circuitry. As used in this application, the term“circuitry” refers to both: (1) hardware-only implementations (such as implementations in only analog and/or digital circuitry), and (2) to combinations of circuitry and software (and/or firmware) (such as, if applicable to the particular context, to a combination of processor(s), including digital signal processor(s), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions). This definition of “circuitry” applies to all uses of this term in this application, including in any claims. As a further example, as used in this application and if applicable to the particular context, the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) and its (or their) accompanying software/or firmware. The term“circuitry” would also cover if applicable to the particular context, for example, a baseband integrated circuit or applications processor integrated circuit in a mobile phone or a similar integrated circuit in a cellular network device or other network devices.
[0131] Pertinent internal components of the telephone include a Main Control Unit (MCU) 1403, a Digital Signal Processor (DSP) 1405, and a receiver/transmitter unit including a microphone gain control unit and a speaker gain control unit. A main display unit 1407 provides a display to the user in support of various applications and mobile terminal functions that perform or support the steps of converting color data into one or more musical notes. The display 1407 includes display circuitry configured to display at least a portion of a user interface of the mobile terminal (e.g., mobile telephone). Additionally, the display 1407 and display circuitry are configured to facilitate user control of at least some functions of the mobile terminal. An audio function circuitry 1409 includes a microphone 1411 and microphone amplifier that amplifies the speech signal output from the microphone 1411. The amplified speech signal output from the microphone 1411 is fed to a coder/decoder (CODEC) 1413.
[0132] A radio section 1415 amplifies power and converts frequency in order to communicate with a base station, which is included in a mobile communication system, via antenna 1417. The power amplifier (PA) 1419 and the transmitter/modulation circuitry are operationally responsive to the MCU 1403, with an output from the PA 1419 coupled to the duplexer 1421 or circulator or antenna switch, as known in the art. The PA 1419 also couples to a battery interface and power control unit 1420.
[0133] In use, a user of mobile terminal 1401 speaks into the microphone 1411 and his or her voice along with any detected background noise is converted into an analog voltage. The analog voltage is then converted into a digital signal through the Analog to Digital Converter (ADC) 1423. The control unit 1403 routes the digital signal into the DSP 1405 for processing therein, such as speech encoding, channel encoding, encrypting, and interleaving. In one embodiment, the processed voice signals are encoded, by units not separately shown, using a cellular transmission protocol such as enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), satellite, and the like, or any combination thereof.
[0134] The encoded signals are then routed to an equalizer 1425 for compensation of any frequency-dependent impairments that occur during transmission though the air such as phase and amplitude distortion. After equalizing the bit stream, the modulator 1427 combines the signal with a RF signal generated in the RF interface 1429. The modulator 1427 generates a sine wave by way of frequency or phase modulation. In order to prepare the signal for transmission, an up-converter 1431 combines the sine wave output from the modulator 1427 with another sine wave generated by a synthesizer 1433 to achieve the desired frequency of transmission. The signal is then sent through a PA 1419 to increase the signal to an appropriate power level. In practical systems, the PA 1419 acts as a variable gain amplifier whose gain is controlled by the DSP 1405 from information received from a network base station. The signal is then filtered within the duplexer 1421 and optionally sent to an antenna coupler 1435 to match impedances to provide maximum power transfer. Finally, the signal is transmitted via antenna 1417 to a local base station. An automatic gain control (AGC) can be supplied to control the gain of the final stages of the receiver. The signals may be forwarded from there to a remote telephone which may be another cellular telephone, any other mobile phone or a land-line connected to a Public Switched Telephone Network (PSTN), or other telephony networks.
[0135] Voice signals transmitted to the mobile terminal 1401 are received via antenna 1417 and immediately amplified by a low noise amplifier (LNA) 1437. A down-converter 1439 lowers the carrier frequency while the demodulator 1441 strips away the RF leaving only a digital bit stream. The signal then goes through the equalizer 1425 and is processed by the DSP 1405. A Digital to Analog Converter (DAC) 1443 converts the signal and the resulting output is transmitted to the user through the speaker 1445, all under control of a Main Control Unit (MCU) 1403 which can be implemented as a Central Processing Unit (CPU).
[0136] The MCU 1403 receives various signals including input signals from the keyboard 1447. The keyboard 1447 and/or the MCU 1403 in combination with other user input components (e.g., the microphone 1411) comprise a user interface circuitry for managing user input. The MCU 1403 runs a user interface software to facilitate user control of at least some functions of the mobile terminal 1401 to convert color data into one or more musical notes. The MCU 1403 also delivers a display command and a switch command to the display 1407 and to the speech output switching controller, respectively. Further, the MCU 1403 exchanges information with the DSP 1405 and can access an optionally incorporated SIM card 1449 and a memory 1451. In addition, the MCU 1403 executes various control functions required of the terminal. The DSP 1405 may, depending upon the implementation, perform any of a variety of conventional digital processing functions on the voice signals. Additionally, DSP 1405 determines the background noise level of the local environment from the signals detected by microphone 1411 and sets the gain of microphone 1411 to a level selected to compensate for the natural tendency of the user of the mobile terminal 1401.
[0137] The CODEC 1413 includes the ADC 1423 and DAC 1443. The memory 1451 stores various data including call incoming tone data and is capable of storing other data including music data received via, e.g., the global Internet. The software module could reside in RAM memory, flash memory, registers, or any other form of writable storage medium known in the art. The memory device 1451 may be, but not limited to, a single memory, CD, DVD, ROM, RAM, EEPROM, optical storage, magnetic disk storage, flash memory storage, or any other non volatile storage medium capable of storing digital data.
[0138] An optionally incorporated SIM card 1449 carries, for instance, important information, such as the cellular phone number, the carrier supplying service, subscription details, and security information. The SIM card 1449 serves primarily to identify the mobile terminal 11401 on a radio network. The card 1449 also contains a memory for storing a personal telephone number registry, text messages, and user specific mobile terminal settings.
[0139] Further, one or more camera sensors 1453 may be incorporated onto the mobile station 1401 wherein the one or more camera sensors may be placed at one or more locations on the mobile station. Generally, the camera sensors may be utilized to capture, record, and cause to store one or more still and/or moving images (e.g., videos, movies, etc.) which also may comprise audio recordings.
[0140] While the invention has been described in connection with a number of embodiments and implementations, the invention is not so limited but covers various obvious modifications and equivalent arrangements, which fall within the purview of the appended claims. Although features of the invention are expressed in certain combinations among the claims, it is contemplated that these features can be arranged in any combination and order.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A computer-implemented method comprising:
presenting a user interface comprising a first user interface element and a second user
interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition;
assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note;
receiving, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys; and
rendering a color of an active slot of the composition area in the second user interface
element based on the one or more respective colors assigned to the played key and the respective musical note to represent a composed musical note of the musical composition.
2. The method of claim 1, further comprising:
receiving another user input for extending or reducing a length of the active slot, wherein the length of the active slot represents a duration of the corresponding musical note for the active slot.
3. The method according to any of claims 1 and 2, further comprising:
after receiving the user input that indicates the interaction, designating a next slot in the second user interface element as a next active slot for creating the musical composition.
4. The method according to any of claims 1-3, wherein the representation of the musical keyboard includes a full-length keyboard representation that highlights an active octave of a zoomed keyboard representation, and wherein the user input that indicates the interaction with the at least one key is received via the zoomed keyboard representation.
5. The method of claim 4, further comprising:
rendering the one or more respective colors of the plurality of keys of the representation with different saturation values to indicate different octaves of the respective musical notes.
6. The method according to any of claims 1-5, wherein the user input includes a touch-based input, a voice-based input, or a combination thereof.
7. The method according to any of claims 1-6, further comprising:
replacing the one or more respective colors based, at least in part, on user preference; and assigning the replaced colors to the plurality of keys.
8. The method according to any of claims 1-7, further comprising:
modifying the plurality of slots in the composition area,
wherein the modification includes altering a total number of the plurality of slots in the composition area.
9. The method according to any of claims 1-8, further comprising:
uploading audio data;
determining one or more attributes of the audio data, wherein the one or more attributes include one or more musical notes;
mapping the one or more musical notes to one or more respective colors; and
generating a visual representation of the audio data in the composition area.
10. The method of claim 9, wherein the audio data is uploaded from one or more databases, a recording by a microphone, or a combination thereof.
11. The method according to any of claims 1-10, wherein the user interface is a virtual user interface in a computer application.
12. The method of claim 11, wherein the computer application is executable on a mobile device.
13. An apparatus comprising:
at least one processor; and
at least one memory including computer program code for one or more programs, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
present a user interface comprising a first user interface element and a second user
interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition;
assign one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note;
receive, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys; and
render a color of an active slot of the composition area in the second user interface
element based on the one or more respective colors assigned to the played key and the respective musical note to represent a composed musical note of the musical composition.
14. The apparatus of claim 13, wherein the apparatus is further caused to:
receive another user input for extending or reducing a length of the active slot, wherein the length of the active slot represents a duration of the corresponding musical note for the active slot.
15. The apparatus according to any of claims 13 and 14, wherein the apparatus is further caused to:
after receiving the user input that indicates the interaction, designate a next slot in the second user interface element as a next active slot for creating the musical composition.
16. The apparatus according to any of claims 13-15, wherein the representation of the musical keyboard includes a full-length keyboard representation that highlights an active octave of a zoomed keyboard representation, and wherein the user input that indicates the interaction with the at least one key is received via the zoomed keyboard representation.
17. The apparatus of claim 16, wherein the apparatus is further caused to:
render the one or more respective colors of the plurality of keys of the representation with different saturation values to indicate different octaves of the respective musical notes.
18. The apparatus according to any of claims 13-17, wherein the user input includes a touch- based input, a voice-based input, or a combination thereof.
19. The apparatus according to any of claims 13-18, wherein the apparatus is further caused to:
replace the one or more respective colors based, at least in part, on user preference; and assign the replaced colors to the plurality of keys.
20. The apparatus according to any of claims 13-19, wherein the apparatus is further caused to: modify the plurality of slots in the composition area,
wherein the modification includes altering a total number of the plurality of slots in the composition area.
21. The apparatus according to any of claims 13-20, wherein the apparatus is further caused to:
upload an audio data;
determine one or more attributes of the audio data, wherein the one or more attributes include one or more musical notes;
map the one or more musical notes to one or more respective colors; and
generate a visual representation of the audio data in the composition area.
22. The apparatus of claim 21, wherein the audio data is uploaded from one or more databases, a recording by a microphone, or a combination thereof.
23. The apparatus according to any of claims 13-22, wherein the user interface is a virtual user interface in a computer application.
24. The apparatus of claim 23, wherein the computer application is executable on a mobile device.
25. A computer-readable storage medium carrying one or more sequences of one or more instructions which, when executed by one or more processors, cause an apparatus to at least perform the following steps:
presenting a user interface comprising a first user interface element and a second user
interface element, wherein the first user interface element presents a representation of a musical keyboard for creating a musical composition, and wherein the second user interface element presents a composition area comprising a plurality of slots for creating a visual representation of the musical composition; assigning one or more respective colors to a plurality of keys of the representation of the musical keyboard, wherein each of the plurality of keys corresponds to a respective musical note;
receiving, via the first user interface element, a user input that indicates interaction with a played key of the plurality of keys; and
rendering a color of an active slot of the composition area in the second user interface
element based on the one or more respective colors assigned to the played key and the respective musical note to represent a composed musical note of the musical
composition.
26. A computer-readable storage medium of claim 25, wherein the apparatus is further caused to perform:
receiving another user input for extending or reducing a length of the active slot, wherein the length of the active slot represents a duration of the corresponding musical note for the active slot.
27. A computer-readable storage medium according to any of claims 25 and 26, wherein the apparatus is further caused to perform:
after receiving the user input that indicates the interaction, designating a next slot in the second user interface element as a next active slot for creating the musical composition.
28. A computer-readable storage medium according to any of claims 25-27, wherein the representation of the musical keyboard includes a full-length keyboard representation that highlights an active octave of a zoomed keyboard representation, and wherein the user input that indicates the interaction with the at least one key is received via the zoomed keyboard representation.
29. A computer-readable storage medium according to any of claims 25-28, wherein the apparatus is further caused to perform: rendering the one or more respective colors of the plurality of keys of the representation with different saturation values to indicate different octaves of the respective musical notes.
30. A computer-readable storage medium according to any of claims 25-29, wherein the user input includes a touch-based input, a voice-based input, or a combination thereof.
31. A computer-readable storage medium according to any of claims 25-30, wherein the apparatus is further caused to perform:
replacing the one or more respective colors based, at least in part, on user preference; and assigning the replaced colors to the plurality of keys.
32. A computer-readable storage medium according to any of claims 25-31, wherein the apparatus is further caused to perform:
modifying the plurality of slots in the composition area, wherein the modification includes altering a total number of the plurality of slots in the composition area.
33. A computer-readable storage medium according to any of claims 25-32, wherein the apparatus is further caused to perform:
uploading an audio data;
determining one or more attributes of the audio data, wherein the one or more attributes include one or more musical notes;
mapping the one or more musical notes to one or more respective colors; and
generating a visual representation of the audio data in the composition area.
34. A computer-readable storage medium of claim 33, wherein the audio data is uploaded from one or more databases, a recording by a microphone, or a combination thereof.
35. A computer-readable storage medium according to any of claims 25-34, wherein the user interface is a virtual user interface in a computer application.
36. A computer-readable storage medium of claim 35, wherein the computer application is executable on a mobile device.
37. A computer program product including one or more sequences of one or more instructions which, when executed by one or more processors, cause an apparatus to at least perform the steps of the method of any of claims 1-12.
38. A method comprising facilitating access to at least one interface configured to allow access to at least one service, the at least one service configured to perform the method of any of claims 1-12.
39. A method comprising facilitating a processing of and/or processing (1) data and/or (2) information and/or (3) at least one signal, the (1) data and/or (2) information and/or (3) at least one signal based, at least in part, on the method of any of claims 1-12.
40. A method comprising facilitating creating and/or facilitating modifying (1) at least one device user interface element and/or (2) at least one device user interface functionality, the (1) at least one device user interface element and/or (2) at least one device user interface functionality based, at least in part, on the method of any of claims 1-12.
PCT/US2019/024370 2018-03-27 2019-03-27 Method and apparatus for providing an application user interface for generating color-encoded music WO2019191291A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862648727P 2018-03-27 2018-03-27
US62/648,727 2018-03-27

Publications (1)

Publication Number Publication Date
WO2019191291A1 true WO2019191291A1 (en) 2019-10-03

Family

ID=68060768

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/024370 WO2019191291A1 (en) 2018-03-27 2019-03-27 Method and apparatus for providing an application user interface for generating color-encoded music

Country Status (1)

Country Link
WO (1) WO2019191291A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190088237A1 (en) * 2017-09-10 2019-03-21 Rocco Anthony DePietro, III System and Method of Generating Signals from Images

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080289477A1 (en) * 2007-01-30 2008-11-27 Allegro Multimedia, Inc Music composition system and method
US7750224B1 (en) * 2007-08-09 2010-07-06 Neocraft Ltd. Musical composition user interface representation
US20120067195A1 (en) * 2010-09-22 2012-03-22 Skaggs Merrie L Educational method and apparatus to simultaneously teach reading and composing music
US20140260898A1 (en) * 2013-03-14 2014-09-18 Joshua Ryan Bales Musical Note Learning System
US20150107441A1 (en) * 2013-10-22 2015-04-23 National Chiao Tung University Color-based music output system and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080289477A1 (en) * 2007-01-30 2008-11-27 Allegro Multimedia, Inc Music composition system and method
US7750224B1 (en) * 2007-08-09 2010-07-06 Neocraft Ltd. Musical composition user interface representation
US20120067195A1 (en) * 2010-09-22 2012-03-22 Skaggs Merrie L Educational method and apparatus to simultaneously teach reading and composing music
US20140260898A1 (en) * 2013-03-14 2014-09-18 Joshua Ryan Bales Musical Note Learning System
US20150107441A1 (en) * 2013-10-22 2015-04-23 National Chiao Tung University Color-based music output system and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190088237A1 (en) * 2017-09-10 2019-03-21 Rocco Anthony DePietro, III System and Method of Generating Signals from Images

Similar Documents

Publication Publication Date Title
US9078091B2 (en) Method and apparatus for generating media based on media elements from multiple locations
US20210287643A1 (en) Method and apparatus for converting color data into musical notes
US9390091B2 (en) Method and apparatus for providing multimedia summaries for content information
US9696884B2 (en) Method and apparatus for generating personalized media streams
US20150326688A1 (en) Method and apparatus for providing segment-based recommendations
US9418346B2 (en) Method and apparatus for providing a drawer-based user interface for content access or recommendation
US20130268414A1 (en) Method and apparatus for providing services using connecting user interface elements
US20190304328A1 (en) Method and apparatus for colored music notation
US20130291708A1 (en) Virtual audio effects package and corresponding network
WO2022253157A1 (en) Audio sharing method and apparatus, device, and medium
US9281793B2 (en) Systems, methods, and apparatus for generating an audio signal based on color values of an image
EP2760014A1 (en) Method for making audio file and terminal device
US20240126403A1 (en) Interaction method and apparatus, medium, and electronic device
US20190026366A1 (en) Method and device for playing video by each segment of music
US20130031497A1 (en) Method and apparatus for enabling multi-parameter discovery and input
WO2019191291A1 (en) Method and apparatus for providing an application user interface for generating color-encoded music
US20240169961A1 (en) Method and apparatus for determining audio, electronic device, and storage medium
US9389594B2 (en) Method and apparatus for providing an interactive cable-based interface to applications and services
US20080110323A1 (en) Interactive composition palette
WO2019191117A1 (en) System, method, and apparatus for providing musical instruction using coloring sheets based on a color-encoded musical notation system
CN108509498A (en) Execute the electronic equipment and method of music related application
WO2013005262A1 (en) Image extraction method, image extraction device, image extraction system, server, user terminal, communication system and program
WO2016158215A1 (en) Information processing system
Harvell Make music with your iPad
US20230086518A1 (en) Systems And Methods For Providing Paint Colors Based On Music

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19776555

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19776555

Country of ref document: EP

Kind code of ref document: A1