WO2020154982A1 - System and method for composing music with physical cards - Google Patents

System and method for composing music with physical cards Download PDF

Info

Publication number
WO2020154982A1
WO2020154982A1 PCT/CN2019/074004 CN2019074004W WO2020154982A1 WO 2020154982 A1 WO2020154982 A1 WO 2020154982A1 CN 2019074004 W CN2019074004 W CN 2019074004W WO 2020154982 A1 WO2020154982 A1 WO 2020154982A1
Authority
WO
WIPO (PCT)
Prior art keywords
music
interactive surface
physical
digital file
user
Prior art date
Application number
PCT/CN2019/074004
Other languages
French (fr)
Inventor
Zheng Shi
Peiyuan Yang
Original Assignee
Zheng Shi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zheng Shi filed Critical Zheng Shi
Priority to PCT/CN2019/074004 priority Critical patent/WO2020154982A1/en
Publication of WO2020154982A1 publication Critical patent/WO2020154982A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B15/00Teaching music

Definitions

  • the present invention and its embodiments relate to the field of music application on electronic devices, and in particular, an interactive surface using physical cards to compose and play music.
  • Music is the one of the most beautiful creations of the human race. Music is expressed in complex forms, syntaxes, colors and articulations that are rooted both in the human biology and elegant mathematics.
  • ReacTable is an electronic musical instrument that has been covered extensively in the public domain. With ReacTable, through the manipulation of physical objects, music is changed and played in real-time, allowing a person to experiment with different types of music sound. However, we see ReacTable having many limitations as follows, in terms of empowering a person to compose music:
  • ReacTable is for the purpose of real-time modification, suitable for live music performance. However, it is not well-suited for the purpose of composing music.
  • End-user experience with ReacTable does not start with creating an original music segment by the end-user. Instead, it starts with an existing music piece by the vendor.
  • ReacTable transforms the music based on the nature of the blocks, and geometry of the blocks such as distances and angles.
  • the modified music is part of the real-time performance.
  • end-users push the blocks around for testing new sound effects, with the underlying logic of music transformation not apparent or even well defined from the perspective of the end-user.
  • ReacTable does not yet provide means for a visual notation of the modified music.
  • US9183755B2 system and method that allow an end-user to compose music are disclosed.
  • one limitation of US9183755B2 is that it does not teach composing music based on what an end-user plays, on a keyboard for example. Nor does it teach associating such play with a definitive machine-readable physical card with a unique ID, therefore creating the possibility that such play is captured with a physical card and is later retrieved for further music composition purposes. Rather, US9183755B2 teaches composing music based on cards that have already been associated with definitive music elements. Another limitation of US9183755B2 is that it does not teach composing music by incorporating the rich expressive qualities of what an end-user plays in real-time.
  • create modifications to what is played by an end-user into a new music piece.
  • the system includes a physical music card, a physical sheet and an interactive surface.
  • the physical music card has an identifier
  • the physical sheet has an identifier and is visually marked with icons of music elements.
  • the interactive surface recognizes the identifiers of the physical music card and the physical sheet as well as finger touches on the icon of a music element.
  • the interactive surface Upon an end-user touching the icons of one or more music elements on the physical sheet placed on the interactive surface, the interactive surface generates a first digital file representing a user music fragment based on the touch action, and associates the first digital file with the identifier of the physical music card.
  • the interactive surface further retrieves the first digital file associated with the identifier of the physical music card.
  • the first digital file is then further processed, transformed, or played, based on instructions from the end-user, for example, through the manipulation of the physical music card.
  • the interactive surface generates a digital profile of the touch action, and incorporates the digital profile into the first digital file.
  • the system further includes a physical coding card that has an identifier and is visually marked with a coding symbol.
  • the interactive surface transforms the first digital file that represents the user music fragment into a second digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding symbol.
  • the system further includes multiple physical coding cards.
  • the interactive surface Upon the physical music card being placed into a coding pattern comprising these physical coding cards, the interactive surface generates a third digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding pattern.
  • the system further includes multiple physical music cards.
  • the interactive surface Upon the multiple physical music cards being placed together on the interactive surface, the interactive surface generates multiple digital files, each associated with the identifier of one of the physical music cards, and combines the multiple digital files into a master digital file representing a single track music piece.
  • the interactive surface is further instructed to organize the multiple digital files into yet another master digital file representing a music piece of multiple tracks.
  • the identifier is defined by a unique identification code (UID) , and the UID is encoded with a radio frequency identification chip, a pattern of capacitive tabs, or a pattern of magnetic tabs.
  • the interactive surface recognizes the UID through a reader or a sensor array that is operatively linked to the interactive surface.
  • the identifier is defined by a unique optical pattern
  • the interactive surface further includes a camera that is operatively linked to the interactive surface and recognizes the unique optical pattern
  • the interactive surface is configured to precisely record the durations of each of the touch actions, and is further configured to modify the duration of a music note from the touch action of an end-user, when or after the first digital file is generated. And the interactive surface is further configured to generate an accurate notation of the music note after the durations of music notes within the user music fragment has been modified.
  • the present invention provides a method for composing music with physical cards, including the following steps:
  • FIG. 1 is an exemplary schematic diagram illustrating a system for composing music with physical cards, using an interactive surface, a physical sheet and multiple physical cards in accordance with one embodiment of the present invention.
  • FIG. 2 is an exemplary schematic diagram illustrating designs of music elements in accordance with one embodiment of the present invention.
  • FIG. 3 is an exemplary schematic diagram illustrating the music card being placed together with one coding card in accordance with one embodiment of the present invention.
  • FIG. 4 is an exemplary schematic diagram illustrating designs of coding symbols in accordance with one embodiment of the present invention.
  • FIG. 5 is an exemplary schematic diagram illustrating the music card being placed into a coding pattern formed by multiple coding cards in accordance with one embodiment of the present invention.
  • FIG. 6 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a single track music piece in accordance with one embodiment of the present invention.
  • FIG. 7 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a music piece of multiple tracks in accordance with one embodiment of the present invention.
  • FIG. 8 is an exemplary schematic diagram illustrating multiple music elements being used to modify the duration of a music note in accordance with one embodiment of the present invention.
  • FIG. 9 is an exemplary schematic diagram illustrating the process flow in accordance to one embodiment of the present invention.
  • Fig. 1 is an exemplary schematic diagram illustrating a system for composing music with physical cards, using an interactive surface, a physical sheet and multiple physical cards in accordance with embodiments of the present invention.
  • the system includes an interactive surface 101, a physical sheet 102, and a music card 103.
  • the music card 103 has an identifier 110
  • the physical sheet 102 has an identifier 120 and is visually marked with icons 130 of music elements.
  • the interactive surface 101 is operatively linked to a processor, though the processor is not shown in Fig. 1.
  • the identifiers 110 and 120 are defined by a unique identification code (UID) that is embedded in the physical sheet 102 or the music card 103.
  • the interactive surface 101 recognizes the UID through a reader or a sensor array that is operatively linked to the interactive surface 101.
  • the UID is encoded with a radio frequency identification chip, a unique pattern of capacitive tabs, or a unique pattern of magnetic tabs.
  • the interactive surface 101 recognizes the UIDs of the physical sheet 102 or the music card 103 via a radio frequency antenna, an array of capacitive sensor switches, or an array of magnetic sensor switches respectively, and transmits that information to the processor.
  • the identifiers 110 and 120 are defined by a unique optical pattern of the object.
  • the unique optical pattern is a marking of music notation, a marking of linguistic or mathematical notation, a marking of icons or graphics, or a unique pattern of 2D or 3D shape, color and texture visible or invisible to the human eye.
  • Such identifiers are either pre-fabricated on physical objects, or marked on the objects by a user with such marking being either permanent or erasable.
  • the interactive surface 101 further includes a camera that is operatively linked to the interactive surface 101 and recognizes the unique optical pattern through the camera.
  • the interactive surface 101 can be a tabletop on which the camera recognizes what is happening, which makes the tabletop an “interactive surface” .
  • the interactive surface 101 recognizes finger touches on the icon 130 of a music element.
  • the interactive surface 101 Upon an end-user touching the icons 130 of one or more music elements, e.g., a few icons of the white keys of the keyboard and one of the icons of the various musical instruments, on the physical sheet 102 placed on the interactive surface 101, the interactive surface 101 generates a first digital file representing such user generated music fragment based on the touch action, and associates the first digital file with the identifier 110 of the music card 103.
  • the first digital file could be an audio file in any format, e.g., a MIDI file, and typically serves as either a melody or a chord.
  • the first digital file could be a single file or a set of files.
  • the music card 103 having been associated with an end-user generated music fragment, is further utilized by the end-user in a music fragment, or stored away for future use.
  • the interactive surface 101 upon the music card 103 being placed on the interactive surface 101 for the second time, the interactive surface 101 further retrieves the first digital file associated with the identifier 110 of the music card 103.
  • the user music fragment represented by the first digital file is further played by an output device operatively linked to the interactive surface, further modified according to other embodiments of the present invention, and/or incorporated into a large piece of music composition through the use of physical coding cards, according to other embodiments of the present invention.
  • the first digital file, and/or other digital files are sent to a remote server by the interactive surface 101, under the choice and instructions of the end-user.
  • the retrieved first digital file may be played, paused or stopped by touching the UI elements 105 marked on the physical sheet 102.
  • the interactive surface 101 capturing a touch action by the end-user, the interactive surface 101 generates a digital profile of the touch action, transforms it into MIDI parameters, and incorporates the digital profile into a digital file such as a MIDI file.
  • Parameters of the digital profile may include velocity, acceleration, deceleration, force and duration of the touch action.
  • Such digital profile is generated by capacitive sensing, or by a camera that analyzes in real-time the motion of the fingers of the end-user.
  • the interactive surface 101 further instructs a sensory accessory operatively linked to a processor that is operatively linked to the interactive surface to play the digital file that represent a user music fragment, in an original or modified form, to provide interactive feedback to the end user.
  • the sensory accessory is an audio device.
  • the sensory accessory is a visual device that shows the notation of a music fragment or composition generated by the interactive surface 101, with such notation written with typical western music symbols such as ABCDEFG, staffs, clefs, notes, chords, rests, breaks, accidentals, and time signatures, or with typical Solfège music symbols such as Do, Re, Mi, Fa, Sol, La, Ti (or Si) , or with simplified numerical music symbols such as 1, 2, 3, 4, 5, 6, 7.
  • a physical sheet is used as a user interface, for example, to serve the purpose for an end-user to generate a music fragment.
  • Our preference has been to create one embodiment of the present invention that does not involve a display screen. Nonetheless, a display screen is equally if not better suited to serve a user interface for the present invention, from an ease of use perspective.
  • an electronic keyboard that is operatively linked to the interactive surface may also provide means for an end-user to generate a user music fragment, to be associated with a physical music card.
  • Fig. 2 provides examples of music elements.
  • the music element are white keys as part of a typical keyboard instrument (210) , sometimes with the name of a music note marked on each white key, a chord (220) , which may be major chords, minor chords, triad chords or seventh chords, etc., a musical instrument (230) , which may include musical instruments, including percussion, wind and string instruments, and a symbol that represents human voices and sounds from nature, etc., or an indicator of the duration of a music note (240) , e.g. 1/4, 1/2, etc.
  • the music element may also be one of the followings: a rest and the duration of rest; accidentals that alter the pitch of a music note; dots or ties that modify the duration of a music note; key signatures that define the music piece being a major scale or a minor scale; raising or lowering the pitch of a music note in the same pitch class; symbols that define the texture, dynamics and articulation of the music piece, including those that are commonly used and those custom designed to enhance the music for a particular audience, sentiment and purpose.
  • FIG. 3 is an exemplary schematic diagram illustrating the music card being placed together with one coding card in accordance with one embodiment of the present invention.
  • the coding card 301 has an identifier 302 and is visually marked with a coding symbol.
  • the interactive surface 101 retrieves the digital file that represents the original user music fragment and transforms the digital file into a new digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding symbol.
  • the new digital file is transformed as the original digital file played forward with double speed.
  • a physical card described in the present invention including both the music card 103 and the coding card 301, may also be a button, a block, a figurine, or another 2D or 3D block structure that is amenable to be placed next to each other or on top of one another to form clearly recognizable 2D or 3D structural patterns, to both the human eye and the sensors and antennas of the interactive surface.
  • the coding symbols 410 represent the logic of transformation of a digital file, with respect to the speed and direction of the music notes being played. Sequentially, the coding symbols in 410 represent 1) playing backward at double speed; 2) playing forward at double speed; 3) playing backward at single speed; 4) playing forward at single speed; 5) playing backward at quadruple speed; 6) playing forward at quadruple speed; and 7) rest for a defined duration.
  • the coding symbols 420 represent coding logic related to loop functions, including the start of loop, the end of loop and the number of loops, etc.
  • the coding symbols 430 are all related to subroutine functions, which enable one music card to be assigned to a coding pattern.
  • the coding symbols 440 in the second line of Fig. 4 are all related to conditional statements in programming, including IF, ELSEIF, ELSE, ENDIF, to represent pre-defined conditions within a digital file. An altered user music fragment progresses differently depending on whether a pre-defined condition has been met or not.
  • Fig. 5 is an exemplary schematic diagram illustrating the music card being placed into a coding pattern formed by multiple coding cards in accordance with one embodiment of the present invention.
  • multiple coding cards 510 are placed together on the interactive surface to form a coding pattern 520.
  • the interactive surface 101 retrieves the original digital file associated with the music card 103, and generates a new digital file representing an altered user music fragment, with the alteration being based on the logic represented by the coding pattern 520.
  • the coding pattern represents a loop that is to be played four times. Within the loop, the digital file associated with the music card 103 is transformed to be playing backward at double speed, then resting for a time period.
  • MusicFile forwardFile transform (DigitalFile, 2, forward) ;
  • fileList add (forwordFile) ;
  • MusicFile delayFile transform (DigitalFile, delay) ;
  • Fig. 6 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a single track music piece in accordance with one embodiment of the present invention. As seen in Fig. 6, multiple music cards 610 are placed together by end-users on the interactive surface 101. The interactive surface 101 retrieves the digital files associated with each the music cards, and merge them into a single master digital file representing a single track music piece.
  • Fig. 7 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a music piece of multiple tracks in accordance with one embodiment of the present invention.
  • three music cards 710 are separately placed into three tracks 730 on the interactive surface 101, together with multiple coding cards 720.
  • the interactive surface first generates a digital file for each of the three different digital files. And these three digital files can be further organized into three tracks of a master digital file representing a music piece of multiple tracks.
  • the exemplary code for the process is as follows:
  • FIG. 8 is an exemplary schematic diagram illustrating multiple music elements being used to modify the duration of a music note in accordance with one embodiment of the present invention.
  • the touch action by the end-user including the durations of each of the touch actions upon the interactive surface 101, is precisely captured and recorded, with such durations may or may not conform, or conform precisely, to an intended tempo or rhythm.
  • the interactive surface is further configured to modify the duration of a music note from the touch action of an end-user, when or after the first digital file is generated.
  • the durations may be "standardized" to conform to a precise tempo and rhythm, by removing small inaccuracies introduced by the end-user; alternatively, the durations may be changed as to create a new rhythm for the user music fragment.
  • the end-user may touch any of the multiple music elements 810 marked (on a physical sheet placed) on the interactive surface 101 to modify the duration of a music note of the first digital file.
  • the interactive surface is further configured to generate an accurate notation of the music note after the durations of music notes within the user music fragment has been so modified.
  • the notation of the music note can be shown in a display device.
  • Fig. 9 is an exemplary schematic diagram illustrating the process flow in accordance to one embodiment of the present invention.
  • the process flow for the present invention has the following steps:
  • Step 901 touching, by an end-user, upon one or more icons of music elements visually marked on a physical sheet placed on an interactive surface that is configured to recognize the identifier of the physical sheet and finger touches upon the icons of the music elements.
  • This step allows an end-user to freely generate a user music fragment, in a way that is very similar to playing an actual piano keyboard.
  • other types of arrangements of music elements on the physical sheet can be used, which allows an end-user to generate a user music fragment in a free play fashion.
  • the end-user does not even need to have any familiarity with the keyboard -so long s/he enjoys the sound of his or her play.
  • the present invention allows him or her to incorporate that play into a music fragment.
  • Step 902 generating, by the interactive surface, a first digital file representing a user music fragment based on such touch actions. This step captures what the end-user has played into a first digital file.
  • the expressive qualities of the touch actions are captured as well, and are incorporated into the first digital file.
  • Step 903 associating, by the interactive surface, the first digital file with the identifier of a music card placed on the interactive surface, wherein the interactive surface is configured to recognize the identifier of the music card. This step creates a physical embodiment of what an end-user has played, turning the intangible music fragment that an end-user just played into a tangible physical card that represents such music fragment.
  • Step 904 retrieving, by the interactive surface, the first digital file associated with identifier of the music card, upon the music card being placed on the interactive surface for the second time.
  • This step describes how the originally intangible music fragment that an end-user just played can now be used in a tangible way.
  • the retrieving of the first digital file allows the end-user to further work with the music fragment that s/he has just created. S/he can now listen to the music fragment, and place the music fragment into a composition.
  • the music pitch and duration are expressed in MIDI notations.
  • the music described in the digital files are expressed in customized notations, so long that the interactive surface is configured to transform such notations into analogue sounds (in playing the music) and into standard MIDI notations (for the end-user to share the composition with others) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

A system and method for composing music with physical cards. An end-user creates an original music fragment by playing keyboards on an interactive surface; the music fragment is then associated with a music card(103), and further incorporated into a coding pattern (520) with multiple coding cards (301). The coding cards (301) are marked with coding symbols (410,420,430,440), and the interactive surface (101) generates a digital file that represents the fragment that the end-user just created with the music card (103) and the coding cards (301), based on the logic represented by the coding symbols (410,420,430,440).

Description

SYSTEM AND METHOD FOR COMPOSING MUSIC WITH PHYSICAL CARDS TECHNICAL FIELD
The present invention and its embodiments relate to the field of music application on electronic devices, and in particular, an interactive surface using physical cards to compose and play music.
BACKGROUND
Music is the one of the most magnificent creations of the human race. Music is expressed in complex forms, syntaxes, colors and articulations that are rooted both in the human biology and elegant mathematics.
Learning, composing, accessing and playing music are among the most fundamental human activities. Other than being an art form practiced by professional musicians and enjoyed by almost everyone else in the world, numerous studies have shown that music is greatly beneficial to the cognitive development of children.
Because of the advancement in technology over the last century, there is now little barrier for accessing and enjoying music. Music is performed in concert halls and recording studios; performance is recorded, stored and disseminated via a great variety of formats and channels. Consequently, almost any music is available at any time in any place to anyone with reasonable access.
Nonetheless, despite such technological advancement over the last century, there is still plenty of improvement potential for assisting learning and composing music by both professionals and novices, by the very senior and the very young, including as young as 2 to 3 years of age when the children have already developed an interest in music.
Three major barriers to learning and composing music have been observed, as outlined in our previous inventions:
1. Skill required. The most commonly practiced format of learning music is the learning of a particular musical instrument. For example, parents often engage  piano teachers to teach their children piano, while musicians play the piano to compose music. Needless to say, it takes years of vigorous practice to become good at playing piano at the amateur level, a process quite often forced upon children by their parents which costs time, money and possibly the children’s very interest in music. It takes tremendous practice and sacrifice to become a pianist.
2. Tools available. Being good at one musical instrument does not readily give one the ability to compose a music piece, with the full range of tunes and rhythms. For example, being good at piano does not readily enable someone to compose music with the drum, the trumpet, or the violin. The popular software program GarageBand by Apple Inc. allows the user to create music with elements of percussion, wind and string instruments. However, the functionality in GarageBand relating to the construction of original music is still highly complex.
3. Music syntax. The syntax of music is highly complex. Terms such as chord, diatonic chord, accidental, major and minor and their many types, variations, equivalency and inversions are simply beyond the grasp of most children and the vast majority of the people who are not professional musicians and have not learned the intricacies of music theories. While anyone can “create music” by singing into a microphone or hitting a few keys on the electronic piano, without the precise language of music syntax, such rudimentary recording cannot be precisely described, nor can it be dissected, analyzed, or further improved upon.
We therefore see the need to create a system that greatly reduces the skills required for learning and creating music, that makes the entire range of musical instruments easily available, and that allows sophisticated structure and complex syntax to be created based more on one’s appreciation and imagination in music and less on one’s mastery of the music syntax or skills with respect to a particular musical instrument.
Many have tried to take on the challenge of making music creation more accessible, including ourselves, in the past. ReacTable is an electronic musical instrument that has been covered extensively in the public domain. With ReacTable, through the manipulation of physical objects, music is changed and played in real-time, allowing a person to experiment with different types of music sound. However, we see ReacTable having many limitations as follows, in terms of  empowering a person to compose music:
● Purpose. ReacTable is for the purpose of real-time modification, suitable for live music performance. However, it is not well-suited for the purpose of composing music.
● Input. End-user experience with ReacTable does not start with creating an original music segment by the end-user. Instead, it starts with an existing music piece by the vendor.
● Processing. ReacTable transforms the music based on the nature of the blocks, and geometry of the blocks such as distances and angles. The modified music is part of the real-time performance. When playing ReacTable, end-users push the blocks around for testing new sound effects, with the underlying logic of music transformation not apparent or even well defined from the perspective of the end-user.
● Output. ReacTable does not yet provide means for a visual notation of the modified music.
In the case of our own invention US9183755B2, system and method that allow an end-user to compose music are disclosed. However, one limitation of US9183755B2 is that it does not teach composing music based on what an end-user plays, on a keyboard for example. Nor does it teach associating such play with a definitive machine-readable physical card with a unique ID, therefore creating the possibility that such play is captured with a physical card and is later retrieved for further music composition purposes. Rather, US9183755B2 teaches composing music based on cards that have already been associated with definitive music elements. Another limitation of US9183755B2 is that it does not teach composing music by incorporating the rich expressive qualities of what an end-user plays in real-time. Rather, the play or performance of the end-user in real-time was not within the scope of it. Yet another limitation of US9183755B2 is that it does not teach creating modifications to what an end-user plays in real-time, after such real-time play has been incorporated into a music composition, in the MIDI format.
In the case of our own invention US9299330B2, it teaches the capture of the rich expressive qualities of a touch action of an end-user, and the transformation of such expressive qualities of the touch action into expressive qualities of music.  However, composing music with physical cards or other types of physical manipulatives is beyond the scope of US9299330B2.
The combination of ReacTable, US9183755B2 and US9299330B2 does not resolve or resolve well the following challenges:
● capture real-time play by an end-user, associate what has been played with a physical card, and use what is captured and represented by the physical card as a source material for composing music;
● capture the rich expressive qualities of the touch actions of an end-user, and use what is captured as a source material for composing music;
● create modifications to what is played by an end-user into a new music piece.
Given all that have been invented and/or developed, it is believed that the current invention could resolve these and other related challenges.
SUMMARY OF INVENTION
The present invention resolves the challenges highlighted in the prior art by providing a system and method to compose music using physical cards. In accordance with one embodiment of the present invention, the system includes a physical music card, a physical sheet and an interactive surface. The physical music card has an identifier, and the physical sheet has an identifier and is visually marked with icons of music elements. The interactive surface recognizes the identifiers of the physical music card and the physical sheet as well as finger touches on the icon of a music element. Upon an end-user touching the icons of one or more music elements on the physical sheet placed on the interactive surface, the interactive surface generates a first digital file representing a user music fragment based on the touch action, and associates the first digital file with the identifier of the physical music card. And upon the physical music card being placed on the interactive surface for the second time, the interactive surface further retrieves the first digital file associated with the identifier of the physical music card. The first digital file is then further processed, transformed, or played, based on instructions from the end-user, for example, through the manipulation of the physical music card.
In accordance with another embodiment of the present invention, the interactive surface generates a digital profile of the touch action, and incorporates the digital profile into the first digital file.
In accordance with another embodiment of the present invention, the system further includes a physical coding card that has an identifier and is visually marked with a coding symbol. Upon the physical music card and the physical coding card being placed together on the interactive surface, the interactive surface transforms the first digital file that represents the user music fragment into a second digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding symbol.
In accordance with another embodiment of the present invention, the system further includes multiple physical coding cards. Upon the physical music card being placed into a coding pattern comprising these physical coding cards, the interactive surface generates a third digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding pattern.
In accordance with another embodiment of the present invention, the system further includes multiple physical music cards. Upon the multiple physical music cards being placed together on the interactive surface, the interactive surface generates multiple digital files, each associated with the identifier of one of the physical music cards, and combines the multiple digital files into a master digital file representing a single track music piece. In accordance with another embodiment of the present invention, the interactive surface is further instructed to organize the multiple digital files into yet another master digital file representing a music piece of multiple tracks.
In accordance with another embodiment of the present invention, the identifier is defined by a unique identification code (UID) , and the UID is encoded with a radio frequency identification chip, a pattern of capacitive tabs, or a pattern of magnetic tabs. The interactive surface recognizes the UID through a reader or a sensor array that is operatively linked to the interactive surface.
In accordance with another embodiment of the present invention, the identifier is defined by a unique optical pattern, and the interactive surface further  includes a camera that is operatively linked to the interactive surface and recognizes the unique optical pattern.
In accordance with another embodiment of the present invention, the interactive surface is configured to precisely record the durations of each of the touch actions, and is further configured to modify the duration of a music note from the touch action of an end-user, when or after the first digital file is generated. And the interactive surface is further configured to generate an accurate notation of the music note after the durations of music notes within the user music fragment has been modified.
The present invention provides a method for composing music with physical cards, including the following steps:
touching, by an end-user, upon one or more icons of music elements visually marked on a physical sheet placed on an interactive surface, wherein the interactive surface is configured to recognize the identifier of the physical sheet and finger touches upon the icons of the music elements;
generating, by the interactive surface, a first digital file representing a user music fragment based on such touch actions;
associating, by the interactive surface, the first digital file with the identifier of a physical music card placed on the interactive surface, wherein the interactive surface is configured to recognize the identifier of the physical music card;
retrieving, by the interactive surface, the first digital file associated with identifier of the physical music card, once the physical music card is placed on the interactive surface for the second time.
BRIEF DESCRIPTION OF THE DRAWINGS
To better illustrate the technical features of the embodiments of the present invention, various embodiments of the present invention will be briefly described in conjunction with the accompanying drawings. It should be obvious that the drawings are only for exemplary embodiments of the present invention, and that a person of ordinary skill in the art may derive additional drawings without deviating from the principles of the present invention.
FIG. 1 is an exemplary schematic diagram illustrating a system for composing music with physical cards, using an interactive surface, a physical sheet and multiple physical cards in accordance with one embodiment of the present invention.
FIG. 2 is an exemplary schematic diagram illustrating designs of music elements in accordance with one embodiment of the present invention.
FIG. 3 is an exemplary schematic diagram illustrating the music card being placed together with one coding card in accordance with one embodiment of the present invention.
FIG. 4 is an exemplary schematic diagram illustrating designs of coding symbols in accordance with one embodiment of the present invention.
FIG. 5 is an exemplary schematic diagram illustrating the music card being placed into a coding pattern formed by multiple coding cards in accordance with one embodiment of the present invention.
FIG. 6 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a single track music piece in accordance with one embodiment of the present invention.
FIG. 7 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a music piece of multiple tracks in accordance with one embodiment of the present invention.
FIG. 8 is an exemplary schematic diagram illustrating multiple music elements being used to modify the duration of a music note in accordance with one embodiment of the present invention.
FIG. 9 is an exemplary schematic diagram illustrating the process flow in accordance to one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Reference will now be made in detail to various embodiments of the invention illustrated in the accompanying drawings. While the invention will be described in conjunction with the embodiments, it will be understood that this is not intended to limit the scope of the invention to these specific embodiments. The  invention is intended to cover all alternatives, modifications and equivalents within the spirit and scope of invention, which is defined by the apprehended claims.
Furthermore, in the detailed description of the present invention, specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits are not described in details to avoid unnecessarily obscuring a clear understanding of the present invention.
Fig. 1 is an exemplary schematic diagram illustrating a system for composing music with physical cards, using an interactive surface, a physical sheet and multiple physical cards in accordance with embodiments of the present invention.
In accordance with one embodiment of the present invention, as shown in Fig. 1, the system includes an interactive surface 101, a physical sheet 102, and a music card 103. The music card 103 has an identifier 110, and the physical sheet 102 has an identifier 120 and is visually marked with icons 130 of music elements.
In accordance with another embodiment of the present invention, the interactive surface 101 is operatively linked to a processor, though the processor is not shown in Fig. 1. The  identifiers  110 and 120 are defined by a unique identification code (UID) that is embedded in the physical sheet 102 or the music card 103. The interactive surface 101 recognizes the UID through a reader or a sensor array that is operatively linked to the interactive surface 101. The UID is encoded with a radio frequency identification chip, a unique pattern of capacitive tabs, or a unique pattern of magnetic tabs. In these cases, the interactive surface 101 recognizes the UIDs of the physical sheet 102 or the music card 103 via a radio frequency antenna, an array of capacitive sensor switches, or an array of magnetic sensor switches respectively, and transmits that information to the processor.
In accordance with another embodiment of the present invention, the  identifiers  110 and 120 are defined by a unique optical pattern of the object. The unique optical pattern is a marking of music notation, a marking of linguistic or mathematical notation, a marking of icons or graphics, or a unique pattern of 2D or 3D shape, color and texture visible or invisible to the human eye. Such identifiers are  either pre-fabricated on physical objects, or marked on the objects by a user with such marking being either permanent or erasable. The interactive surface 101 further includes a camera that is operatively linked to the interactive surface 101 and recognizes the unique optical pattern through the camera. For example, the interactive surface 101 can be a tabletop on which the camera recognizes what is happening, which makes the tabletop an “interactive surface” .
In accordance with another embodiment of the present invention, the interactive surface 101 recognizes finger touches on the icon 130 of a music element. Upon an end-user touching the icons 130 of one or more music elements, e.g., a few icons of the white keys of the keyboard and one of the icons of the various musical instruments, on the physical sheet 102 placed on the interactive surface 101, the interactive surface 101 generates a first digital file representing such user generated music fragment based on the touch action, and associates the first digital file with the identifier 110 of the music card 103. The first digital file could be an audio file in any format, e.g., a MIDI file, and typically serves as either a melody or a chord. And the first digital file could be a single file or a set of files. Thus, the present invention has provided a means for capturing an intangible music fragment played by an end-user onto a tangible physical card.
One of the advantages for the present invention is that, the music card 103, having been associated with an end-user generated music fragment, is further utilized by the end-user in a music fragment, or stored away for future use.
In accordance with another embodiment of the present invention, upon the music card 103 being placed on the interactive surface 101 for the second time, the interactive surface 101 further retrieves the first digital file associated with the identifier 110 of the music card 103. Through the manipulation of the physical music card 103, the user music fragment represented by the first digital file is further played by an output device operatively linked to the interactive surface, further modified according to other embodiments of the present invention, and/or incorporated into a large piece of music composition through the use of physical coding cards, according to other embodiments of the present invention. The first digital file, and/or other digital files are sent to a remote server by the interactive surface 101, under the choice and instructions of the end-user. The retrieved first digital file may be played, paused or stopped by touching the UI elements 105 marked on the physical sheet 102.
In accordance with another embodiment of the present invention, capturing a touch action by the end-user, the interactive surface 101 generates a digital profile of the touch action, transforms it into MIDI parameters, and incorporates the digital profile into a digital file such as a MIDI file. Parameters of the digital profile may include velocity, acceleration, deceleration, force and duration of the touch action. Such digital profile is generated by capacitive sensing, or by a camera that analyzes in real-time the motion of the fingers of the end-user.
In accordance with another embodiment of the present invention, the interactive surface 101 further instructs a sensory accessory operatively linked to a processor that is operatively linked to the interactive surface to play the digital file that represent a user music fragment, in an original or modified form, to provide interactive feedback to the end user. In one embodiment of the present invention, the sensory accessory is an audio device. In another embodiment of the present invention, the sensory accessory is a visual device that shows the notation of a music fragment or composition generated by the interactive surface 101, with such notation written with typical western music symbols such as ABCDEFG, staffs, clefs, notes, chords, rests, breaks, accidentals, and time signatures, or with typical Solfège music symbols such as Do, Re, Mi, Fa, Sol, La, Ti (or Si) , or with simplified numerical music symbols such as 1, 2, 3, 4, 5, 6, 7.
It should be noted that in the above embodiment of the present invention, a physical sheet is used as a user interface, for example, to serve the purpose for an end-user to generate a music fragment. Our preference has been to create one embodiment of the present invention that does not involve a display screen. Nonetheless, a display screen is equally if not better suited to serve a user interface for the present invention, from an ease of use perspective. Alternatively, an electronic keyboard that is operatively linked to the interactive surface may also provide means for an end-user to generate a user music fragment, to be associated with a physical music card. Those who are skilled in the art will recognize that these and other types of variations are commonplace and are all within the scope of the present invention.
Fig. 2 provides examples of music elements. As shown in Fig. 2, the music element are white keys as part of a typical keyboard instrument (210) , sometimes with the name of a music note marked on each white key, a chord (220) , which may be major chords, minor chords, triad chords or seventh chords, etc., a musical instrument  (230) , which may include musical instruments, including percussion, wind and string instruments, and a symbol that represents human voices and sounds from nature, etc., or an indicator of the duration of a music note (240) , e.g. 1/4, 1/2, etc.
Besides the examples shown in Fig. 2, the music element may also be one of the followings: a rest and the duration of rest; accidentals that alter the pitch of a music note; dots or ties that modify the duration of a music note; key signatures that define the music piece being a major scale or a minor scale; raising or lowering the pitch of a music note in the same pitch class; symbols that define the texture, dynamics and articulation of the music piece, including those that are commonly used and those custom designed to enhance the music for a particular audience, sentiment and purpose.
FIG. 3 is an exemplary schematic diagram illustrating the music card being placed together with one coding card in accordance with one embodiment of the present invention. As seen in Fig. 3, the coding card 301 has an identifier 302 and is visually marked with a coding symbol. Upon the music card 103 and the coding card 301 being placed together on the physical sheet 102 placed on the interactive surface 101, the interactive surface 101 retrieves the digital file that represents the original user music fragment and transforms the digital file into a new digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding symbol. In Fig. 3, specifically, the new digital file is transformed as the original digital file played forward with double speed.
It should also be noted that, a physical card described in the present invention, including both the music card 103 and the coding card 301, may also be a button, a block, a figurine, or another 2D or 3D block structure that is amenable to be placed next to each other or on top of one another to form clearly recognizable 2D or 3D structural patterns, to both the human eye and the sensors and antennas of the interactive surface.
The exemplary designs of a few coding symbols in various categories are disclosed in Fig. 4, in accordance with embodiments of the present invention.
Referring to Fig. 4, the coding symbols 410 represent the logic of transformation of a digital file, with respect to the speed and direction of the music notes being played. Sequentially, the coding symbols in 410 represent 1) playing  backward at double speed; 2) playing forward at double speed; 3) playing backward at single speed; 4) playing forward at single speed; 5) playing backward at quadruple speed; 6) playing forward at quadruple speed; and 7) rest for a defined duration.
The following is an exemplary illustration of the transformation of an original digital involving N music notes:
DigitalFile
{
Instrument
NoteList
[
1: Note (1)
{
Pitch: pitch_1
StartTime: starttime_1
EndTime: endtime_1
}
2: Note (2)
{
Pitch: pitch_2
StartTime: starttime_2
EndTime: endtime_2
}
.....
N: Note (n)
{
Pitch: pitch_n
StartTime: startTime_n
EndTime: endtime_n
}
]
}
being transformed by a coding card marked with the coding symbol 410 "playing backward at double speed" to generate a new digital file also containing N music notes:
Backward2XFile
{
Instrument
NoteList
[
1: Note (n)
{
Pitch: pitch_n
StartTime: (totalduration -endtime_n) /2
EndTime: (totalDuration -starttime_n) /2
}
2: Note (n-1)
{
Pitch: pitch_n-1
StartTime: (totalduration -endtime_n-1) /2
EndTime: (totalDuration -starttime_n-1) /2
}
.....
n: Note (1)
{
Pitch: pitch_1
StartTime: (totalduration -endtime_1) /2
EndTime: (totalDuration -starttime_1) /2
}
]
}
Referring to Fig. 4, the coding symbols 420 represent coding logic related to loop functions, including the start of loop, the end of loop and the number of loops, etc.
For the original digital file mentioned above, i.e., DigitalFile, following is an exemplary code for its transformation into a new digital file by the coding symbol 410 "playing backward at double speed" (within the loop) , in combination with three coding cards 430 related to loop functions (i.e., start of loop, end of loop and 4 loops) :
main () {
loop (4) {
transform (DigitalFile, 2, backward) ;
}
}
Referring to Fig. 4, the coding symbols 430 are all related to subroutine functions, which enable one music card to be assigned to a coding pattern.
For the original digital file mentioned above, i.e., DigitalFile, following is an exemplary code for its transformation into a new digital file by the coding symbol 410 "playing backward at double speed" (the subroutine called four times) , in combination with two coding cards 440 related to subroutine functions (i.e., function card and equal sign card) :
main () {
function () ;
function () ;
function () ;
function () ;
}
function () {
transform (DigitalFile, 2, backward) ;
}
Referring to Fig. 4, the coding symbols 440 in the second line of Fig. 4 are all related to conditional statements in programming, including IF, ELSEIF, ELSE, ENDIF, to represent pre-defined conditions within a digital file. An altered user music fragment progresses differently depending on whether a pre-defined condition has been met or not.
Fig. 5 is an exemplary schematic diagram illustrating the music card being placed into a coding pattern formed by multiple coding cards in accordance with one embodiment of the present invention.
As seen in Fig. 5, multiple coding cards 510 are placed together on the interactive surface to form a coding pattern 520. Upon the music card 103 being placed into the coding pattern 520, the interactive surface 101 retrieves the original digital file associated with the music card 103, and generates a new digital file representing an altered user music fragment, with the alteration being based on the logic represented by the coding pattern 520. In accordance with one embodiment of the present invention, the coding pattern represents a loop that is to be played four times. Within the loop, the digital file associated with the music card 103 is transformed to be playing backward at double speed, then resting for a time period.
Referring to Fig. 5, for the digital file associated with the music card 103, i.e., DigitalFile, following is an exemplary code for its transformation into a new digital file that represents the user music fragment into a second digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding symbol:
MusicFile DigitalFile;
List<MusicFile> fileList;
generateTargetFile () {
loop (4) {
MusicFile forwardFile = transform (DigitalFile, 2, forward) ;
fileList. add (forwordFile) ;
MusicFile delayFile = transform (DigitalFile, delay) ;
fileList. add (delayFile ) ;
}
MusicFile NewDigitalFile = addFile (fileList) ;
return NewDigitalFile;
}
Fig. 6 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a single track music piece in accordance with one embodiment of the present invention. As seen in Fig. 6, multiple music cards 610 are placed together by end-users on the interactive surface 101. The interactive surface 101 retrieves the digital files associated with each the music cards, and merge them into a single master digital file representing a single track music piece.
Fig. 7 is an exemplary schematic diagram illustrating multiple music cards being used to generate a master digital file representing a music piece of multiple tracks in accordance with one embodiment of the present invention. As shown in Fig. 7, three music cards 710 are separately placed into three tracks 730 on the interactive surface 101, together with multiple coding cards 720. The interactive surface first generates a digital file for each of the three different digital files. And these three digital files can be further organized into three tracks of a master digital file representing a music piece of multiple tracks. The exemplary code for the process is as follows:
MusicFile DigitalFile1;
MusicFile DigitalFile 2;
MusicFile DigitalFile 3;
List<MusicFile> fileList;
generateTargetFile () {
fileList. add (DigitalFile1) ;
fileList. add (DigitalFile2) ;
fileList. add (DigitalFile3) ;
MusicFile MasterDigitalFile = mergeTrack (fileList) ;
return MasterDigitalFile;
}
FIG. 8 is an exemplary schematic diagram illustrating multiple music elements being used to modify the duration of a music note in accordance with one embodiment of the present invention. As mentioned earlier, the touch action by the end-user, including the durations of each of the touch actions upon the interactive surface 101, is precisely captured and recorded, with such durations may or may not conform, or conform precisely, to an intended tempo or rhythm. The interactive surface is further configured to modify the duration of a music note from the touch action of an end-user, when or after the first digital file is generated. For example, the durations may be "standardized" to conform to a precise tempo and rhythm, by removing small inaccuracies introduced by the end-user; alternatively, the durations may be changed as to create a new rhythm for the user music fragment. As seen in Fig. 8, the end-user may touch any of the multiple music elements 810 marked (on a physical sheet placed) on the interactive surface 101 to modify the duration of a music note of the first digital file. The interactive surface is further configured to generate an accurate notation of the music note after the durations of music notes within the user music fragment has been so modified. And the notation of the music note can be shown in a display device. Alternatively, we could provide a menu of durations in the form of rhythms that the end-user can pick and choose.
Fig. 9 is an exemplary schematic diagram illustrating the process flow in accordance to one embodiment of the present invention.
The process flow for the present invention has the following steps:
Step 901: touching, by an end-user, upon one or more icons of music elements visually marked on a physical sheet placed on an interactive surface that is configured to recognize the identifier of the physical sheet and finger touches upon the icons of the music elements. This step allows an end-user to freely generate a user music fragment, in a way that is very similar to playing an actual piano keyboard. In addition, other types of arrangements of music elements on the physical sheet can be used, which allows an end-user to generate a user music fragment in a free play fashion. The end-user does not even need to have any familiarity with the keyboard -so long s/he enjoys the sound of his or her play. The present invention allows him or her to incorporate that play into a music fragment.
Step 902: generating, by the interactive surface, a first digital file representing a user music fragment based on such touch actions. This step captures what the end-user has played into a first digital file. In accordance with another embodiment of the present invention, the expressive qualities of the touch actions are captured as well, and are incorporated into the first digital file.
Step 903: associating, by the interactive surface, the first digital file with the identifier of a music card placed on the interactive surface, wherein the interactive surface is configured to recognize the identifier of the music card. This step creates a physical embodiment of what an end-user has played, turning the intangible music fragment that an end-user just played into a tangible physical card that represents such music fragment.
Step 904: retrieving, by the interactive surface, the first digital file associated with identifier of the music card, upon the music card being placed on the interactive surface for the second time. This step describes how the originally intangible music fragment that an end-user just played can now be used in a tangible way. The retrieving of the first digital file allows the end-user to further work with the music fragment that s/he has just created. S/he can now listen to the music fragment, and place the music fragment into a composition.
In accordance with one embodiment of the present invention, within the first digital file and subsequent (altered) digital files, the music pitch and duration are expressed in MIDI notations. In accordance with another embodiment of the present invention, the music described in the digital files are expressed in customized notations, so long that the interactive surface is configured to transform such notations into analogue sounds (in playing the music) and into standard MIDI notations (for the end-user to share the composition with others) .

Claims (20)

  1. A system for composing music with coding cards, comprising:
    a physical music card, comprising an identifier;
    a physical sheet, comprising an identifier, and visually marked with icons of music elements;
    an interactive surface that is configured to recognize the identifiers of the physical music card and the physical sheet, and finger touches upon the icon of a music element;
    wherein, upon an end-user touching upon one or more icons of the music elements on the physical sheet placed on the interactive surface, the interactive surface is configured to generate a first digital file representing a user music fragment based on such touch action, and associate the first digital file with the identifier of the physical music card placed on the interactive surface, and
    wherein, upon the physical music card being placed on the interactive surface for the second time, the interactive surface is further configured to retrieve the first digital file associated with the identifier of the physical music card.
  2. The system of Claim 1, wherein, the interactive surface is configured to generate a digital profile of the touch action, and incorporate such digital profile into the first digital file.
  3. The system of Claim 1, further comprising a physical coding card, comprising an identifier and visually marked with a coding symbol, wherein, upon the physical music card and the physical coding card being placed together on the interactive surface, the interactive surface is configured to transform the first digital file representing the user music fragment into a second digital file representing an altered user music fragment, with the alteration being based on the logic represented by the coding symbol.
  4. The system of Claim 3, further comprising a plurality of physical coding cards, wherein, upon the physical music card being placed into a coding pattern comprising the plurality of physical coding cards, the interactive surface is configured to generate a third digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding pattern.
  5. The system of Claim 4, further comprising a pre-defined condition in the third digital file, wherein the altered user music fragment progresses differently depending on whether the pre-defined condition has been met or not.
  6. The system of Claim 1, further comprising a plurality of physical music cards, wherein, the interactive surface is configured to generate multiple digital files, each associated with the identifier of a physical music card among the plurality of physical music cards, and combine the multiple digital files into a master digital file representing a single track music piece, upon the plurality of physical music cards have been placed together on the interactive surface by the end-user.
  7. The system of Claim 6, wherein an end-user instructs the interactive surface to organize the multiple digital files into a master digital file representing a music piece of multiple tracks.
  8. The system of Claim 1, wherein the identifier comprises a unique identification code (UID) , and the UID is encoded with a device selected from a group consisting of a radio frequency identification chip, a pattern of capacitive tabs, and a pattern of magnetic tabs, and wherein the interactive surface is configured to recognize the UID through a reader or a sensor array that is operatively linked to the interactive surface.
  9. The system of Claim 1, wherein the identifier comprises a unique optical pattern, and wherein the interactive surface further comprises a camera that is operatively linked to the interactive surface and is configured to recognize the unique optical pattern through the camera.
  10.  The system of Claim 1, wherein, the interactive surface is configured to modify the duration of a music note from the touch action of the end-user, when or after the first digital file being generated, and to generate an accurate notation of the music note.
  11. A method for composing music with physical cards, comprising:
    touching, by an end-user, upon one or more icons of music elements visually marked on a physical sheet placed on an interactive surface, wherein the interactive  surface is configured to recognize the identifier of the physical sheet and finger touches upon the icons of the music elements;
    generating, by the interactive surface, a first digital file representing a user music fragment based on such touch actions;
    associating, by the interactive surface, the first digital file with the identifier of a physical music card placed on the interactive surface, wherein the interactive surface is configured to recognize the identifier of the physical music card;
    retrieving, by the interactive surface, the first digital file associated with identifier of the physical music card, upon the physical music card being placed on the interactive surface for the second time.
  12. The method of Claim 11, further comprising, generating a digital profile of the touch action, and incorporating such digital profile into the first digital file, by the interactive surface.
  13. The method of Claim 11, further comprising:
    placing a physical coding card and the music card together on the interactive surface, wherein the physical coding card comprises an identifier and is visually marked with a coding symbol; and
    transforming, by the interactive surface, the first digital file representing the user music fragment into a second digital file representing an altered user music fragment, with the alteration being based on the logic represented by the coding symbol.
  14. The method of Claim 13, further comprising:
    placing, by the end-user, the physical music card into a coding pattern comprising multiple physical coding cards on the interactive surface;
    generating, by the interactive surface, a third digital file that represents an altered user music fragment, with the alteration being based on the logic represented by the coding pattern.
  15. The method of Claim 11, further comprising:
    incorporating a pre-defined condition into the third digital file, wherein, the altered user music fragment progresses differently, depending on whether the pre-defined condition has been met or not.
  16. The method of Claim 11, further comprising:
    generating, by the interactive surface, multiple digital files, each associated with the identifier of a physical music card among a plurality of physical music cards;
    combining the multiple digital files into a master digital file representing a single track music piece, upon the plurality of physical music cards having been placed together on the interactive surface by the end-user.
  17. The method of Claim 16, further comprising:
    instructing, by an end user, the interactive surface to organize the multiple digital files into a master digital file representing a music piece of multiple tracks.
  18. The method of claim 11, wherein the identifier comprises a unique identification code (UID) , and the UID is encoded with a device selected from a group consisting of a radio frequency identification chip, a pattern of capacitive tabs, and a pattern of magnetic tabs, and the method further comprising:
    recognizing the UID through a reader or a sensor array that is operatively linked to the interactive surface.
  19. The method of Claim 11, wherein the identifier comprises a unique optical pattern, and wherein the interactive surface further comprises a camera that is operatively linked to the interactive surface, and the method further comprising:
    recognizing the unique optical pattern through the camera.
  20. The method of Claim 11, further comprising, by the interactive surface:
    modifying the duration of a music note from the touch action of the end-user; and
    generating an accurate notation of the first digital file, when or after the first digital file being generated.
PCT/CN2019/074004 2019-01-30 2019-01-30 System and method for composing music with physical cards WO2020154982A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/074004 WO2020154982A1 (en) 2019-01-30 2019-01-30 System and method for composing music with physical cards

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/074004 WO2020154982A1 (en) 2019-01-30 2019-01-30 System and method for composing music with physical cards

Publications (1)

Publication Number Publication Date
WO2020154982A1 true WO2020154982A1 (en) 2020-08-06

Family

ID=71840659

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/074004 WO2020154982A1 (en) 2019-01-30 2019-01-30 System and method for composing music with physical cards

Country Status (1)

Country Link
WO (1) WO2020154982A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150068387A1 (en) * 2013-03-12 2015-03-12 Zheng Shi System and method for learning, composing, and playing music with physical objects
US20150279343A1 (en) * 2014-01-30 2015-10-01 Zheng Shi Apparatus and method to enhance the expressive qualities of digital music
CN108961611A (en) * 2018-06-05 2018-12-07 口碑(上海)信息技术有限公司 A kind of information processing system and method for commodity to be processed
CN110110147A (en) * 2017-12-27 2019-08-09 中兴通讯股份有限公司 A kind of method and device of video frequency searching

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150068387A1 (en) * 2013-03-12 2015-03-12 Zheng Shi System and method for learning, composing, and playing music with physical objects
US20150279343A1 (en) * 2014-01-30 2015-10-01 Zheng Shi Apparatus and method to enhance the expressive qualities of digital music
CN110110147A (en) * 2017-12-27 2019-08-09 中兴通讯股份有限公司 A kind of method and device of video frequency searching
CN108961611A (en) * 2018-06-05 2018-12-07 口碑(上海)信息技术有限公司 A kind of information processing system and method for commodity to be processed

Similar Documents

Publication Publication Date Title
US9183755B2 (en) System and method for learning, composing, and playing music with physical objects
JP6429336B2 (en) System and method for learning, composing and playing music with physical objects
Marrin Toward an understanding of musical gesture: Mapping expressive intention with the digital baton
Holland Artificial intelligence, education and music: The use of artificial intelligence to encourage and facilitate music composition by novices
US20110146477A1 (en) String instrument educational device
WO2015113360A1 (en) System and method for learning,composing,and playing music with physical objects
CN106133824A (en) For rolling the method for music score, equipment and computer program
JP6459378B2 (en) Problem management apparatus and problem management program
Medina-Gray Modular structure and function in early 21st-century video game music
Weinberg et al. Robotic musicianship: embodied artificial creativity and mechatronic musical expression
CN101604486A (en) Musical instrument playing and practicing method based on speech recognition technology of computer
CN106952510B (en) Pitch calibrator
TWI274321B (en) System to enable the use of white keys of musical keyboards for scales
Bongers Interactivation: Towards an e-cology of People, our Technological Environment, and the Arts
JP4506175B2 (en) Fingering display device and program thereof
Clarke Expression and communication in musical performance
US11308926B2 (en) Method and system for composing music with chord accompaniment
WO2020154982A1 (en) System and method for composing music with physical cards
CN105489209A (en) Electroacoustic musical instrument rhythm controllable method and improvement of karaoke thereof
Weinberg Interconnected musical networks: bringing expression and thoughtfulness to collaborative group playing
Haenselmann et al. A zero-vision music recording paradigm for visually impaired people
JP7473781B2 (en) SOUND ELEMENT INPUT MEDIUM, READING AND CONVERSION DEVICE, MUSICAL INSTRUMENT SYSTEM, AND MUSIC SOUND GENERATION METHOD
Laws Beckett in New Musical Composition
Kanga " Building an instrument" in the collaborative composition and performance of works for piano and live electronics
Curry The Philosophy of Musical Instruments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913167

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913167

Country of ref document: EP

Kind code of ref document: A1