WO2018236962A1 - Autonomous page-turner for sheet music - Google Patents

Autonomous page-turner for sheet music Download PDF

Info

Publication number
WO2018236962A1
WO2018236962A1 PCT/US2018/038440 US2018038440W WO2018236962A1 WO 2018236962 A1 WO2018236962 A1 WO 2018236962A1 US 2018038440 W US2018038440 W US 2018038440W WO 2018236962 A1 WO2018236962 A1 WO 2018236962A1
Authority
WO
WIPO (PCT)
Prior art keywords
music
page
pitch
pitches
sheet
Prior art date
Application number
PCT/US2018/038440
Other languages
French (fr)
Inventor
Po-Li SOONG
Original Assignee
Cattle Pasture Studio, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cattle Pasture Studio, Llc filed Critical Cattle Pasture Studio, Llc
Publication of WO2018236962A1 publication Critical patent/WO2018236962A1/en

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10GREPRESENTATION OF MUSIC; RECORDING MUSIC IN NOTATION FORM; ACCESSORIES FOR MUSIC OR MUSICAL INSTRUMENTS NOT OTHERWISE PROVIDED FOR, e.g. SUPPORTS
    • G10G1/00Means for the representation of music

Definitions

  • a method of operating a computing system includes displaying a music sheet page of a music sheet file on a user interface to the application.
  • the music sheet file comprises a sequence of pitches associated with each of a plurality music sheet pages.
  • a pitch is identified from inputted audio data and responsively added to branches of a suffix tree associated with the music sheet page.
  • Each of the branches of the suffix tree is then compared to a sequence of pitches associated with the music sheet page to determine a location of the identified pitch in the sequence of pitches.
  • a subsequent music sheet page is displayed on the user interface.
  • Figure 1 illustrates an operational architecture for implementing an enhanced computing environment to autonomously turn a page for sheet music on an electric computing device according to an implementation.
  • Figure 2 illustrates a page turner process employed in implementations of an enhanced application to autonomously turn a page for sheet music on an electric computing device.
  • Figure 3 illustrates an exemplary user interface to autonomously turn a page of sheet music on tablet according to an implementation.
  • Figure 4 illustrates an exemplary host computing system to autonomously turn a page for sheet music on an electric computing device according to an implementation.
  • Figure 5 illustrates a sequence diagram to autonomously turn a page for sheet music on an electric computing device according to an implementation.
  • Figure 6 illustrates an exemplary suffix tree to determine the location of the identified pitch on a page of sheet music according to an implementation.
  • Figure 7 illustrates an exemplary table to determine the location of the identified pitch on a page of sheet music according to an implementation.
  • Figure 8 illustrates an exemplary acoustic signal of a sound wave according to an implementation.
  • Figure 9 illustrates an exemplary energy density spectrum of an acoustic signal according to an implementation.
  • Figure 10 illustrates a computing system to autonomously turn a page for sheet music on an electric computing device according to an implementation.
  • Examples of the present disclosure describe an application for autonomously turn a page for sheet music on an electric computing device.
  • a music sheet page of a music sheet file is displayed on a user interface to the application.
  • the music sheet file comprises a sequence of pitches associated with each of a plurality music sheet pages.
  • a pitch is identified from inputted audio data and responsively added to branches of a suffix tree associated with the music sheet page.
  • Each of the branches of the suffix tree is then compared to a sequence of pitches associated with the music sheet page to determine a location of the identified pitch in the sequence of pitches.
  • a subsequent music sheet page is displayed on the user interface.
  • a technical effect that may be appreciated from the present discussion is the increased efficiency in displaying a next page of sheet music to a user without the need for the user to disrupt the musical performance to manually turn the page of sheet music.
  • the application described herein also improves the accuracy of the time in which the user sees the next page of sheet music since a more precise location of where the user is located on the page of sheet music may be determined.
  • the pitch from inputted audio data may be identified by determining an energy density spectrum of the inputted audio data.
  • the pitch may be identified based on the inputted audio data having a frequency with a spectral density within a tolerance level of a nominal frequency.
  • the inputted audio data is calibrated with each pitch of the sequence of pitches associated with each of the music sheet pages of the music sheet file.
  • the sheet music file comprises a Music Extensive Markup Language (MusicXML) file.
  • one or more branches of the suffix tree may be removed when the identified pitch added to the one or more branches does not match the next pitch in the sequence of pitches associated with the music sheet page.
  • a new branch may be added to the suffix tree when the identified pitch added to the one or more branches does not match the next pitch in the sequence of pitches associated with the music sheet page.
  • one or more branches of the suffix tree may be split when the identified pitch added to the one or more branches matches one or more locations in the sequence of pitches associated with the music sheet page.
  • Figure 1 illustrates an exemplary operational architecture 100 related to processing operations for management of an exemplary enhanced system with which aspects of the present disclosure may be practiced.
  • Operational environment 100 includes computing system 101 comprising application 102.
  • Application 102 employs a page turner process 200 in the context of producing views in a user interface 103.
  • User interface 103 displays sheet music pages 110-111 of a sheet music file processed by application 102.
  • Computing system 101 is representative of any device capable of running an application natively or in the context of a web browser, streaming an application, or executing an application in any other manner.
  • Examples of computing system 101 include, but are not limited to, personal computers, mobile phones, tablet computers, desktop computers, laptop computers, wearable computing devices, or any other form factor, including any combination of computers or variations thereof.
  • Computing system 101 may include various hardware and software elements in a supporting architecture suitable for performing page turning process 200.
  • One such representative architecture is illustrated in Figure 10 with respect to computing system 1001.
  • Application 102 includes a software application or application component capable of displaying and autonomously turning sheet music pages in accordance with the processes described herein.
  • Examples of the software application include, but are not limited to, speech editing applications, music editing applications, video editing applications, and any other type of combination or variation thereof.
  • the software application may be
  • User interface 103 includes a representative view of sheet music pages 110-111 that may be produced by a sheet music application. As can be seen in sheet music page 110, musical notes may be illustrated which correspond to a sequence of pitches stored in a sheet music audio file. An end user may interface with application 102 to view sheet music pages 110-111. The user may interface with application 102 over user interface 103 using an input instrument such as a microphone which allows application 102 process inputted audio data.
  • an input instrument such as a microphone which allows application 102 process inputted audio data.
  • Figure 2 illustrates page turner process 200 which, as mentioned, may be employed by application 102 to autonomously turn a page for sheet music on an electric computing device as described herein.
  • Some or all of the steps of page turner process 200 may be implemented in program instructions in the context of a component or components to the application used to carry out the visual representation display feature.
  • the program instructions direct application 102 to operate as follows, referring parenthetically to the steps in Figure 2 in the context of Figure 1.
  • application 102 displays music sheet page 110 of a music sheet file on user interface 103 to application 102 (step 201).
  • the music sheet file comprises a sequence of pitches 120 associated with each of a plurality of music sheet pages 110-111 of a music sheet file.
  • each page of sheet music in the sheet music file has a sequence of notes.
  • first page of sheet music 110 may begin with a sequence of notes, such as A, B, F. These notes would then be associated with sequence of pitches 120 and stored in the sheet music file for sheet music page 110.
  • each page of sheet music may contain different sequences of pitches.
  • the music sheet file may comprise a Music Extensive Markup Language (MusicXML) file.
  • the music sheet file may alternatively be in any other file format for representing musical notation.
  • Application 102 then identifies pitch 140 from inputted audio data 130 and responsively adds identified pitch 140 to one or more branches 160-162 of suffix tree 150 associated with music sheet page 110 (step 202). For example, application 102 identifies that inputted audio data 130 indicates that identified pitch 140 is the note C. The note is then added to each of branches 160-162 in suffix tree 150.
  • pitch 140 from inputted audio data 130 may be identified by determining an energy density spectrum of the inputted audio data.
  • pitch 140 may be identified based on inputted audio data 130 having a frequency with a spectral density within a tolerance level of a nominal frequency.
  • inputted audio data 130 is calibrated with each pitch of sequence of pitches 120-121 associated with each of music sheet pages 110-111 of the music sheet file.
  • application 102 compares each of one or more branches 160-162 of suffix tree 150 to sequence of pitches 120 associated with music sheet page 110 to determine a location of the identified pitch 140 in sequence of pitches 120 associated with music sheet page 110 (step 203).
  • One or more branches of suffix tree 150 may be removed when identified pitch 140 added to one or more branches 160-162 does not match the next pitch in sequence of pitches 120 associated with music sheet page 110.
  • branch 161 of suffix tree has a sequence of A, F#, and C. However, since the F# is not included on pitch sequence 120, branch 161 may be removed from suffix tree 150.
  • a new branch may be added to suffix tree 150 when the identified pitch added to one or more branches 160-162 does not match the next pitch in the sequence of pitches 120 associated with music sheet page 110.
  • one or more branches 160-162 of suffix tree 150 may be split when the identified pitch added to one or more branches 160-162 matches one or more locations in sequence of pitches 120 associated with music sheet page 110.
  • step 204 when the location of identified pitch 140 is determined to be at an end of sequence of pitches 120, application 102 displays subsequent music sheet page 111 of the music sheet file on user interface 103 (step 204). It should be understood that application 102 may determine that a pitch before the last pitch in sequence of pitches 120 indicates the end of sheet music page 110. For example, although the note B is the last pitch in pitch sequence 120, application 102 may determine to turn from sheet music page 110 to sheet music page 111 once branch 163 of suffix tree 150 reaches the second to last pitch (i.e., note C). This allows the page to seamlessly turn to the next page without causing delay to the user viewing the sheet music.
  • FIG. 3 illustrates an exemplary tablet to autonomously turn a page of sheet music on tablet according to an implementation.
  • Tablet 301 includes graphical display 302 to present a sheet music page to a user.
  • the sheet music page contains notes which are each associated with pitches in a sequence of pitches.
  • the sequence of pitches is associated with each page of the music sheet file.
  • the user may interface with tablet 301 using an input instrument such as a stylus, mouse device, keyboard, touch gesture, as well as any other suitable input device.
  • Tablet 301 also includes microphone 303 and speaker 304.
  • tablet computer 301 graphically displays an initial page of sheet music to a user on graphical display 302. Once a user begins to play a musical instrument, tablet computer 301 begins detecting and receiving inputted audio data.
  • the inputted audio data may be generated by a musical instrument (e.g., a piano, violin, etc.), a voice, an additional computing device (e.g., electronic piano).
  • the audio data may be captured and input into tablet 301 for processing using microphone 303.
  • the audio data is converted and transmitted to the application executing on tablet computer 301 to identify pitches from the inputted audio data.
  • Tablet computer 301 then begins to build a set of branches off of a suffix tree based on the identified pitches.
  • the set of subsequences are then matched to a stored sequence of pitches associated with the sheet music page on display in graphical display 302.
  • tablet computer 301 determines the location of the pitches on the page of sheet music. While the user moves through the page, tablet computer 301 is enabled to determine when the user is nearing the end of the current page of sheet music. Therefore, when the application executing on tablet computer 301 determines that the suffix tree indicates that the location of the latest identified pitch is at the end of the page of sheet music, graphical display 302 displays a subsequent page of sheet music.
  • the user playing the musical instrument may view the subsequent page of sheet music without having to interrupt the performance of the song with a physical motion or audio command instructing tablet computer to turn to the next page, such as by inputting additional noises into microphone 303 or touching graphical display 302.
  • a physical motion or audio command instructing tablet computer may be used to turn to the next page, such as by inputting additional noises into microphone 303 or touching graphical display 302.
  • FIG. 4 illustrates a computing environment 400 to autonomously turn a page for sheet music on an electric computing device according to an implementation.
  • Computing environment 100 includes computing system 401 comprising application 402, audio input 410, A/D converter 420, music processor 430, and sheet music display 440.
  • sheet music display 440 graphically displays an initial page of sheet music to a user.
  • the initial page of sheet music may be previously downloaded into computing system 401 or currently be streaming from an external computing system as executed by application 402.
  • the digital sheet music may be received and stored in various digital formats, such as a Music XML formats.
  • audio input 410 receives inputted audio data.
  • audio data created by playing a musical instrument may be inputted into computing system 401 via audio input 410, such as a microphone.
  • the audio data is then converted in A/D converter 420 and transferred to music processor 430.
  • Music processor 430 then identifies pitches from the inputted audio data.
  • the sound from the musical instrument is picked up from a microphone and quantized with A/D converter 420.
  • the quantized sound in the form of a stream of binary data, is then processed with Short- Time Fourier Transform (STFT) to derive the spectrum of the music being played. From the spectrum, the pitches played in the duration for the Fourier transform are extracted.
  • STFT Short- Time Fourier Transform
  • Music processor 430 then builds branches of the suffix tree and matches the identified pitches on the suffix tree with a known sequence of pitches associated with the music audio file, such as a MusicXML file. Music processor 430 may then determine the location of the pitches on the page of sheet music. In one implementation, the pitches are internally organized on the suffix tree and the pitches derived from the STFT are used to match the edges of the tree to determine the likely position on the current page of the sheet music.
  • music processor 430 determines that the suffix tree indicates that the location is at the end of the page of sheet music, music processor 430 directs sheet music display 440 to display a subsequent page of sheet music.
  • FIG. 6 illustrates an exemplary suffix tree 600 to determine the location of the identified pitch on a page of sheet music according to an implementation.
  • each pitch which is identified from the inputted audio data may be added to a branch on suffix tree 600.
  • C5 may be identified as a pitch from inputted data.
  • C5 may then be added to branch 1.
  • C4 may be identified from the inputted audio data.
  • C4 would then be added to branch 1.
  • new branch 2 may be created.
  • branches 1 and 2 may be advanced and new branch 3 may be added.
  • Next pitches E4/E5 are identified. This results in the addition of branches 4 and 6 and the advancement of branches 1-3.
  • the application may then determine that the next pitch is identified to be G5.
  • branch 7 is split off of branch 2
  • branch 8 is split off of branch 3
  • branch 9 is split off of branch 4
  • branches 1 and 6 are removed from the suffix tree.
  • the suffix tree removes branches 7, 8, 9, and 10 and adds branch 11.
  • FIG. 7 illustrates exemplary table 700 to determine the location of the identified pitch on a page of sheet music according to an implementation.
  • the most prevalent tuning systems of the contemporary world music divide an octave into 12 pitches. These tuning systems employ different methods of dividing the frequency range of an octave and result in frequency differences among the nominally equivalent pitches. The differences in frequency of the pitches, however, are audibly discernible only by trained tuning technicians, musicians or individuals with highly sensitive sense of hearing. The resulting frequency differences from the prevalent tuning systems are minute. Therefore, an application running on a computing system may use the frequencies from the standard 12-tone equal
  • Table 700 in Figure 7 lists the frequencies of the pitches in the twelve-tone equal temperament tuning.
  • the scientific pitch notation is used for the pitches of the table entries.
  • C4 and A4 in cells of Table 700 are the pitches for the middle-C and A440.
  • the numbers in the parentheses are the Musical Instrument Digital Interface (MIDI) number corresponding to the pitches.
  • MIDI Musical Instrument Digital Interface
  • FIG 8 illustrates exemplary acoustic signal graph 800 according to an implementation.
  • the sound wave encompasses the last one second of music of a first measure of music. Identification of pitches in the music being played is achieved by examining the computed energy density spectrum of the incoming audio data.
  • energy density spectra graph 900 corresponding to acoustic signal graph 800 illustrated in Figure 8 is presented.
  • the energy density spectrum is computed with FFT on the samples from the last time interval T.
  • the frequencies with spectral density greater than a predetermined level are compared against the nominal pitch frequencies using Table 700 from Figure 7. For example, between the two sound segments, two pitches may be identified as D4 and B4.
  • a pitch is affirmed if the frequency of a spectral peak falls within the tolerance of the nominal frequency.
  • the set of all positively identified pitches is then passed onto the next functional stage for positioning them on the active sheet music page.
  • the time shift by At allows successive spectra to be compared and thus the arrival of new pitches to be readily detected.
  • Shaded area 810 may be shifted by a delta t sound wave which ends at 0.1 seconds into the second measure.
  • Computing system 1000 in an exemplary implementation is shown.
  • Computing system 1000 provides an example of computing system 101 and tablet computer 301, or any computing system that may be used to function as shown and described herein, although such systems could use alternative configurations.
  • Computing system 1000 comprises audio user interface 1001, graphical user interface 1002, and processing system 1003.
  • Processing system 1003 is linked to audio user interface 1001 and graphical user interface 1002.
  • Processing system 1003 includes processing circuitry 1004 and memory device 1005 that stores operating software 1006.
  • Computing system 1000 may include other well-known components such as a battery and enclosure that are not shown for clarity.
  • Computing system 1000 may be representative of any computing apparatus, system, or systems on which an application or variations thereof may be suitably
  • Computing system 1000 may reside in a single device or may be distributed across multiple devices. Examples of computing system 1000 include mobile computing devices, such as cell phones, tablet computers, laptop computers, notebook computers, and gaming devices, as well as any other type of mobile computing devices and any combination or variation thereof. Note that the features and functionality of computing system 1000 may apply as well to desktop computers, server computers, and virtual machines, as well as any other type of computing system, variation, or combination thereof.
  • processing system 1003 is operatively coupled with memory device 1005, audio user interface 1001, and graphical user interface 1002.
  • Processing system 1003 loads and executes software 1006 from memory device 1005.
  • Processing system 1003 may comprise a microprocessor and other circuitry that retrieves and executes software 1006 from memory device 1005. Processing system 1003 may be implemented within a single processing device but may also be distributed across multiple processing devices or sub-systems that cooperate in executing program instructions.
  • processing system 1003 examples include general purpose central processing units, application specific processors, and logic devices, as well as any other type of processing device, combinations, or variations thereof.
  • Memory device 1005 may comprise any computer readable media or storage media readable by processing system 1003 and capable of storing software 1006.
  • Memory device 1005 may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • Memory device 1005 may be implemented as a single storage device but may also be implemented across multiple storage devices or sub-systems co-located or distributed relative to each other.
  • Memory device 1005 may comprise additional elements, such as a controller, capable of
  • storage media include random- access memory, read-only memory, magnetic disks, optical disks, flash memory, virtual memory and non-virtual memory, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and that may be accessed by an instruction execution system, as well as any combination or variation thereof, or any other type of storage media.
  • the storage media a propagated signal.
  • processing system 1003 loads and executes portions of software 1006, such as application modules 1007-1010, to operate as described herein or variations thereof.
  • Software 1006 may be implemented in program instructions and among other functions may, when executed by computing system 1000 in general or processing system 1003 in particular, direct computing system 1000 or processing system 1003 to operate as described herein or variations thereof.
  • Software 1006 may include additional processes, programs, or components, such as operating system software or other application software.
  • Software 1006 may also comprise firmware or some other form of machine -readable processing instructions executable by processing system 1003.
  • software 1006 may, when loaded into processing system 1003 and executed, transform computing system 1000 overall from a general-purpose computing system into a special-purpose computing system customized operate as described herein for each implementation or variations thereof.
  • encoding software 1006 on memory device 1005 may transform the physical structure of memory device 1005.
  • the specific transformation of the physical structure may depend on various factors in different implementations of this description. Examples of such factors may include, but are not limited to the technology used to implement the storage media of memory device 1005 and whether the computer-readable storage media are characterized as primary or secondary storage.
  • computing system 1000 is generally intended to represent a computing system with which software 1006 is deployed and executed in order to implement application modules 1007-1010 to operate as described herein for each implementation (and variations thereof).
  • computing system 1000 may also represent any computing system on which software 1006 may be staged and from where software 1006 may be distributed, transported, downloaded, or otherwise provided to yet another computing system for deployment and execution, or yet additional distribution.
  • computing system 1000 could be configured to deploy software 1006 over the internet to one or more client computing systems for execution thereon, such as in a cloud- based deployment scenario.
  • Audio user interface 1001 and graphical user interface 1002 may include a voice input device, a touch input device for receiving a gesture from a user, a motion input device for detecting non-touch gestures and other motions by a user, and other comparable input devices and associated processing elements capable of receiving user input from a user.
  • Output devices such as a display, speakers, haptic devices, and other types of output devices may also be included in audio user interface 1001 and graphical user interface 1002.
  • graphical user interface 1002 could include a touch screen capable of displaying a graphical user interface that also accepts user inputs via touches on its surface.
  • the aforementioned user input devices are well known in the art and need not be discussed at length here.
  • Audio user interface 1001 and graphical user interface 1002 may also each include associated user interface software executable by processing system 1003 in support of the various user input and output devices discussed above. Separately or in conjunction with each other and other hardware and software elements, the user interface software and devices may provide a graphical user interface, a natural user interface, or any other kind of user interface.
  • computing system 1000 may also include a
  • connections and devices that allow for communication between computing system 1000 and other computing systems (not shown) or services.
  • Examples of connections and devices that together allow for inter-system communication may include network interface cards, antennas, power amplifiers, RF circuitry, transceivers, and other communication circuitry.
  • the aforementioned network, connections, and devices are well known and need not be discussed at length here.

Abstract

Described herein are systems (101, 401, 1000), methods, and software (102, 402, 1006) to autonomously turn a page of sheet music (110). In one implementation, a method of operating a computing system (101, 401, 1000) includes displaying a music sheet page (110) of a music sheet file on a user interface (103, 302, 440, 1002) to the application (102, 402). The music sheet file comprises a sequence of pitches (120-121) associated with each of a plurality music sheet pages (110-111). A pitch (140) is identified from inputted audio data (130) and responsively added to branches (160-162) of a suffix tree (150, 600) associated with the music sheet page (110). Each of the branches (160-162) of the suffix tree (150, 600) is then compared to a sequence of pitches (120) associated with the music sheet page (110) to determine a location of the identified pitch (140) in the sequence of pitches (120). When the location of the identified pitch (140) is determined to be at an end of the sequence of pitches (120), a subsequent music sheet page (111) is displayed on the user interface (103, 302, 440, 1002).

Description

AUTONOMOUS PAGE-TURNER FOR SHEET MUSIC
BACKGROUND
[0001] Page-turning activities occur during practice and on-stage performances.
While playing musical instruments, the continuity of the flow of music is briefly interrupted to turn the page before resuming. To avoid interruptions, a human page-turner may accompany the performer or the performer may play the musical instrument from memory without sheet music. While current techniques for enabling a user to turn a page using touch and speech commands, these techniques still require the performer to be distracted from playing the musical instrument. [0002] Musical performers commonly desire for pages of sheet music to be turned without physical motion, audio commands, or other external distractions. For example, performers want to seamlessly move from one page to another in a sequence of sheets in a hands-free and noise-free manner. Current electronic platforms for displaying a next page of sheet music involve a variety of disruptive cues that must be initiated by the performer. Unfortunately, turning pages during a performance often requires some sort of interruption to the performer and an audience.
OVERVIEW
[0003] Described herein are systems, methods, and software to autonomously turn a page of sheet music. In one implementation, a method of operating a computing system includes displaying a music sheet page of a music sheet file on a user interface to the application. The music sheet file comprises a sequence of pitches associated with each of a plurality music sheet pages. A pitch is identified from inputted audio data and responsively added to branches of a suffix tree associated with the music sheet page. Each of the branches of the suffix tree is then compared to a sequence of pitches associated with the music sheet page to determine a location of the identified pitch in the sequence of pitches. When the location of the identified pitch is determined to be at an end of the sequence of pitches, a subsequent music sheet page is displayed on the user interface.
[0004] This Overview is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. It may be understood that this Overview is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The following description and associated figures teach the best mode of the invention. For the purpose of teaching inventive principles, some conventional aspects of the best mode may be simplified or omitted. The following claims specify the scope of the invention. Note that some aspects of the best mode may not fall within the scope of the invention as specified by the claims. Thus, those skilled in the art will appreciate variations from the best mode that fall within the scope of the invention. Those skilled in the art will appreciate that the features described below can be combined in various ways to form multiple variations of the invention. As a result, the invention is not limited to the specific examples described below, but only by the claims and their equivalents. [0006] Figure 1 illustrates an operational architecture for implementing an enhanced computing environment to autonomously turn a page for sheet music on an electric computing device according to an implementation. [0007] Figure 2 illustrates a page turner process employed in implementations of an enhanced application to autonomously turn a page for sheet music on an electric computing device.
[0008] Figure 3 illustrates an exemplary user interface to autonomously turn a page of sheet music on tablet according to an implementation. [0009] Figure 4 illustrates an exemplary host computing system to autonomously turn a page for sheet music on an electric computing device according to an implementation.
[0010] Figure 5 illustrates a sequence diagram to autonomously turn a page for sheet music on an electric computing device according to an implementation.
[0011] Figure 6 illustrates an exemplary suffix tree to determine the location of the identified pitch on a page of sheet music according to an implementation.
[0012] Figure 7 illustrates an exemplary table to determine the location of the identified pitch on a page of sheet music according to an implementation.
[0013] Figure 8 illustrates an exemplary acoustic signal of a sound wave according to an implementation. [0014] Figure 9 illustrates an exemplary energy density spectrum of an acoustic signal according to an implementation.
[0015] Figure 10 illustrates a computing system to autonomously turn a page for sheet music on an electric computing device according to an implementation.
DETAILED DESCRIPTION
[0016] Examples of the present disclosure describe an application for autonomously turn a page for sheet music on an electric computing device. A music sheet page of a music sheet file is displayed on a user interface to the application. The music sheet file comprises a sequence of pitches associated with each of a plurality music sheet pages. A pitch is identified from inputted audio data and responsively added to branches of a suffix tree associated with the music sheet page. Each of the branches of the suffix tree is then compared to a sequence of pitches associated with the music sheet page to determine a location of the identified pitch in the sequence of pitches. When the location of the identified pitch is determined to be at an end of the sequence of pitches, a subsequent music sheet page is displayed on the user interface.
[0017] A technical effect that may be appreciated from the present discussion is the increased efficiency in displaying a next page of sheet music to a user without the need for the user to disrupt the musical performance to manually turn the page of sheet music.
Additionally, the user does not require an additional person to be present to turn the page of music. The application described herein also improves the accuracy of the time in which the user sees the next page of sheet music since a more precise location of where the user is located on the page of sheet music may be determined.
[0018] Further, examples herein described that the pitch from inputted audio data may be identified by determining an energy density spectrum of the inputted audio data. In further examples, the pitch may be identified based on the inputted audio data having a frequency with a spectral density within a tolerance level of a nominal frequency. In some
implementations, the inputted audio data is calibrated with each pitch of the sequence of pitches associated with each of the music sheet pages of the music sheet file. In another example, the sheet music file comprises a Music Extensive Markup Language (MusicXML) file.
[0019] In some scenarios, one or more branches of the suffix tree may be removed when the identified pitch added to the one or more branches does not match the next pitch in the sequence of pitches associated with the music sheet page. In yet another scenario, a new branch may be added to the suffix tree when the identified pitch added to the one or more branches does not match the next pitch in the sequence of pitches associated with the music sheet page. In other implementations, one or more branches of the suffix tree may be split when the identified pitch added to the one or more branches matches one or more locations in the sequence of pitches associated with the music sheet page.
[0020] Referring to the drawings, Figure 1 illustrates an exemplary operational architecture 100 related to processing operations for management of an exemplary enhanced system with which aspects of the present disclosure may be practiced. Operational environment 100 includes computing system 101 comprising application 102. Application 102 employs a page turner process 200 in the context of producing views in a user interface 103. User interface 103 displays sheet music pages 110-111 of a sheet music file processed by application 102.
[0021] Computing system 101 is representative of any device capable of running an application natively or in the context of a web browser, streaming an application, or executing an application in any other manner. Examples of computing system 101 include, but are not limited to, personal computers, mobile phones, tablet computers, desktop computers, laptop computers, wearable computing devices, or any other form factor, including any combination of computers or variations thereof. Computing system 101 may include various hardware and software elements in a supporting architecture suitable for performing page turning process 200. One such representative architecture is illustrated in Figure 10 with respect to computing system 1001.
[0022] Application 102 includes a software application or application component capable of displaying and autonomously turning sheet music pages in accordance with the processes described herein. Examples of the software application include, but are not limited to, speech editing applications, music editing applications, video editing applications, and any other type of combination or variation thereof. The software application may be
implemented as a natively installed and executed application, a web application hosted in the context of a browser, a streamed or streaming application, a mobile application, or any variation or combination thereof.
[0023] User interface 103 includes a representative view of sheet music pages 110-111 that may be produced by a sheet music application. As can be seen in sheet music page 110, musical notes may be illustrated which correspond to a sequence of pitches stored in a sheet music audio file. An end user may interface with application 102 to view sheet music pages 110-111. The user may interface with application 102 over user interface 103 using an input instrument such as a microphone which allows application 102 process inputted audio data.
[0024] More particularly, Figure 2 illustrates page turner process 200 which, as mentioned, may be employed by application 102 to autonomously turn a page for sheet music on an electric computing device as described herein. Some or all of the steps of page turner process 200 may be implemented in program instructions in the context of a component or components to the application used to carry out the visual representation display feature. The program instructions direct application 102 to operate as follows, referring parenthetically to the steps in Figure 2 in the context of Figure 1.
[0025] In operation, application 102 displays music sheet page 110 of a music sheet file on user interface 103 to application 102 (step 201). The music sheet file comprises a sequence of pitches 120 associated with each of a plurality of music sheet pages 110-111 of a music sheet file. For example, each page of sheet music in the sheet music file has a sequence of notes. For example, first page of sheet music 110 may begin with a sequence of notes, such as A, B, F. These notes would then be associated with sequence of pitches 120 and stored in the sheet music file for sheet music page 110. It should be understood that each page of sheet music may contain different sequences of pitches. The music sheet file may comprise a Music Extensive Markup Language (MusicXML) file. However, the music sheet file may alternatively be in any other file format for representing musical notation.
[0026] Application 102 then identifies pitch 140 from inputted audio data 130 and responsively adds identified pitch 140 to one or more branches 160-162 of suffix tree 150 associated with music sheet page 110 (step 202). For example, application 102 identifies that inputted audio data 130 indicates that identified pitch 140 is the note C. The note is then added to each of branches 160-162 in suffix tree 150. In some examples, pitch 140 from inputted audio data 130 may be identified by determining an energy density spectrum of the inputted audio data. In further examples, pitch 140 may be identified based on inputted audio data 130 having a frequency with a spectral density within a tolerance level of a nominal frequency. In some implementations, inputted audio data 130 is calibrated with each pitch of sequence of pitches 120-121 associated with each of music sheet pages 110-111 of the music sheet file.
[0027] In a next operation, application 102 compares each of one or more branches 160-162 of suffix tree 150 to sequence of pitches 120 associated with music sheet page 110 to determine a location of the identified pitch 140 in sequence of pitches 120 associated with music sheet page 110 (step 203). One or more branches of suffix tree 150 may be removed when identified pitch 140 added to one or more branches 160-162 does not match the next pitch in sequence of pitches 120 associated with music sheet page 110. For example, branch 161 of suffix tree has a sequence of A, F#, and C. However, since the F# is not included on pitch sequence 120, branch 161 may be removed from suffix tree 150.
[0028] In other scenarios, a new branch may be added to suffix tree 150 when the identified pitch added to one or more branches 160-162 does not match the next pitch in the sequence of pitches 120 associated with music sheet page 110. In other implementations, one or more branches 160-162 of suffix tree 150 may be split when the identified pitch added to one or more branches 160-162 matches one or more locations in sequence of pitches 120 associated with music sheet page 110.
[0029] In a final operation, when the location of identified pitch 140 is determined to be at an end of sequence of pitches 120, application 102 displays subsequent music sheet page 111 of the music sheet file on user interface 103 (step 204). It should be understood that application 102 may determine that a pitch before the last pitch in sequence of pitches 120 indicates the end of sheet music page 110. For example, although the note B is the last pitch in pitch sequence 120, application 102 may determine to turn from sheet music page 110 to sheet music page 111 once branch 163 of suffix tree 150 reaches the second to last pitch (i.e., note C). This allows the page to seamlessly turn to the next page without causing delay to the user viewing the sheet music.
[0030] Figure 3 illustrates an exemplary tablet to autonomously turn a page of sheet music on tablet according to an implementation. Tablet 301 includes graphical display 302 to present a sheet music page to a user. The sheet music page contains notes which are each associated with pitches in a sequence of pitches. Although not illustrated, the sequence of pitches is associated with each page of the music sheet file. The user may interface with tablet 301 using an input instrument such as a stylus, mouse device, keyboard, touch gesture, as well as any other suitable input device. Tablet 301 also includes microphone 303 and speaker 304.
[0031] At the bottom of graphical display 302, indicators are provided to illustrate tablet computer 301 toggling between displaying various different pages of the sheets of music. However, it should be noted that tablet computer 301 toggles between displaying various different pages without the use of inputted touch, audio, or any other signaling initiated by a user. [0032] As shown in Figure 3, tablet computer 301 graphically displays an initial page of sheet music to a user on graphical display 302. Once a user begins to play a musical instrument, tablet computer 301 begins detecting and receiving inputted audio data. The inputted audio data may be generated by a musical instrument (e.g., a piano, violin, etc.), a voice, an additional computing device (e.g., electronic piano). The audio data may be captured and input into tablet 301 for processing using microphone 303.
[0033] Accordingly, the audio data is converted and transmitted to the application executing on tablet computer 301 to identify pitches from the inputted audio data. Tablet computer 301 then begins to build a set of branches off of a suffix tree based on the identified pitches. The set of subsequences are then matched to a stored sequence of pitches associated with the sheet music page on display in graphical display 302. By comparing the branches of the suffix tree currently being built by the application running on tablet computer 301 with the stored sequence of pitches associated with the sheet music page on display, tablet computer 301 determines the location of the pitches on the page of sheet music. While the user moves through the page, tablet computer 301 is enabled to determine when the user is nearing the end of the current page of sheet music. Therefore, when the application executing on tablet computer 301 determines that the suffix tree indicates that the location of the latest identified pitch is at the end of the page of sheet music, graphical display 302 displays a subsequent page of sheet music.
[0034] Advantageously, the user playing the musical instrument may view the subsequent page of sheet music without having to interrupt the performance of the song with a physical motion or audio command instructing tablet computer to turn to the next page, such as by inputting additional noises into microphone 303 or touching graphical display 302. Although demonstrated in the example of Figure 3 as applying to automatically turning a page for sheet music on an electric computing device, it should be understood that similar operations may also be employed to automatically turn a page for sheet music on an electric computing device.
[0035] Figure 4 illustrates a computing environment 400 to autonomously turn a page for sheet music on an electric computing device according to an implementation. Computing environment 100 includes computing system 401 comprising application 402, audio input 410, A/D converter 420, music processor 430, and sheet music display 440.
[0036] Referring to Figure 5, sheet music display 440 graphically displays an initial page of sheet music to a user. The initial page of sheet music may be previously downloaded into computing system 401 or currently be streaming from an external computing system as executed by application 402. The digital sheet music may be received and stored in various digital formats, such as a Music XML formats.
[0037] In a next operation, audio input 410 receives inputted audio data. For example, audio data created by playing a musical instrument may be inputted into computing system 401 via audio input 410, such as a microphone. The audio data is then converted in A/D converter 420 and transferred to music processor 430. Music processor 430 then identifies pitches from the inputted audio data. In one implementation, the sound from the musical instrument is picked up from a microphone and quantized with A/D converter 420. The quantized sound, in the form of a stream of binary data, is then processed with Short- Time Fourier Transform (STFT) to derive the spectrum of the music being played. From the spectrum, the pitches played in the duration for the Fourier transform are extracted. The system then repeats the pitch extracted on the window at an interval delta-t following and overlapping the previous one. The new pitches found in the later window are the most recent ones being played.
[0038] Music processor 430 then builds branches of the suffix tree and matches the identified pitches on the suffix tree with a known sequence of pitches associated with the music audio file, such as a MusicXML file. Music processor 430 may then determine the location of the pitches on the page of sheet music. In one implementation, the pitches are internally organized on the suffix tree and the pitches derived from the STFT are used to match the edges of the tree to determine the likely position on the current page of the sheet music.
[0039] This process continues until it is determined that the end of the page is reached.
In a final operation, when music processor 430 determines that the suffix tree indicates that the location is at the end of the page of sheet music, music processor 430 directs sheet music display 440 to display a subsequent page of sheet music.
[0040] Figure 6 illustrates an exemplary suffix tree 600 to determine the location of the identified pitch on a page of sheet music according to an implementation. As can be seen in Figure 6, each pitch which is identified from the inputted audio data may be added to a branch on suffix tree 600. For example, C5 may be identified as a pitch from inputted data. C5 may then be added to branch 1. Next, C4 may be identified from the inputted audio data. C4 would then be added to branch 1. Additionally, new branch 2 may be created.
[0041] In response to determining G4 as the next pitch, branches 1 and 2 may be advanced and new branch 3 may be added. Next pitches E4/E5 are identified. This results in the addition of branches 4 and 6 and the advancement of branches 1-3. The application may then determine that the next pitch is identified to be G5. At this point, branch 7 is split off of branch 2, branch 8 is split off of branch 3, branch 9 is split off of branch 4, and branches 1 and 6 are removed from the suffix tree. Finally, in response to receiving pitches D4/B4, the suffix tree removes branches 7, 8, 9, and 10 and adds branch 11.
[0042] Figure 7 illustrates exemplary table 700 to determine the location of the identified pitch on a page of sheet music according to an implementation. The most prevalent tuning systems of the contemporary world music divide an octave into 12 pitches. These tuning systems employ different methods of dividing the frequency range of an octave and result in frequency differences among the nominally equivalent pitches. The differences in frequency of the pitches, however, are audibly discernible only by trained tuning technicians, musicians or individuals with highly sensitive sense of hearing. The resulting frequency differences from the prevalent tuning systems are minute. Therefore, an application running on a computing system may use the frequencies from the standard 12-tone equal
temperament tuning system with a small tolerance to pick out the music pitches.
[0043] Table 700 in Figure 7 lists the frequencies of the pitches in the twelve-tone equal temperament tuning. The scientific pitch notation is used for the pitches of the table entries. For example, C4 and A4 in cells of Table 700 are the pitches for the middle-C and A440. The numbers in the parentheses are the Musical Instrument Digital Interface (MIDI) number corresponding to the pitches. For the purpose of illustrating the algorithms of a music data processor, all pitches are written with the scientific pitch notation for readability. MIDI numbers, on the other hand, lend themselves as the choice for internal representation for the pitches in the implementation.
[0044] Figure 8 illustrates exemplary acoustic signal graph 800 according to an implementation. As illustrated in Figure 8, the sound wave encompasses the last one second of music of a first measure of music. Identification of pitches in the music being played is achieved by examining the computed energy density spectrum of the incoming audio data. Referring to Figure 9, energy density spectra graph 900 corresponding to acoustic signal graph 800 illustrated in Figure 8 is presented. The energy density spectrum is computed with FFT on the samples from the last time interval T. The frequencies with spectral density greater than a predetermined level are compared against the nominal pitch frequencies using Table 700 from Figure 7. For example, between the two sound segments, two pitches may be identified as D4 and B4. [0045] A pitch is affirmed if the frequency of a spectral peak falls within the tolerance of the nominal frequency. The set of all positively identified pitches is then passed onto the next functional stage for positioning them on the active sheet music page. The pitch identification process repeats with a period At, At <= T, the interval of which the samples are computed for the energy density spectrum. In other words, the two adjacent windows of samples overlap by the duration of T - At. The time shift by At allows successive spectra to be compared and thus the arrival of new pitches to be readily detected. Shaded area 810 may be shifted by a delta t sound wave which ends at 0.1 seconds into the second measure.
[0046] Referring now to Figure 10, a block diagram that illustrates computing system
1000 in an exemplary implementation is shown. Computing system 1000 provides an example of computing system 101 and tablet computer 301, or any computing system that may be used to function as shown and described herein, although such systems could use alternative configurations. Computing system 1000 comprises audio user interface 1001, graphical user interface 1002, and processing system 1003. Processing system 1003 is linked to audio user interface 1001 and graphical user interface 1002. Processing system 1003 includes processing circuitry 1004 and memory device 1005 that stores operating software 1006. Computing system 1000 may include other well-known components such as a battery and enclosure that are not shown for clarity.
[0047] Computing system 1000 may be representative of any computing apparatus, system, or systems on which an application or variations thereof may be suitably
implemented. Computing system 1000 may reside in a single device or may be distributed across multiple devices. Examples of computing system 1000 include mobile computing devices, such as cell phones, tablet computers, laptop computers, notebook computers, and gaming devices, as well as any other type of mobile computing devices and any combination or variation thereof. Note that the features and functionality of computing system 1000 may apply as well to desktop computers, server computers, and virtual machines, as well as any other type of computing system, variation, or combination thereof.
[0048] Referring still to Figure 10, processing system 1003 is operatively coupled with memory device 1005, audio user interface 1001, and graphical user interface 1002.
Processing system 1003 loads and executes software 1006 from memory device 1005.
Processing system 1003 may comprise a microprocessor and other circuitry that retrieves and executes software 1006 from memory device 1005. Processing system 1003 may be implemented within a single processing device but may also be distributed across multiple processing devices or sub-systems that cooperate in executing program instructions.
Examples of processing system 1003 include general purpose central processing units, application specific processors, and logic devices, as well as any other type of processing device, combinations, or variations thereof.
[0049] Memory device 1005 may comprise any computer readable media or storage media readable by processing system 1003 and capable of storing software 1006. Memory device 1005 may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Memory device 1005 may be implemented as a single storage device but may also be implemented across multiple storage devices or sub-systems co-located or distributed relative to each other. Memory device 1005 may comprise additional elements, such as a controller, capable of
communicating with processing system 1003. Examples of storage media include random- access memory, read-only memory, magnetic disks, optical disks, flash memory, virtual memory and non-virtual memory, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and that may be accessed by an instruction execution system, as well as any combination or variation thereof, or any other type of storage media. In no case is the storage media a propagated signal.
[0050] In operation, in conjunction with audio user interface 1001 and graphical user interface 1002, processing system 1003 loads and executes portions of software 1006, such as application modules 1007-1010, to operate as described herein or variations thereof.
Software 1006 may be implemented in program instructions and among other functions may, when executed by computing system 1000 in general or processing system 1003 in particular, direct computing system 1000 or processing system 1003 to operate as described herein or variations thereof. Software 1006 may include additional processes, programs, or components, such as operating system software or other application software. Software 1006 may also comprise firmware or some other form of machine -readable processing instructions executable by processing system 1003.
[0051] In general, software 1006 may, when loaded into processing system 1003 and executed, transform computing system 1000 overall from a general-purpose computing system into a special-purpose computing system customized operate as described herein for each implementation or variations thereof. For example, encoding software 1006 on memory device 1005 may transform the physical structure of memory device 1005. The specific transformation of the physical structure may depend on various factors in different implementations of this description. Examples of such factors may include, but are not limited to the technology used to implement the storage media of memory device 1005 and whether the computer-readable storage media are characterized as primary or secondary storage.
[0052] In some examples, if the computer-readable storage media are implemented as semiconductor-based memory, software 1006 may transform the physical state of the semiconductor memory when the program is encoded therein. For example, software 1006 may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory. A similar transformation may occur with respect to magnetic or optical media. Other transformations of physical media are possible without departing from the scope of the present description, with the foregoing examples provided only to facilitate this discussion. [0053] It should be understood that computing system 1000 is generally intended to represent a computing system with which software 1006 is deployed and executed in order to implement application modules 1007-1010 to operate as described herein for each implementation (and variations thereof). However, computing system 1000 may also represent any computing system on which software 1006 may be staged and from where software 1006 may be distributed, transported, downloaded, or otherwise provided to yet another computing system for deployment and execution, or yet additional distribution. For example, computing system 1000 could be configured to deploy software 1006 over the internet to one or more client computing systems for execution thereon, such as in a cloud- based deployment scenario. [0054] Audio user interface 1001 and graphical user interface 1002 may include a voice input device, a touch input device for receiving a gesture from a user, a motion input device for detecting non-touch gestures and other motions by a user, and other comparable input devices and associated processing elements capable of receiving user input from a user. Output devices such as a display, speakers, haptic devices, and other types of output devices may also be included in audio user interface 1001 and graphical user interface 1002. In some examples, graphical user interface 1002 could include a touch screen capable of displaying a graphical user interface that also accepts user inputs via touches on its surface. The aforementioned user input devices are well known in the art and need not be discussed at length here. Audio user interface 1001 and graphical user interface 1002 may also each include associated user interface software executable by processing system 1003 in support of the various user input and output devices discussed above. Separately or in conjunction with each other and other hardware and software elements, the user interface software and devices may provide a graphical user interface, a natural user interface, or any other kind of user interface. [0055] Although not shown, computing system 1000 may also include a
communication interface and other communication connections and devices that allow for communication between computing system 1000 and other computing systems (not shown) or services. Examples of connections and devices that together allow for inter-system communication may include network interface cards, antennas, power amplifiers, RF circuitry, transceivers, and other communication circuitry. The aforementioned network, connections, and devices are well known and need not be discussed at length here.
[0056] The functional block diagrams, operational sequences, and flow diagrams provided in the Figures are representative of exemplary architectures, environments, and methodologies for performing novel aspects of the disclosure. While, for purposes of simplicity of explanation, methods included herein may be in the form of a functional diagram, operational sequence, or flow diagram, and may be described as a series of acts, it is to be understood and appreciated that the methods are not limited by the order of acts, as some acts may, in accordance therewith, occur in a different order and/or concurrently with other acts from that shown and described herein. For example, those skilled in the art will understand and appreciate that a method could alternatively be represented as a series of interrelated states or events, such as in a state diagram. Moreover, not all acts illustrated in a methodology may be required for a novel implementation.

Claims

What is claimed is:
1. A computer apparatus comprising:
one or more computer readable storage media;
one or more processors operatively coupled to the one or more computer readable storage media; and
an application program comprising instructions stored on the one or more computer readable storage media that, when executed by the one or more processors, direct the one or more processors to at least:
display a music sheet page of a music sheet file on a user interface to the application, wherein the music sheet file comprises a sequence of pitches associated with each of a plurality of music sheet pages of the music sheet file;
identify a pitch from inputted audio data and responsively add the identified pitch to one or more branches of a suffix tree associated with the music sheet page;
compare each of the one or more branches of the suffix tree to a sequence of pitches associated with the music sheet page to determine a location of the identified pitch in the sequence of pitches associated with the music sheet page; and
when the location of the identified pitch is determined to be at an end of the sequence of pitches, display a subsequent music sheet page of the music sheet file on the user interface.
2. The computer apparatus of claim 1 wherein to identify the pitch from inputted audio data, the program instructions direct the one or more processors to determine an energy density spectrum of the inputted audio data.
3. The computer apparatus of claims 1-2 wherein the program instructions further direct the one or more processors to identify the pitch based on the inputted audio data having a frequency with a spectral density within a tolerance level of a nominal frequency.
4. The computer apparatus of claims 1-3 wherein the program instructions further direct the one or more processors to calibrate the inputted audio data with each pitch of the sequence of pitches associated with each of the music sheet pages of the music sheet file.
5. The computer apparatus of claims 1-4 wherein the program instructions further direct the one or more processors to remove one or more branches of the suffix tree when the identified pitch added to the one or more branches does not match a next pitch in the sequence of pitches associated with the music sheet page.
6. The computer apparatus of claims 1-5 wherein the program instructions further direct the one or more processors to add a new branch to the suffix tree when the identified pitch added to the one or more branches does not match a next pitch in the sequence of pitches associated with the music sheet page.
7. The computer apparatus of claims 1-6 wherein the program instructions further direct the one or more processors to split one or more branches of the suffix tree when the identified pitch added to the one or more branches matches one or more locations in the sequence of pitches associated with the music sheet page.
8. The computer apparatus of claims 1-7 wherein the sheet music file comprises a Music Extensive Markup Language (MusicXML) file.
9. A method of operating a computing system to autonomously turn a page of sheet music, the method comprising:
displaying a music sheet page of a music sheet file on a user interface to the application, wherein the music sheet file comprises a sequence of pitches associated with each of a plurality of music sheet pages of the music sheet file;
identifying a pitch from inputted audio data and responsively adding the identified pitch to one or more branches of a suffix tree associated with the music sheet page;
comparing each of the one or more branches of the suffix tree to a sequence of pitches associated with the music sheet page to determine a location of the identified pitch in the sequence of pitches associated with the music sheet page; and
when the location of the identified pitch is determined to be at an end of the sequence of pitches, displaying a subsequent music sheet page of the music sheet file on the user interface.
10. The method of claim 9 wherein identifying the pitch from inputted audio data comprises determining an energy density spectrum of the inputted audio data.
11. The method of claims 9-10 wherein identifying the pitch from inputted audio data comprises identifying the pitch based on the inputted audio data having a frequency with a spectral density within a tolerance level of a nominal frequency.
12. The method of claims 9-11 further comprising calibrating the inputted audio data with each pitch of the sequence of pitches associated with each of the music sheet pages of the music sheet file.
13. The method of claims 9-12 further comprising removing one or more branches of the suffix tree when the identified pitch added to the one or more branches does not match a next pitch in the sequence of pitches associated with the music sheet page.
14. The method of claims 9-13 further comprising adding a new branch to the suffix tree when the identified pitch added to the one or more branches does not match a next pitch in the sequence of pitches associated with the music sheet page.
15. The method of claims 9-14 further comprising splitting one or more branches of the suffix tree when the identified pitch added to the one or more branches matches one or more locations in the sequence of pitches associated with the music sheet page.
16. The method of claims 9-15 wherein the sheet music file comprises a Music Extensive Markup Language (MusicXML) file.
17. One or more computer readable storage media having program instructions stored thereon, wherein the program instructions, when executed by a processing system, direct the processing system to at least:
display an initial page of sheet music on a user interface to the application;
receive inputted audio data and responsively identify pitches from the inputted audio data;
match the identified pitches with pitches on a suffix tree to determine the location of the pitches on the page of sheet music; and
when the location is at the end of the page of sheet music, display a subsequent page of sheet music on the user interface.
18. The computer readable storage media of claim 17 wherein to identify the pitch from inputted audio data, the program instructions direct the processing system to determine an energy density spectrum of the inputted audio data.
19. The computer readable storage media of claims 17-18 wherein the program instructions further direct the processing system to identify the pitch based on the inputted audio data having a frequency with a spectral density within a tolerance level of a nominal frequency.
20. The computer readable storage media of claims 17-19 wherein to match the identified pitches with pitches on the suffix tree to determine the location of the pitches on the page of sheet music the program instructions further direct the processing system to at least remove one or more branches of the suffix tree or add a new branch to the suffix tree when the identified pitch does not match a next pitch in a sequence of pitches associated with the music sheet page.
PCT/US2018/038440 2017-06-21 2018-06-20 Autonomous page-turner for sheet music WO2018236962A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762522982P 2017-06-21 2017-06-21
US62/522,982 2017-06-21

Publications (1)

Publication Number Publication Date
WO2018236962A1 true WO2018236962A1 (en) 2018-12-27

Family

ID=62986165

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/038440 WO2018236962A1 (en) 2017-06-21 2018-06-20 Autonomous page-turner for sheet music

Country Status (1)

Country Link
WO (1) WO2018236962A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111554250A (en) * 2020-04-26 2020-08-18 苏州缪斯谈谈科技有限公司 Automatic music score turning method, system, electronic equipment and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050081701A1 (en) * 2003-10-15 2005-04-21 Sunplus Technology Co., Ltd. Electronic musical score device
US20080156176A1 (en) * 2004-07-08 2008-07-03 Jonas Edlund System For Generating Music
US20150348523A1 (en) * 2014-05-27 2015-12-03 Terrence Bisnauth Musical Score Display Device and Accessory Therefor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050081701A1 (en) * 2003-10-15 2005-04-21 Sunplus Technology Co., Ltd. Electronic musical score device
US20080156176A1 (en) * 2004-07-08 2008-07-03 Jonas Edlund System For Generating Music
US20150348523A1 (en) * 2014-05-27 2015-12-03 Terrence Bisnauth Musical Score Display Device and Accessory Therefor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111554250A (en) * 2020-04-26 2020-08-18 苏州缪斯谈谈科技有限公司 Automatic music score turning method, system, electronic equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
US10770050B2 (en) Audio data processing method and apparatus
EP3047478B1 (en) Combining audio samples by automatically adjusting sample characteristics
EP3047479B1 (en) Automatically expanding sets of audio samples
US8309834B2 (en) Polyphonic note detection
US20140129235A1 (en) Audio tracker apparatus
US8392006B2 (en) Detecting if an audio stream is monophonic or polyphonic
US9852721B2 (en) Musical analysis platform
US11568244B2 (en) Information processing method and apparatus
US20170090860A1 (en) Musical analysis platform
US8614388B2 (en) System and method for generating customized chords
JP2023071787A (en) Method and apparatus for extracting pitch-independent timbre attribute from medium signal
CN109189975B (en) Music playing method and device, computer equipment and readable storage medium
CN114842820A (en) K song audio processing method and device and computer readable storage medium
WO2018236962A1 (en) Autonomous page-turner for sheet music
CN112309409A (en) Audio correction method and related device
US20160277864A1 (en) Waveform Display Control of Visual Characteristics
US20230260531A1 (en) Intelligent audio procesing
US10127897B2 (en) Key transposition
CN112992110A (en) Audio processing method, device, computing equipment and medium
US10832678B2 (en) Filtering audio-based interference from voice commands using interference information
US10811007B2 (en) Filtering audio-based interference from voice commands using natural language processing
CN113646756A (en) Information processing apparatus, method, and program
KR20190093268A (en) Method for controlling device and device thereof
JP6680029B2 (en) Acoustic processing method and acoustic processing apparatus
Kursa et al. Multi-label ferns for efficient recognition of musical instruments in recordings

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18743887

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18743887

Country of ref document: EP

Kind code of ref document: A1