WO2017037342A1 - System for teaching a user to play a musical instrument from musical notation via virtual exercises and a method thereof - Google Patents

System for teaching a user to play a musical instrument from musical notation via virtual exercises and a method thereof Download PDF

Info

Publication number
WO2017037342A1
WO2017037342A1 PCT/FI2016/050598 FI2016050598W WO2017037342A1 WO 2017037342 A1 WO2017037342 A1 WO 2017037342A1 FI 2016050598 W FI2016050598 W FI 2016050598W WO 2017037342 A1 WO2017037342 A1 WO 2017037342A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
fingering
instrument
graphical
note
Prior art date
Application number
PCT/FI2016/050598
Other languages
French (fr)
Inventor
Senna LEHTONEN
Leena Lehtonen
Taina VÄNNI
Visa VÄNNI
Original Assignee
Pianorobot Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pianorobot Oy filed Critical Pianorobot Oy
Publication of WO2017037342A1 publication Critical patent/WO2017037342A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B15/00Teaching music
    • G09B15/02Boards or like means for providing an indication of notes
    • G09B15/023Electrically operated
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/46Computing the game score
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B15/00Teaching music
    • G09B15/001Boards or like means for providing an indication of chords
    • G09B15/002Electrically operated systems
    • G09B15/003Electrically operated systems with indication of the keys or strings to be played on instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0016Means for indicating which keys, frets or strings are to be actuated, e.g. using lights or leds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10GREPRESENTATION OF MUSIC; RECORDING MUSIC IN NOTATION FORM; ACCESSORIES FOR MUSIC OR MUSICAL INSTRUMENTS NOT OTHERWISE PROVIDED FOR, e.g. SUPPORTS
    • G10G1/00Means for the representation of music
    • G10G1/02Chord or note indicators, fixed or adjustable, for keyboard of fingerboards
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/066Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for pitch analysis as part of wider processing for musical purposes, e.g. transcription, musical performance evaluation; Pitch recognition, e.g. in polyphonic sounds; Estimation or use of missing fundamental
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/091Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for performance evaluation, i.e. judging, grading or scoring the musical qualities or faithfulness of a performance, e.g. with respect to pitch, tempo or other timings of a reference performance
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/005Non-interactive screen display of musical or status data
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/021Indicator, i.e. non-screen output user interfacing, e.g. visual or tactile instrument status or guidance information using lights, LEDs, seven segments displays
    • G10H2220/026Indicator, i.e. non-screen output user interfacing, e.g. visual or tactile instrument status or guidance information using lights, LEDs, seven segments displays associated with a key or other user input device, e.g. key indicator lights
    • G10H2220/041Remote key fingering indicator, i.e. fingering shown on a display separate from the instrument itself or substantially disjoint from the keys
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/155User input interfaces for electrophonic musical instruments
    • G10H2220/441Image sensing, i.e. capturing images or optical patterns for musical purposes or musical control purposes
    • G10H2220/455Camera input, e.g. analyzing pictures from a video camera and using the analysis results as control data

Definitions

  • the present invention relates to electronic arrangements and computing systems. Particularly, however not exclusively, the invention pertains to audio recognition techniques and music teaching methods.
  • the traditional approaches include using learning material, such as teach- ing videos or books. However, they very much lack any interaction or customized teaching and learning as the user progresses.
  • Some other solutions comprise online lessons with teachers that teach in real-time with the help of webcams and microphones. However, they are very much only extensions of traditional face-to-face teaching utilizing virtual means and there - fore lack on-demand capability and virtual teaching feedback and teaching means.
  • the objective of the embodiments of the present invention is to at least alleviate the aforementioned drawbacks evident in the prior art arrangements particularly in the context of teaching music.
  • the objective is generally achieved with a system and a corresponding method in accordance with the present invention.
  • One of the present invention's main advantage is that it teaches the user to read actual musical notation in contrast to tablature, color coded keys, etc., which make reading of music essentially easier in the short run but ignore many of the musical notation's important aspects and actually only teach user to play an instrument by following patterns or the like. It is well- known that knowing how to read musical notation is essential for playing any songs in the world. Playing instruments and understanding musical theory also serve as the basis for improvisation, band playing and writing music.
  • Yet another advantage of the present invention is that it tracks the user's performance (scores) and development, and correspondingly determines further exercises as the user progresses individually and using intellectual- ly learning by mistakes. Hence, the system follows the user's development like an actual teacher would and may provide the user exercises on the areas that the user has still learning to do.
  • the present invention is arranged to present the user with notes or a set of notes and (chords or intervals) one by one, such that it is easy to receive immediate feedback on whether a the note or the set of notes was played correctly. Further, the present invention tracks the success rate of the notes that the user has produced (correctly and incorrectly) and determines which notes and/or intervals the user has problems with so that the system carrying out the invention may provide those notes back to the user. Consequently, the system learns the notes, which the user has problems with and provides such exercises to user so that the user has to practice them more.
  • a system for teaching a user to play a musical instrument from musical notation via virtual exercises comprising:
  • At least one electronic device comprising at least a processing entity, a memory entity and a display
  • -means for forming or receiving at least one play signal produced by a user on a musical instrument -the processing entity being arranged to provide at least graphical musi-cal notation content, such as a single note or a chord, via the display, -the processing entity being further arranged to obtain at least play signal data via said means for forming or receiving at least one play signal,
  • the processing entity being further arranged to execute audio recognition to recognize the play signal data and compare it with data relating to the presented graphical content to at least determine if the play signal corresponds to said at least graphical musical notation content
  • the processing entity being further arranged to assign a score to represent the result of said comparison and store said score
  • the processing entity being further arranged to present the score to the user via the display
  • the play signal is herein used to refer to an electric signal, that is optionally digital, or audio signal representing or corresponding to at least one note, i.e. at least the information of the pitch and optionally the length of a sound.
  • the play signal may be produced by an instrument, voice or MIDI (Musical Instrument Digital Interface) that is either analog and originally acoustic or electric, or digital. After obtaining the signal it is recognized and converted into digital signal.
  • MIDI Musical Instrument Digital Interface
  • the musical notation pertains to at least one note, optionally with a duration, presented in relation to staff.
  • the musical notation may be a single note whose pitch is stated via position, accidental(s), key signature and/or a clef.
  • the musical notation may also comprise a plurality of notes preferably played simultaneously as a chord.
  • the musical notation preferably provides a user with a presentation of one note or chord at a time to which the user gives their input by playing an instrument after which the user is provided with new graphical content, such as another note or chord, and feedback on whether the user was able to produce the right note(s) on their instrument.
  • the score is data comprising such information as if the user succeeded to produce the presented graphical content, such as a correct note or hand position, the time needed to produce the play signal, etc.
  • the graphical content of the system comprises also note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument.
  • the system comprises image acquisition means for acquiring image data.
  • the im- age data may be image and/or video of the user's note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument.
  • the sys- tem comprises a wearable device, such as a hand-wearable device that can recognize the user's note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument.
  • the device may comprise processing means or be arranged to utilize the processing means of the electronic device to execute recognition of the user's note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument e.g. on the basis of location of the wearable device or reference points thereof in relation to the instrument.
  • a wearable device as well as the image acquisition means may be also used for recognizing the user's body posture in relation to the instrument.
  • the processing entity is further arranged to execute image recognition for determining the user's note fingering, interval fingering, chord fingering or hand posture in relation to the instrument from the acquired image data, and compare said determined user's note fingering, interval fingering, chord fingering or hand posture to with data relating to the presented graphical content to at least determine if the user's note fingering, interval fingering, chord fingering or hand posture corresponds with the correct user's note fingering, interval fingering, chord fingering or hand posture associated with the provided graphical musical notation content.
  • the processing entity may further be arranged to assign a score to represent the result of said comparison involving image data and store said score.
  • the processing entity may also further be arranged to present the score relating to image data to the user via the display.
  • the processing entity may further arranged to determine, at least on the basis of said score relating to image data and other scores, further at least graphical musical notation content, to be presented to the user via the display.
  • the system may also comprise at least one remote entity with which the electronic device may communicate at least unidirectionally.
  • the remote entity may be used as a source and/or storage of graphical content and score statistics for the system. Further, the remote entity may execute at least partially the one or more of the processing entity tasks, such as audio recognition, comparison of output (graphical content) and input (audio, etc.) data, and the determination of graphical content to be provided to the user, optionally in relation to stored scores.
  • Such remote entity may e.g. comprise a da- tabase accessible via Internet.
  • a method for teaching a user to play a musical instrument from musical notation via virtual exercises comprising:
  • the graphical musical notation is presented in relation to an instrument, such as presenting via the display the location of at least one note on the in- strument.
  • the user is presented with graphical musical notation with corresponding instrument fingering, hand posture or body posture.
  • the user is provided with graphical content, such as advice and/or instructions, in accordance to the recognized user's note fingering, interval fingering, chord fingering, body posture or hand posture.
  • the method comprises determining at least graphical musical notation content to be presented to the user also in relation to historic score information. Such information may be obtained and/or aggregated from the user's previous results, different users and/or sources.
  • a computer program product embodied in a non-transitory computer readable medium, comprising computer code for causing the computer to execute the method items of the present invention.
  • the computer program product embodied in a non-transitory computer readable medium, comprising computer code for causing the computer to execute the method of the present invention may be provided as an application program, web program or SaaS (Software as a Service) as part of the system of the present invention in a standalone device or at least partly in a cloud server.
  • SaaS Software as a Service
  • musical instrument and “instrument” both refer to a musical instrument and are herein used interchangeably.
  • exemplary refers herein to an example or example-like feature, not the sole or only preferable option.
  • the term "user” is often used herein to refer to a user of the system and in many instances is synonymous with a player of an instrument or person studying music utilizing the system.
  • Figure 2 illustrates an exemplary embodiment of presenting graphical content to the user in accordance with the present invention
  • FIG. 3 illustrates a use context of the system in accordance with the present invention
  • FIG. 4 illustrates the audio recognition in accordance with the present invention
  • Figure 5 depicts feasible techniques of image recognition for determining the user's hand and finger position in relation to an instrument
  • Figure 6 is a flow diagram illustrating one feasible embodiment of a method in accordance with the present invention.
  • FIG. 1 illustrates an embodiment of the system 100 in accordance with the present invention.
  • the system comprises at least one electronic device 102, which further comprises at least a processing entity 104, a memory entity 106, a display 108, and means for forming or receiving at least one play signal 1 10 produced by a musical instrument 1 14.
  • the electronic device may also comprise an image acquisition device 1 12, such as a camera or similar means for obtaining image data.
  • the electronic device 102 may comprise or at least be functionally connected with sensors such as IR (infrared) camera and pressure sensors.
  • the processing entity 104 is configured to at least process the play signals, and execute the computer program product of the present invention. Additionally, the processing entity may be arranged to further process the comparison data and additionally store it, use it in the computer program and/or provide graphical representations of it.
  • the processing entity may comprise, e.g. at least one processing/controlling unit such as a microprocessor, DSP (a digital signal processor), DSC (a digital signal controller), a micro-controller or programmable logic chip(s), optionally comprising a plurality of co-operating or parallel (sub-)units.
  • the electronic device 102 comprises also a memory entity 106 for storing at least the play signal data and comparison data.
  • the memory entity may be divided between one or more physical memory chips and/or cards.
  • the memory entity may also comprise necessary code, e.g. in a form of the computer program/application, for enabling the control and operation of the device, and provision of the related control data.
  • the memory may comprise e.g. ROM (read only memory) or RAM-type (random access memory) implementations as disk storage or flash storage.
  • the memory may further comprise an advantageously detachable memory card/stick, a floppy disc, an optical disc, such as a CD-ROM, or a fixed/removable hard drive.
  • the means for forming or receiving at least one play signal 1 10 produced by an instrument 1 10 may comprise a microphone input port, a microphone, electronic musical instrument or MIDI that may be either external to the electronic device or integrated therein.
  • An audio amplifier may also be used therein.
  • the means are used to capture audio and data, such as electric signals representing audio or pitched acoustic sounds from instruments, and receive and/or process said signals into play signal data to be compared with and/or stored.
  • Said means may be manually or electronically directed to capture image and/or video from a particular object, or said means may be arranged to follow the object such as moving hands on an instrument.
  • the image acquisition device 1 12 comprises a digital camera or a CCD (charge-coupled device), which may be chosen from the plurality of digital imaging devices capable of at least creating digital images and option- ally additionally digital video.
  • the camera may be integrated to the electronic device or external to it.
  • suitable digital cameras comprise compact cameras, DSLRs (digital single-lens reflex cameras), DSLTs (digital single-lens translucent cameras), webcams and other video cameras, as well as cameras directly integrated to mobile devices, laptops and tablet devices.
  • An IR camera may also be used for image acquisition.
  • a pressure sensor may be used to monitor how hard the user is playing the instrument (in terms of pressure via fingering) or the pressure sensor may be used to track the user's position and posture on a playing chair, such as a piano bench.
  • the pressure sensors may be also used to track pressure imposed on an electronic device interface, wherein the pressure sensor may sense the speed, pressure and tone of the playing.
  • the at least functional connection between the electronic device and mentioned components may be provided via wires or wireless means.
  • the display 108 may comprise a touch screen such as an essentially touch- based, contactless and/or three-dimensional touch screen via which graphical content may be given as output preferably via a GUI (graphical user interface).
  • the display may be used to give commands and control the software program.
  • the graphical user interface may be configured to visualize, or present as textual, different data elements, indications, status in- formation, control features, user instructions and user input indicators.
  • the system 100 may further comprise a remote entity 1 16 such as a cloud computing or remote server, comprising e.g. a computer program product 1 18 and a database 120.
  • a remote entity 1 16 such as a cloud computing or remote server, comprising e.g. a computer program product 1 18 and a database 120.
  • the arrangement may be provided as SaaS, web application, mobile application, facilitated via a browser or similar software, or as an API (application programming interface) wherein the computer program product is essentially external but remotely accessible by the electronic device.
  • the remote entity may be arranged to process and/or analyze the data received and/or processed by the electron- ic device, such as the comparison data, and store it into the database and/or send it back to the electronic device.
  • the database may further comprise instructions, exercises, program modes data collected from the device and/or stored score data.
  • the database may be further used to provide the device with updates and/or system functions.
  • the electronic device 102 may communicate with a remote entity 1 16 to send and/or receive data, such as comparison analysis data, stored score data, metadata (pertaining to the user, exercises, scores, etc.), with a remote entity.
  • data such as comparison analysis data, stored score data, metadata (pertaining to the user, exercises, scores, etc.)
  • the remote entity may process and/or analyze said data and further send it back to the electronic device. For example, the comparison of the graphical content output and user input may be done on the remote entity and the result of said comparison may be then communicated back to the electronic device.
  • the processing entity tasks may be divided between the electronic device and remote entity for example when external processing to the electronic device is preferred.
  • Such application could include, i.a., a browser-based web application.
  • An actual teacher may also be a part of the system and obtain user scores, etc., information to via e.g. the remote entity.
  • the teacher may comment or otherwise affect the feedback and exercises ' provided to the user.
  • the electronic device 102 may hence comprise also suitable connection means to communicate with the remote entity via e.g. WANs (wide area networks) and/or LANs (local area networks).
  • WANs wide area networks
  • LANs local area networks
  • the computer program 1 18 product may be offered as application software to the at least one electronic device. It may hence be run by the electronic device essentially independent of any remote entity. Obviously, up- dates and such may be carried out by connecting to a remote entity.
  • the electronic device 102 may comprise or constitute a tablet, phablet, mobile phone, laptop, desktop or any such computing device well known in the prior art.
  • Additional elements and means such as wireless modules (NFC, blue- tooth, WiFi, FID, etc.), cables, conductors, electrodes, power sources, illumination devices, supports, encapsulation, fastening means, etc., well- known to a person skilled in the art may be incorporated appropriately ac- cording to various embodiments.
  • wireless modules NFC, blue- tooth, WiFi, FID, etc.
  • cables conductors, electrodes, power sources, illumination devices, supports, encapsulation, fastening means, etc.
  • Figure 2 illustrates an exemplary embodiment of presenting graphical content to the user in accordance with the present invention.
  • Said view may be a part of program software, such a mobile or a web application, run on an electronic device 202.
  • the user may be presented with graphical content of musical notation 204a, such as one note, and finger and hand position 204b in relation to the presented musical notation 204a via a display 206.
  • graphical content of musical notation 204a such as one note
  • finger and hand position 204b in relation to the presented musical notation 204a via a display 206.
  • the presented graphical content may of course with similar considerations comprise a representation of the correct hand posture, body posture, etc. via animations or the like. Any such graphical content that improves the user's playing technique may be presented.
  • the system may of course utilize other numeric, alphabetic, alphanumeric and graphical representations as well.
  • the representations may further be used to graphically represent the development or performance of the user.
  • feedback and instructions may be presented to the user as descriptive rules or advice.
  • a user may be provided with in- structions and examples of how to improve playing and whether the note played by the user was correct and/or correctly played (i.e. pitch, timing- wise and/or in terms of playing technique).
  • Such representation may include images and/or video.
  • Figure 3 illustrates a use context of the system in accordance with the present invention.
  • the user 304 may be for example a person studying musical notation, theory and the playing technique of an instrument by themselves at their home using an embodiment of the electronic device 302 ot the system.
  • the system utilizes play signals produced by the user 304 with their instrument 306.
  • the system may also utilize image or video information obtained via image acquisition means 308.
  • the system may utilize other information such as pressure as a source of input.
  • the one or more play signals may be a single pitched note or a combination of a plurality of pitched notes, optionally with the information of their timing and/or duration, produced essentially simultaneously by the user.
  • the play signal is used to refer to a note produced on an instrument, which may produce a sound and/or an electric signal representing the produced note.
  • the image or video input may be produced by a camera image of the user's grip of the instrument, note fingering, interval fingering, chord fingering, body posture or hand posture.
  • the optical instrument may capture an image or video essentially simultaneously as the user produced the note and consequently the play signal.
  • the image or video input may be also captured at any time during the use of the system. That the image and video input may be produced at any time during the use of the system means that they may be set to be produced in intervals, essentially in real-time, in response to user command or in response to any other suitable input.
  • the system provides the user with a graphical representation of at least one musical note, which the user has to play on an instrument.
  • the note may be also presented in relation to its place on a particular instrument, such as the one that the user is playing.
  • the user may be also presented with textual and/or graphical information on the correct fingering, hand posture or body posture in relation to the particular instrument and optionally also in relation to the graphical musical notation.
  • the user preferably utilizes an actual instrument for producing the play signal.
  • an actual instrument for producing the play signal a wide variety of said instruments is known ranging from the piano keyboard instruments in general, string instruments, brass- wind and woodwind instruments, and other band and orchestra instru- ments as appropriate.
  • sung notes, and hence human voice are also regarded as instruments herein and work well with the system.
  • devices and arrangements with instrument-like virtual or physical interfaces, such as gaming controllers, instrument software and other such arrangements for producing electric signals or acoustic sounds may be used.
  • the system then receives the play signal and determines whether the pitch of the play signal essentially corresponds to the musical notation or whether they differ. The system may then indicate this to the user and store the result of the comparison.
  • the system may optionally recognize the pitch represented by the play signal to be out of tune and advice for tuning of the instrument or the correct fingering and playing of the note.
  • the sys- tern may also be able to determine if the play signal was close enough to the note e.g. so that the note was probably specifically intended by the user instead of either of the two other semitones away from the note. If the play signal produced by the user is the same one as presented as the graphical content on the system display the user is marked with a score of one correct note and additional information about e.g. time and date, time taken to produce the note by the user, the result of the comparison (right or wrong note), etc.
  • the additional information might be stored cumulatively and processed to elicit and determine notes, chords or exercises that the user has had most failed attempts or difficulties with (in accordance to the amount of time taken to produce the note and/or the failure rate of particular notes, chords, and exercises). Similarly said information may be used for compilation into score data for further provision, or to be presented to the user or another user of the system, such as the user's personal teacher.
  • the system may then either provide the user with another graphical musical notation in accordance to predetermined rules of e.g. an exercise (pro- gram mode) or in accordance to the ones that the user has had the most failed attempts and/or longest answer time as mentioned hereinbefore.
  • predetermined rules e.g. an exercise (pro- gram mode) or in accordance to the ones that the user has had the most failed attempts and/or longest answer time as mentioned hereinbefore.
  • the stored or historic score data may be used to direct the exercises and offer the user teaching within the areas where they are less improved.
  • the next presented graphical content may be in ac- cordance to the one or more previous scores (successes and failures to produce notes as presented by the system).
  • the system may be controlled essentially via the play signal and/or with image or video input. I.e. the system may move between the graphical musical notation in accordance to the play signals and/or to the image and/or video input. This way the user need not to stop playing the instrument to e.g. give commands via the electronic device UI (user interface) but instead may just continuously play throughout the changing musical notation and/or exercises.
  • the system user score data (constitute by a number of scores) is created and stored to the device and/or to the remote server.
  • the score data may comprise e.g.
  • one score is the at least the success and failure to produce one note or optionally more notes in accordance to the system output (represented graphical content corresponding to the note). If whole songs or musical notation exercises are played said score data may be recorded per song or per exercise. Also timing and playing rhythm of individual notes may be stored and used as score data.
  • a user profile under which the user scores data is preferably stored may be created.
  • the user scores data may be analyzed and used to target a user with certain exercises and personalized feedback and instructions. For ex- ample, the user may be instructed which aspects of their playing still need improvement, how advanced they are or what songs they can already attempt to play (in accordance to their skill level).
  • the system may compare a user's scores data with another user's data to create feedback and instructions. For example, a user may be provided with same feedback or exercises that another with similar score data and/or improvement areas is provided with (either by their own will or via the analysis of the system). Further, gamification aspects, such as playing collaboration or competitions among the users may be facilitated.
  • FIG. 4 illustrates the audio recognition in accordance with the present invention.
  • the audio recognition is done by detecting a play signal, produced by a user with his/hers musical instrument 402, comprising information corresponding to a note. After receiving the signal it is converted into digital signal, if needed, and recognized as either audio signal or data signal 404.
  • An audio signal e.g. signal initially produced by an acoustic musical instrument
  • harmonic sound obtained e.g. via a microphone or as an electric signal via a microphone port
  • the system detects the most harmonic signals and their duration, via which the frequency of the sound may be determined.
  • digital filtering may be also used for noise suppression around the given detected frequency.
  • the detected frequency of the play signal may be then compared to the correct frequency and interpreted as a note that is compared to the represented musical note 412.
  • the notes that the user is playing may be recognized via supervised learning.
  • the algorithm utilized by the system is taught the sounds and/or signals corresponding to the graphical representations. This way the algorithm may model the output-input dependence for correct and wrong results of comparison.
  • the graphical musical notation content may be gone through and corresponded with play signals, such as an admin, developer or a user playing notes on an instrument for each graphical musical notation after which user's play signals may be compared to it.
  • the algorithm correspondingly uses the taught information and model to anticipate further correspondence between notes and play signals.
  • the audio signal is not given to the algorithm as raw data.
  • Feature extraction 406 is carried out for processing out the redundant information of the audio signal data making the resultant representation of the complete input data easier to be compared with the data pertaining to the graphical content 412 and to determine the correspondence of the input in relation to the graphical content output of the system, i.e. whether the user has achieved to give the correct input by pitching the correct note on an instrument.
  • different frequency and time domain methods such as STFT (Short-time Fourier transform), digital filters, PCA (Principal components analysis), SVD (Singular value decomposition), etc., may be used to implement and train the algorithm.
  • the audio recognition comprises only recognizing that it is a data signal and corre- sponding the data signal to a note 410b.
  • Other examples include electronic musical instruments that produce digital signals. In these instances feature extraction and harmonic sound detection, as techniques explained herein- earlier, may not be necessary since such data signal produced by the in- strument may be easily interpreted as a note 410b which may be directly compared with the data pertaining to the graphical content 412.
  • the algorithm may be self-learning via the use of the system and collected data.
  • Figure 5 depicts feasible techniques of image recognition for determining the user's hand and finger position in relation to an instrument.
  • the techniques preferably incorporate at least image acquisition means as presented in detail hereinbefore.
  • the techniques include essentially parts acquiring image or video data of the user 504, recognizing the location of the user's fingertips and recognizing the location of the instrument certain segments of the instrument, such as piano or woodwind instrument keys, brass instrument valves, string instrument fingerboard locations, guitar frets, etc. Basically the user's fingering on any such instrument while playing 502 may be determined and compared to an image or data representing the correct fingering. For simplicity however, only the recognition technique for piano finger position will be described. A person skilled in the art will under- stand the teachings presented herein to be extendable to the other instruments as well following the principles of the particular computer vision technique.
  • Segmentation 506 of the hand position may be essentially feature or mod- el-based. Feature -based segmentation may be used to select a set of interest points 510 or areas of the image via feature extraction 508. Using these relevant interest points or areas 510 the position of the hand in relation to the piano keys may be determined 516. In the model-based segmentation on the other hand the image data is fitted to a model based on apriori in- formation on how and where on the keyboard the musical notation should be played like a professional pianist 512. The result of the segmentation is the hand position and the instrument location in a coordinate system 514. This may be further used to determine a number of hand position aspects in relation to the instrument 516. Supervised learning may be used to determine the location of the fingertips.
  • various image processing techniques may be used to extract the hand from the image and recognize the fingertips based on the features calculated from the hand segment.
  • the locations of the keys may be determined similarly after which the locations of the fingertips and the keys may be compared.
  • the techniques for determining the user's hand posture and body posture may also utilize other sensors that either give a signal pertaining to their location and/or are utilized as marks that make their location easier to determine via image recognition.
  • Such sensors may be wearable devices, or at least functionally connected to the instrument, user, and/or the playing chair. Some such sensors may be worn in or essentially constitute such wearable devices as rings, gloves, implants and stickers.
  • Feasible techniques comprise background subtraction, object recognition, feature and model based segmentation as depicted with figure 4a, as well as supervised learning.
  • the object of incorporating the techniques depicted with figures 4a and 4b is to recognize the user's playing posture, fingering, etc., as explained, to be able to advice the user to correct their playing and position.
  • the correct finger and hand position, hand posture, body posture, etc. are well-known in prior art and they are utilized as reference to which the comparison is made.
  • the user may be given with textual or graphical instructions about correcting their position.
  • the finger and hand position, hand posture, body posture, etc., of the user may be also used to determine whether the user passes an exercise.
  • Figure 6 is a flow diagram illustrating one feasible embodiment of a method in accordance with the present invention.
  • the system performing the method may be configured.
  • a user may e.g. select their user profile, select the musical instrument on which the exercises are directed to and start an exercise.
  • the system may use the user's profile information and/or past scores to determine which graphical content is presented as presented in the method item 603.
  • the user's profile information and scores may be retrieved and/or updated for example from and/or according to database on a remote entity. For example, if the aim of the exercise is for learning to play the notes of C major scale on a piano and the user has done these exercises before, the system may retrieve the past scores from the exercises. If it appears that the user has had difficulties in playing the notes (pitches) of F and B for example, from the total of C, D, E, F, G, A, B presented in a random order to the user, then the system may present and ask the user to produce the notes F and B more than the other five notes.
  • the system may present the user with B note more and weigh the other notes C, D, E, G, A and the F with similar frequencies.
  • the frequencies of which particu- lar notes or chords (i.e. producing two or more notes simul-taneously) of a scale appear may be initially identical and after the system gathers scores the particular notes with low scores (low rate of success to produce the note) may be presented with higher frequency than the others. This way the system uses the user's scores as his/hers skill-level and asks the user to produce notes that heim/sheher hasve the lowest score with.
  • Method item 605 represents the user producing at least one note on a musical instrument.
  • the instrument may produce acoustic sound or electric signals (that are initially analog or digital).
  • the user preferably, to his/her best knowledge and skills, tries to produce the presented musical notation and with correct playing techniques, as may be presented by the system.
  • play signal in accordance to the user-produced one or more notes is obtained.
  • the play signal is processed and at least the note produced by the user on the instrument is recognized.
  • the source instrument of the play signal may be recognized, and image recognition for image or video input may be executed.
  • the processed play signal data is compared with the graphical content data to determine if the play signal corresponds to the presented graphical content, i.e. whether the user has managed to produce the note(s) corresponding to the note(s) that was presented by the system to them, op- tionally in correct duration, timing, rhythm, etc.
  • This information may be processed and/or stored as user score data.
  • the result of the comparison of item 610 is assigned a score, which is further preferably stored.
  • Method item 615 comprises stored score data (at least the one produced by the user at 605), which is used by the system at 603 to determine the next graphical content or exercise presented to the user.
  • the method item may be carried out essentially at the same time as 612 or 614 but preferably such that the user has had time to view their score at 614 before the method moves to item 604 again via 603 and new graphical content is presented.
  • the method may be repeated again starting essentially from the method item 603.
  • graphical content to be presented to the user is determined.
  • the graphical content presented to the user may be determined in accordance to their skill level. This way the system deter- mines, which musical notations, such as individual note or chords, and optionally hand positioning and postures the user has had the most difficulties with and determines the next musical notation in relation to that to provide the user with graphical content to produce playing signals to, to improve the mentioned aspects in which the user has had the most difficulties with.
  • Success rate therein preferably pertains to the user's success to produce correctly a specific note or note(s) in relation to the presented musical notation.
  • the hand positioning and postures may be optionally determined and provided feedback to at the same time as the note is played or real-time while the user is playing and/or moving in relation to an instrument.
  • repeating the method items as described may constitute as an exercise to practice, notes, intervals, chords, timing, etc., in accordance to the predetermined objective of the exercise.
  • a song may be practiced using the method wherein the method repeats the notes that the user had difficulties with or played wrong.
  • the method teaches the user step-by-step instead of only providing feedback of if a note or a song was played correctly to some degree.
  • the method may constitute e.g. an exercise or a whole song gone through note-per-note, phrase-by-phrase, etc.
  • the user may be also presented with the information whether the note(s) produced by the user was correct and additionally optionally whether the user's note fingering, interval fingering, chord fingering or hand posture were correct when producing the note(s) (represented as the play signal).
  • the user terminates the method and the method ends e.g. via user command.
  • the method may also incorporate the aspects of gamification comprising a lot of different goals, skill levels reached via the exercises as well as collaboration or competitive game against other user(s).
  • Method items may be stored as and executed by a computer program product embodied in a non-transitory carrier medium, such as optical disc or memory card.

Abstract

A system (100) for teaching a user to play a musical instrument (114) from musical notation via virtual exercises comprising: at least one electronic device (102) comprising at least a processing entity (104), a memory entity (106) and a display (108), means for forming or receiving at least one play signal (110) produced by a user on a musical instrument (114), the processing entity (104) being arranged to provide at least graphical musical notation content, such as a note or a chord, via the display (108), the processing entity (104) being further arranged to obtain at least play signal data via said means for forming or receiving at least one play signal (110), the processing entity (104) being further arranged to execute audio recognition to recognize the play signal data and compare it with data relating to the presented graphical content to at least determine if the play signal corresponds to said graphical content, the processing entity (104) being further arranged to assign a score to represent the result of said comparison and store said score, the processing entity (104) being further arranged to present the score to the user via the display (108), and to determine, at least on the basis of said score and other stored scores, further at least graphical musical notation content, to be presented to the user via the display (108). Corresponding method and computer program product are also presented.

Description

SYSTEM FOR TEACHING A USER TO PLAY A MUSICAL INSTRUMENT FROM MUSICAL NOTATION VIA VIRTUAL EXERCISES AND A METHOD THEREOF FIELD OF THE INVENTION
Generally the present invention relates to electronic arrangements and computing systems. Particularly, however not exclusively, the invention pertains to audio recognition techniques and music teaching methods.
BACKGROUND
Learning to play music has been the interest of many people for ages. Particularly learning to play from actual musical notation and finding the in- terest to study the theoretical aspects of music have been difficult to solve with solutions that practically exclude an inspiring teacher and face-to- face teaching sessions.
The traditional approaches include using learning material, such as teach- ing videos or books. However, they very much lack any interaction or customized teaching and learning as the user progresses. Some other solutions comprise online lessons with teachers that teach in real-time with the help of webcams and microphones. However, they are very much only extensions of traditional face-to-face teaching utilizing virtual means and there - fore lack on-demand capability and virtual teaching feedback and teaching means.
Other existing solutions rely on teaching users to play some certain complete songs or pieces without reading musical notation. However, these so- lutions rely on comping and/or timing in relation to the shown structure, which is often a reference to the instrument interface, such as keys or fret- board position. Further, some other solutions, such as Guitar Hero®, rely on using a sort of simplified or otherwise modified mockup instrument to simplify the playing and to avoid the need for audio recognition. However, these lack the essential aspect of learning musical notation crucial for actually learning to play an instrument. Even further, none of the previous solutions have considered teaching a user the correct playing posture and position of the fingers, hands and the whole body of the player when playing a particular instrument. These aspects are crucial for learning correctly to play an instrument - beyond handling just through exercises.
By musical notation all musicians can play anything with their instruments all over the world, because the same system of reading/writing musical notes has spread universal since long time from the 1600-century to now- adays. But the traditional way of learning notes is too difficult or slow for many people, because they try to learn whole pieces/songs instead of learning to read musical notation note by note as ABC. It is not enough to read musical notation, if you cannot connect your instrument, fingers and hand posture in it.
There is hence obviously a need in the prior art for a system that enables self-teaching and learning to play a real instrument from musical notation.
SUMMARY OF THE INVENTION
The objective of the embodiments of the present invention is to at least alleviate the aforementioned drawbacks evident in the prior art arrangements particularly in the context of teaching music. The objective is generally achieved with a system and a corresponding method in accordance with the present invention.
One of the present invention's main advantage is that it teaches the user to read actual musical notation in contrast to tablature, color coded keys, etc., which make reading of music essentially easier in the short run but ignore many of the musical notation's important aspects and actually only teach user to play an instrument by following patterns or the like. It is well- known that knowing how to read musical notation is essential for playing any songs in the world. Playing instruments and understanding musical theory also serve as the basis for improvisation, band playing and writing music.
One other advantage of the present invention is that it does not require any additional hardware or software for the instrument, i.e. the user plays the actual instrument that they want to learn. Additionally, the instrument is not limited to any certain instrument but essentially the user may choose any instrument of their preference. Another advantage of the present invention is that, in addition to the played notes, the present invention allows for tracking of the user's correct playing posture and position of the fingers, hands and the whole body of the player when playing a particular instrument and providing feedback and instructions thereof. This allows teaching the notes in relation to the correct hand posture and positioning of hands, body, etc.
Yet another advantage of the present invention is that it tracks the user's performance (scores) and development, and correspondingly determines further exercises as the user progresses individually and using intellectual- ly learning by mistakes. Hence, the system follows the user's development like an actual teacher would and may provide the user exercises on the areas that the user has still learning to do.
Finally, the present invention is arranged to present the user with notes or a set of notes and (chords or intervals) one by one, such that it is easy to receive immediate feedback on whether a the note or the set of notes was played correctly. Further, the present invention tracks the success rate of the notes that the user has produced (correctly and incorrectly) and determines which notes and/or intervals the user has problems with so that the system carrying out the invention may provide those notes back to the user. Consequently, the system learns the notes, which the user has problems with and provides such exercises to user so that the user has to practice them more. In accordance with one aspect of the present invention a system for teaching a user to play a musical instrument from musical notation via virtual exercises comprising:
-at least one electronic device comprising at least a processing entity, a memory entity and a display,
-means for forming or receiving at least one play signal produced by a user on a musical instrument, -the processing entity being arranged to provide at least graphical musi-cal notation content, such as a single note or a chord, via the display, -the processing entity being further arranged to obtain at least play signal data via said means for forming or receiving at least one play signal,
-the processing entity being further arranged to execute audio recognition to recognize the play signal data and compare it with data relating to the presented graphical content to at least determine if the play signal corresponds to said at least graphical musical notation content,
-the processing entity being further arranged to assign a score to represent the result of said comparison and store said score,
-the processing entity being further arranged to present the score to the user via the display, and
-to determine, at least on the basis of said score and other stored scores, further at least graphical musical notation content, to be presented to the user via the display.
The play signal is herein used to refer to an electric signal, that is optionally digital, or audio signal representing or corresponding to at least one note, i.e. at least the information of the pitch and optionally the length of a sound. E.g. the play signal may be produced by an instrument, voice or MIDI (Musical Instrument Digital Interface) that is either analog and originally acoustic or electric, or digital. After obtaining the signal it is recognized and converted into digital signal.
The musical notation pertains to at least one note, optionally with a duration, presented in relation to staff. As such the musical notation may be a single note whose pitch is stated via position, accidental(s), key signature and/or a clef. The musical notation may also comprise a plurality of notes preferably played simultaneously as a chord. The musical notation preferably provides a user with a presentation of one note or chord at a time to which the user gives their input by playing an instrument after which the user is provided with new graphical content, such as another note or chord, and feedback on whether the user was able to produce the right note(s) on their instrument.
According to an exemplary embodiment of the present invention the score is data comprising such information as if the user succeeded to produce the presented graphical content, such as a correct note or hand position, the time needed to produce the play signal, etc.
According to an exemplary embodiment of the present invention the graphical content of the system comprises also note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument.
According to an exemplary embodiment of the present invention the system comprises image acquisition means for acquiring image data. The im- age data may be image and/or video of the user's note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument.
According to an exemplary embodiment of the present invention the sys- tem comprises a wearable device, such as a hand-wearable device that can recognize the user's note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument. The device may comprise processing means or be arranged to utilize the processing means of the electronic device to execute recognition of the user's note fingering, interval fingering, chord fingering and/or hand posture in relation to the instrument e.g. on the basis of location of the wearable device or reference points thereof in relation to the instrument. A wearable device as well as the image acquisition means may be also used for recognizing the user's body posture in relation to the instrument.
According to an exemplary embodiment of the present invention, the processing entity is further arranged to execute image recognition for determining the user's note fingering, interval fingering, chord fingering or hand posture in relation to the instrument from the acquired image data, and compare said determined user's note fingering, interval fingering, chord fingering or hand posture to with data relating to the presented graphical content to at least determine if the user's note fingering, interval fingering, chord fingering or hand posture corresponds with the correct user's note fingering, interval fingering, chord fingering or hand posture associated with the provided graphical musical notation content. The processing entity may further be arranged to assign a score to represent the result of said comparison involving image data and store said score. The processing entity may also further be arranged to present the score relating to image data to the user via the display. The processing entity may further arranged to determine, at least on the basis of said score relating to image data and other scores, further at least graphical musical notation content, to be presented to the user via the display.
According to an exemplary embodiment of the present invention the system may also comprise at least one remote entity with which the electronic device may communicate at least unidirectionally. The remote entity may be used as a source and/or storage of graphical content and score statistics for the system. Further, the remote entity may execute at least partially the one or more of the processing entity tasks, such as audio recognition, comparison of output (graphical content) and input (audio, etc.) data, and the determination of graphical content to be provided to the user, optionally in relation to stored scores. Such remote entity may e.g. comprise a da- tabase accessible via Internet.
In accordance with one aspect of the present invention a method for teaching a user to play a musical instrument from musical notation via virtual exercises comprising:
-presenting a user with graphical musical notation, such as a single note or a chord, optionally in a random order or in any appropriate order,
-obtaining at least one play signal produced by the user on a musical in- strument,
-executing audio recognition to recognize the play signal,
-comparing the recognized play signal with the graphical musical notation and determining if the play signal corresponds to said graphical content,
-assigning a score to represent the result of said comparison, -presenting the score to the user,
-determining, at least on the basis of said score and other stored scores, further at least graphical musical notation content, to be presented to the user.
According to an exemplary embodiment of the present invention the graphical musical notation is presented in relation to an instrument, such as presenting via the display the location of at least one note on the in- strument.
According to an exemplary embodiment of the present invention the user is presented with graphical musical notation with corresponding instrument fingering, hand posture or body posture.
According to an exemplary embodiment of the present invention the user is provided with graphical content, such as advice and/or instructions, in accordance to the recognized user's note fingering, interval fingering, chord fingering, body posture or hand posture.
The abovementioned three embodiments allow for instructing the user to play correctly and advice on finding e.g. the notes represented by the graphical musical notation on the instrument that they are playing. According to an exemplary embodiment of the present invention the method comprises determining at least graphical musical notation content to be presented to the user also in relation to historic score information. Such information may be obtained and/or aggregated from the user's previous results, different users and/or sources.
In accordance with one aspect of the present invention a computer program product embodied in a non-transitory computer readable medium, comprising computer code for causing the computer to execute the method items of the present invention.
According to an exemplary embodiment of the present invention the computer program product embodied in a non-transitory computer readable medium, comprising computer code for causing the computer to execute the method of the present invention may be provided as an application program, web program or SaaS (Software as a Service) as part of the system of the present invention in a standalone device or at least partly in a cloud server.
The previously presented considerations concerning the various embodiments of the system may be flexibly applied to the embodiments of the method and of the computer program product mutatis mutandis and vice versa, as being appreciated by a skilled person. Similarly, the provision and utilization of the method as well as the system are scalable in the limitations of the entities according to the system.
As briefly reviewed hereinbefore, the utility of the different aspects of the present invention arises from a plurality of issues depending on each par- ticular embodiment.
The expression "a number of may herein refer to any positive integer starting from one (1). The expression "a plurality of may refer to any positive integer starting from two (2), respectively.
The terms "musical instrument" and "instrument" both refer to a musical instrument and are herein used interchangeably.
The term "exemplary" refers herein to an example or example-like feature, not the sole or only preferable option.
The term "user" is often used herein to refer to a user of the system and in many instances is synonymous with a player of an instrument or person studying music utilizing the system.
Different embodiments of the present invention are also disclosed in the attached dependent claims.
BRIEF DESCRIPTION OF THE RELATED DRAWINGS
Next, some exemplary embodiments of the present invention are reviewed more closely with reference to the attached drawings, wherein Figure 1 illustrates an embodiment of the system in accordance with the present invention
Figure 2 illustrates an exemplary embodiment of presenting graphical content to the user in accordance with the present invention
Figure 3 illustrates a use context of the system in accordance with the present invention
Figure 4 illustrates the audio recognition in accordance with the present invention
Figure 5 depicts feasible techniques of image recognition for determining the user's hand and finger position in relation to an instrument
Figure 6 is a flow diagram illustrating one feasible embodiment of a method in accordance with the present invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Figure 1 illustrates an embodiment of the system 100 in accordance with the present invention. The system comprises at least one electronic device 102, which further comprises at least a processing entity 104, a memory entity 106, a display 108, and means for forming or receiving at least one play signal 1 10 produced by a musical instrument 1 14. The electronic device may also comprise an image acquisition device 1 12, such as a camera or similar means for obtaining image data. Further, the electronic device 102 may comprise or at least be functionally connected with sensors such as IR (infrared) camera and pressure sensors.
The processing entity 104 is configured to at least process the play signals, and execute the computer program product of the present invention. Additionally, the processing entity may be arranged to further process the comparison data and additionally store it, use it in the computer program and/or provide graphical representations of it. The processing entity may comprise, e.g. at least one processing/controlling unit such as a microprocessor, DSP (a digital signal processor), DSC (a digital signal controller), a micro-controller or programmable logic chip(s), optionally comprising a plurality of co-operating or parallel (sub-)units.
The electronic device 102 comprises also a memory entity 106 for storing at least the play signal data and comparison data. The memory entity may be divided between one or more physical memory chips and/or cards. The memory entity may also comprise necessary code, e.g. in a form of the computer program/application, for enabling the control and operation of the device, and provision of the related control data. The memory may comprise e.g. ROM (read only memory) or RAM-type (random access memory) implementations as disk storage or flash storage. The memory may further comprise an advantageously detachable memory card/stick, a floppy disc, an optical disc, such as a CD-ROM, or a fixed/removable hard drive. The means for forming or receiving at least one play signal 1 10 produced by an instrument 1 10 may comprise a microphone input port, a microphone, electronic musical instrument or MIDI that may be either external to the electronic device or integrated therein. An audio amplifier may also be used therein. The means are used to capture audio and data, such as electric signals representing audio or pitched acoustic sounds from instruments, and receive and/or process said signals into play signal data to be compared with and/or stored. Said means may be manually or electronically directed to capture image and/or video from a particular object, or said means may be arranged to follow the object such as moving hands on an instrument.
The image acquisition device 1 12 comprises a digital camera or a CCD (charge-coupled device), which may be chosen from the plurality of digital imaging devices capable of at least creating digital images and option- ally additionally digital video. The camera may be integrated to the electronic device or external to it. Some examples for suitable digital cameras comprise compact cameras, DSLRs (digital single-lens reflex cameras), DSLTs (digital single-lens translucent cameras), webcams and other video cameras, as well as cameras directly integrated to mobile devices, laptops and tablet devices. An IR camera may also be used for image acquisition.
A pressure sensor may be used to monitor how hard the user is playing the instrument (in terms of pressure via fingering) or the pressure sensor may be used to track the user's position and posture on a playing chair, such as a piano bench. The pressure sensors may be also used to track pressure imposed on an electronic device interface, wherein the pressure sensor may sense the speed, pressure and tone of the playing. The at least functional connection between the electronic device and mentioned components may be provided via wires or wireless means.
The display 108 may comprise a touch screen such as an essentially touch- based, contactless and/or three-dimensional touch screen via which graphical content may be given as output preferably via a GUI (graphical user interface). The display may be used to give commands and control the software program. The graphical user interface may be configured to visualize, or present as textual, different data elements, indications, status in- formation, control features, user instructions and user input indicators.
The system 100 may further comprise a remote entity 1 16 such as a cloud computing or remote server, comprising e.g. a computer program product 1 18 and a database 120. Optionally the arrangement may be provided as SaaS, web application, mobile application, facilitated via a browser or similar software, or as an API (application programming interface) wherein the computer program product is essentially external but remotely accessible by the electronic device. The remote entity may be arranged to process and/or analyze the data received and/or processed by the electron- ic device, such as the comparison data, and store it into the database and/or send it back to the electronic device. The database may further comprise instructions, exercises, program modes data collected from the device and/or stored score data. The database may be further used to provide the device with updates and/or system functions.
The electronic device 102 may communicate with a remote entity 1 16 to send and/or receive data, such as comparison analysis data, stored score data, metadata (pertaining to the user, exercises, scores, etc.), with a remote entity. As mentioned, the remote entity may process and/or analyze said data and further send it back to the electronic device. For example, the comparison of the graphical content output and user input may be done on the remote entity and the result of said comparison may be then communicated back to the electronic device. It is also possible to implement the invention essentially in a stand-alone device, such as a computer or mobile device, without using a network connection or a server. However, the processing entity tasks may be divided between the electronic device and remote entity for example when external processing to the electronic device is preferred. Such application could include, i.a., a browser-based web application.
An actual teacher may also be a part of the system and obtain user scores, etc., information to via e.g. the remote entity. The teacher may comment or otherwise affect the feedback and exercises 'provided to the user.
The electronic device 102 may hence comprise also suitable connection means to communicate with the remote entity via e.g. WANs (wide area networks) and/or LANs (local area networks).
The computer program 1 18 product may be offered as application software to the at least one electronic device. It may hence be run by the electronic device essentially independent of any remote entity. Obviously, up- dates and such may be carried out by connecting to a remote entity.
The electronic device 102 may comprise or constitute a tablet, phablet, mobile phone, laptop, desktop or any such computing device well known in the prior art.
Additional elements and means, such as wireless modules (NFC, blue- tooth, WiFi, FID, etc.), cables, conductors, electrodes, power sources, illumination devices, supports, encapsulation, fastening means, etc., well- known to a person skilled in the art may be incorporated appropriately ac- cording to various embodiments.
Figure 2 illustrates an exemplary embodiment of presenting graphical content to the user in accordance with the present invention. Said view may be a part of program software, such a mobile or a web application, run on an electronic device 202.
For example, the user may be presented with graphical content of musical notation 204a, such as one note, and finger and hand position 204b in relation to the presented musical notation 204a via a display 206. This way the user is demonstrated where and how the note should be played. The presented graphical content may of course with similar considerations comprise a representation of the correct hand posture, body posture, etc. via animations or the like. Any such graphical content that improves the user's playing technique may be presented.
The system may of course utilize other numeric, alphabetic, alphanumeric and graphical representations as well. The representations may further be used to graphically represent the development or performance of the user. Further, although not explicitly depicted, feedback and instructions may be presented to the user as descriptive rules or advice. For example, in addition, or instead of showing user score, a user may be provided with in- structions and examples of how to improve playing and whether the note played by the user was correct and/or correctly played (i.e. pitch, timing- wise and/or in terms of playing technique). Such representation may include images and/or video. Figure 3 illustrates a use context of the system in accordance with the present invention. The user 304 may be for example a person studying musical notation, theory and the playing technique of an instrument by themselves at their home using an embodiment of the electronic device 302 ot the system. The system utilizes play signals produced by the user 304 with their instrument 306. Optionally, the system may also utilize image or video information obtained via image acquisition means 308. Additionally optionally, the system may utilize other information such as pressure as a source of input. The one or more play signals may be a single pitched note or a combination of a plurality of pitched notes, optionally with the information of their timing and/or duration, produced essentially simultaneously by the user. The play signal is used to refer to a note produced on an instrument, which may produce a sound and/or an electric signal representing the produced note.
The image or video input may be produced by a camera image of the user's grip of the instrument, note fingering, interval fingering, chord fingering, body posture or hand posture. The optical instrument may capture an image or video essentially simultaneously as the user produced the note and consequently the play signal. However, the image or video input may be also captured at any time during the use of the system. That the image and video input may be produced at any time during the use of the system means that they may be set to be produced in intervals, essentially in real-time, in response to user command or in response to any other suitable input.
The system provides the user with a graphical representation of at least one musical note, which the user has to play on an instrument. The note may be also presented in relation to its place on a particular instrument, such as the one that the user is playing.
Optionally, the user may be also presented with textual and/or graphical information on the correct fingering, hand posture or body posture in relation to the particular instrument and optionally also in relation to the graphical musical notation.
The user preferably utilizes an actual instrument for producing the play signal. In this context a wide variety of said instruments is known ranging from the piano keyboard instruments in general, string instruments, brass- wind and woodwind instruments, and other band and orchestra instru- ments as appropriate. However, it should be noted that sung notes, and hence human voice, are also regarded as instruments herein and work well with the system. Additionally optionally devices and arrangements with instrument-like virtual or physical interfaces, such as gaming controllers, instrument software and other such arrangements for producing electric signals or acoustic sounds may be used.
The system then receives the play signal and determines whether the pitch of the play signal essentially corresponds to the musical notation or whether they differ. The system may then indicate this to the user and store the result of the comparison.
Further, the system may optionally recognize the pitch represented by the play signal to be out of tune and advice for tuning of the instrument or the correct fingering and playing of the note. Optionally alternatively, the sys- tern may also be able to determine if the play signal was close enough to the note e.g. so that the note was probably specifically intended by the user instead of either of the two other semitones away from the note. If the play signal produced by the user is the same one as presented as the graphical content on the system display the user is marked with a score of one correct note and additional information about e.g. time and date, time taken to produce the note by the user, the result of the comparison (right or wrong note), etc. The additional information might be stored cumulatively and processed to elicit and determine notes, chords or exercises that the user has had most failed attempts or difficulties with (in accordance to the amount of time taken to produce the note and/or the failure rate of particular notes, chords, and exercises). Similarly said information may be used for compilation into score data for further provision, or to be presented to the user or another user of the system, such as the user's personal teacher.
The system may then either provide the user with another graphical musical notation in accordance to predetermined rules of e.g. an exercise (pro- gram mode) or in accordance to the ones that the user has had the most failed attempts and/or longest answer time as mentioned hereinbefore. This way the stored or historic score data may be used to direct the exercises and offer the user teaching within the areas where they are less improved. In other words, the next presented graphical content may be in ac- cordance to the one or more previous scores (successes and failures to produce notes as presented by the system).
Similar considerations apply for recognizing the user's note fingering, interval fingering, chord fingering, body posture or hand posture, and de- termining whether they correspond to their representation given via the display. The result of this comparison may also be stored, provisioned, presented, and/or used to determine the next graphical information type and/or the next graphical musical notation to be presented via the display to the user.
The system may be controlled essentially via the play signal and/or with image or video input. I.e. the system may move between the graphical musical notation in accordance to the play signals and/or to the image and/or video input. This way the user need not to stop playing the instrument to e.g. give commands via the electronic device UI (user interface) but instead may just continuously play throughout the changing musical notation and/or exercises. As the user uses the system user score data (constitute by a number of scores) is created and stored to the device and/or to the remote server. The score data may comprise e.g. the time and success of playing certain individual notes, the amount of wrong notes or failed attempts to produce a note as presented by the system or exercise of the system, time for playing the right note after a failed attempt, time for recognizing and/or producing a presented note, user's hand and finger position, finger used to produce a certain note, fingers used to produce a plurality of notes, and the finger's location on the keyboard, valves, fretboard, etc, optionally in relation to a specific instrument. Essentially, one score is the at least the success and failure to produce one note or optionally more notes in accordance to the system output (represented graphical content corresponding to the note). If whole songs or musical notation exercises are played said score data may be recorded per song or per exercise. Also timing and playing rhythm of individual notes may be stored and used as score data.
A user profile under which the user scores data is preferably stored may be created. The user scores data may be analyzed and used to target a user with certain exercises and personalized feedback and instructions. For ex- ample, the user may be instructed which aspects of their playing still need improvement, how advanced they are or what songs they can already attempt to play (in accordance to their skill level).
Further, the system may compare a user's scores data with another user's data to create feedback and instructions. For example, a user may be provided with same feedback or exercises that another with similar score data and/or improvement areas is provided with (either by their own will or via the analysis of the system). Further, gamification aspects, such as playing collaboration or competitions among the users may be facilitated.
Figure 4 illustrates the audio recognition in accordance with the present invention. The audio recognition is done by detecting a play signal, produced by a user with his/hers musical instrument 402, comprising information corresponding to a note. After receiving the signal it is converted into digital signal, if needed, and recognized as either audio signal or data signal 404. An audio signal (e.g. signal initially produced by an acoustic musical instrument) comprising harmonic sound obtained e.g. via a microphone or as an electric signal via a microphone port may undergo at least the steps of feature extraction 406 and harmonic sound detection 408 before a note may be interpreted 410a from said signal. The system detects the most harmonic signals and their duration, via which the frequency of the sound may be determined. Herein, since the correct frequency of a presented note is known a priori, digital filtering may be also used for noise suppression around the given detected frequency. The detected frequency of the play signal may be then compared to the correct frequency and interpreted as a note that is compared to the represented musical note 412.
Alternatively, the notes that the user is playing may be recognized via supervised learning. Therein, the algorithm utilized by the system is taught the sounds and/or signals corresponding to the graphical representations. This way the algorithm may model the output-input dependence for correct and wrong results of comparison. For example, the graphical musical notation content may be gone through and corresponded with play signals, such as an admin, developer or a user playing notes on an instrument for each graphical musical notation after which user's play signals may be compared to it. The algorithm correspondingly uses the taught information and model to anticipate further correspondence between notes and play signals. Essentially, the audio signal is not given to the algorithm as raw data. Feature extraction 406 is carried out for processing out the redundant information of the audio signal data making the resultant representation of the complete input data easier to be compared with the data pertaining to the graphical content 412 and to determine the correspondence of the input in relation to the graphical content output of the system, i.e. whether the user has achieved to give the correct input by pitching the correct note on an instrument. For this purpose, different frequency and time domain methods such as STFT (Short-time Fourier transform), digital filters, PCA (Principal components analysis), SVD (Singular value decomposition), etc., may be used to implement and train the algorithm.
If the play signal is a digital signal produced e.g. by MIDI the audio recognition comprises only recognizing that it is a data signal and corre- sponding the data signal to a note 410b. Other examples include electronic musical instruments that produce digital signals. In these instances feature extraction and harmonic sound detection, as techniques explained herein- earlier, may not be necessary since such data signal produced by the in- strument may be easily interpreted as a note 410b which may be directly compared with the data pertaining to the graphical content 412.
The algorithm may be self-learning via the use of the system and collected data.
Figure 5 depicts feasible techniques of image recognition for determining the user's hand and finger position in relation to an instrument. The techniques preferably incorporate at least image acquisition means as presented in detail hereinbefore.
Further, the techniques include essentially parts acquiring image or video data of the user 504, recognizing the location of the user's fingertips and recognizing the location of the instrument certain segments of the instrument, such as piano or woodwind instrument keys, brass instrument valves, string instrument fingerboard locations, guitar frets, etc. Basically the user's fingering on any such instrument while playing 502 may be determined and compared to an image or data representing the correct fingering. For simplicity however, only the recognition technique for piano finger position will be described. A person skilled in the art will under- stand the teachings presented herein to be extendable to the other instruments as well following the principles of the particular computer vision technique.
Segmentation 506 of the hand position may be essentially feature or mod- el-based. Feature -based segmentation may be used to select a set of interest points 510 or areas of the image via feature extraction 508. Using these relevant interest points or areas 510 the position of the hand in relation to the piano keys may be determined 516. In the model-based segmentation on the other hand the image data is fitted to a model based on apriori in- formation on how and where on the keyboard the musical notation should be played like a professional pianist 512. The result of the segmentation is the hand position and the instrument location in a coordinate system 514. This may be further used to determine a number of hand position aspects in relation to the instrument 516. Supervised learning may be used to determine the location of the fingertips. For example, various image processing techniques (boundary detection, texture separation, etc.) may be used to extract the hand from the image and recognize the fingertips based on the features calculated from the hand segment. The locations of the keys may be determined similarly after which the locations of the fingertips and the keys may be compared.
The techniques for determining the user's hand posture and body posture may also utilize other sensors that either give a signal pertaining to their location and/or are utilized as marks that make their location easier to determine via image recognition. Such sensors may be wearable devices, or at least functionally connected to the instrument, user, and/or the playing chair. Some such sensors may be worn in or essentially constitute such wearable devices as rings, gloves, implants and stickers.
Feasible techniques comprise background subtraction, object recognition, feature and model based segmentation as depicted with figure 4a, as well as supervised learning.
The object of incorporating the techniques depicted with figures 4a and 4b is to recognize the user's playing posture, fingering, etc., as explained, to be able to advice the user to correct their playing and position. The correct finger and hand position, hand posture, body posture, etc. are well-known in prior art and they are utilized as reference to which the comparison is made. The user may be given with textual or graphical instructions about correcting their position. The finger and hand position, hand posture, body posture, etc., of the user may be also used to determine whether the user passes an exercise.
Figure 6 is a flow diagram illustrating one feasible embodiment of a method in accordance with the present invention.
At 602, referring to the initial state of the method the system performing the method may be configured. A user may e.g. select their user profile, select the musical instrument on which the exercises are directed to and start an exercise.
The system may use the user's profile information and/or past scores to determine which graphical content is presented as presented in the method item 603. The user's profile information and scores, may be retrieved and/or updated for example from and/or according to database on a remote entity. For example, if the aim of the exercise is for learning to play the notes of C major scale on a piano and the user has done these exercises before, the system may retrieve the past scores from the exercises. If it appears that the user has had difficulties in playing the notes (pitches) of F and B for example, from the total of C, D, E, F, G, A, B presented in a random order to the user, then the system may present and ask the user to produce the notes F and B more than the other five notes. Similarly, if it appears that the user then succeeds to play the note of F better and better, the system may present the user with B note more and weigh the other notes C, D, E, G, A and the F with similar frequencies. The frequencies of which particu- lar notes or chords (i.e. producing two or more notes simul-taneously) of a scale appear may be initially identical and after the system gathers scores the particular notes with low scores (low rate of success to produce the note) may be presented with higher frequency than the others. This way the system uses the user's scores as his/hers skill-level and asks the user to produce notes that heim/sheher hasve the lowest score with.
At 604, the user is presented with graphical content, such as musical notation. Method item 605 represents the user producing at least one note on a musical instrument. The instrument may produce acoustic sound or electric signals (that are initially analog or digital). The user preferably, to his/her best knowledge and skills, tries to produce the presented musical notation and with correct playing techniques, as may be presented by the system.
At 606, play signal in accordance to the user-produced one or more notes is obtained. At 608, the play signal is processed and at least the note produced by the user on the instrument is recognized. Optionally also the source instrument of the play signal may be recognized, and image recognition for image or video input may be executed.
At 610, the processed play signal data is compared with the graphical content data to determine if the play signal corresponds to the presented graphical content, i.e. whether the user has managed to produce the note(s) corresponding to the note(s) that was presented by the system to them, op- tionally in correct duration, timing, rhythm, etc. This information may be processed and/or stored as user score data.
At 612, the result of the comparison of item 610 is assigned a score, which is further preferably stored.
At 614, the score is presented to the user. Essentially, the result of the comparison, i.e., whether the user succeeded to produce the correct note or chords, etc., is presented to the user via the display for immediate feedback so that the user learns from their mistakes. Optionally, in addition to the score instructions of the correct note, hand and finger position, etc., may be presented to the user. This way the user may learn notes and chords one by one with feedback after each try (i.e. execution of method steps). Method item 615, comprises stored score data (at least the one produced by the user at 605), which is used by the system at 603 to determine the next graphical content or exercise presented to the user. The method item may be carried out essentially at the same time as 612 or 614 but preferably such that the user has had time to view their score at 614 before the method moves to item 604 again via 603 and new graphical content is presented.
The method may be repeated again starting essentially from the method item 603. Therein, in relation to at least the one stored score, such as the score of the previous execution of the method or optionally another previously stored result, graphical content to be presented to the user is determined. Optionally, the graphical content presented to the user may be determined in accordance to their skill level. This way the system deter- mines, which musical notations, such as individual note or chords, and optionally hand positioning and postures the user has had the most difficulties with and determines the next musical notation in relation to that to provide the user with graphical content to produce playing signals to, to improve the mentioned aspects in which the user has had the most difficulties with. Success rate therein preferably pertains to the user's success to produce correctly a specific note or note(s) in relation to the presented musical notation. The hand positioning and postures may be optionally determined and provided feedback to at the same time as the note is played or real-time while the user is playing and/or moving in relation to an instrument.
Many of the graphical content may be repeatedly presented to the user to improve the user's capability to produce them on the instrument. Altogether, repeating the method items as described may constitute as an exercise to practice, notes, intervals, chords, timing, etc., in accordance to the predetermined objective of the exercise. This way the user gets customized graphical content presented to them during the exercise. For example, a song may be practiced using the method wherein the method repeats the notes that the user had difficulties with or played wrong. This way the method teaches the user step-by-step instead of only providing feedback of if a note or a song was played correctly to some degree. As discussed, the method may constitute e.g. an exercise or a whole song gone through note-per-note, phrase-by-phrase, etc.
Obviously, the user may be also presented with the information whether the note(s) produced by the user was correct and additionally optionally whether the user's note fingering, interval fingering, chord fingering or hand posture were correct when producing the note(s) (represented as the play signal).
At 614, referred to as the end phase, the user terminates the method and the method ends e.g. via user command.
The method may also incorporate the aspects of gamification comprising a lot of different goals, skill levels reached via the exercises as well as collaboration or competitive game against other user(s). Method items may be stored as and executed by a computer program product embodied in a non-transitory carrier medium, such as optical disc or memory card.
The scope of the invention is determined by the attached claims together with the equivalents thereof. The skilled persons will again appreciate the fact that the disclosed embodiments were constructed for illustrative purposes only, and the innovative fulcrum reviewed herein will cover further embodiments, embodiment combinations, variations and equivalents that better suit each particular use case of the invention.

Claims

Claims
1. A system for teaching a user to play a musical instrument from musical notation via virtual exercises comprising:
-at least one electronic device comprising at least a processing entity, a memory entity and a display,
-means for forming or receiving at least one play signal produced by a us- er on a musical instrument,
-the processing entity being arranged to provide at least graphical musical notation content, such as a single note or a chord, via the display, -the processing entity being further arranged to obtain at least play signal data via said means for forming or receiving at least one play signal,
-the processing entity being further arranged to execute audio recognition to recognize the play signal data and compare it with data relating to the presented graphical content to at least determine if the play signal corresponds to said at least graphical musical notation content,
-the processing entity being further arranged to assign a score to represent the result of said comparison and store said score,
-the processing entity being further arranged to present the score to the user via the display, and
-to determine, at least on the basis of said score and other stored scores, further at least graphical musical notation content, to be presented to the user via the display.
2. The system of any preceding claim, comprising providing also graphical content selected from the group consisting of note fingering, in- terval fingering, chord fingering and hand posture in relation to the instrument.
3. The system of any preceding claim, comprising image acquisition means for acquiring image data of the user's note fingering, interval fingering, chord fingering or hand posture in relation to the instrument.
4. The system of any preceding claim, comprising a hand-wearable device, said device further comprising processing means to detect location in relation to an instrument or means to utilize the processing means of the electronic device for recognizing the user's note fingering, interval fingering, chord fingering or hand posture in relation to the instrument.
5. The system of claim 3, wherein the processing entity is further arranged to execute image recognition for determining the user's note fingering, interval fingering, chord fingering or hand posture in relation to the instrument from the acquired image data, and compare said determined user's note fingering, interval fingering, chord fingering or hand posture to with data relating to the presented graphical content to at least determine if the user's note fingering, interval fingering, chord fingering or hand posture corresponds with the correct user's note fingering, interval fingering, chord fingering or hand posture associated with the provided graphical musical notation content.
6. The system of claim 5, wherein the processing entity is further arranged to assign a score to represent the result of said comparison involving image data and store said score.
7. The system of claim 6, wherein the processing entity is further arranged to present the score relating to image data to the user via the display.
8. The system of claim 7, wherein the processing entity is further arranged to determine, at least on the basis of said score relating to image data and other scores, further at least graphical musical notation content, to be presented to the user via the display.
9. A method for teaching a user to play a musical instrument from musical notation via virtual exercises comprising: -presenting a user with graphical musical notation, such as a single note or a chord, optionally in a random order or in any appropriate order,
-obtaining at least one play signal produced by the user on a musical in- strument,
-executing audio recognition to recognize the play signal,
-comparing the recognized play signal with the graphical musical notation and determining if the play signal corresponds to said graphical content,
-assigning a score to represent the result of said comparison,
-presenting the score to the user,
-determining, at least on the basis of said score and other stored scores, further at least graphical musical notation content, to be presented to the user.
10. The method of claim 5, comprising the method item of presenting a user with at least graphical musical notation in relation to an instrument optionally presenting the location of the at least one note of said musical notation on the instrument.
1 1. The method of any of claims 5-6, comprising the method item of presenting a user with graphical musical notation with corresponding instrument fingering, hand posture or body posture.
12. The method of any of claims 5-7, comprising the method item of determining graphical musical notation to be presented to the user also in relation to recognized user's note fingering, interval fingering, chord fingering, body posture or hand posture.
13. A computer program product embodied in a non- transitory comput- er readable medium, comprising computer code for causing the computer to execute the method items of claim 5.
PCT/FI2016/050598 2015-09-04 2016-08-31 System for teaching a user to play a musical instrument from musical notation via virtual exercises and a method thereof WO2017037342A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20155639 2015-09-04
FI20155639A FI20155639A (en) 2015-09-04 2015-09-04 A system and method for teaching a user to play an instrument from musical notes through virtual exercises

Publications (1)

Publication Number Publication Date
WO2017037342A1 true WO2017037342A1 (en) 2017-03-09

Family

ID=58186793

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2016/050598 WO2017037342A1 (en) 2015-09-04 2016-08-31 System for teaching a user to play a musical instrument from musical notation via virtual exercises and a method thereof

Country Status (2)

Country Link
FI (1) FI20155639A (en)
WO (1) WO2017037342A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3376416A1 (en) * 2017-03-13 2018-09-19 Asate AG Device and method for training muscle and connective tissue in the oral cavity and throat of a person, in particular for long-term prevention of respiratory and sleep disorders and their consequences
CN111028615A (en) * 2019-11-29 2020-04-17 尤剑 Intelligent musical instrument playing teaching method, system and storage medium
CN111061365A (en) * 2019-11-28 2020-04-24 武汉渲奇数字科技有限公司 Bell set playing guide system based on VR technology
CN112233497A (en) * 2020-10-23 2021-01-15 郑州幼儿师范高等专科学校 Piano playing finger force exercise device
CN113255470A (en) * 2021-05-06 2021-08-13 李岱勋 Multi-mode piano partner training system and method based on hand posture estimation
CN113257210A (en) * 2021-06-02 2021-08-13 南京邮电大学 Multi-mode music score transformation method and system for copper or wood musical instrument
US11670188B2 (en) 2020-12-02 2023-06-06 Joytunes Ltd. Method and apparatus for an adaptive and interactive teaching of playing a musical instrument
US11893898B2 (en) 2020-12-02 2024-02-06 Joytunes Ltd. Method and apparatus for an adaptive and interactive teaching of playing a musical instrument
US11900825B2 (en) 2020-12-02 2024-02-13 Joytunes Ltd. Method and apparatus for an adaptive and interactive teaching of playing a musical instrument
US11972693B2 (en) 2021-11-18 2024-04-30 Joytunes Ltd. Method, device, system and apparatus for creating and/or selecting exercises for learning playing a music instrument

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000056669A (en) * 1998-08-10 2000-02-25 Yamaha Corp Device and method for detecting musical instrument performance operation and instructing musical instrument performance operation, and recording medium stored with their program
JP2003066957A (en) * 2001-08-27 2003-03-05 Casio Comput Co Ltd Performance practice device and performance practice processing program
JP2006091633A (en) * 2004-09-27 2006-04-06 Casio Comput Co Ltd Musical performance evaluation system and program of performance evaluation processing
JP2006091631A (en) * 2004-09-27 2006-04-06 Casio Comput Co Ltd System and program for managing musical performance practice
US20090019990A1 (en) * 2007-07-16 2009-01-22 Industrial Technology Research Institute Method and apparatus for keyboard instrument learning
US20110003638A1 (en) * 2009-07-02 2011-01-06 The Way Of H, Inc. Music instruction system
WO2011030225A2 (en) * 2009-09-14 2011-03-17 Joytunes, Ltd. System and method for improving musical education
US20120057012A1 (en) * 1996-07-10 2012-03-08 Sitrick David H Electronic music stand performer subsystems and music communication methodologies
KR20140142794A (en) * 2013-06-04 2014-12-15 김부전 Keyboard apparatus for music lesson
US20150059556A1 (en) * 2013-09-05 2015-03-05 Keith Grafman System and method for learning to play a musical instrument

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120057012A1 (en) * 1996-07-10 2012-03-08 Sitrick David H Electronic music stand performer subsystems and music communication methodologies
JP2000056669A (en) * 1998-08-10 2000-02-25 Yamaha Corp Device and method for detecting musical instrument performance operation and instructing musical instrument performance operation, and recording medium stored with their program
JP2003066957A (en) * 2001-08-27 2003-03-05 Casio Comput Co Ltd Performance practice device and performance practice processing program
JP2006091633A (en) * 2004-09-27 2006-04-06 Casio Comput Co Ltd Musical performance evaluation system and program of performance evaluation processing
JP2006091631A (en) * 2004-09-27 2006-04-06 Casio Comput Co Ltd System and program for managing musical performance practice
US20090019990A1 (en) * 2007-07-16 2009-01-22 Industrial Technology Research Institute Method and apparatus for keyboard instrument learning
US20110003638A1 (en) * 2009-07-02 2011-01-06 The Way Of H, Inc. Music instruction system
WO2011030225A2 (en) * 2009-09-14 2011-03-17 Joytunes, Ltd. System and method for improving musical education
KR20140142794A (en) * 2013-06-04 2014-12-15 김부전 Keyboard apparatus for music lesson
US20150059556A1 (en) * 2013-09-05 2015-03-05 Keith Grafman System and method for learning to play a musical instrument

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018165773A1 (en) * 2017-03-13 2018-09-20 Asate Ag Device and method for training the muscles and connective tissue in the oral cavity and throat of a person, particularly for long-term avoidance of airway and sleep disorders and the consequences thereof
EP3376416A1 (en) * 2017-03-13 2018-09-19 Asate AG Device and method for training muscle and connective tissue in the oral cavity and throat of a person, in particular for long-term prevention of respiratory and sleep disorders and their consequences
CN111061365A (en) * 2019-11-28 2020-04-24 武汉渲奇数字科技有限公司 Bell set playing guide system based on VR technology
CN111061365B (en) * 2019-11-28 2023-05-30 武汉渲奇数字科技有限公司 Bell playing guiding system based on VR technology
CN111028615A (en) * 2019-11-29 2020-04-17 尤剑 Intelligent musical instrument playing teaching method, system and storage medium
CN112233497B (en) * 2020-10-23 2022-02-22 郑州幼儿师范高等专科学校 Piano playing finger force exercise device
CN112233497A (en) * 2020-10-23 2021-01-15 郑州幼儿师范高等专科学校 Piano playing finger force exercise device
US11670188B2 (en) 2020-12-02 2023-06-06 Joytunes Ltd. Method and apparatus for an adaptive and interactive teaching of playing a musical instrument
US11893898B2 (en) 2020-12-02 2024-02-06 Joytunes Ltd. Method and apparatus for an adaptive and interactive teaching of playing a musical instrument
US11900825B2 (en) 2020-12-02 2024-02-13 Joytunes Ltd. Method and apparatus for an adaptive and interactive teaching of playing a musical instrument
CN113255470A (en) * 2021-05-06 2021-08-13 李岱勋 Multi-mode piano partner training system and method based on hand posture estimation
CN113255470B (en) * 2021-05-06 2024-04-02 李岱勋 Multi-mode piano accompany training system and method based on hand gesture estimation
CN113257210A (en) * 2021-06-02 2021-08-13 南京邮电大学 Multi-mode music score transformation method and system for copper or wood musical instrument
CN113257210B (en) * 2021-06-02 2023-10-24 南京邮电大学 Multi-mode spectrum conversion method and system for copper or wooden musical instrument
US11972693B2 (en) 2021-11-18 2024-04-30 Joytunes Ltd. Method, device, system and apparatus for creating and/or selecting exercises for learning playing a music instrument

Also Published As

Publication number Publication date
FI20155639A (en) 2017-03-05

Similar Documents

Publication Publication Date Title
WO2017037342A1 (en) System for teaching a user to play a musical instrument from musical notation via virtual exercises and a method thereof
US10339829B2 (en) System and method for learning to play a musical instrument
Jensenius Action-sound: Developing methods and tools to study music-related body movement
US9218748B2 (en) System and method for providing exercise in playing a music instrument
JP7238794B2 (en) Information processing device, information processing method and program
US11557269B2 (en) Information processing method
Percival et al. Effective use of multimedia for computer-assisted musical instrument tutoring
JP7367690B2 (en) information processing equipment
US20220398937A1 (en) Information processing device, information processing method, and program
JPWO2020100671A1 (en) Information processing equipment, information processing methods and programs
US20150242797A1 (en) Methods and systems for evaluating performance
Hummel et al. Interactive sonification of German wheel sports movement
Allingham et al. Motor performance in violin bowing: Effects of attentional focus on acoustical, physiological and physical parameters of a sound-producing action
Cosentino et al. Natural human–robot musical interaction: understanding the music conductor gestures by using the WB-4 inertial measurement system
CN114170868A (en) Intelligent piano training method and system
CN110910712A (en) Zheng auxiliary teaching system and method based on AR
Rhodes et al. Towards Developing a Virtual Guitar Instructor through Biometrics Informed Human-Computer Interaction
Brown et al. A case study in collaborative learning via participatory music interactive systems: Interactive Tango Milonga
US20230024727A1 (en) System and method for learning to play a musical instrument
US20230386155A1 (en) Virtual, augmented or mixed reality instrument teaching system and method
KR102545185B1 (en) Apparatus for training and testing to improve cognitive auditory function by sensing, method and program of the same
KR102556571B1 (en) Apparatus for training and testing to improve cognitive and auditory function, method and program of the same
EP4332957A2 (en) Virtual, augmented or mixed reality instrument teaching system and method
Lo et al. Deep Violin: Deep Learning-Based Violin Bowing Training System
WO2023105601A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16840884

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16840884

Country of ref document: EP

Kind code of ref document: A1