CN113870067A - Intelligent music learning method - Google Patents

Intelligent music learning method Download PDF

Info

Publication number
CN113870067A
CN113870067A CN202111010873.7A CN202111010873A CN113870067A CN 113870067 A CN113870067 A CN 113870067A CN 202111010873 A CN202111010873 A CN 202111010873A CN 113870067 A CN113870067 A CN 113870067A
Authority
CN
China
Prior art keywords
rhythm
music
file
exercise
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111010873.7A
Other languages
Chinese (zh)
Other versions
CN113870067B (en
Inventor
林东姝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yiqi Network Technology Co ltd
Original Assignee
Beijing Yiqi Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yiqi Network Technology Co ltd filed Critical Beijing Yiqi Network Technology Co ltd
Priority to CN202111010873.7A priority Critical patent/CN113870067B/en
Priority claimed from CN202111010873.7A external-priority patent/CN113870067B/en
Publication of CN113870067A publication Critical patent/CN113870067A/en
Application granted granted Critical
Publication of CN113870067B publication Critical patent/CN113870067B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

The invention discloses an intelligent music learning method, which comprises the following steps: providing a music list, recording exercise rhythms, correcting exercise rhythms and exercise records; the intelligent music learning method provided by the invention intelligently and interactively assists students in learning music, collects, counts and screens favorite music types of students, provides different rhythm learning, and corrects and collects errors in the learning process.

Description

Intelligent music learning method
Technical Field
The invention relates to the field of online education, in particular to an intelligent music learning method.
Background
The Audio-visual multi-modal teaching method refers to that human senses the world through five senses, namely vision, hearing, touch, smell and taste, and the interaction mode between various senses and the external environment is a mode. Five types of communication modalities, namely a visual modality, an auditory modality, a tactile modality, an olfactory modality and a gustatory modality, are generated by the five sensory channels. In the teaching process, a plurality of channels and a plurality of teaching means are utilized to mobilize the cooperative operation of a plurality of senses of learners so as to achieve the teaching method for deepening the impression and strengthening the memory, namely an audio-visual moving contact multi-mode teaching mode.
Cadence is a special expression form of human language. The rhythm of music is the ability to sense the length of the sound in the music. The music rhythm sense has the motility and the emotionality, and the ability of feeling the emotional expression of the rhythm in the music and accurately reproducing the music rhythm is realized. The pupil is the golden period of absorbing music rhythm, and for the pupil, learning good rhythm is the premise and the basis of learning good music. At present, the low-grade students in primary schools are easy to cause unstable rhythm and lack of rhythm knowledge in the aspect of learning music rhythm, and one of the reasons is that in the aspect of teaching equipment, the current rhythm teaching only depends on the on-site demonstration teaching of teachers.
Disclosure of Invention
In order to solve the defects of the prior art, the invention provides the following technical scheme:
an intelligent music learning method comprises the following steps:
providing a music list;
recording the exercise rhythm;
correcting the exercise rhythm;
and (6) recording the exercise.
The intelligent music learning method also comprises a learning database; the learning database is stored with learning materials, the learning materials are music rhythm learning materials, and each music rhythm learning material comprises a music audio file and a standard rhythm file corresponding to the music audio file; each music file in the learning database at least comprises music content (i.e. audio data), rhythm type, music style and the like. The standard tempo file comprises time and tempo point data.
The music intelligent learning method comprises the following steps of: when music audio playback, begin to record student's rhythm and hit information of beating, student's rhythm is hit information of beating and can be acquireed through hitting the unit, hit the unit and include beat collector, simulate shelf drum and computer or mobile device's input unit.
According to the intelligent music learning method, after the students finish practice, the students obtain an audio file, and a sound intensity graph of the audio file can be generated by taking time as an x axis and sound intensity as a y axis; the sound intensity map is according to TwGenerating time windows, each time window being a time unit, the ith time unit being represented as [ (i-1) Tw,iTw]And i is a natural number.
In the above music intelligent learning method, the sound intensity map uses soundThe intensity curve is represented by V (t), V (t) over a time period [ t, t + w ]]Inner curve length integral
Figure BDA0003238446980000031
In discrete form as
Figure BDA0003238446980000032
Wherein Δ Vk=Vk-Vk-1,1<i<N, N is the total number of sampling points, and delta t is the integration window, namely the granularity of integration.
The above-mentioned music intelligent learning method, wherein the practice tempo record file is represented by B (Ly (Tw, i)) corresponding to each time unit Ti; the calculation method of B (Ly (Tw, i)) for each time unit Ti is as follows: starting from time 0, each time unit T is calculated one by onewLy (Tw, i) in, there is not enough T for the last time windowwIf yes, 0 is supplemented; a binary filtering process is performed for each Ly (Tw, i), and when Ly (Tw, i) > H, B (Ly (Tw, i)) -1, and Ly (Tw, i) ≦ H, B (Ly (Tw, i)) -0.
In the above intelligent music learning method, the time information in the standard tempo file is also composed of a plurality of consecutive time units, and the granularity of the time units in the standard tempo data is finer than that of the time units in the practice standard tempo data, that is: t iswStandard rhythm file < TwStandard tempo documentation is exercised.
According to the intelligent music learning method, the time unit of the exercise standard rhythm file is generated according to the following rules: the time unit of the standard tempo file is denoted by Twb, the time unit of the exercise standard tempo is denoted by Twp, which is M × Twb, and the time period of the time unit in the exercise standard tempo data is M times the time period of the time unit of the standard tempo file. Then: twpj=(Twbj+(j-1)*M,Twbj+1+(j-1)*M,...,TwbM+(j-1)*M) (ii) a The rhythm point is represented by Beat, the rhythm point of the standard rhythm file is represented by Beatb, the rhythm point of the exercise standard rhythm file is represented by Beatp, and then:
Figure BDA0003238446980000041
if all the rhythm points of "yes" in the standard rhythm file are represented by BeatY, { BeatY1, BeatY2,. and BeatK }, K being the total number of rhythm points of "yes" in the standard rhythm file; the time corresponding to the rhythm point BeatY is represented by TY, then: TY ═ TY1, TY2,.., TK }; twp is less than or equal to Min { Tyk-Tyk-1K is a natural number and not more than K.
In the above music intelligent learning method, the practice judgment unit obtains the practice standard rhythm data file corresponding to the music audio file played by the practice content playing unit and the practice rhythm record file generated after the practice of the student is completed, and compares the practice standard rhythm data file with the practice rhythm record file; the process of the comparison is as follows: and comparing rhythm point values in each time unit in the exercise rhythm record file and the exercise standard rhythm data file one by one, if Beatp is B (Ly), judging that the student rhythm striking is correct, and otherwise, judging that the student rhythm striking is wrong.
In the above intelligent music learning method, the exercise recording method comprises: collecting the rhythms of correct striking and wrong striking of the students in the learning process, consolidating the training in the next learning aiming at the rhythms of correct striking, and carrying out key training in the next learning aiming at the rhythms of wrong striking.
The intelligent music learning method provided by the invention intelligently and interactively assists students in learning music, collects, counts and screens favorite music types of students, provides different rhythm learning, and corrects and collects errors in the learning process.
Drawings
In order to more clearly illustrate the embodiments of the present application or technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic diagram of a logic structure of an intelligent music learning method according to an embodiment of the present invention;
FIG. 2 is a graph of sound intensity of an audio file according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a recording file for practicing tempo provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of a standard tempo file according to an embodiment of the present invention
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an intelligent music learning method, which is applied to music online education for pupils.
As shown in fig. 1, the music intelligent learning method includes the following steps:
step 1, providing a music list.
In the step, music is selected from the learning database and a music list is provided, so that the student selects favorite music from the music list, and the favorite music is analyzed according to the favorite music selected by the student to obtain the interested music style and rhythm type of the student.
Specifically, the learning database stores learning materials, the learning materials are music rhythm learning materials, and each music rhythm learning material comprises a music audio file and a standard rhythm file corresponding to the music audio file. Preferably, each music file in the learning database further includes at least music content (i.e., audio data), rhythm type, music style, and the like. The standard tempo file comprises time and tempo point data.
Furthermore, when the student uses the method for learning music for the first time, the learning database randomly selects music to provide a music list, collects music segments which are interesting and uninteresting to the student in the learning process, and obtains the interesting music style and rhythm type of the student according to judgment. For example, when a student hears a certain piece of music, interest labeling can be performed if the student is interested. Preferably, the music intelligent learning method of the present invention can be implemented by a tablet computer, a mobile phone or other terminal devices with an input function. When the method is implemented by a computer, interest can be expressed by a corresponding key code or a mouse clicking position through a keyboard or a mouse, for example, an input enter indicates interest, and the mouse clicks a corresponding icon in a screen to indicate interest. When the music intelligent learning system is implemented by a mobile terminal with touch input, a student can express interest by a certain gesture (for example, sliding the screen leftwards or rightwards), or double-clicking or multi-clicking the screen. Similarly, the student can express that the currently played music is not interested by corresponding operations. The interest judgment unit acquires music sets which are interested by students and analyzes the music sets to obtain the music styles and rhythm types (such as 4/4 beats) which are interested by the students.
And step two, recording the exercise rhythm.
Specifically, in the music learning process, the method for generating the exercise rhythm recording file comprises the following steps: when music audio is played, rhythm striking information of students begins to be recorded, and the rhythm striking information of the students can be acquired through a striking unit, wherein the striking unit can be similar to a beat collector, a simulated drum set and the like, or an input unit of a computer or mobile equipment. Preferably, the striking unit may be shaped as a percussion-type musical instrument, such as a drum, a xylophone, a celesta, a pipe bell, a snare drum, a bass drum, a triumph, a tambourine, a rattle, or the like.
The recording exercise rhythm is a whole-process recording, namely, the rhythm striking information of the student is recorded from the moment of music audio playing, in order to be more widely applied, the recording content is an audio signal in the exercise environment of the student, for example, when the student exercises on a bus through a mobile phone, the student can strike a certain part of the mobile phone, such as a screen and a mobile phone back shell, and at the moment, all sounds including environment sounds on the bus, namely the student striking sounds, are recorded in the exercise rhythm recording file. Or the student is in a music practice room and beats through the tambourine, all the sounds in the music practice room are contained in the practice rhythm recording file.
Preferably, in order to avoid the music audio file interfering with the exercise rhythm recording file, the music audio file may be separated from the environmental sounds, for example, the music audio file may be played through a headset or an in-ear earphone, so that the interference of the beating sound of the student rhythm during the playing of the music audio sound may be effectively avoided.
Step three: correcting the exercise rhythm.
In this step, the exercise tempo recorded in the music learning process in the above step is compared with the standard tempo file in the learning database.
Specifically, after the student completes the exercise (of course, the student can actively stop or terminate the exercise), an audio file is obtained; as shown in fig. 2, a sound intensity map of the audio file may be generated with time as the x-axis and sound intensity as the y-axis.
According to TwTime windows are generated, each of which can be understood as a sampling period. Also referred to as a time unit in the present invention. For example, in the exercise standard rhythm data, the time information is a minute time period, i.e., a time unit, and the time information may be { time unit 1, time unit 2, time unit 3, … }, and adjacent time units are consecutive in time. For example, time unit 1 ═ 0, Tw]Time unit 2 ═ Tw,2Tw]Time unit 3 ═ 2Tw,3Tw]. The ith time unit may be represented as [ (i-1) Tw,iTw]And i is a natural number.
Assuming that the sound intensity curve is denoted by v (t), if v (t) is continuously differentiable within the time period [ t1, t2], v (t) the length of the curve within the time period [ t1, t2] is denoted by Ly. Then:
Figure BDA0003238446980000081
tw, then V (t) is in the time period [ t, t + w]Inner curve length integral
Figure BDA0003238446980000082
E.g. then V (t) for a time period [0, Tw ]]Inner curve length integral
Figure BDA0003238446980000083
In discrete form as
Figure BDA0003238446980000084
Wherein Δ Vk=Vk-Vk-1,1<i<N, N is the total number of sampling points, and delta t is the integration window, namely the granularity of integration. E.g. then V (t) for a time period [0, Tw ]]Inner curve length integral
Figure BDA0003238446980000085
Starting from time 0, each time unit T is calculated one by onewLy (Tw, i) in, there is not enough T for the last time windowwIf yes, 0 is complemented. Performing binary filtering processing on each Ly (Tw, i), and when Ly (Tw, i) > H, B (Ly (Tw, i)) -1, and Ly (Tw, i) ≦ H, B (Ly (Tw, i)) -0, to obtain B (Ly (Tw, i)) corresponding to each time unit Ti. This results in a practice tempo record file as illustrated in fig. 3.
Note that the standard tempo file includes time and tempo point data as shown in fig. 4:
the time information in the standard rhythm file is also composed of a plurality of continuous time units, and generally, the granularity of the time units in the standard rhythm data is finer than that of the time units in the exercise standard rhythm data;
that is to say: t iswStandard rhythm file < TwTraining a standard rhythm file;
for example, the time unit in the standard tempo data is 2 msec, the time unit in the exercise standard tempo data may be 10 msec, 20 msec, 50 msec, or the like, and the duration of the time unit in the exercise standard tempo data is an integral multiple of the duration of the time unit of the standard tempo file. Generating a time unit of the exercise standard rhythm file according to the following rules:
the time unit of the standard tempo file is denoted by Twb, the time unit of the exercise standard tempo is denoted by Twp, which is M × Twb, and the time period of the time unit in the exercise standard tempo data is M times the time period of the time unit of the standard tempo file. Then:
Twpj=(Twbj+(j-1)*M,Twbj+1+(j-1)*M,...,TwbM+(j-1)*M);
the rhythm point is represented by Beat, the rhythm point of the standard rhythm file is represented by Beatb, and the rhythm point in the exercise standard rhythm file is represented by Beatp.
Then:
Figure BDA0003238446980000091
note that M should not be selected too large, otherwise (Beatb)j+(j-1)*M,Beatbj+1+(j-1)*M,...,BeatbM+(j-1)*M) There may be cases where two or more of the Beatb values are yes. Therefore, in a preferred case, if all the rhythm points that are "yes" in the standard rhythm file are represented by BeatY, { BeatY1, BeatY2,. and BeatK }, K is the total number of rhythm points that are all "yes" in the standard rhythm file; the time corresponding to the rhythm point BeatY is denoted by TY, and TY ═ TY1, TY2,.., TK }; twp is less than or equal to Min { Tyk-Tyk-1K is a natural number and not more than K. This ensures (Beatb)j+(j-1)*M,Beatbj+1+(j-1)*M,...,BeatbM+(j-1)*M) There is no case where two or more than two Beatb values are yes.
The exercise judgment unit acquires an exercise standard rhythm data file corresponding to the music audio file played by the exercise content playing unit and an exercise rhythm recording file generated after the student completes exercise. And compares the exercise standard tempo data file with the exercise tempo record file. The process of the comparison is as follows:
and comparing rhythm point values in each time unit in the exercise rhythm record file and the exercise standard rhythm data file one by one, if Beatp is B (Ly), judging that the student rhythm striking is correct, and otherwise, judging that the student rhythm striking is wrong.
Step four: and (6) recording the exercise.
In the step, collecting rhythms of correct striking and wrong striking of students in the learning process, consolidating training in next learning aiming at the rhythms of correct striking, and intensively training in next learning aiming at the rhythms of wrong striking;
furthermore, music segments which are interesting and uninteresting to students in the learning process of the students are collected and analyzed to obtain music styles and rhythm types which are interesting to the students, and the favorite music styles and rhythm types are selected for the students to practice in the next learning process, so that the effect of getting twice the result with half the effort is brought, and the students can be helped to master the beating of the rhythm as soon as possible.
While certain exemplary embodiments of the present invention have been described above by way of illustration only, it will be apparent to those of ordinary skill in the art that the described embodiments may be modified in various different ways without departing from the spirit and scope of the invention. Accordingly, the drawings and description are illustrative in nature and should not be construed as limiting the scope of the invention.

Claims (10)

1. A music intelligent learning method, the method comprising the steps of:
providing a music list;
recording the exercise rhythm;
correcting the exercise rhythm;
and (6) recording the exercise.
2. The intelligent music learning method of claim 1, further comprising a learning database; the learning database stores learning materials, the learning materials are music rhythm learning materials, and each music rhythm learning material comprises a music audio file and a standard rhythm file corresponding to the music audio file; each music file in the learning database at least comprises music content, rhythm type, music style and the like, and the standard rhythm file comprises time and rhythm point data.
3. The intelligent music learning method according to claim 2, wherein the exercise tempo record file is generated by: when music audio frequency broadcast, begin to record student's rhythm and hit information, student's rhythm hits information of hitting can be acquireed through hitting the unit, hits the unit of hitting and includes beat collector, simulation shelf drum and computer or mobile device's input unit.
4. The intelligent music learning method according to claim 3, wherein the student obtains an audio file after completing the exercise, and a sound intensity map of the audio file can be generated by using time as an x-axis and sound intensity as a y-axis; sound intensity map according to TwGenerating time windows, each time window being a time unit, the ith time unit being represented as [ (i-1) Tw,iTw]And i is a natural number.
5. The intelligent music learning method of claim 4, wherein the sound intensity graph is represented by a sound intensity curve V (t), V (t) being in a time period [ t, t + w ]]Inner curve length integral
Figure FDA0003238446970000021
In discrete form as
Figure FDA0003238446970000022
Wherein Δ Vk=Vk-Vk-1,1<i<N, N is the total number of sampling points, and delta t is the integration window, namely the granularity of integration.
6. Tone of claim 5A musical intelligent learning method characterized in that a practice tempo record file is represented by B (Ly (Tw, i)) for each time unit Ti; the calculation method of B (Ly (Tw, i)) for each time unit Ti is as follows: starting from time 0, each time unit T is calculated one by onewLy (Tw, i) in, there is not enough T for the last time windowwIf yes, 0 is supplemented; a binary filtering process is performed for each Ly (Tw, i), and when Ly (Tw, i) > H, B (Ly (Tw, i)) -1, and Ly (Tw, i) ≦ H, B (Ly (Tw, i)) -0.
7. The intelligent music learning method according to claim 6, wherein the time information in the standard tempo file is also composed of a plurality of consecutive time units, and the granularity of the time units in the standard tempo data is finer than that of the time units in the practice standard tempo data, that is: t iswStandard rhythm file < TwStandard tempo documentation is exercised.
8. The intelligent learning method for music according to claim 7, wherein the time unit of the exercise standard tempo file is generated according to the following rules: a time unit of the standard tempo file is denoted by Twb, a time unit of the exercise standard tempo is denoted by Twp, which is M × Twb, and the time duration of the time unit in the exercise standard tempo data is M times the time duration of the time unit of the standard tempo file, then: twpj=(Twbj+(j-1)*M,Twbj+1+(j-1)*M,...,TwbM+(j-1)*M) (ii) a The rhythm point is represented by Beat, the rhythm point of the standard rhythm file is represented by Beatb, the rhythm point of the exercise standard rhythm file is represented by Beatp, and then:
Figure FDA0003238446970000031
if all the rhythm points of "yes" in the standard rhythm file are represented by BeatY,
betty ═ betty 1, bety 2.., BeatK }, K being the total number of rhythm points in the standard rhythm file all being "yes"; the time corresponding to the rhythm point BeatY is represented by TY, then:
TY ═ TY1, TY2,.., TK }; twp is less than or equal to Min { Tyk-Tyk-1K is a natural number and not more than K.
9. The intelligent music learning method according to claim 8, wherein the exercise judgment unit obtains an exercise standard tempo data file corresponding to the music audio file played by the exercise content playing unit and an exercise tempo record file generated after the student completes exercise, and compares the exercise standard tempo data file with the exercise tempo record file; the process of the comparison is as follows: and comparing rhythm point values in each time unit in the exercise rhythm record file and the exercise standard rhythm data file one by one, if Beatp is B (Ly), judging that the student rhythm striking is correct, and otherwise, judging that the student rhythm striking is wrong.
10. The intelligent learning method for music according to claim 9, wherein the exercise recording method comprises: collecting the rhythms of correct striking and wrong striking of the students in the learning process, consolidating the training in the next learning aiming at the rhythms of correct striking, and carrying out key training in the next learning aiming at the rhythms of wrong striking.
CN202111010873.7A 2021-08-31 Intelligent music learning method Active CN113870067B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111010873.7A CN113870067B (en) 2021-08-31 Intelligent music learning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111010873.7A CN113870067B (en) 2021-08-31 Intelligent music learning method

Publications (2)

Publication Number Publication Date
CN113870067A true CN113870067A (en) 2021-12-31
CN113870067B CN113870067B (en) 2024-10-25

Family

ID=

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWM449336U (en) * 2012-09-26 2013-03-21 Univ Southern Taiwan Sci & Tec Rhythmic electronic drum
US20140100010A1 (en) * 2009-07-02 2014-04-10 The Way Of H, Inc. Music instruction system
CN107093347A (en) * 2016-02-18 2017-08-25 起鼓音乐文化有限公司 Intelligence auxiliary percussion music learning system and its method
CN111128100A (en) * 2019-12-20 2020-05-08 网易(杭州)网络有限公司 Rhythm point detection method and device and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140100010A1 (en) * 2009-07-02 2014-04-10 The Way Of H, Inc. Music instruction system
TWM449336U (en) * 2012-09-26 2013-03-21 Univ Southern Taiwan Sci & Tec Rhythmic electronic drum
CN107093347A (en) * 2016-02-18 2017-08-25 起鼓音乐文化有限公司 Intelligence auxiliary percussion music learning system and its method
CN111128100A (en) * 2019-12-20 2020-05-08 网易(杭州)网络有限公司 Rhythm point detection method and device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张贝贝: ""基于Unity3D的音乐游戏制作"", 《现代信息科技》, vol. 5, no. 8, 25 April 2021 (2021-04-25), pages 112 - 114 *

Similar Documents

Publication Publication Date Title
Hitron et al. Introducing children to machine learning concepts through hands-on experience
Frid Accessible digital musical instruments-a survey of inclusive instruments
CN105070298B (en) The methods of marking and device of polyphony musical instrument
KR101859268B1 (en) System for providing music synchronized with syllable of english words
CN106991852B (en) Online teaching method and device
CN112309365A (en) Training method and device of speech synthesis model, storage medium and electronic equipment
Wang et al. Real-time pitch training system for violin learners
Gerino et al. Towards large scale evaluation of novel sonification techniques for non visual shape exploration
KR100894866B1 (en) Piano tuturing system using finger-animation and Evaluation system using a sound frequency-waveform
Brandmeyer et al. Learning expressive percussion performance under different visual feedback conditions
Bresin et al. Sonification of the self vs. sonification of the other: Differences in the sonification of performed vs. observed simple hand movements
Krout Engaging iPad applications with young people with autism spectrum disorders
CN111695777A (en) Teaching method, teaching device, electronic device and storage medium
JP5346114B1 (en) Educational device and method for music expression and music performance evaluation device
CN113870067B (en) Intelligent music learning method
CN113870067A (en) Intelligent music learning method
Lin et al. Implementation and evaluation of real-time interactive user interface design in self-learning singing pitch training apps
Fonteles et al. User experience in a kinect-based conducting system for visualization of musical structure
Tez et al. Exploring the effect of interface constraints on live collaborative music improvisation.
Yamamoto et al. Livo: Sing a song with a vowel keyboard
Larkin et al. Sonification of bowing features for string instrument training
Jaime et al. A new multiformat rhythm game for music tutoring
Jylhä et al. Design and evaluation of human-computer rhythmic interaction in a tutoring system
CN107146181A (en) A kind of online teaching method and device
CN1530892A (en) Hearing sense recovering method and system for deaf children

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant