CN110136677B - Musical tone control method and related product - Google Patents

Musical tone control method and related product Download PDF

Info

Publication number
CN110136677B
CN110136677B CN201910242677.9A CN201910242677A CN110136677B CN 110136677 B CN110136677 B CN 110136677B CN 201910242677 A CN201910242677 A CN 201910242677A CN 110136677 B CN110136677 B CN 110136677B
Authority
CN
China
Prior art keywords
target
note
feature set
tone data
musical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910242677.9A
Other languages
Chinese (zh)
Other versions
CN110136677A (en
Inventor
刘纯阳
方家文
段志尧
李博琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mango Future Technology Co ltd
Original Assignee
Shenzhen Mango Future Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mango Future Technology Co ltd filed Critical Shenzhen Mango Future Technology Co ltd
Priority to CN201910242677.9A priority Critical patent/CN110136677B/en
Publication of CN110136677A publication Critical patent/CN110136677A/en
Application granted granted Critical
Publication of CN110136677B publication Critical patent/CN110136677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H7/00Instruments in which the tones are synthesised from a data store, e.g. computer organs
    • G10H7/002Instruments in which the tones are synthesised from a data store, e.g. computer organs using a common processing for different operations or calculations, and a set of microinstructions (programme) to control the sequence thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Auxiliary Devices For Music (AREA)

Abstract

The embodiment of the application discloses a music sound control method and a related product, first music sound data is acquired through electronic equipment, extracting the characteristics of the first music data to obtain a first target note characteristic set, searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set, determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between the preset reference note feature set and the control instruction, executing the operation corresponding to the target control instruction, thus, the electronic equipment can determine the control instruction corresponding to the musical tone data by analyzing the musical tone data, therefore, the user can control the operation of the electronic equipment through the music, the diversity and the intelligence of the control mode of the electronic equipment are improved, and the electronic equipment is more personalized.

Description

Musical tone control method and related product
Technical Field
The present application relates to the field of audio processing technologies, and in particular, to a musical tone control method and a related product.
Background
With the widespread use of electronic devices, the electronic devices have more and more functions and more powerful functions, and the electronic devices are developed towards diversification and personalization, becoming indispensable products in the life of users, and users can control the operation of the electronic devices by interacting with the electronic devices.
Disclosure of Invention
The embodiment of the application provides a musical tone control method and a related product, which can enable a user to control the operation of electronic equipment through musical tones, improve the diversity and intelligence of the control mode of the electronic equipment and enable the electronic equipment to be more personalized.
In a first aspect, an embodiment of the present application provides a musical tone control method, applied to an electronic device, the method including:
acquiring first tone data;
performing feature extraction on the first musical tone data to obtain a first target note feature set, wherein the first target note feature set comprises a plurality of target note features;
searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set;
and determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between a preset reference note feature set and a control instruction, and executing the operation corresponding to the target control instruction.
In a second aspect, an embodiment of the present application provides a musical tone control apparatus applied to an electronic device, the musical tone control apparatus including:
an acquisition unit configured to acquire first tone data;
an extracting unit, configured to perform feature extraction on the first tone data to obtain a first target note feature set, where the first target note feature set includes a plurality of target note features;
the searching unit is used for searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set;
and the processing unit is used for determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between a preset reference note feature set and the control instruction, and executing the operation corresponding to the target control instruction.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that the tone control method and associated product described in the embodiments of the present application, by acquiring first tone data by an electronic device, extracting the characteristics of the first music data to obtain a first target note characteristic set, searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set, determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between the preset reference note feature set and the control instruction, executing the operation corresponding to the target control instruction, thus, the electronic equipment can determine the control instruction corresponding to the musical tone data by analyzing the musical tone data, therefore, the user can control the operation of the electronic equipment through the music, the diversity and the intelligence of the control mode of the electronic equipment are improved, and the electronic equipment is more personalized.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a musical tone control method according to an embodiment of the present application;
fig. 2 is a flow chart of another tone control method disclosed in the embodiment of the present application;
fig. 3 is a flow chart of another tone control method disclosed in the embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application;
fig. 5 is a schematic structural view of a musical tone control apparatus disclosed in an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wireless headsets, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and the like, which have wireless communication functions, and may be, for example, a smart phone, a tablet computer, a notebook computer, a palm computer, an earphone box, and the like. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
Referring to fig. 1, fig. 1 is a flow chart of a musical tone control method applied to an electronic device according to an embodiment of the present application, the method including the steps of:
101. first tone data is acquired.
The first musical tone data may be an audio signal acquired by a microphone, a vibration musical tone signal acquired by a vibration sensor disposed on a musical instrument, or a digital musical tone signal acquired through a Musical Instrument Digital Interface (MIDI), and the like, which is not limited in the present application.
In an embodiment of the present application, a user may play a piece of music through a musical instrument, and the electronic device may acquire data of the music played by the user to obtain first music data, where the specific implementation of the electronic device acquiring the first music data of the player may include: acquiring an audio signal played by a player through a microphone on the electronic equipment; the electronic equipment can be connected with a vibration sensor on the musical instrument, and the vibration sensor on the musical instrument acquires a signal of the vibration of the musical instrument to obtain a vibration musical tone signal, for example, a user stirs a string to detect the vibration musical tone signal of the string; alternatively, by connecting the electronic apparatus to MIDI on the musical instrument, when the user plays the musical instrument, the corresponding digital musical tone signal can be acquired.
Wherein, the musical instrument played by the user can comprise any one of the following: violins, pianos, cellos, guitars, accordion, zither, etc., without limitation in this application.
102. And performing feature extraction on the first musical tone data to obtain a first target note feature set, wherein the first target note feature set comprises a plurality of target note features.
In an embodiment of the application, the feature extraction method may include a spectrum envelope method, a cepstrum method, an LPC interpolation method, an LPC root finding method, a hilbert transform method, and the like, which is not described herein again, the first target note feature set includes a plurality of target note features corresponding to a plurality of notes in the first musical tone data, and the target note features may include any one of the following: timbre, pitch, duration.
Alternatively, if the first musical tone data is the first audio frequency obtained by the microphone of the electronic device, the step 102 of performing feature extraction on the first musical tone data to obtain the first target note feature set may include the following steps:
21. amplifying the first audio to obtain a first amplified audio;
22. filtering the first amplified audio to obtain a first target audio;
23. performing analog-to-digital conversion on the first target audio to obtain a first target digital audio signal;
24. and performing feature extraction on the first target digital audio signal according to a preset algorithm to obtain a first target note feature set.
The audio signal is amplified, so that the signal is more convenient to analyze, and the noise in the first amplified audio can be filtered through filtering, so that the accuracy of the first target note feature set can be improved.
Alternatively, if the first musical tone data is a vibration musical tone signal, the vibration musical tone signal may be converted into an electrical signal to obtain an audio signal corresponding to the vibration musical tone signal, and the audio signal may be subjected to feature extraction to obtain a first target note feature set. If the first musical tone data is a digital musical tone signal, the digital musical tone signal may be amplified to obtain an amplified digital musical tone signal, and a plurality of musical notes corresponding to a plurality of time points in the digital musical tone signal are determined, and then a plurality of target musical note features corresponding to the plurality of musical notes are extracted.
103. Searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set.
In the embodiment of the application, a database containing a plurality of reference note characteristics can be stored in a memory of the electronic device, so that the first target note characteristic set can be searched in a preset database to obtain the first reference note characteristic set which is successfully compared with the first target note characteristic set.
Optionally, in the step 103, searching the first target note feature set in a preset database to obtain a first reference note feature set successfully compared with the first target note feature set, which may include the following steps:
and searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
Wherein, a plurality of target note characteristics in the first target note characteristic set are searched in the database in sequence, specifically, according to the sequence of the target note characteristics, the notes ranked in the front are searched first, if a reference note successfully compared with the notes ranked in the front is found, the search of the notes ranked in the back is continued, for example, assuming that the user plays four hollow strings 'E, A, D, G' of the violin, and the playing sequence is 'E-A-D-G', the first target note characteristic corresponding to the E string can be searched to obtain a corresponding first reference note characteristic f1, then the search of the second target note characteristic corresponding to the A string is continued to obtain a corresponding second reference note characteristic f2, and further, a third reference note characteristic f3 corresponding to the D string and a fourth reference note characteristic f4 corresponding to the G string can be determined in sequence, specifically, for each target note feature, the target note feature may be compared with the reference note features in the database to determine whether there is a reference note feature corresponding to the target note feature, and in an implementation, the first tone data played by the player may include different tones for different musical instruments, for example, the four hollow strings of the viola are "C, G, D, A", and the player may play the four hollow strings to obtain the corresponding first tone data.
Optionally, the step of sequentially searching the database for the target note features may include the steps of:
a1, comparing the target pitches with the reference pitches in the first reference note feature set according to the sequence of the target note features to obtain pitch difference values;
and A2, if the pitch difference values are all in a preset range, determining that the target note characteristics are all compared successfully.
The method includes the steps of searching a plurality of target note characteristics, comparing a plurality of target pitches with reference pitches in a database in pairs in sequence, wherein each target pitch corresponds to one reference pitch, specifically, calculating a pitch difference between each target pitch and the corresponding reference pitch for each target pitch in the plurality of target pitches, and if the pitch difference is within a preset range, indicating that the pitch error is small, determining that the target pitch is matched and compared with the reference pitch, and further determining that the corresponding target note characteristics are successfully compared.
Optionally, in this embodiment of the application, after determining that all of the plurality of target note features are successfully aligned through step a2, the method may further include the following steps:
determining whether the time length from the starting time point to the ending time point corresponding to the plurality of target note features is smaller than a preset value, if so, executing the operation of determining the target control instruction corresponding to the first reference note feature set according to the corresponding relation between the preset reference note feature set and the control instruction in step 104.
The time length of the player playing the plurality of notes can be limited within a reasonable range, and the purpose of determining that the player plays the plurality of notes is to trigger a target control instruction and prevent the electronic equipment from mistakenly recognizing the control instruction.
104. And determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between a preset reference note feature set and a control instruction, and executing the operation corresponding to the target control instruction.
The memory of the electronic device may pre-store a corresponding relationship between the reference note feature set and the control instruction, so that after the first reference note feature set is obtained, the target control instruction corresponding to the first reference note feature set may be determined according to the corresponding relationship, where the target control instruction may be any one of: an application opening instruction for controlling the electronic device to open an application, for example, the electronic device may install an audio evaluation application, so that the opening of the audio evaluation application may be controlled through the acquired first musical tone data; a control instruction for controlling a hardware device in the electronic apparatus to start operating, for example, a microphone of the electronic apparatus may be controlled to acquire audio by the acquired first musical tone data, and for example, a camera may be controlled to perform video shooting; the control electronic device executes a specific operation in an application, for example, the electronic device may be installed with an audio evaluation application, so that the electronic device may be controlled to execute an operation corresponding to the tone evaluation instruction by acquiring the first tone data.
Alternatively, in the step 104, the target control command is a musical sound evaluation command, and the operation corresponding to the target control command is executed, which includes the following steps:
41. acquiring second tone data performed by a player, the second tone data being tone data performed by the player subsequent to the first tone data;
42. analyzing the second musical tone data to obtain a target musical note track; performing feature extraction on the second musical tone data to obtain a second target musical note feature set;
43. determining a target track corresponding to the second musical tone data;
44. comparing the target note track with a preset reference note track corresponding to the target track to obtain a starting note position and an ending note position played by the player;
45. selecting a first reference note feature set from a starting note position to an ending note position from a preset target reference note feature set corresponding to the target track;
46. comparing the first target note feature set with the first reference note feature set according to the corresponding relation between the target note track and the reference note track to obtain a target comparison result of feature data including intonation, speed and rhythm in the second music data;
47. and determining the evaluation score corresponding to the second musical tone data according to the target comparison result.
In an embodiment of the present application, the user may set the target control instruction as a tone evaluation instruction, and after determining that the target control instruction is the tone evaluation instruction, the electronic device may start acquiring second tone data performed by a player, where the player may be the user or another person other than the user, in a specific implementation, after acquiring the first tone data and determining the tone evaluation instruction according to the first tone data, the electronic device may continue acquiring tone data performed by the player to obtain the second tone data, and if the electronic device does not acquire tone data within a preset duration, the operation of acquiring tone data is terminated, and the preset duration may be, for example, 8s, 10s, or the like.
Wherein, above-mentioned target song can be the song that the user played through the musical instrument, and the musical instrument can include any one of following: violins, pianos, cellos, guitars, accordion, zither, etc., without limitation in this application.
And determining a target track corresponding to the second musical tone data, and determining the target track successfully compared with the target note track and the second target note feature set by searching the target note track and the second target note feature set in a preset database.
In the embodiment of the present application, the performance of the second musical tone data may be evaluated in multiple dimensions of pitch, velocity and rhythm, and specifically, the second target note feature set may be compared with the second reference note feature set, where the second target note feature set may include feature sets of three dimensions of pitch, velocity and rhythm, so as to obtain a first accuracy corresponding to the pitch in each note in the second musical tone data, a second accuracy corresponding to the performance velocity of each bar, and a third accuracy corresponding to the performance rhythm of each bar, and then, the first average accuracy corresponding to all notes in the second musical tone data, the second average accuracy corresponding to the performance velocity of all bars, and the third average accuracy corresponding to the performance rhythm of all bars may be determined, and finally, the first weight corresponding to the preset pitch, the second weight corresponding to the velocity, the second accuracy corresponding to the velocity, the third average accuracy corresponding to the performance rhythm of all bars may be determined, And determining the performance score of the second tone data by the third weight corresponding to the tempo and the first, second and third average accuracies.
It can be seen that the tone control method described in the embodiments of the present application, by acquiring first tone data by an electronic apparatus, extracting the characteristics of the first music data to obtain a first target note characteristic set, searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set, determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between the preset reference note feature set and the control instruction, executing the operation corresponding to the target control instruction, thus, the electronic equipment can determine the control instruction corresponding to the musical tone data by analyzing the musical tone data, therefore, the user can control the operation of the electronic equipment through the music, the diversity and the intelligence of the control mode of the electronic equipment are improved, and the electronic equipment is more personalized.
In accordance with the foregoing, fig. 2 is a flow chart illustrating a musical tone control method disclosed in an embodiment of the present application. Applied to an electronic apparatus, the tone control method includes the steps of:
201. first tone data is acquired.
202. And performing feature extraction on the first musical tone data to obtain a first target note feature set, wherein the first target note feature set comprises a plurality of target note features.
203. And searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
204. And determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between a preset reference note feature set and a control instruction, and executing the operation corresponding to the target control instruction.
The detailed descriptions of the steps 201 to 204 can refer to the corresponding descriptions of the musical tone control method described in fig. 1, and are not repeated herein.
It can be seen that, in the musical sound control method described in the embodiment of the present application, the electronic device acquires first musical sound data, performs feature extraction on the first musical sound data to obtain a first target note feature set, sequentially searches the database for a plurality of target note features according to the sequence of arrangement of the plurality of target note features in the first target note feature set to obtain a first reference note feature set successfully compared with the plurality of target note features, determines a target control instruction corresponding to the first reference note feature set according to a corresponding relationship between the preset reference note feature set and the control instruction, and executes an operation corresponding to the target control instruction, so that the electronic device can determine the control instruction corresponding to the musical sound data by analyzing the musical sound data, thereby enabling a user to control the operation of the electronic device through musical sound, the diversity and the intelligence of the control mode of the electronic equipment are improved, and the electronic equipment is more personalized.
In accordance with the foregoing, fig. 3 is a flow chart illustrating a musical tone control method disclosed in an embodiment of the present application. Applied to an electronic apparatus, the tone control method includes the steps of:
301. first tone data is acquired.
302. And performing feature extraction on the first musical tone data to obtain a first target note feature set, wherein the first target note feature set comprises a plurality of target note features.
303. And searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
304. And determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between the preset reference note feature set and the control instruction.
305. If the target control command is a tone evaluation command, second tone data performed by a player is acquired, the second tone data being tone data performed by the player subsequent to the first tone data.
306. Analyzing the second musical tone data to obtain a target musical note track; and performing feature extraction on the second musical tone data to obtain a second target musical note feature set.
307. A target track corresponding to the second musical tone data is determined.
308. And comparing the target note track with a preset reference note track corresponding to the target track to obtain the starting note position and the ending note position played by the player.
309. And selecting a second reference note feature set from the starting note position to the ending note position from a preset target reference note feature set corresponding to the target track.
310. And comparing the first target note characteristic set with the second reference note characteristic set according to the corresponding relation between the target note track and the reference note track to obtain a target comparison result of characteristic data including intonation, speed and rhythm in the second music data.
311. And determining the evaluation score corresponding to the second musical tone data according to the target comparison result.
The detailed descriptions of steps 301 to 311 may refer to the corresponding descriptions of the musical tone control method described in fig. 1, and are not repeated herein.
It can be seen that, in the tone control method described in the embodiment of the present application, the first tone data is obtained, feature extraction is performed on the first tone data to obtain a first target note feature set, a plurality of target note features are sequentially searched in the database to obtain a first reference note feature set, which is successfully compared with the plurality of target note features, a tone evaluation instruction corresponding to the first reference note feature set is determined, second tone data performed by a player is obtained according to the tone evaluation instruction, and the second tone data is analyzed to obtain a target note trajectory; performing feature extraction on second music data to obtain a second target note feature set, determining a target track corresponding to the second music data, comparing the target note track with a reference note track to obtain a starting note position and an ending note position, selecting the second reference note feature set from the starting note position to the ending note position from the target reference note feature set, comparing the second target note feature set with the second reference note feature set to obtain a target comparison result of feature data including accuracy, speed and rhythm, and determining a rating score corresponding to the second music data according to the target comparison result, so that a user can perform rating on a played target track through music control electronic equipment, determine the playing level, improve the diversity and intelligence of the control mode of the electronic equipment, and enable the electronic equipment to be more personalized, .
Referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application, and as shown in the drawing, the electronic device includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
acquiring first tone data;
performing feature extraction on the first musical tone data to obtain a first target note feature set, wherein the first target note feature set comprises a plurality of target note features;
searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set;
and determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between a preset reference note feature set and a control instruction, and executing the operation corresponding to the target control instruction.
In one possible example, the first musical tone data is a first audio frequency captured by a microphone of the electronic device, and the program includes instructions for performing the following steps in the feature extraction of the first musical tone data to obtain a first set of target note features:
amplifying the first audio to obtain a first amplified audio;
filtering the first amplified audio to obtain a first target audio;
performing analog-to-digital conversion on the first target audio to obtain a first target digital audio signal;
and performing feature extraction on the first target digital audio signal according to a preset algorithm to obtain a first target note feature set.
In one possible example, in said searching of said first target note feature set in a predetermined database for a first reference note feature set that successfully matches said first target note feature set, said program comprises instructions for:
and searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
In one possible example, the plurality of target note features are a plurality of target pitches, the plurality of target note features correspond one-to-one to the plurality of target pitches, and the program includes instructions for performing the following steps in the searching of the database for the plurality of target note features in turn:
comparing the target pitches with the reference pitches in the first reference note feature set in pairs according to the sequence of the target note features to obtain pitch difference values;
and if the pitch difference values are within the preset range, determining that the target note characteristics are compared successfully.
In one possible example, the target control instruction is a musical tone evaluation instruction, and the program further includes instructions for executing, in the performing of the operation corresponding to the target control instruction, the steps of:
acquiring second tone data performed by a player, the second tone data being tone data performed by the player subsequent to the first tone data;
analyzing the second musical tone data to obtain a target musical note track; performing feature extraction on the second musical tone data to obtain a second target musical note feature set;
determining a target track corresponding to the second musical tone data;
comparing the target note track with a preset reference note track corresponding to the target track to obtain a starting note position and an ending note position played by the player;
selecting a second reference note feature set from the starting note position to the ending note position from a preset target reference note feature set corresponding to the target track;
comparing the first target note characteristic set with the second reference note characteristic set according to the corresponding relation between the target note track and the reference note track to obtain a target comparison result of characteristic data including intonation, speed and rhythm in the second music data;
and determining the evaluation score corresponding to the second musical tone data according to the target comparison result.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a tone control apparatus disclosed in an embodiment of the present application, applied to an electronic device, the tone control apparatus including an acquisition unit 501, an extraction unit 502, a search unit 503, and a processing unit 504, wherein,
the acquiring unit 501 is configured to acquire first tone data;
the extracting unit 502 is configured to perform feature extraction on the first musical tone data to obtain a first target note feature set, where the first target note feature set includes a plurality of target note features;
the searching unit 503 is configured to search the first target note feature set in a preset database to obtain a first reference note feature set successfully compared with the first target note feature set;
the processing unit 504 is configured to determine a target control instruction corresponding to the first reference note feature set according to a correspondence between a preset reference note feature set and a control instruction, and execute an operation corresponding to the target control instruction.
Optionally, the first musical tone data is a first audio obtained through a microphone of the electronic device, and in terms of performing feature extraction on the first musical tone data to obtain a first target note feature set, the extraction unit 502 is specifically configured to:
amplifying the first audio to obtain a first amplified audio;
filtering the first amplified audio to obtain a first target audio;
performing analog-to-digital conversion on the first target audio to obtain a first target digital audio signal;
and performing feature extraction on the first target digital audio signal according to a preset algorithm to obtain a first target note feature set.
Optionally, the search unit 503 is specifically configured to:
and searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
Optionally, the target note features are target pitches, the target note features correspond to the target pitches one by one, and in the aspect of sequentially searching the target note features in the database, the searching unit 503 is specifically configured to:
comparing the target pitches with the reference pitches in the first reference note feature set in pairs according to the sequence of the target note features to obtain pitch difference values;
and if the pitch difference values are within the preset range, determining that the target note characteristics are compared successfully.
Optionally, the target control instruction is a musical sound evaluation instruction, and in terms of performing an operation corresponding to the target control instruction, the processing unit 504 is specifically configured to:
acquiring second tone data performed by a player, the second tone data being tone data performed by the player subsequent to the first tone data;
analyzing the second musical tone data to obtain a target musical note track; performing feature extraction on the second musical tone data to obtain a second target musical note feature set;
determining a target track corresponding to the second musical tone data;
comparing the target note track with a preset reference note track corresponding to the target track to obtain a starting note position and an ending note position played by the player;
selecting a second reference note feature set from the starting note position to the ending note position from a preset target reference note feature set corresponding to the target track;
comparing the first target note characteristic set with the second reference note characteristic set according to the corresponding relation between the target note track and the reference note track to obtain a target comparison result of characteristic data including intonation, speed and rhythm in the second music data;
and determining the evaluation score corresponding to the second musical tone data according to the target comparison result.
It can be seen that the tone control apparatus described in the embodiments of the present application, by acquiring first tone data by an electronic device, extracting the characteristics of the first music data to obtain a first target note characteristic set, searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set, determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between the preset reference note feature set and the control instruction, executing the operation corresponding to the target control instruction, thus, the electronic equipment can determine the control instruction corresponding to the musical tone data by analyzing the musical tone data, therefore, the user can control the operation of the electronic equipment through the music, the diversity and the intelligence of the control mode of the electronic equipment are improved, and the electronic equipment is more personalized.
It should be noted that the electronic device described in the embodiments of the present application is presented in the form of a functional unit. The term "unit" as used herein is to be understood in its broadest possible sense, and objects used to implement the functions described by the respective "unit" may be, for example, an integrated circuit ASIC, a single circuit, a processor (shared, dedicated, or chipset) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
Among them, the acquisition unit 501, the extraction unit 502, the search unit 503, and the processing unit 504 may be a control circuit or a processor.
Embodiments of the present application also provide a computer storage medium storing a computer program for electronic data exchange, the computer program causing a computer to execute a part or all of the steps of any one of the tone control methods as set forth in the above-described method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the tone control methods as set forth in the above-described method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A tone control method, applied to an electronic apparatus, the method comprising:
acquiring first tone data;
performing feature extraction on the first musical tone data to obtain a first target note feature set, wherein the first target note feature set comprises a plurality of target note features;
searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set;
determining a target control instruction corresponding to the first reference note feature set according to a corresponding relation between a preset reference note feature set and a control instruction, wherein the target control instruction is a musical tone evaluation instruction;
executing an operation corresponding to the target control instruction, including: acquiring second tone data performed by a player, the second tone data being tone data performed by the player subsequent to the first tone data;
analyzing the second musical tone data to obtain a target musical note track; performing feature extraction on the second musical tone data to obtain a second target musical note feature set; the second target note feature set is a feature set comprising intonation, speed and rhythm;
determining a target track corresponding to the second musical tone data;
comparing the target note track with a preset reference note track corresponding to the target track to obtain a starting note position and an ending note position played by the player;
selecting a second reference note feature set from the starting note position to the ending note position from a preset target reference note feature set corresponding to the target track;
comparing the second target note feature set with the second reference note feature set according to the corresponding relationship between the target note track and the reference note track to obtain a target comparison result of feature data including intonation, speed and rhythm in the second musical tone data, and determining a corresponding evaluation score of the second musical tone data according to the target comparison result, wherein the evaluation score comprises: obtaining a first accuracy corresponding to a pitch of each note in the second musical sound data, a second accuracy corresponding to a playing speed of each measure, and a third accuracy corresponding to a playing rhythm of each measure, determining a first average accuracy corresponding to all notes in the second musical sound data, a second average accuracy corresponding to the playing speed of all measures, and a third average accuracy corresponding to the playing rhythm of all measures, and determining a measurement score corresponding to the second musical sound data according to a first weight corresponding to a preset pitch, a second weight corresponding to a speed, a third weight corresponding to a rhythm, the first average accuracy, the second average accuracy and the third average accuracy.
2. The method of claim 1 wherein the first musical tone data is a first audio frequency captured by a microphone of the electronic device, and wherein the performing feature extraction on the first musical tone data to obtain a first set of target note features comprises:
amplifying the first audio to obtain a first amplified audio;
filtering the first amplified audio to obtain a first target audio;
performing analog-to-digital conversion on the first target audio to obtain a first target digital audio signal;
and performing feature extraction on the first target digital audio signal according to a preset algorithm to obtain a first target note feature set.
3. The method according to claim 1 or 2, wherein searching the first target note feature set in a predetermined database to obtain a first reference note feature set successfully aligned with the first target note feature set comprises:
and searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
4. The method of claim 3, wherein the plurality of target note features are a plurality of target pitches, the plurality of target note features correspond one-to-one to the plurality of target pitches, and the sequentially searching the plurality of target note features in the database comprises:
comparing the target pitches with the reference pitches in the first reference note feature set in pairs according to the sequence of the target note features to obtain pitch difference values;
and if the pitch difference values are within the preset range, determining that the target note characteristics are compared successfully.
5. A tone control apparatus applied to an electronic device, the tone control apparatus comprising:
an acquisition unit configured to acquire first tone data;
an extracting unit, configured to perform feature extraction on the first tone data to obtain a first target note feature set, where the first target note feature set includes a plurality of target note features;
the searching unit is used for searching the first target note characteristic set in a preset database to obtain a first reference note characteristic set which is successfully compared with the first target note characteristic set;
the processing unit is used for determining a target control instruction corresponding to the first reference note feature set according to the corresponding relation between a preset reference note feature set and a control instruction, and the target control instruction is a musical tone evaluation instruction; executing an operation corresponding to the target control instruction, including: acquiring second tone data performed by a player, the second tone data being tone data performed by the player subsequent to the first tone data;
analyzing the second musical tone data to obtain a target musical note track; performing feature extraction on the second musical tone data to obtain a second target musical note feature set; the second target note feature set is a feature set comprising intonation, speed and rhythm;
determining a target track corresponding to the second musical tone data;
comparing the target note track with a preset reference note track corresponding to the target track to obtain a starting note position and an ending note position played by the player;
selecting a second reference note feature set from the starting note position to the ending note position from a preset target reference note feature set corresponding to the target track;
comparing the second target note feature set with the second reference note feature set according to the corresponding relationship between the target note track and the reference note track to obtain a target comparison result of feature data including intonation, speed and rhythm in the second musical tone data, and determining a corresponding evaluation score of the second musical tone data according to the target comparison result, wherein the evaluation score comprises: obtaining a first accuracy corresponding to a pitch of each note in the second musical sound data, a second accuracy corresponding to a playing speed of each measure, and a third accuracy corresponding to a playing rhythm of each measure, determining a first average accuracy corresponding to all notes in the second musical sound data, a second average accuracy corresponding to the playing speed of all measures, and a third average accuracy corresponding to the playing rhythm of all measures, and determining a measurement score corresponding to the second musical sound data according to a first weight corresponding to a preset pitch, a second weight corresponding to a speed, a third weight corresponding to a rhythm, the first average accuracy, the second average accuracy and the third average accuracy.
6. The apparatus according to claim 5, wherein the first musical tone data is a first audio frequency captured by a microphone of the electronic device, and in said feature extracting the first musical tone data to obtain a first set of target note features, the extracting unit is specifically configured to:
amplifying the first audio to obtain a first amplified audio;
filtering the first amplified audio to obtain a first target audio;
performing analog-to-digital conversion on the first target audio to obtain a first target digital audio signal;
and performing feature extraction on the first target digital audio signal according to a preset algorithm to obtain a first target note feature set.
7. The apparatus according to claim 5 or 6, wherein the search unit is specifically configured to:
and searching the target note features in the database in sequence according to the sequence of the target note features in the first target note feature set to obtain a first reference note feature set which is successfully compared with the target note features.
8. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-4.
9. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-4.
CN201910242677.9A 2019-03-28 2019-03-28 Musical tone control method and related product Active CN110136677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910242677.9A CN110136677B (en) 2019-03-28 2019-03-28 Musical tone control method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910242677.9A CN110136677B (en) 2019-03-28 2019-03-28 Musical tone control method and related product

Publications (2)

Publication Number Publication Date
CN110136677A CN110136677A (en) 2019-08-16
CN110136677B true CN110136677B (en) 2022-03-15

Family

ID=67568689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910242677.9A Active CN110136677B (en) 2019-03-28 2019-03-28 Musical tone control method and related product

Country Status (1)

Country Link
CN (1) CN110136677B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113490315B (en) * 2021-01-20 2023-08-04 深圳市智岩科技有限公司 Lighting device control method, lighting device control device and storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2596436Y (en) * 2002-01-15 2003-12-31 许光清 Lamp light controller
CN1739127A (en) * 2003-01-17 2006-02-22 摩托罗拉公司 Audio file format with mapped lighting effects and method for controlling lighting effects using an audio file format
CN1897104A (en) * 2002-12-24 2007-01-17 卡西欧计算机株式会社 Device and program for musical performance evaluation
CN101430876A (en) * 2007-11-08 2009-05-13 中国科学院声学研究所 Singing marking system and method
CN201525958U (en) * 2009-07-10 2010-07-14 声宝股份有限公司 Washing machine program setting system with voice prompting function
CN103531189A (en) * 2013-09-25 2014-01-22 熊世林 Performance evaluator for intelligent electric piano
CN103594075A (en) * 2012-08-14 2014-02-19 雅马哈株式会社 Music information display control method and music information display control apparatus
CN103714805A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic musical instrument control device and method thereof
CN104464701A (en) * 2013-09-20 2015-03-25 卡西欧计算机株式会社 Playing practice device and method
CN105204357A (en) * 2015-09-18 2015-12-30 小米科技有限责任公司 Contextual model regulating method and device for intelligent household equipment
TW201616490A (en) * 2014-10-27 2016-05-01 明新科技大學 Music playing control device
CN106504491A (en) * 2016-11-29 2017-03-15 芜湖美智空调设备有限公司 A kind of the method and system of household electrical appliances, household electrical appliance, remote control are controlled by music
CN108053815A (en) * 2017-12-12 2018-05-18 广州德科投资咨询有限公司 The performance control method and robot of a kind of robot
CN108231063A (en) * 2016-12-13 2018-06-29 中国移动通信有限公司研究院 A kind of recognition methods of phonetic control command and device
CN109448754A (en) * 2018-09-07 2019-03-08 南京光辉互动网络科技股份有限公司 A kind of various dimensions singing marking system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9842577B2 (en) * 2015-05-19 2017-12-12 Harmonix Music Systems, Inc. Improvised guitar simulation

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2596436Y (en) * 2002-01-15 2003-12-31 许光清 Lamp light controller
CN1897104A (en) * 2002-12-24 2007-01-17 卡西欧计算机株式会社 Device and program for musical performance evaluation
CN1739127A (en) * 2003-01-17 2006-02-22 摩托罗拉公司 Audio file format with mapped lighting effects and method for controlling lighting effects using an audio file format
CN101430876A (en) * 2007-11-08 2009-05-13 中国科学院声学研究所 Singing marking system and method
CN201525958U (en) * 2009-07-10 2010-07-14 声宝股份有限公司 Washing machine program setting system with voice prompting function
CN103594075A (en) * 2012-08-14 2014-02-19 雅马哈株式会社 Music information display control method and music information display control apparatus
CN103714805A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic musical instrument control device and method thereof
CN104464701A (en) * 2013-09-20 2015-03-25 卡西欧计算机株式会社 Playing practice device and method
CN103531189A (en) * 2013-09-25 2014-01-22 熊世林 Performance evaluator for intelligent electric piano
TW201616490A (en) * 2014-10-27 2016-05-01 明新科技大學 Music playing control device
CN105204357A (en) * 2015-09-18 2015-12-30 小米科技有限责任公司 Contextual model regulating method and device for intelligent household equipment
CN106504491A (en) * 2016-11-29 2017-03-15 芜湖美智空调设备有限公司 A kind of the method and system of household electrical appliances, household electrical appliance, remote control are controlled by music
CN108231063A (en) * 2016-12-13 2018-06-29 中国移动通信有限公司研究院 A kind of recognition methods of phonetic control command and device
CN108053815A (en) * 2017-12-12 2018-05-18 广州德科投资咨询有限公司 The performance control method and robot of a kind of robot
CN109448754A (en) * 2018-09-07 2019-03-08 南京光辉互动网络科技股份有限公司 A kind of various dimensions singing marking system

Also Published As

Publication number Publication date
CN110136677A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
US9928835B1 (en) Systems and methods for determining content preferences based on vocal utterances and/or movement by a user
JP6017687B2 (en) Audio signal analysis
US8892565B2 (en) Method and apparatus for accessing an audio file from a collection of audio files using tonal matching
US9852721B2 (en) Musical analysis platform
KR100895009B1 (en) System and method for recommending music
JP5642296B2 (en) Input interface for generating control signals by acoustic gestures
CN110111761B (en) Method for real-time following musical performance and related product
CN110070847B (en) Musical tone evaluation method and related products
US9804818B2 (en) Musical analysis platform
US10504498B2 (en) Real-time jamming assistance for groups of musicians
US20140129235A1 (en) Audio tracker apparatus
CN112382257B (en) Audio processing method, device, equipment and medium
CN110136677B (en) Musical tone control method and related product
CN111399745A (en) Music playing method, music playing interface generation method and related products
CN113763913A (en) Music score generation method, electronic device and readable storage medium
Dittmar et al. Real-time guitar string detection for music education software
KR101813704B1 (en) Analyzing Device and Method for User's Voice Tone
JP2006195384A (en) Musical piece tonality calculating device and music selecting device
CN109686376B (en) Song singing evaluation method and system
KR101429138B1 (en) Speech recognition method at an apparatus for a plurality of users
JP5843074B2 (en) Stringed instrument performance evaluation apparatus and stringed instrument performance evaluation program
JP6728572B2 (en) Plucked instrument performance evaluation device, music performance device, and plucked instrument performance evaluation program
CN113593504A (en) Pitch recognition model establishing method, pitch recognition method and pitch recognition device
KR102117685B1 (en) Apparatus and method for guide to playing a stringed instrument, and computer readable medium having computer program recorded thereof
CN115171729B (en) Audio quality determination method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000 No. a821, 8 / F, block a, Mingyou procurement center, labor community, Xixiang street, Bao'an District, Shenzhen, Guangdong

Applicant after: Shenzhen mango Future Technology Co.,Ltd.

Address before: 518000 No. A541, 5 / F, block a, Mingyou procurement center, labor community, Xixiang street, Bao'an District, Shenzhen, Guangdong

Applicant before: Shenzhen Mango Future Education Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant