CN108806732B - Background music processing method based on artificial intelligence and electronic equipment - Google Patents

Background music processing method based on artificial intelligence and electronic equipment Download PDF

Info

Publication number
CN108806732B
CN108806732B CN201810737667.8A CN201810737667A CN108806732B CN 108806732 B CN108806732 B CN 108806732B CN 201810737667 A CN201810737667 A CN 201810737667A CN 108806732 B CN108806732 B CN 108806732B
Authority
CN
China
Prior art keywords
music
external
background music
editing interface
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810737667.8A
Other languages
Chinese (zh)
Other versions
CN108806732A (en
Inventor
李天驰
孙悦
张永聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan programming cat Technology Co.,Ltd.
Original Assignee
Shenzhen Dianmao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dianmao Technology Co Ltd filed Critical Shenzhen Dianmao Technology Co Ltd
Priority to CN201810737667.8A priority Critical patent/CN108806732B/en
Publication of CN108806732A publication Critical patent/CN108806732A/en
Application granted granted Critical
Publication of CN108806732B publication Critical patent/CN108806732B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/011Files or data streams containing coded musical information, e.g. for transmission
    • G10H2240/016File editing, i.e. modifying musical data files or streams as such
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/011Files or data streams containing coded musical information, e.g. for transmission
    • G10H2240/046File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/011Files or data streams containing coded musical information, e.g. for transmission
    • G10H2240/046File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
    • G10H2240/061MP3, i.e. MPEG-1 or MPEG-2 Audio Layer III, lossy audio compression

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

The invention discloses a background music processing method based on artificial intelligence and an electronic device, wherein the method comprises the following steps: adding external music to a background music editing interface in advance; and selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music. The user can firstly import the external music through the background music editing interface provided by the invention and then select one or more musical instruments to edit the external music, thereby obtaining the background music. By the method, the sound of the musical instrument can be increased on the basis of external music, so that the content of the song is richer, and the personalized requirements of the user are met. Moreover, the method has the characteristics of simplicity, easy operation and high efficiency.

Description

Background music processing method based on artificial intelligence and electronic equipment
Technical Field
The invention relates to the field of music processing, in particular to a background music processing method based on artificial intelligence and electronic equipment.
Background
Background music (BGM), also called companion music and score music, generally refers to a kind of music used for atmosphere adjustment in drama, movie, animation, video game, web site. The background music is inserted into the conversation, the expression of emotion can be enhanced, the effect that audiences feel personally on the scene is achieved, and the infectivity of the works can be greatly improved through the excellent background music.
In the prior art, a user can directly select a song or a part of the song as background music, however, part of the song directly serves as the background music, which often does not conform to the atmosphere of games and web pages, and more importantly, the user hopes to add part of the sound of musical instruments on the basis of the original song. The prior art does not satisfy the user's needs.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
In view of the above disadvantages of the prior art, an object of the present invention is to provide a background music processing method and an electronic device based on artificial intelligence, and to solve the problems in the prior art that the editing method of background music is single and cannot meet the personalized requirements of users.
The technical scheme of the invention is as follows:
an intelligent processing method of background music comprises the following steps:
A. adding external music to a background music editing interface in advance;
B. and selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music.
Preferably, after step a and before step B, the method comprises:
and converting the format of the external music into an MP3 format through the background music editing interface.
Preferably, the step B specifically includes:
predefining a time interval, and dividing the external music into one or more external music pieces according to the time interval;
and selecting one or more corresponding musical instruments to edit the external music segments to obtain background music.
Preferably, the step of selecting one or more corresponding musical instruments to edit the external music piece to obtain background music specifically includes:
selecting two or more external music pieces to be combined to obtain combined music pieces;
and selecting one or more corresponding musical instruments to edit the combined music fragment to obtain background music.
Preferably, the method further comprises the following steps after the step A and before the step B:
and defining a starting time and an ending time in advance, and cutting the external music according to the starting time and the ending time to obtain cut music fragments.
Preferably, the method further comprises the following steps after the step A and before the step B:
calling a recording device through the background music editing interface;
recording the recording music clips through the recording equipment.
Preferably, step B specifically comprises:
selecting one or more recording fragments and one or more cutting music fragments to be mixed to obtain mixed music fragments;
and selecting one or more musical instruments in the background music editing interface to edit the mixed music fragment to obtain background music.
The present invention also provides an electronic device, comprising:
a processor adapted to implement the instructions, an
A storage device adapted to store a plurality of instructions, the instructions adapted to be loaded and executed by a processor to:
adding external music to a background music editing interface;
and selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music.
Preferably, after the step of adding the external music to a background music editing interface, the step of selecting one or more musical instruments in the background music editing interface to edit the external music comprises the following steps:
and converting the format of the external music into an MP3 format through the background music editing interface.
Preferably, the step of editing the external music through one or more musical instruments selected from the background music editing interface to obtain the background music specifically includes:
predefining a time interval, and dividing the external music into one or more external music pieces according to the time interval;
and selecting one or more corresponding musical instruments to edit the external music segments to obtain background music.
Has the advantages that: the user can firstly import the external music through the background music editing interface provided by the invention and then select one or more musical instruments to edit the external music, thereby obtaining the background music. According to the mode, music of musical instruments can be added on the basis of external music, so that the content of songs is richer, and the personalized requirements of users are met. Moreover, the method has the characteristics of simplicity, easy operation and high efficiency.
Drawings
FIG. 1 is a flowchart illustrating a background music processing method based on artificial intelligence according to a preferred embodiment of the present invention.
Fig. 2 is a block diagram of an electronic device according to a preferred embodiment of the invention.
Detailed Description
The present invention provides a background music processing method and an electronic device based on artificial intelligence, and the following further describes the present invention in detail in order to make the purpose, technical scheme and effect of the present invention clearer and clearer. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, a background music processing method based on artificial intelligence includes the steps of:
s1, adding external music to a background music editing interface in advance;
and S2, selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music.
By the method provided by the invention, the user can edit the background music quickly and individually. Moreover, the method provided by the invention is simple and easy to operate, so that the method is suitable for users at all ages, and even children can edit and obtain beautiful and vivid background music by the method provided by the invention.
In step S1, the external music is a music file. The format of the music file is not limited, and may be a music file in a format such as MP3, WAV, WMA, or the like. External music is imported through an import key (virtual key) in the background music editing interface, specifically, music files in a computer can be selected, and music files in a mobile U disk and a mobile hard disk connected with the computer can also be selected.
Preferably, the steps S1 and S2 include:
and converting the format of the external music into an MP3 format through the background music editing interface.
Since the files in the MP3 format have the characteristics of easy editing and easy processing, the files can be better edited by converting the external music into the MP3 format. For example, if the format of the imported external music is WAV, the external music is converted into MP3 format through the "convert" key in the background music editing interface; the format of the imported external music is WMA, and the external music is converted into the MP3 format through a "convert" key in the background music editing interface. Of course, if the music format of the external music is the MP3 format, the conversion process is not required.
Preferably, the method further comprises, after the step S1 and before the step S2, the steps of:
and defining a starting time and an ending time in advance, and cutting the external music according to the starting time and the ending time to obtain cut music fragments.
The start time specifically represents a start clipping time and the end time specifically represents an end clipping time. For example, a start time is defined as 1min, an end time is defined as 3min, when the duration of the external music is 5min, then the cutting is started at 1min of the external music and the cutting is ended at 3min, resulting in a music cut segment of 2min duration. In this way, the start time and the end time can be freely selected, so that the optimal music clip is obtained, and better background music can be edited.
Note that, when the end time is longer than the time length of the external music, the play end time of the external music is set as the end cut time in the present invention. For example, a start time is defined as 1min, an end time is defined as 5min, when the duration of the external music is 3min, then the cutting is started at 1min of the external music and the cutting is ended at 3min, resulting in a music cut segment of 2min duration. The clip music may be a part of the external music or may be the entire external music. When the cut music is represented by complete external music, the start time is the play start time of the external music, and the end time is the play end time of the external music.
Preferably, the method further comprises, after the step S1 and before the step S2, the steps of:
calling a recording device through the background music editing interface;
recording the recording music clips through the recording equipment.
The recording equipment (radio equipment) is specifically one or two of a recorder and a microphone. Taking the microphone as an example, when a user needs to input own voice or the voice of other articles, the recording device is called through the background music editing interface, the microphone is selected, and then the voice of the user or the voice of the articles is recorded through the microphone and stored as a recording music fragment. Therefore, the user can blend own and/or external sound into the background music, so that the background music has more personalized characteristics.
In step S2, one or more musical instruments in the background music editing interface are selected to edit the clip music piece and the recording music piece. The specific editing steps and how to edit will be described in detail later. Therefore, music of musical instruments can be added on the basis of external music, and background music required by the music can be edited.
Preferably, the method further comprises, after the step S1 and before the step S2, the steps of:
and selecting one or more recording fragments to be mixed with one or more cutting music fragments to obtain mixed music fragments.
The user can edit the cut music clips directly through the background music editing interface, and can also edit the recorded music clips through the background music editing interface.
Another preferable scheme of the present invention is to mix the two to obtain a mixed music piece, and then edit the mixed music piece through the background music editing interface. For example, the clipped music pieces specifically include a clipped music a, a clipped music b, a clipped music c, and a clipped music d, and the recorded music pieces specifically include a recorded music piece e, a recorded music piece f, and a recorded music piece g, so that the user may select to mix the clipped music a and the recorded music e to obtain a mixed music piece h; or selecting the cutting music fragment a, the cutting music fragment b and the recording music fragment e to mix to obtain a mixed music fragment i; of course, the cut music piece a, the cut music piece b, the cut music piece c, the cut music piece d, the recording music piece e, the recording music piece f, and the recording music piece g may be mixed to obtain the mixed music piece j. Therefore, the sound of the user can be combined with the music file for editing, and a plurality of modes for editing the background music are provided for the user.
Preferably, the step S2 specifically includes:
s21, predefining a time interval, and dividing the external music into one or more external music pieces according to the time interval;
and S22, selecting one or more corresponding musical instruments to edit the external music pieces to obtain background music.
In step S21, the time interval may be 1min, 3min, 5min, or the like. For example, the time interval is 1min, when external music with the time interval of 5min is segmented, 5 external music pieces are obtained, and the time intervals of the 5 external music pieces are all 1 min; the time interval is 3min, when external music with the time interval of 5min is segmented, 2 external music segments are obtained, and the time intervals of the 2 external music segments are 3min and 2min respectively; the time interval is 5min, and when external music with the time interval of 5min is segmented, 1 external music segment is obtained, and the time interval of the external music segment is 5 min. That is, the duration interval may be directly the duration of the external music, and when the duration interval is the duration of the external music, the external music piece is the whole piece of external music.
Of course, the external music pieces may be obtained by dividing the clip music pieces, the recording music pieces, or the mixing music pieces according to the time length intervals.
In step S22, one or more musical instruments are selected through the background music editing interface, then musical instrument music pieces are edited through the selected musical instruments, and then the musical instrument music pieces are fused with the external music pieces to obtain background music. For example, selecting a guitar and a piano to edit to obtain a musical instrument music piece a, and then fusing the musical instrument music piece a with an external music piece b to obtain background music A; for another example, selecting guitar, piano, violin, music box, drum, bamboo flute and saxophone to edit to obtain musical instrument music piece c, and then fusing the musical instrument music piece c with external music piece B to obtain background music B. Thus, the external music piece can be edited, and the background music can be obtained.
Preferably, the step S22 specifically includes:
s221, selecting two or more external music pieces to be combined to obtain combined music pieces;
s222, selecting one or more corresponding musical instruments to edit the combined music piece to obtain background music.
In step S221, the user segments the external music to obtain a plurality of external music pieces. The user can select one or more music pieces to be combined according to the requirement to obtain the combined music piece. For example, after the user divides the external music, an external music piece a, an external music piece b and an external music piece c are obtained, and the external music piece a and the external music piece c are selected to be combined to obtain a combined music piece s; it is also possible to select the external music piece b and the external music piece c to be combined, resulting in a combined music piece y.
In step S222, the user may edit the combined music piece to obtain background music. For example, selecting a piano and a violin to edit to obtain an instrument music piece a, and then fusing the instrument music piece a and the combined music piece s to obtain background music C; for another example, selecting guitar, piano, violin, music box, drum, bamboo flute and saxophone to edit to obtain musical instrument music piece b, and then fusing musical instrument music piece b with combined music piece y to obtain background music D.
The invention not only can combine external music fragments, but also can directly select a plurality of external music to be directly combined to obtain combined music, and then select one or more corresponding musical instruments to edit the combined music to obtain background music. The specific editing method has been described in detail in the above steps, and thus is not described in detail here.
Preferably, the user may also define a fusion start time and a fusion end time. For example, if the duration of the musical instrument music piece a is 1min, the duration of the external music b is 5min, the fusion start time is 3min, and the fusion end time is 4min, then the musical instrument music piece a will start to fuse in the third minute of the external music b and end to fuse in the fourth minute of the external music b.
In the above mode, the musical instrument music pieces and the external music can be fused according to the requirements of the user, so that various background music can be obtained.
Preferably, the step S2 is followed by the step of:
and S3, uploading the background music to a material library or outputting the background music to a mobile storage device.
In step S3, the user may upload the background music edited in step S2 to the material library. The material library specifically represents a server capable of storing background music, and a user can download the background music through the material library. The mobile storage device is specifically a mobile storage device such as a mobile U disk, a mobile hard disk, an optical disk and the like. Thus, the background music can be saved and recalled.
Preferably, the back music editing interface is further provided with a tutorial module.
An official (particularly a developer carrying a video music editing interface) can upload an editing course about background music, so that the official can better help a user to edit the background music; of course, the user may also select to upload the background music edited by the user to the tutorial module, and upload the background music and the creation steps together. For example, the background music D is obtained by editing in the steps sa, sb, and sc, and then the background music D, sa, sb, and sc are uploaded to the tutorial module together, so that the user can learn more conveniently.
Preferably, when the user edits the musical instrument music piece, the sounds corresponding to the notes are played synchronously.
In the prior art, when a user edits a musical instrument music piece, the user cannot directly hear the sound corresponding to the musical instrument music piece, but can hear the edited musical instrument music piece only by clicking a play button after the editing is finished, and if one sound goes wrong, the modification is very troublesome. The invention provides a method for instantly playing musical instrument music segments, which can better solve the problems.
Specifically, when a user edits a musical piece of an instrument through one of the instruments, notes input by the user are played synchronously using a playing device. For example, if the user inputs a sound a on the background music editing interface, the sound corresponding to the sound a is played at the time when the user inputs a or at the moment after the input a. Therefore, the user can judge whether the note just input is wrong or not, and if the note is wrong, the note can be timely modified and corrected.
Preferably, the background music editing interface automatically acquires the current time and sends a corresponding instruction to the tutorial module.
Specifically, the background music editing interface acquires the current time, and then judges a vacation corresponding to the current time according to the current time. For example, 12 and 25 correspond to christmas, and then corresponding instructions are sent to the tutorial module, the tutorial module can push a tutorial for editing christmas songs according to the instructions, so that the purpose of reminding holidays of festivals can be achieved, and an editing tutorial which meets the scene can be pushed for a user.
It should be noted that the present invention is specifically applied to the music module of the source code editor. The source code editor is a code editor and comprises a drawing board module, a music module and the like.
Referring to fig. 2, the present invention also provides an electronic device 10, which includes:
a processor 110 adapted to implement instructions, an
A storage device 120 adapted to store a plurality of instructions adapted to be loaded and executed by a processor:
adding external music to a background music editing interface;
and selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music.
The processor 110 may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a single chip, an arm (acorn RISC machine) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination of these components. Also, the processor may be any conventional processor, microprocessor, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The storage device 120, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions corresponding to the background music processing method based on artificial intelligence in the embodiments of the present invention. The processor executes various functional applications and data processing of the background music processing method based on artificial intelligence, that is, implements the background music processing method based on artificial intelligence in the above-described method embodiments, by executing the nonvolatile software program, instructions, and units stored in the storage device.
Preferably, after the step of adding the external music to a background music editing interface, the step of selecting one or more musical instruments in the background music editing interface to edit the external music comprises the following steps:
and converting the format of the external music into an MP3 format through the background music editing interface.
Preferably, the step of editing the external music through one or more musical instruments selected from the background music editing interface to obtain the background music specifically includes:
predefining a time interval, and dividing the external music into one or more external music pieces according to the time interval;
and selecting one or more corresponding musical instruments to edit the external music segments to obtain background music.
Details of the electronic device 10 have been described in the above steps, and thus are not described in detail.
It is to be understood that the invention is not limited to the examples described above, but that modifications and variations may be effected thereto by those of ordinary skill in the art in light of the foregoing description, and that all such modifications and variations are intended to be within the scope of the invention as defined by the appended claims.

Claims (5)

1. A background music processing method based on artificial intelligence is characterized by comprising the following steps:
A. adding external music to a background music editing interface in advance;
B. selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music;
the user may define a fusion start time and a fusion end time;
the background music editing interface is also provided with a tutorial module;
the background music editing interface automatically acquires the current time and sends a corresponding instruction to the tutorial module;
when a user edits musical instrument music clips, synchronously playing the sound corresponding to the musical notes;
the step of selecting one or more corresponding musical instruments to edit the external music piece to obtain background music specifically comprises:
selecting two or more external music pieces to be combined to obtain combined music pieces;
selecting one or more corresponding musical instruments to edit the combined music segments to obtain background music;
the step B specifically comprises the following steps:
predefining a time interval, and dividing the external music into one or more external music pieces according to the time interval;
selecting one or more corresponding musical instruments to edit the external music segments to obtain background music;
after the step A and before the step B, the method also comprises the following steps:
predefining a starting time and an ending time, and cutting the external music according to the starting time and the ending time to obtain cut music fragments;
the step B specifically comprises the following steps:
selecting one or more recording fragments and one or more cutting music fragments to be mixed to obtain mixed music fragments;
and selecting one or more musical instruments in the background music editing interface to edit the mixed music fragment to obtain background music.
2. The artificial intelligence based background music processing method according to claim 1, wherein after the step a and before the step B, the method comprises:
and converting the format of the external music into an MP3 format through the background music editing interface.
3. The artificial intelligence based background music processing method according to claim 1, wherein said step a is followed by step B and further comprises the steps of:
calling a recording device through the background music editing interface;
recording the recording music clips through the recording equipment.
4. An electronic device, comprising:
a processor adapted to implement the instructions, an
A storage device adapted to store a plurality of instructions, the instructions adapted to be loaded and executed by a processor to:
adding external music to a background music editing interface;
selecting one or more musical instruments in the background music editing interface to edit the external music to obtain background music;
the user may define a fusion start time and a fusion end time;
the background music editing interface is also provided with a tutorial module;
the background music editing interface automatically acquires the current time and sends a corresponding instruction to the tutorial module;
when a user edits musical instrument music clips, synchronously playing the sound corresponding to the musical notes;
the step of selecting one or more corresponding musical instruments to edit the external music piece to obtain background music specifically comprises:
selecting two or more external music pieces to be combined to obtain combined music pieces;
selecting one or more corresponding musical instruments to edit the combined music segments to obtain background music;
the step of editing the external music through one or more musical instruments selected from the background music editing interface to obtain background music specifically comprises:
predefining a time interval, and dividing the external music into one or more external music pieces according to the time interval;
selecting one or more corresponding musical instruments to edit the external music segments to obtain background music;
after the external music is added to a background music editing interface in advance, one or more musical instruments in the background music editing interface are selected to edit the external music, and the method also comprises the following steps before the background music is obtained:
predefining a starting time and an ending time, and cutting the external music according to the starting time and the ending time to obtain cut music fragments;
the selecting one or more musical instruments in the background music editing interface to edit the external music to obtain the background music specifically comprises:
selecting one or more recording fragments and one or more cutting music fragments to be mixed to obtain mixed music fragments;
and selecting one or more musical instruments in the background music editing interface to edit the mixed music fragment to obtain background music.
5. The electronic device of claim 4, wherein after the step of adding the external music to a background music editing interface, and after the step of selecting one or more musical instruments in the background music editing interface to edit the external music, the step of obtaining the background music comprises:
and converting the format of the external music into an MP3 format through the background music editing interface.
CN201810737667.8A 2018-07-06 2018-07-06 Background music processing method based on artificial intelligence and electronic equipment Active CN108806732B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810737667.8A CN108806732B (en) 2018-07-06 2018-07-06 Background music processing method based on artificial intelligence and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810737667.8A CN108806732B (en) 2018-07-06 2018-07-06 Background music processing method based on artificial intelligence and electronic equipment

Publications (2)

Publication Number Publication Date
CN108806732A CN108806732A (en) 2018-11-13
CN108806732B true CN108806732B (en) 2021-04-30

Family

ID=64075636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810737667.8A Active CN108806732B (en) 2018-07-06 2018-07-06 Background music processing method based on artificial intelligence and electronic equipment

Country Status (1)

Country Link
CN (1) CN108806732B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110430326B (en) * 2019-09-10 2021-02-02 Oppo广东移动通信有限公司 Ring editing method and device, mobile terminal and storage medium
CN111276115A (en) * 2020-01-14 2020-06-12 孙志鹏 Cloud beat

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104778216A (en) * 2015-03-20 2015-07-15 广东欧珀移动通信有限公司 Method and device for processing songs with preset styles
CN106780673A (en) * 2017-02-13 2017-05-31 杨金强 A kind of animation method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2539875B (en) * 2015-06-22 2017-09-20 Time Machine Capital Ltd Music Context System, Audio Track Structure and method of Real-Time Synchronization of Musical Content

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104778216A (en) * 2015-03-20 2015-07-15 广东欧珀移动通信有限公司 Method and device for processing songs with preset styles
CN106780673A (en) * 2017-02-13 2017-05-31 杨金强 A kind of animation method and system

Also Published As

Publication number Publication date
CN108806732A (en) 2018-11-13

Similar Documents

Publication Publication Date Title
US11277215B2 (en) System and method for generating an audio file
JP4413144B2 (en) System and method for portable speech synthesis
CN108806732B (en) Background music processing method based on artificial intelligence and electronic equipment
WO2009010009A1 (en) Prompting message forming method and device for mobile terminal
Huber The Midi manual: A practical guide to Midi within Modern Music production
KR100731232B1 (en) Musical data editing and reproduction apparatus, and portable information terminal therefor
JPWO2020166094A1 (en) Information processing equipment, information processing methods and information processing programs
US20240194173A1 (en) Method, system and computer program for generating an audio output file
KR100620973B1 (en) A system for outputing sound data
WO2024120810A1 (en) Method, system and computer program for generating an audio output file
CN115440177A (en) Method, device, system and medium for controlling tone color change of electric piano
Wenttola The use of modelling technology and its effects on the music industry
Jansch Towards the open outcome record: a portfolio of works exploring strategies of freeing the record from fixity
JP2016156957A (en) Musical instrument and musical instrument system
IE20130120U1 (en) A system and method for generating an audio file
JP2008300912A (en) Program production system and producing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211229

Address after: 430000 r05-r10, 16 / F, tower a, Guanggu new development international center, Guanggu Avenue, Donghu New Technology Development Zone, Wuhan, Hubei Province

Patentee after: Wuhan programming cat Technology Co.,Ltd.

Address before: 518000 25-26 / F, China Construction steel structure building, 3331 Central Road, Yuehai street, Shenzhen, Guangdong Province

Patentee before: SHENZHEN DIANMAO TECHNOLOGY Co.,Ltd.