CN113035157A - Graphical music editing method, system and storage medium - Google Patents

Graphical music editing method, system and storage medium Download PDF

Info

Publication number
CN113035157A
CN113035157A CN202110120091.2A CN202110120091A CN113035157A CN 113035157 A CN113035157 A CN 113035157A CN 202110120091 A CN202110120091 A CN 202110120091A CN 113035157 A CN113035157 A CN 113035157A
Authority
CN
China
Prior art keywords
playing
notes
music
graphical
note
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110120091.2A
Other languages
Chinese (zh)
Other versions
CN113035157B (en
Inventor
孙悦
李天驰
蔡欣嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dianmao Technology Co Ltd
Original Assignee
Shenzhen Dianmao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dianmao Technology Co Ltd filed Critical Shenzhen Dianmao Technology Co Ltd
Priority to CN202110120091.2A priority Critical patent/CN113035157B/en
Publication of CN113035157A publication Critical patent/CN113035157A/en
Application granted granted Critical
Publication of CN113035157B publication Critical patent/CN113035157B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0025Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/101Music Composition or musical creation; Tools or processes therefor
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/126Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters for graphical editing of individual notes, parts or phrases represented as variable length segments on a 2D or 3D representation, e.g. graphical edition of musical collage, remix files or pianoroll representations of MIDI-like files

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a graphical music editing method, a graphical music editing system and a storage medium, wherein the method comprises the following steps: detecting a mouse event input by a user in a music canvas, and editing a corresponding graphic note on the music canvas according to the mouse event; converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes; and when a playing instruction is received, playing corresponding musical notes at each playing time point according to the data structure. According to the embodiment of the invention, corresponding graphic notes are directly edited on the music canvas by detecting mouse events, the edited graphic notes are converted into corresponding data structures and then played, notes with different pitches can be edited according to different starting and ending times of each graphic note, and the editing and playing of the notes with different pitches are realized.

Description

Graphical music editing method, system and storage medium
Technical Field
The invention relates to the technical field of graphical programming, in particular to a graphical music editing method, a graphical music editing system and a storage medium.
Background
In a conventional visual music editor, musical notes are generally drawn based on development technologies such as bitmap-mode canvas, however, in the conventional musical note drawing, each beat of each musical note is an independent graph, and it is impossible to confirm whether each musical note is continuous or independent, so that all musical notes can only be played according to a fixed time value, and the effects of continuous sound and long sound cannot be drawn and presented, which affects the editing output effect of the graphical music editor.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
In view of the foregoing disadvantages of the prior art, an object of the present invention is to provide a method, a system and a storage medium for graphical music editing, which solve the problem in the prior art that graphical music editing cannot realize editing of notes with different durations.
In order to achieve the purpose, the invention adopts the following technical scheme:
a graphical music editing method comprises the following steps:
detecting a mouse event input by a user in a music canvas, and editing a corresponding graphic note on the music canvas according to the mouse event;
converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes;
and when a playing instruction is received, playing corresponding musical notes at each playing time point according to the data structure.
In the graphical music editing method, before the step of detecting a mouse event input by a user in a music canvas and editing a corresponding graphical note on the music canvas according to the mouse event, the graphical music editing method further includes:
and building a music canvas for editing the musical notes based on a vector graphic library, wherein the editing area of the music canvas is of a grid structure.
In the graphical music editing method, the step of detecting a mouse event input by a user in a music canvas and editing a corresponding graphical note on the music canvas according to the mouse event comprises the following steps:
detecting a mouse pressing event and a mouse moving event input by a user;
respectively triggering a drawing function or a frame selection function according to the direction information in the mouse moving event;
and detecting a mouse release event input by a user, and drawing a graphic note with a corresponding length or selecting a graphic note in a corresponding area according to the mouse pressing position and the mouse release position.
In the graphical music editing method, the step of triggering a drawing function or a framing function according to the direction information in the mouse movement event includes:
when the direction information in the mouse moving event is transverse, triggering a drawing function;
and triggering a frame selection function when the direction information in the mouse moving event is longitudinal.
In the graphical music editing method, the step of converting the graphical musical notes into corresponding data structures according to preset rules, wherein the data structures include position information, start time information and end time information of the graphical musical notes, includes:
acquiring position information, starting time information and ending time information of the graphic notes;
storing the position information, the start time information and the end time information of the graphic notes into an array of playing data;
and storing the array of the playing data and the playing time point into the array of the time data, and forming the data structure by the array of the time data according to a time increasing rule.
In the graphical music editing method, the step of playing the corresponding note at each playing time point according to the data structure when the playing instruction is received includes:
when a playing instruction is received, acquiring playing start points, duration values and pitches of all the graphic notes according to the data structure;
and playing corresponding notes at each playing time point according to the playing starting points, the durations and the pitches of all the graphic notes.
In the method for editing graphical music, before the step of playing corresponding notes at each playing time point according to the playing start point, duration and pitch of all graphical notes, the method further comprises:
detecting a positioning time point of a current positioning line;
comparing the positioning time point with the playing start points of all the graphic notes, and judging whether the playing start point is earlier than the positioning time point;
if yes, starting playing from the positioning time point; otherwise, the playing is started from the earliest playing starting point in all the playing starting points.
In the graphical music editing method, after the step of playing the corresponding note at each playing time point according to the data structure when the playing instruction is received, the method further includes:
and switching the graphic notes in the playing state into a highlight state.
Another embodiment of the present invention also provides a graphical music editing system, including: a processor, a memory, and a communication bus;
the memory has stored thereon a computer readable program executable by the processor;
the communication bus realizes connection communication between the processor and the memory;
the processor, when executing the computer readable program, implements the steps in the graphical music editing method as described above.
Another embodiment of the present invention also provides a computer-readable storage medium, wherein the computer-readable storage medium stores one or more programs, which are executable by one or more processors to implement the steps in the graphical music editing method as described above.
Compared with the prior art, the graphical music editing method, the graphical music editing system and the storage medium provided by the invention have the advantages that the corresponding graphical notes are directly edited on the music canvas by detecting the mouse event, the edited graphical notes are converted into the corresponding data structure and then played, the notes with different pitches can be edited according to the different starting and ending times of each graphical note, and the editing and playing of the notes with different pitches are realized.
Drawings
FIG. 1 is a flowchart illustrating a graphical music editing method according to a preferred embodiment of the present invention;
FIG. 2 is a flowchart illustrating the step S10 according to the preferred embodiment of the present invention;
FIG. 3 is a flowchart of detecting mouse events to edit graphical notes in an embodiment of the graphical music editing method;
FIG. 4 is a flowchart illustrating step S20 according to a preferred embodiment of the present invention;
FIG. 5 is a flowchart illustrating step S30 according to a preferred embodiment of the present invention;
FIG. 6 is a flowchart illustrating steps S33, S34 and S35 of the graphical music editing method according to the present invention;
FIG. 7 is an interface diagram of a portion of a music canvas and graphical notes in an embodiment of the graphical music editing method application provided by the present invention;
FIG. 8 is a diagram illustrating a hardware configuration of a preferred embodiment of a graphical music editing system according to the present invention;
FIG. 9 is a functional block diagram of a system for installing a graphical music editing program according to a preferred embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and effects of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, the method for editing graphical music provided by the present invention includes the following steps:
s10, detecting a mouse event input by a user in the music canvas, and editing corresponding graphic musical notes on the music canvas according to the mouse event.
In the embodiment, the corresponding graphic musical notes are edited by detecting the mouse events input by the user on the music canvas, the editing operations including drawing, deleting, copying and the like are all realized by driving the mouse events, the musical note editing mode is visual and flexible, the musical notes of the corresponding vector graphics can be obtained by inputting different mouse events, the graphical musical note editing is not limited to fixed graphics any more, and continuous or independent musical notes are effectively distinguished.
Specifically, the user may further edit corresponding graphical musical notes on the existing music canvas to implement music re-editing, or edit musical notes in a newly created blank music canvas, so in an optional embodiment, before the step S10, the method further includes a step of building a music canvas for editing musical notes, specifically building the music canvas based on a vector graphics library paper.
Specifically, referring to fig. 2, which is a flowchart of step S10 in the graphical music editing method provided by the present invention, as shown in fig. 2, the step S10 includes:
s11, detecting a mouse pressing event and a mouse moving event input by a user;
s12, respectively triggering a drawing function or a frame selection function according to the direction information in the mouse moving event;
and S13, detecting a mouse release event input by a user, and drawing a graphic note with a corresponding length or selecting a graphic note in a corresponding area according to the mouse pressing position and the mouse release position.
In this embodiment, a user performs a note editing operation on the music canvas through a mouse, wherein the music canvas is sequentially provided with a grid layer, a drawing layer and a playing line layer from a bottom layer to a top layer, that is, the bottom layer is the grid layer, and divides an editing and drawing area into areas of rectangular grids one by one, specifically, the editing area on the music canvas is divided into one grid by closing alternate lines based on paper. The second layer is a drawing layer which is a rectangular grid of graphic notes, a rectangular structure is drawn by paper.js and used as notes, the drawn rectangular edge is matched with the grid through position judgment to obtain corresponding graphic notes, continuous rectangles are used for representing polyphonic notes, and single rectangles are used for representing monosyllabic notes; the third layer is a play line layer used for positioning the movement of the line when the musical note is played.
Therefore, a user uses a mouse to draw graphic notes, and particularly realizes drawing on a drawing layer, firstly detects a mouse pressing event input by the user and a mouse moving event after the mouse pressing event is detected, and respectively triggers a drawing function or a framing function according to different current mouse moving directions, wherein the drawing function is triggered when the direction information in the mouse moving event is horizontal, the framing function is triggered when the direction information in the mouse moving event is longitudinal, particularly, the moving directions can be distinguished by comparing a vertical coordinate when the mouse is pressed with a vertical coordinate after the mouse is moved, when the difference value between the vertical coordinate after the mouse is moved and the vertical coordinate when the mouse is pressed is smaller than a preset threshold value, the mouse is judged to be moved horizontally, and otherwise, the mouse is judged to be moved vertically.
The method comprises the steps of continuously detecting a mouse release event input by a user after different functions are triggered, drawing graphic notes with corresponding lengths or framing graphic notes in corresponding areas by comparing a mouse pressing position with a mouse release position, enabling the user to draw diversified graphic notes or select graphic notes in different corresponding areas by controlling the mouse pressing and releasing positions and the mouse moving direction, wherein during drawing, for example, when single-syllable notes need to be drawn, the single-syllable notes are drawn by clicking a certain grid, when continuous-note notes need to be drawn, the continuous-note notes are drawn by continuously moving a plurality of grids transversely after clicking the certain grid, and then the continuous-note notes are released.
In an alternative application embodiment, as shown in fig. 3, which is a flowchart of editing a graphical note by detecting a mouse event in an application embodiment, in the application embodiment, a scheme for implementing drawing or framing by mouse event driving includes the following steps:
s101, pressing down a mouse;
s102, moving a mouse;
s103, judging the moving direction of the mouse, if the moving direction is horizontal, executing a step S104, and if the moving direction is vertical, executing a step S107;
s104, dynamically generating a note rectangle from a mouse pressing coordinate to a current coordinate;
s105, releasing the mouse;
s106, attaching the current note to the grid;
s107, dynamically generating a dotted rectangle from the mouse pressing coordinate to the current coordinate;
s108, releasing the mouse;
and S109, framing all the notes in the dotted line area.
Specifically, in the optional embodiment, a mouse press and a mouse movement event are detected first, then a mouse movement direction is determined, a drawing mode is entered when the mouse is moved laterally, a note rectangle from a press coordinate to a current coordinate is dynamically generated according to the movement of the mouse in the drawing mode, that is, a note rectangle with a corresponding length is generated along with the movement of the mouse until the mouse is released, at this time, the current note rectangle is attached to the grid to obtain a finally drawn graphic note, particularly, when the mouse is released in the drawing mode, whether other notes overlapped with the newly drawn graphic note exist or not is further searched for at the current time, if other overlapped notes exist, the other overlapped notes are deleted, only the presently drawn note is retained, the correctness of note drawing is ensured, and data errors such as overlapped notes are avoided; and entering a frame selection mode when the mouse moves longitudinally, dynamically generating a dotted line frame selection rectangle from a pressed coordinate to a current coordinate according to the movement of the mouse in the mode until the mouse is released, selecting all notes in a dotted line area by frames, namely marking all the notes falling into the dotted line rectangle area into a selected state, and further performing editing events such as copying, pasting, deleting, dragging, moving, zooming and the like on the selected notes according to the input mouse and keyboard events to realize the editing and modifying operation on the drawn graphic notes.
And S20, converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise the position information, the starting time information and the ending time information of the graphic notes.
In this embodiment, after drawing the corresponding graphic note, in order to realize an accurate and continuous note playing effect, the drawn graphic note is converted into a corresponding data structure according to a preset rule, and the accurate playing information of the graphic note is obtained through data conversion, so that note data to be processed at each time point is clarified, and a large amount of stuttering caused by detection and calculation in the playing process is reduced, thereby realizing an accurate and smooth note playing effect.
Specifically, please refer to fig. 4, which is a flowchart of step S20 in the graphical music editing method according to the present invention. As shown in fig. 4, the step S20 includes:
s21, acquiring the position information, the starting time information and the ending time information of the graphic notes;
s22, storing the position information, the start time information and the end time information of the graphic notes into an array of playing data;
s23, storing the array of the playing data and the playing time point into the array of the time data, and forming the data structure by the array of the time data according to the rule of increasing time.
In this embodiment, when data conversion is performed, the position information, the start time information, and the end time information of the graphic note are obtained, specifically, the horizontal axis of the editing area on the music canvas is a time axis, the vertical axis is a pitch axis, that is, each line corresponds to a pitch, and each column corresponds to a beat, the position information, the start time information, and the end time information of the graphic note drawn by the user can be obtained according to the position and the length of the graphic note, the information is stored in an array of play data, that is, the note is recorded to start or stop playing at a certain time point, and the note belongs to the first note in the several lines, then the array of play data and the play time point are stored in an array of time data, that is, the notes start or stop playing within a time are recorded, the array of time data is converted by increasing the time to obtain the data structure, so as to realize the accurate playing control of the graphic notes drawn by the user.
And S30, when a playing instruction is received, playing corresponding musical notes at each playing time point according to the data structure.
In this embodiment, after the data conversion is performed to obtain the corresponding data structure, when the user inputs a play instruction, the corresponding musical note is played at each play time point according to the data structure, and since the data structure includes the position, the start time, and the end time of the accurate graphic musical note, the play state of each musical note does not need to be determined in a conventional manner of determining whether the positioning line intersects with the musical note during playing, so that the accuracy of playing the musical note is improved, and the musical note is played in the manner of the data structure, so that the effect of playing the polyphonic note can be realized during playing by identifying the start time and the end time of the polyphonic note after the polyphonic note is drawn, and the play output effect of the graphical music editing is improved.
Specifically, please refer to fig. 5, which is a flowchart of step S30 in the graphical music editing method according to the present invention. As shown in fig. 5, the step S30 includes:
s31, when a playing instruction is received, acquiring playing start points, duration values and pitches of all the graphic notes according to the data structure;
and S32, playing corresponding notes at each playing time point according to the playing start points, the durations and the pitches of all the graphic notes.
In this embodiment, when playing, the playing start point, duration and pitch of all the graphic notes are obtained according to the data structure, specifically, the corresponding pitch is obtained according to the row where each graphic note is located, the playing start point is obtained according to the time point where the leftmost side of each graphic note is located, the duration (one time corresponding to one beat) is obtained according to the number of the grids occupied by each graphic note, and after playing is started, the notes of the corresponding pitch and duration are played according to the playing or stopping state of each note at each playing time point, so that the graphical music editing preview effect is realized.
Further, referring to fig. 6, before the step S32, the method further includes:
s33, detecting the positioning time point of the current positioning line;
s34, comparing the positioning time point with the playing start points of all the graphic notes, and judging whether the playing start point is earlier than the positioning time point;
s35, if yes, starting playing from the positioning time point; otherwise, the playing is started from the earliest playing starting point in all the playing starting points.
In this embodiment, a positioning line for positioning a playing time point is disposed on a playing line layer of a music canvas, and the positioning line moves to a corresponding playing time point along with the playing of a note when the note is played, so that before the note is played, the position of the current positioning line is detected, the starting point of the playing of the note is correspondingly adjusted according to the position of the positioning line, and when the playing starting point is earlier than the positioning time point, it indicates that the positioning line is not before the time point of all graphic notes, for example, the playing is paused after the playing of a note at a certain time, or a user moves the positioning line to the position of a specified note according to the preview requirement, and then the playing is started from the positioning time point at the time, so as to improve the music preview efficiency; when the playing start point is not earlier than the positioning time point, the current graphic note is in a default state of playing from the beginning, and the playing starts from the earliest playing start point at the moment, so that different playing preview requirements are met.
Further, the step S30 is followed by:
and switching the graphic notes in the playing state into a highlight state.
In this embodiment, when a note is played, the note being played is switched to the display effect in the highlight state to explicitly prompt the user of the currently played note, specifically, the positioning line moves along with the playing of music during the playing, the note intersecting with the positioning line is the note being played, and the note intersecting with the positioning line is switched to the highlight state, so as to implement the corresponding playing relationship between the positioning line and the note and the dynamic display of the playing of the note.
To better understand the data conversion and playing process in the graphical music editing method provided by the present invention, the following describes the data structure conversion and note playing process in detail by referring to fig. 7 as an embodiment.
FIG. 7 is an interface diagram of a portion of a music canvas and graphical notes in an application embodiment, as shown in FIG. 7, in which three polyphonic notes are plotted, the first note being the first in the first row, with a start time point of 1, an end time point of 4, and a pitch of treble do; the second note is the first in the second line, with a start time point of 4, an end time point of 6, and a pitch of high pitch si; the third note is the first note in the third row, the start time point is 2, the end time point is 4, and the pitch is high la, although in specific implementation, different graphic notes can be set to different colors, for example, each pitch is preset with a color, and notes of different pitches can be more intuitively distinguished when drawn; js, before playing music, the rectangular vector of paper is transformed into a time-driven data structure according to the following preset rules:
interface PlayData {
type:'start' | 'end';
row:number;
index:number;
}
interface TimeData {
time:number;
data:PlayData[];
}
timeDataArr:TimeData[] = []
wherein, the timeDataArr is an array, which is incremented according to time, and when data is converted, the notes drawn by the user are converted into a time data TimeData, which represents that in a time period, a plurality of notes can start playing or stop playing, and these notes are stored in an array of play data playData, which records the position of the note at the time point and the note at the line, so that according to the above rule, the graphic note in fig. 7 is converted into the following data structure:
[
{time: 1; data:[{type:’start’, row:‘1’,index: ‘1’}]}
{time: 2, data:[{type:’start’, row:‘3’,index: ‘1’}]}
{time: 4; data:[
{type:‘end’, row:‘1’,index: ‘1’},
{type:‘end’, row:‘3’,index: ‘1’},
{type: ‘start, row: ‘2’, index: ‘1’}
]};
{time: 6, data:[{type:’end’, row:‘2’,index: ‘1’}]}
]
the above data structure represents that when the playback starts, at time point 1, the 1 st note in line 1 is in the start state, and the high-pitched do note is played and put in the highlight state;
when the time point 2 is entered, the 1 st note in the 3 rd line is in the starting state, the high sound la note is played and is in the high brightness state;
when the time point 3 is entered, no note at the current time point needs to be changed, and the playing state is not changed;
when the time point 4 is entered, the 1 st note in the 1 st line and the 1 st note in the 3 rd line are in a stop state, which indicates that the playing is stopped after the beat is finished, and the 1 st note in the 2 nd line is in a start state, so that the treble do, the treble la and the treble si are played at the same time, and the treble si note is in a highlight state;
when the time point 5 is entered, the 1 st note of the 1 st line and the 1 st note of the 3 rd line are stopped playing, and the highlight states of the 1 st note and the 1 st note are cancelled;
when the time point 6 is reached, the 1 st note in the 2 nd line is in a stop state, so that the high-pitched music is stopped after the music is played, the high-pitched music is reset to a default state, all highlight states are cancelled, the positioning line is restored to the default starting point, and the playing is finished.
It should be noted that, a certain sequence does not necessarily exist between the above steps, and those skilled in the art can understand, according to the description of the embodiments of the present invention, that in different embodiments, the above steps may have different execution sequences, that is, may also be executed in parallel, may also be executed interchangeably, and the like.
As shown in fig. 8, based on the graphical music editing method, the present invention further provides a graphical music editing system, where the graphical music editing system may be a computing device such as a mobile terminal, a desktop computer, a notebook, a palm computer, and a server, and includes a processor 10, a memory 20, and a display 30. FIG. 8 shows only some of the components of the graphical music editing system, but it is to be understood that not all of the shown components are required and that more or fewer components may be implemented instead.
The storage 20 may be an internal storage unit of the graphical music editing system in some embodiments, such as a hard disk or a memory of the system. The memory 20 may also be an external storage device of the graphical music editing system in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the graphical music editing system. Further, the memory 20 may also include both an internal storage unit and an external storage device of the graphical music editing system. The memory 20 is used for storing application software installed in the graphical music editing system and various data, such as program codes for installing the graphical music editing system. In one embodiment, the memory 20 stores a graphical music editing program 40, and the graphical music editing program 40 can be executed by the processor 10, so as to implement the graphical music editing method according to the embodiments of the present application.
The processor 10 may be a Central Processing Unit (CPU), a microprocessor or other data Processing chip in some embodiments, and is used for executing the program codes stored in the memory 20 or Processing data, such as executing the graphical music editing method.
The display 30 may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch panel, or the like in some embodiments. The display 30 is used to display information in the graphical music editing system and to display a visual user interface. The components 10-30 of the graphical music editing system communicate with each other via a system bus. In one embodiment, the steps of the graphical music editing method described above are implemented when the processor 10 executes the graphical music editing program 40 in the memory 20.
Please refer to fig. 9, which is a functional block diagram of a system for installing a graphical music editing program according to a preferred embodiment of the present invention. In this embodiment, the system for installing the graphical music editing program may be divided into one or more modules, and the one or more modules are stored in the memory 20 and executed by one or more processors (in this embodiment, the processor 10) to complete the present invention. For example, in fig. 9, the system for installing the graphical music editing program may be divided into a mouse detection module 21, a data conversion module 22, and a note playing module 23, and the mouse detection module 21, the data conversion module 22, and the note playing module 23 are connected in this order.
The mouse detection module 21 is configured to detect a mouse event input by a user in a music canvas, and edit a corresponding graphic note on the music canvas according to the mouse event;
the data conversion module 22 is configured to convert the graphic note into a corresponding data structure according to a preset rule, where the data structure includes position information, start time information, and end time information of the graphic note;
the note playing module 23 is configured to play a corresponding note at each playing time point according to the data structure when a playing instruction is received.
The module referred to in the present invention refers to a series of computer program instruction segments capable of performing specific functions, which are more suitable than programs for describing the execution process of the graphical music editing program in the graphical music editing system. For the specific functions of the modules 21 to 23, reference is made to the embodiments corresponding to the above-mentioned methods.
In summary, in a graphical music editing method, a graphical music editing system, and a storage medium provided by the present invention, the graphical music editing method includes: building a music canvas for editing musical notes; detecting a mouse event input by a user in a music canvas, and editing a corresponding graphic note on the music canvas according to the mouse event; converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes; and when a playing instruction is received, playing corresponding musical notes at each playing time point according to the data structure. According to the embodiment of the invention, corresponding graphic notes are directly edited on the music canvas by detecting mouse events, the edited graphic notes are converted into corresponding data structures and then played, notes with different pitches can be edited according to different starting and ending times of each graphic note, and the editing and playing of the notes with different pitches are realized.
The above-described embodiments are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that the embodiments may be implemented by software plus a general hardware platform, and may also be implemented by hardware. With this in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer electronic device (which may be a personal computer, a server, or a network electronic device, etc.) to execute the methods of the various embodiments or some parts of the embodiments.
Conditional language such as "can," "might," or "may" is generally intended to convey that a particular embodiment can include (yet other embodiments do not include) particular features, elements, and/or operations, among others, unless specifically stated otherwise or otherwise understood within the context as used. Thus, such conditional language is also generally intended to imply that features, elements, and/or operations are in any way required for one or more embodiments or that one or more embodiments must include logic for deciding, with or without input or prompting, whether such features, elements, and/or operations are included or are to be performed in any particular embodiment.
What has been described herein in the specification and drawings includes examples that can provide a project affiliate analysis evaluation method, system, and medium. It will, of course, not be possible to describe every conceivable combination of components and/or methodologies for purposes of describing the various features of the disclosure, but it can be appreciated that many further combinations and permutations of the disclosed features are possible. It is therefore evident that various modifications can be made to the disclosure without departing from the scope or spirit thereof. In addition, or in the alternative, other embodiments of the disclosure may be apparent from consideration of the specification and drawings and from practice of the disclosure as presented herein. It is intended that the examples set forth in this specification and the drawings be considered in all respects as illustrative and not restrictive. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims (10)

1. A graphical music editing method is characterized by comprising the following steps:
detecting a mouse event input by a user in a music canvas, and editing a corresponding graphic note on the music canvas according to the mouse event;
converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes;
and when a playing instruction is received, playing corresponding musical notes at each playing time point according to the data structure.
2. The graphical music editing method according to claim 1, wherein the step of detecting a mouse event input by a user in a music canvas, and editing a corresponding graphical note on the music canvas according to the mouse event further comprises:
and building a music canvas for editing the musical notes based on a vector graphic library, wherein the editing area of the music canvas is of a grid structure.
3. The graphical music editing method according to claim 1, wherein the step of detecting a mouse event input by a user in a music canvas, and editing a corresponding graphical note on the music canvas according to the mouse event comprises:
detecting a mouse pressing event and a mouse moving event input by a user;
respectively triggering a drawing function or a frame selection function according to the direction information in the mouse moving event;
and detecting a mouse release event input by a user, and drawing a graphic note with a corresponding length or selecting a graphic note in a corresponding area according to the mouse pressing position and the mouse release position.
4. The graphical music editing method according to claim 3, wherein the step of triggering a drawing function or a frame selection function according to the direction information in the mouse movement event includes:
when the direction information in the mouse moving event is transverse, triggering a drawing function;
and triggering a frame selection function when the direction information in the mouse moving event is longitudinal.
5. The graphical music editing method according to claim 1, wherein the step of converting the graphical note into a corresponding data structure according to a preset rule, wherein the data structure includes position information, start time information and end time information of the graphical note, comprises:
acquiring position information, starting time information and ending time information of the graphic notes;
storing the position information, the start time information and the end time information of the graphic notes into an array of playing data;
and storing the array of the playing data and the playing time point into the array of the time data, and forming the data structure by the array of the time data according to a time increasing rule.
6. The graphical music editing method according to claim 1, wherein the step of playing a corresponding note at each playing time point according to the data structure when a playing instruction is received comprises:
when a playing instruction is received, acquiring playing start points, duration values and pitches of all the graphic notes according to the data structure;
and playing corresponding notes at each playing time point according to the playing starting points, the durations and the pitches of all the graphic notes.
7. The graphical music editing method of claim 6, wherein the step of playing the corresponding note at each playing time point according to the playing start point, duration and pitch of all the graphical notes is preceded by the step of:
detecting a positioning time point of a current positioning line;
comparing the positioning time point with the playing start points of all the graphic notes, and judging whether the playing start point is earlier than the positioning time point;
if yes, starting playing from the positioning time point; otherwise, the playing is started from the earliest playing starting point in all the playing starting points.
8. The graphical music editing method according to claim 1, wherein the step of playing the corresponding musical note at each playing time point according to the data structure when receiving the playing instruction further comprises:
and switching the graphic notes in the playing state into a highlight state.
9. A graphical music editing system, comprising: a processor, a memory, and a communication bus;
the memory has stored thereon a computer readable program executable by the processor;
the communication bus realizes connection communication between the processor and the memory;
the processor, when executing the computer readable program, implements the steps in the graphical music editing method of any of claims 1-8.
10. A computer readable storage medium storing one or more programs, the one or more programs being executable by one or more processors to perform the steps in the graphical music editing method of any one of claims 1-8.
CN202110120091.2A 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium Active CN113035157B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110120091.2A CN113035157B (en) 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110120091.2A CN113035157B (en) 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium

Publications (2)

Publication Number Publication Date
CN113035157A true CN113035157A (en) 2021-06-25
CN113035157B CN113035157B (en) 2024-04-16

Family

ID=76459388

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110120091.2A Active CN113035157B (en) 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium

Country Status (1)

Country Link
CN (1) CN113035157B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003271164A (en) * 2002-03-19 2003-09-25 Yamaha Music Foundation Musical sound generating method, musical sound generating program, storage medium, and musical sound generating device
JP2010091744A (en) * 2008-10-07 2010-04-22 Kawai Musical Instr Mfg Co Ltd Musical symbol input device and musical symbol input program
JP2012083564A (en) * 2010-10-12 2012-04-26 Yamaha Corp Music editing device and program
US9443501B1 (en) * 2015-05-13 2016-09-13 Apple Inc. Method and system of note selection and manipulation
US20190103082A1 (en) * 2017-09-29 2019-04-04 Yamaha Corporation Singing voice edit assistant method and singing voice edit assistant device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003271164A (en) * 2002-03-19 2003-09-25 Yamaha Music Foundation Musical sound generating method, musical sound generating program, storage medium, and musical sound generating device
JP2010091744A (en) * 2008-10-07 2010-04-22 Kawai Musical Instr Mfg Co Ltd Musical symbol input device and musical symbol input program
JP2012083564A (en) * 2010-10-12 2012-04-26 Yamaha Corp Music editing device and program
US9443501B1 (en) * 2015-05-13 2016-09-13 Apple Inc. Method and system of note selection and manipulation
US20190103082A1 (en) * 2017-09-29 2019-04-04 Yamaha Corporation Singing voice edit assistant method and singing voice edit assistant device

Also Published As

Publication number Publication date
CN113035157B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
US8319086B1 (en) Video editing matched to musical beats
US10198421B2 (en) Method for inserting or deleting cells, rows or columns in spreadsheet and a device therefor
RU2530342C2 (en) Interaction with multimedia timeline
US7714864B2 (en) Visual resource profiler for graphical applications
US9208138B2 (en) Range adjustment for text editing
US20030067497A1 (en) Method and device for modifying a pre-existing graphical user interface
US20130257770A1 (en) Controlling and editing media files with touch gestures over a media viewing area using a touch sensitive device
US20060247925A1 (en) Virtual push-to-talk
JP2008275687A (en) Display control device and method
CN107844331A (en) Generate the method, apparatus and equipment of boot configuration file
JP5149552B2 (en) Display control apparatus and display control method
WO2022001579A1 (en) Audio processing method and apparatus, device, and storage medium
CN113035158A (en) Online MIDI music editing method, system and storage medium
CN115357177A (en) Device control method, device, storage medium and electronic device
US20240098328A1 (en) Video processing method and apparatus, and device and storage medium
CN113035157A (en) Graphical music editing method, system and storage medium
WO2020001178A1 (en) Mode switching method, device and computer-readable storage medium
JP2008165408A (en) Information processor, its control method, and program
US20160300554A1 (en) Musical score displaying and performing program, and musical score displaying and performing device
US7721072B2 (en) Information processing method and apparatus, recording medium, and program
KR20190081911A (en) Method for panning image
KR20190115401A (en) Method, apparatus and program for linked view
JP3953738B2 (en) Graphic editing device and recording medium storing program for functioning as graphic editing device
CN108491139A (en) Object fixing method and device, terminal equipment and storage medium
JP2008071280A (en) Information processor and its control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant