CN111185006A - Information processing method and device in game, storage medium and electronic equipment - Google Patents

Information processing method and device in game, storage medium and electronic equipment Download PDF

Info

Publication number
CN111185006A
CN111185006A CN201911342934.2A CN201911342934A CN111185006A CN 111185006 A CN111185006 A CN 111185006A CN 201911342934 A CN201911342934 A CN 201911342934A CN 111185006 A CN111185006 A CN 111185006A
Authority
CN
China
Prior art keywords
game
time
interface
audio information
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911342934.2A
Other languages
Chinese (zh)
Inventor
钟川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201911342934.2A priority Critical patent/CN111185006A/en
Publication of CN111185006A publication Critical patent/CN111185006A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall

Abstract

The disclosure provides an information processing method and device in a game, electronic equipment and a computer readable storage medium, and relates to the technical field of software communication. The information processing method in the game is applied to a terminal, a graphical user interface is provided on a display screen of the terminal, and the method comprises the following steps: acquiring a first interface file corresponding to first target music; extracting audio information in the first interface file, and obtaining time sequence data corresponding to the first target music according to the audio information, wherein the audio information comprises the tempo of the first target music and the time number corresponding to a quarter note; and displaying a corresponding operation prompt message on the graphical user interface according to the time sequence data. According to the method and the device, the relevant information of the corresponding music can be extracted by analyzing a midi (Musical Instrument Digital Interface) file, and the problems of low efficiency and high error possibility existing in a time sequence data deriving mode are solved.

Description

Information processing method and device in game, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of software communication technologies, and in particular, to an in-game information processing method, an in-game information processing apparatus, an electronic device, and a computer-readable storage medium.
Background
Music games are increasingly popular in people's daily life due to hobbies or entertainment requirements, and even in some non-music games, music-related playing methods are integrated to improve the playability of the games and bring more experience and freshness to game users.
In the prior art, music games are typically implemented by deriving a time sequence from an audio file and prompting the game user to pop up the corresponding tone at the appropriate time according to the time sequence. However, the method has the problems of easy error, low efficiency and difficult realization of complex music effect.
Therefore, it is desirable to provide an in-game information processing method that realizes a music-like game that is efficient, has a low error rate, and can present a complicated effect.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the embodiments of the present disclosure is to provide an in-game information processing method, an in-game information processing apparatus, an electronic device, and a computer-readable storage medium, which can further realize a music game that can achieve high efficiency, low error rate, and can present a complex effect.
According to a first aspect of the present disclosure, there is provided an information processing method in a game, applied to a terminal, for providing a graphical user interface on a display screen of the terminal, including:
acquiring a first interface file corresponding to first target music;
extracting audio information in the first interface file, and obtaining time sequence data corresponding to the first target music according to the audio information, wherein the audio information comprises the tempo of the first target music and the time number corresponding to a quarter note;
and displaying a corresponding operation prompt message on the graphical user interface according to the time sequence data.
In an exemplary embodiment of the disclosure, the obtaining time-series data corresponding to the first target music according to the audio information includes:
acquiring an event in the time sequence data and a note corresponding to the event according to the audio information;
calculating the triggering time point of each event on the time sequence by using the tempo of the first target music and the time number corresponding to the quarter note;
the event is an operation performed on each note corresponding to the event.
In an exemplary embodiment of the present disclosure, the first target music is played with an n-th note, where n is 2mM is a positive integer greater than or equal to 2;
the calculating the triggering time point of each event on the time sequence by using the tempo of the first target music and the time number corresponding to the quarter note comprises:
calculating the time number corresponding to the n-th note according to the time number corresponding to the quarter note;
calculating the time length corresponding to the n-th note according to the tempo of the first target music;
calculating the time length of each event from the first target music starting point by using the calculated time number corresponding to the n-th note and the calculated time length corresponding to the n-th note;
and obtaining the triggering time point of each event on the time sequence by using the time length.
In an exemplary embodiment of the present disclosure, before displaying a corresponding operation prompt message on a graphical user interface according to the time-series data, the method further includes:
providing at least one selectable game mode on the graphical user interface, wherein the game mode is associated with the audio information parameter;
and responding to the selected game mode, and performing corresponding processing on the first interface file.
In an exemplary embodiment of the present disclosure, the audio information parameters include a tone, a tempo, a key signature, and a musical instrument timbre;
the gaming mode is associated with the audio information parameter, including:
the game mode is associated with one or more of a tone, tempo, key signature, and instrument timbre.
In an exemplary embodiment of the disclosure, said responding to the selected game mode to process the first interface file accordingly includes:
and in response to the selected game mode associated with the beat speed or the key signature, performing speed change or tone change processing on the first interface file.
According to a second aspect of the present disclosure, there is provided an information processing method in a game, applied to a terminal, for providing a graphical user interface on a display screen of the terminal, including:
providing a setting interface of audio information parameters on the graphical user interface;
responding to the first operation of the setting interface, and acquiring the set audio information parameters;
and responding to a second operation on the graphical user interface to record audio according to the set audio information parameter, and generating a second interface file corresponding to the audio.
In an exemplary embodiment of the present disclosure, the recording an audio in response to a second operation on the graphical user interface and generating a second interface file corresponding to the audio includes:
responding to a second operation on the graphical user interface, recording a note corresponding to each playing event and triggering a time point of the note, wherein the second operation comprises at least one playing event;
calculating the time interval between the adjacent events according to the recorded time points of triggering the notes;
and recording each event, the corresponding note of each event and the time interval to generate a second interface file corresponding to the audio.
In an exemplary embodiment of the present disclosure, the method further comprises:
recording the track information of the audio in the corresponding second interface file;
the track information comprises user information, track names, track introduction and score conditions.
According to a third aspect of the present disclosure, there is provided an information processing apparatus in a game, applied to a terminal, for providing a graphical user interface on a display screen of the terminal, including:
the acquisition module is used for acquiring a first interface file corresponding to the first target music;
the analysis module is used for extracting audio information in the first interface file and obtaining time sequence data corresponding to the first target music according to the audio information, wherein the audio information comprises the tempo of the first target music and the time number corresponding to the quarter note;
and the prompting module is used for displaying corresponding operation prompting messages on the graphical user interface according to the time sequence data.
According to a fourth aspect of the present disclosure, there is provided an information processing apparatus in a game, applied to a terminal, for providing a graphical user interface on a display screen of the terminal, including:
the interactive module is used for providing a setting interface of audio information parameters on the graphical user interface;
the setting module is used for responding to the first operation of the setting interface and acquiring the set audio information parameters;
and the recording module is used for responding to a second operation on the graphical user interface to record audio according to the set audio information parameters and generating a second interface file corresponding to the audio.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to a sixth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the information processing method in the game provided by the disclosed example embodiment, first, a first interface file corresponding to a first target music is acquired, audio information is extracted from the acquired first interface file, the audio information includes a tempo of the first target music and a time number corresponding to a quarter note, after the audio information is extracted, time series data corresponding to the first target music is obtained according to the audio information, and a corresponding operation prompt message is displayed on a graphical user interface according to the obtained time series data. On the other hand, in the information processing method in a game according to the exemplary embodiment, the data of the first interface file directly includes the audio information related to the first target music, and the time-series data of the first target music can be obtained by analyzing the first interface file. Therefore, the time sequence file of the first target file does not need to be exported, the problems of high error probability and audio information loss caused by export are solved, and the export efficiency is improved. On the other hand, besides the time series data, the first interface file also comprises the information such as the beat, the musical instrument tone and the tone mark related to the first target music, so more original data of the first target music can be acquired through the first interface file, and the operation expandability is strong. Meanwhile, in the method provided by the exemplary embodiment of the disclosure, more music playing methods can be realized by adjusting the audio information, and the playability and the interest of the game are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a schematic diagram illustrating an exemplary system architecture of an in-game information processing method and apparatus to which an embodiment of the present disclosure may be applied;
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device used to implement embodiments of the present disclosure;
FIG. 3 schematically shows a diagram of an information processing method in a game according to one embodiment of the present disclosure;
FIG. 4 is a schematic diagram showing the structure of a header data block of a midi file in the information processing method in a game of one embodiment of the present disclosure;
FIG. 5 schematically shows a flow chart of an in-game information processing method according to one embodiment of the present disclosure;
FIG. 6 schematically shows a flow diagram of an in-game information processing method according to one embodiment of the present disclosure;
FIG. 7 schematically shows a block diagram of an information processing apparatus in a game according to one embodiment of the present disclosure;
fig. 8 schematically shows a block diagram of an information processing apparatus in a game according to one embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an in-game information processing method and apparatus according to an embodiment of the present disclosure can be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The information processing method in the game provided by the embodiment of the present disclosure may be executed by the terminal devices 101, 102, and 103, and accordingly, the information processing apparatus in the game may also be provided in the terminal devices 101, 102, and 103. The in-game information processing method provided by the embodiment of the present disclosure may also be executed by the terminal devices 101, 102, and 103 and the server 105, and accordingly, the in-game information processing apparatus may be provided in the terminal devices 101, 102, and 103 and the server 105. In addition, the information processing method in the game provided by the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the information processing apparatus in the game may be disposed in the server 105, which is not particularly limited in the exemplary embodiment.
For example, in this exemplary embodiment, a game client may be provided in the terminal device 101, 102, or 103, and a graphical user interface is provided on a display screen of the terminal device 101, 102, or 103, the game client, in response to an operation of a game user, acquires a first interface file corresponding to the first target music from the server 105 or locally, extracts relevant audio information therefrom, obtains time-series data corresponding to the first target music according to the extracted audio information, and displays an operation prompt message on the graphical user interface according to the time-series data to prompt the game user to perform a corresponding operation, so as to implement a music game.
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU)201 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for system operation are also stored. The CPU201, ROM 202, and RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input portion 206 including a keyboard, a mouse, and the like; an output section 207 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 208 including a hard disk and the like; and a communication section 209 including a network interface card such as a LAN card, a modem, or the like. The communication section 209 performs communication processing via a network such as the internet. A drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 210 as necessary, so that a computer program read out therefrom is mounted into the storage section 208 as necessary.
Music games are increasingly popular in people's daily life due to hobbies or entertainment requirements, and even in some non-music games, music-related playing methods are integrated to improve the playability of the games and bring more experience and freshness to game users.
In order to realize the music playing method in the game, the inventor tentatively provides a scheme which mainly comprises the following steps: firstly, making a music score file corresponding to target music, and exporting a time sequence corresponding to the music score file; then, the time sequence is read in the game, and the game user is prompted to pop up corresponding tones at proper time according to the read time sequence, so that the game user can play a complete song.
Although the above method can realize the music playing method in the game, the following problems are also existed: in the method, a music score file corresponding to target music is usually made by a sound effect engineer, and a process of sending operation prompt messages according to time series in a game is usually realized by a program developer, so that the sound effect engineer is required to derive time series data in a specified format to the program developer, and the efficiency is low. And once errors occur in the derivation, the debugging and repairing can be completed only by matching the personnel of the two positions. If the program developer is allowed to complete the data export work, important information is easily lost due to the fact that the music score file is not well known. In addition, the method adopts a way of deriving time series, which is difficult to realize complex music effect and also causes the loss of original sound data. Sound effect dispersion also occurs when a score is varied in a game.
In addition, in order to realize the function of recording the music played by the game user, the inventor has also tentatively proposed the following two recording methods:
the first is to record an audio file directly in a game and compress the recorded audio file. The problem with this method is that the audio file directly recorded is large in size, slow in transmission speed and consumes a lot of traffic, and the compression process for the audio file also consumes performance. In addition, the method also has difficulty in implementing a function of adding additional information to the audio file.
The second recording method based on time sequence is to record the notes played by game users at corresponding time points and save the notes as a text file or a binary file. Other users in the game can download the file, read the corresponding time sequence data, and play the played notes according to the time sequence. Although this method can improve the problem of the first method that the audio file is too large for transmission, it has the following disadvantages: it is also difficult to store other related data such as long-pitch, instrument type and author information. Additional ways need to be introduced when recording these data. Meanwhile, the method can lose some related information such as musical instruments, timbres, beats, speeds and the like, and usually, the text file can be analyzed only by a specific player realized by the game, so that the method cannot be popularized and used.
In order to solve the problems in the above method, in the present exemplary embodiment, the inventor further proposes a new technical solution to implement music in a game, and the following describes the technical solution of the embodiment of the present disclosure in detail:
the present exemplary embodiment first provides an information processing method in a game. The information processing method in the game may be applied to one or more of the terminal apparatuses 101, 102, and 103, the server 105, and the terminal apparatuses 101, 102, and 103 and the server 105. Referring to fig. 3, the information processing method in the game specifically includes the following steps:
step S310: acquiring a first interface file corresponding to first target music;
step S320: extracting audio information in the first interface file, and obtaining time sequence data corresponding to the first target music according to the audio information, wherein the audio information comprises the tempo of the first target music and the time number corresponding to a quarter note;
step S330: and displaying a corresponding operation prompt message on the graphical user interface according to the time sequence data.
In the method for processing information in a game provided in the present exemplary embodiment, on the other hand, in the method for processing information in a game provided in the present exemplary embodiment, the data of the first interface file directly includes the audio information related to the first target music, and the time-series data of the first target music can be obtained by parsing the first interface file. Therefore, the time sequence file of the first target file does not need to be exported, the problems of high error probability and audio information loss caused by export are solved, and the export efficiency is improved. On the other hand, besides the time series data, the first interface file also comprises the information such as the beat, the musical instrument tone and the tone mark related to the first target music, so more original data of the first target music can be acquired through the first interface file, and the operation expandability is strong. Meanwhile, in the method provided by the exemplary embodiment of the disclosure, more music playing methods can be realized by adjusting the audio information, and the playability and the interest of the game are improved.
In another embodiment, which will be described in more detail below, the information processing method in a game provided by this exemplary embodiment is applied to a terminal, and a graphical user interface is provided on a display screen of the terminal.
In step S310, a first interface file corresponding to the first target music is acquired.
The information processing method in the game provided by the present exemplary embodiment can realize the music playing method in the game, that is, the game user completes the playing of the first target music in the game. The game client side in the terminal responds to the request operation of a user for playing the first target music, and obtains a first interface file corresponding to the first target music.
The first target music may be music stored locally at the game client, may also be music imported by the user, and may also be acquired from a network, which is not particularly limited in this exemplary embodiment.
The first interface file may be stored locally at the game client, may be obtained from a server, or may be obtained by other technical means. The first Interface file is a midi (musical instrument Digital Interface) file corresponding to the first target music. The midi file records music using digital control signals of notes, which is a communication standard, and can realize the exchange of information and control signals between electronic sound equipment. The structure of a midi file comprises two parts, a file header block and track block data. For example, the hexadecimal representation of the header block can be as shown in fig. 4, where the first four bytes "4 d 546864" are the identifier string and the character is denoted as "MThd", which is used to identify the header block type; the next four bytes "00000006" indicate that the next file header data area is 6 bytes in length; the last six bytes "ff ff nn nn dd" are the header data area of the file, as shown in table 1, and include information such as the format, track, and basic time of the midi file.
TABLE 1
Figure BDA0002332082890000101
The hexadecimal representation of track block data includes three parts, an identifier string, a track block data block length, and a track block data block. Where the hexadecimal representation of the track block designator string is "4 d 54726 b" and the corresponding character is identified as "MTrk"; the track block data block length is fixed 4 bytes, and the length of the following track block data block is specified; the track block data section is made up of a number of midi events, which consist of delta time (time difference) and midi messages. The time difference refers to the time from the previous event to the current event, and the unit is tick. The midi message is composed of a status byte and a plurality of data bytes, as shown in table 2, the status byte can be understood as a method, and the data byte can be understood as a parameter corresponding to the method.
TABLE 2
Figure BDA0002332082890000111
There is a special status byte FF in the MIDI message, indicating a Non-MIDI events, also called meta-events, which can be used to record additional information of the first target music. The syntax definition of this meta-event is shown in table 3:
TABLE 3
Figure BDA0002332082890000112
In the byte position corresponding to table 3, additional information such as the name of the first target music song, the work and rest information, the copyright of the song and the like can be added, so that the flexibility and the interestingness of the music game are improved.
In step S320, audio information in the first interface file is extracted, and time sequence data corresponding to the first target music is obtained according to the audio information, where the audio information includes a tempo of the first target music and a time number corresponding to a quarter note.
In this exemplary embodiment, after acquiring the first interface file, the game client extracts the audio information in the first interface file, and obtains the time series data of the first target music according to the acquired audio information. The audio information is the above-mentioned text introduced to the first interface file and the related information of the first target music included in tables 1 to 3. The time series data may be represented as: t is tdata=[(t1,m1,e1),(t2,m2,e2)...(tn,mn,en)]Wherein, ti(i=1~n) represents the time of occurrence of midi event i, mi(i 1 to n) represents a note corresponding to the ith event, and ei(i 1 to n) represents a note miThe operation is carried out. The acquisition of the time-series data may be achieved as follows: and acquiring events and notes corresponding to the events in the time sequence data according to the audio information, and calculating to obtain trigger time points of each midi event in the time sequence corresponding to the time sequence data, wherein the events are operations performed on the notes corresponding to the events.
In a specific implementation, the event may be obtained from the midi message, and as shown in table 2, the operation performed on each corresponding note may be obtained from the corresponding relationship between the status byte and the function description, that is, e in the time series data may be obtainedi. For a first target music with n-th note as one beat, where n is 2mAnd m is a positive integer greater than or equal to 2, and the calculation of the time trigger point can be specifically realized as follows: calculating the time number corresponding to the n-th note according to the time number corresponding to the quarter note; calculating the time length corresponding to the n-th note according to the tempo of the first target music; calculating the time length of each event from the first target music starting point by using the calculated time number and time length corresponding to the n-th note; the time length is used to derive the trigger time point of each event on the time sequence.
Taking the quarter note as an example, the trigger time t is calculated as described aboveiThe method of (1) may be: first, the tick number t corresponding to the specified basic time is acquired from "dd dd" of the midi file shown in table 1resI.e. tick number t corresponding to a quarter noteresTick is the minimum time unit of the midi file; then, acquiring a tempo from a corresponding field of the midi file as shown in table 3, and calculating a time length corresponding to one quarter note according to a formula of 60.0/bpm using the acquired tempo, wherein bpm represents the number of notes played for one minute, and here, is the number of quarter notes played for one minute; obtaining the tick number t corresponding to a quarter noteresAnd the length of time, from the midi file track block data region's correspondenceObtaining delta time data by position, wherein the time difference data refers to the time number from the previous event to the current event, namely tick number, and the formula is utilized
Figure BDA0002332082890000131
Calculating t corresponding to the time length of each note from the initial noteresThe number of the cells; finally, the time length corresponding to the obtained quarter note and the time length t corresponding to the time length of each note from the initial note are utilizedresNumber by formula
Figure BDA0002332082890000132
Calculating to obtain corresponding time trigger points of each event on the time sequence, wherein tkRepresenting the time of occurrence of the kth event, i representing the ith event, tickiThe tick number between the represented ith event and the ith-1 event. The time-series data t can be finally obtained by the calculationdata=[(t1,m1,e1),(t2,m2,e2)...(tn,mn,en)]。
It should be noted that the above scenario is only an exemplary illustration, and is not limited, for example, when the first target music takes an eighth note as one beat, the tick number corresponding to the eighth note may be calculated according to the tick number of the quarter note obtained from the "dd dd dd" field of the midi file, the time length corresponding to one eighth note may be calculated according to the tempo obtained from the corresponding field of the midi file as shown in table 3, and finally, the time trigger point corresponding to each event in the time sequence may be calculated according to the tick number and the time length obtained corresponding to one eighth note. In addition, the first target music may be played with other notes such as sixteen notes, which falls within the scope of the present exemplary embodiment.
Compared with a method for deriving the time sequence from the first target music, the method for obtaining the time sequence data by analyzing the midi file has the advantages of high efficiency, low error rate, difficulty in losing audio information, capability of obtaining more original data and strong operation expandability.
In step S330, a corresponding operation prompt message is displayed on the graphical user interface according to the time-series data.
In the present exemplary embodiment, after the time-series data are calculated according to the steps from S310 to S320, the time-series data t are calculated according to the time-series datadata=[(t1,m1,e1),(t2,m2,e2)...(tn,mn,en)]And sending a corresponding prompt message to the game user on the graphical user interface of the terminal.
For example, the above process may be: note miThe corresponding event is event eiThe time point of occurrence is tiIf e isiIndicating the operation of pressing a note, tiThe corresponding time point is 6 minutes and 6 seconds, when the time sequence of the first target music is 6 minutes and 6 seconds, the pair of notes miA highlighted distinctive display is performed to prompt the user to perform an operation of pressing the note. By performing the above process on each note in the time sequence, the game user can complete playing the first target music, and the music playing method in the game is realized. It should be noted that the above scenario is only an exemplary illustration, and does not limit the protection scope of the exemplary embodiment.
Preferably, in the information processing method in the game provided in the present exemplary embodiment, a plurality of game modes may be provided for the game user to select, and corresponding processing is performed according to the selection of the user, and a specific flow is shown in fig. 5, and includes the following steps:
in step S510, a first interface file corresponding to the first target music is acquired.
In the present exemplary embodiment, the details of this step have already been introduced in step S310, and therefore are not described herein again.
In step S520, the first interface file is parsed.
In the exemplary embodiment, after the game client acquires the first interface file, the game client extracts the audio message in the first interface fileAnd obtaining time sequence data of the first target music according to the obtained audio information. The audio information is the above-mentioned text introduced to the first interface file and the related information of the first target music included in tables 1 to 3. The time series data may be represented as: t is tdata=[(t1,m1,e1),(t2,m2,e2)...(tn,mn,en)]Wherein, ti(i-1-n) represents the time of occurrence of midi event i, mi(i 1 to n) represents a note corresponding to the ith event, and ei(i 1 to n) represents a note miThe operation is carried out. The detailed process of calculating the time-series data in step S320 is already described, and therefore will not be described herein.
In step S530, at least one selectable game mode is provided on the gui, and whether the audio information of the first interface file changes is determined according to the selected game mode.
In this example embodiment, the terminal may provide at least one selectable game mode to the game user in the graphical user interface, the game mode being associated with the audio information parameter in the first interface file, and the ease of play of the game may be adjusted by changing the audio information parameter. For example, the game mode may be provided according to the tempo, or the tone, and besides, the game mode may be associated with audio information parameters such as a key and a musical instrument tone, which is not limited in this exemplary embodiment.
Responding to the operation that the game user selects the game mode, judging whether the audio information parameter of the first interface file changes according to the selected game mode, if so, executing the step S540, otherwise, skipping to the step S550.
In step S540, if the determination result is yes, corresponding processing is performed on the first interface file.
In this example embodiment, when the audio information parameter corresponding to the game mode selected by the user is different from the first interface file, the first interface file may be processed by the post-processing module to obtain time-series data matching the selected game mode. For example, when the selected game mode is to change the speed, the speed change process may be performed; when the selected game mode is to change the tone, the tone-changing process may be performed, and in addition, other processes may be performed according to different audio information parameters, which is not particularly limited in the present exemplary embodiment.
Step S550: and displaying a corresponding operation prompt message on a graphical user interface according to the obtained time sequence data.
In the present exemplary embodiment, after the time-series data corresponding to the first target music is obtained in step S520 or S540, a corresponding prompt message is sent to the game user on the graphical user interface of the terminal according to the time-series data. The specific implementation of this process is already described in step S330, and therefore will not be described herein.
In the music realization of the game, the music played by the game users can be recorded, so that the functions of sharing among a plurality of game users, scoring the played music and the like can be realized.
In order to solve the problems in the two methods of directly recording the audio file and recording the audio file based on the time sequence, correspondingly, the present exemplary embodiment further provides an information processing method in a game, the method is applied to a terminal, and a graphical user interface is provided on a display screen of the terminal, and a specific flow is shown in fig. 6, and the method includes the following steps:
step S610: providing a setting interface of audio information parameters on the graphical user interface;
step S620: responding to the first operation of the setting interface, and acquiring the set audio information parameters;
step S630: and responding to a second operation on the graphical user interface to record audio according to the set audio information parameter, and generating a second interface file corresponding to the audio.
In another embodiment, the above steps are described in more detail below:
in step S610, a setting interface of audio information parameters is provided on the graphical user interface.
In the present exemplary embodiment, when recording audio, a game user may be provided with a setting interface of audio information parameters on a graphical user interface, and the game user may set audio information such as speed, tempo, key, and musical instrument timbre in the setting interface.
In step S620, the set audio information parameter is obtained in response to the first operation of the setting interface.
In the present exemplary embodiment, the audio information parameters set by the game user are acquired in response to the setting operation of the audio information parameters by the user in the setting interface. The first operation may be a click operation or a long-press operation on an audio information parameter to be selected, which has been set in the game, or may be defined by the game user, or may be set by other technical means, which is not limited in this exemplary embodiment.
In step S630, according to the set audio information parameter, responding to a second operation on the graphical user interface to record an audio, and generating a second interface file corresponding to the audio.
In this exemplary embodiment, after the game user completes setting the audio information parameter, the game user performs audio recording in response to a second operation of the game user on the graphical user interface based on the audio information parameter set by the game user, and generates a second interface file corresponding to the audio. The game user can complete playing of a complete audio through the second operation. The second interface file is a midi file corresponding to the audio to be recorded, and the related information about the midi file has been introduced in step S310 in detail, so that it is not repeated herein. The second operation may be a click or lift operation of the game user on the graphical user interface, or may be another playing operation, which is not particularly limited in this exemplary embodiment.
The specific process of recording the audio in response to the second operation of the game user on the graphical user interface and generating the second interface file corresponding to the audio may be as follows: responding to a second operation performed by the game user on the graphical user interface, recording notes corresponding to each playing event and triggering time points of the notes; calculating the time interval between adjacent events according to the recorded time point for triggering each note; and recording each event, the corresponding note of each event and the time interval to generate a second interface file corresponding to the recorded audio.
In a specific implementation, after the time point corresponding to each performance event is obtained by recording, according to the time point, the time interval between the current event and the previous event can be calculated, and according to a formula ticki=(ti-ti-1) And the/60.0 × bpm obtains the tick number corresponding to the time interval, and records the tick number corresponding to the time interval, the performance event and the note corresponding to the performance event so as to generate a second interface file. Wherein t isiTo trigger the point in time of the ith event, bpm represents the number of notes played in a minute.
It should be noted that the above scenario is only an exemplary illustration, and does not limit the protection scope of the exemplary embodiment.
In the present exemplary embodiment, the recorded audio-related track information may also be recorded in the corresponding fields in the second interface file as shown in table 3. The track information may be one or more of user information, a track name, track introduction, and a score, and may also be other information related to the recorded audio, which is not limited in this exemplary embodiment. By adopting the midi file format, the name of the game user and the audio recording time can be additionally stored, the game user can freely name the recorded audio, and the like, and the data can be stored in the midi file and can be displayed to other game users in the social sharing process. In addition, the data amount generated by the information processing method in the game provided by the embodiment is small, more information can be integrated, the richness and interest of the game are improved, and meanwhile, the storage and transmission of data are reduced.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, the present exemplary embodiment provides an information processing apparatus in a game with respect to the information processing method in a game provided in the above-described steps S310 to S330. The information processing apparatus in the game may be applied to a terminal device, a server, or both. Referring to fig. 7, the in-game information processing apparatus 700 may include an obtaining module 710, a parsing module 720, and a prompting module 730. Wherein:
the obtaining module 710 may be configured to obtain a first interface file corresponding to a first target music;
the parsing module 720 may be configured to extract audio information in the first interface file, and obtain time sequence data corresponding to the first target music according to the audio information, where the audio information includes a tempo of the first target music and a time number corresponding to a quarter note;
the prompt module 730 may be configured to display a corresponding operation prompt message on the graphical user interface according to the time-series data.
The specific details of each module or unit in the information processing apparatus in the game have been described in detail in the corresponding information processing method in the game, and therefore are not described herein again.
Correspondingly, for the in-game information processing method provided in the above steps S610 to S630, the present exemplary embodiment also provides an in-game information processing apparatus. The information processing apparatus in the game may be applied to a terminal device, a server, or both. Referring to fig. 8, the in-game information processing apparatus 800 may include an interaction module 810, a setting module 820, and a recording module 830. Wherein:
the interaction module 810 may be configured to provide a setting interface of audio information parameters on the graphical user interface;
the setting module 820 may be configured to respond to a first operation of the setting interface to obtain the set audio information parameter;
the recording module 830 may be configured to record an audio in response to a second operation on the graphical user interface according to the set audio information parameter, and generate a second interface file corresponding to the audio.
The specific details of each module or unit in the information processing apparatus in the game have been described in detail in the corresponding information processing method in the game, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 3 to 6, and the like.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. An information processing method in a game, which is applied to a terminal and provides a graphical user interface on a display screen of the terminal, comprises the following steps:
acquiring a first interface file corresponding to first target music;
extracting audio information in the first interface file, and obtaining time sequence data corresponding to the first target music according to the audio information, wherein the audio information comprises the tempo of the first target music and the time number corresponding to a quarter note;
and displaying a corresponding operation prompt message on the graphical user interface according to the time sequence data.
2. The method of claim 1, wherein the obtaining time-series data corresponding to the first target music according to the audio information comprises:
acquiring an event in the time sequence data and a note corresponding to the event according to the audio information;
calculating the triggering time point of each event on the time sequence by using the tempo of the first target music and the time number corresponding to the quarter note;
the event is an operation performed on each note corresponding to the event.
3. The in-game information processing method according to claim 2, wherein the first target music is played with an n-th note, where n-2mM is a positive integer greater than or equal to 2;
the calculating the triggering time point of each event on the time sequence by using the tempo of the first target music and the time number corresponding to the quarter note comprises:
calculating the time number corresponding to the n-th note according to the time number corresponding to the quarter note;
calculating the time length corresponding to the n-th note according to the tempo of the first target music;
calculating the time length of each event from the first target music starting point by using the calculated time number corresponding to the n-th note and the calculated time length corresponding to the n-th note;
and obtaining the triggering time point of each event on the time sequence by using the time length.
4. The in-game information processing method according to claim 1, wherein before said displaying, on a graphical user interface, a corresponding operation prompt message according to the time-series data, the method further comprises:
providing at least one selectable game mode on the graphical user interface, wherein the game mode is associated with the audio information parameter;
and responding to the selected game mode, and performing corresponding processing on the first interface file.
5. The in-game information processing method according to claim 4, wherein the audio information parameters include a tone, a tempo, a key signature, and a musical instrument tone;
the gaming mode is associated with the audio information parameter, including:
the game mode is associated with one or more of a tone, tempo, key signature, and instrument timbre.
6. The method of claim 4, wherein the processing the first interface file in response to the selected game mode comprises:
and in response to the selected game mode associated with the beat speed or the key signature, performing speed change or tone change processing on the first interface file.
7. An information processing method in a game, which is applied to a terminal and provides a graphical user interface on a display screen of the terminal, comprises the following steps:
providing a setting interface of audio information parameters on the graphical user interface;
responding to the first operation of the setting interface, and acquiring the set audio information parameters;
and responding to a second operation on the graphical user interface to record audio according to the set audio information parameter, and generating a second interface file corresponding to the audio.
8. The in-game information processing method according to claim 7, wherein recording an audio in response to a second operation on the gui and generating a second interface file corresponding to the audio includes:
responding to a second operation on the graphical user interface, recording a note corresponding to each playing event and triggering a time point of the note, wherein the second operation comprises at least one playing event;
calculating the time interval between the adjacent events according to the recorded time points of triggering the notes;
and recording each event, the corresponding note of each event and the time interval to generate a second interface file corresponding to the audio.
9. The in-game information processing method according to claim 7, characterized by further comprising:
recording the track information of the audio in the corresponding second interface file;
the track information comprises user information, track names, track introduction and score conditions.
10. An information processing apparatus in a game, applied to a terminal, for providing a graphical user interface on a display screen of the terminal, comprising:
the acquisition module is used for acquiring a first interface file corresponding to the first target music;
the analysis module is used for extracting audio information in the first interface file and obtaining time sequence data corresponding to the first target music according to the audio information, wherein the audio information comprises the tempo of the first target music and the time number corresponding to the quarter note;
and the prompting module is used for displaying corresponding operation prompting messages on the graphical user interface according to the time sequence data.
11. An information processing apparatus in a game, applied to a terminal, for providing a graphical user interface on a display screen of the terminal, comprising:
the interactive module is used for providing a setting interface of audio information parameters on the graphical user interface;
the setting module is used for responding to the first operation of the setting interface and acquiring the set audio information parameters;
and the recording module is used for responding to a second operation on the graphical user interface to record audio according to the set audio information parameters and generating a second interface file corresponding to the audio.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1-9.
13. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-9 via execution of the executable instructions.
CN201911342934.2A 2019-12-23 2019-12-23 Information processing method and device in game, storage medium and electronic equipment Pending CN111185006A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911342934.2A CN111185006A (en) 2019-12-23 2019-12-23 Information processing method and device in game, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911342934.2A CN111185006A (en) 2019-12-23 2019-12-23 Information processing method and device in game, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111185006A true CN111185006A (en) 2020-05-22

Family

ID=70684716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911342934.2A Pending CN111185006A (en) 2019-12-23 2019-12-23 Information processing method and device in game, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111185006A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113648651A (en) * 2021-07-02 2021-11-16 北京金三惠科技有限公司 Positioning method and system for music teaching foundation improvement game
CN113797541A (en) * 2021-09-06 2021-12-17 武汉指娱互动信息技术有限公司 Music game level generating method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1571985A (en) * 2001-10-20 2005-01-26 哈尔·C·索尔特 Interactive game providing instruction in musical notation and in learning an instrument
CN109994017A (en) * 2018-12-01 2019-07-09 李昱婷 A kind of bonusing method for supplementary music practice
CN110559657A (en) * 2019-08-22 2019-12-13 腾讯科技(深圳)有限公司 network game control method, device and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1571985A (en) * 2001-10-20 2005-01-26 哈尔·C·索尔特 Interactive game providing instruction in musical notation and in learning an instrument
CN109994017A (en) * 2018-12-01 2019-07-09 李昱婷 A kind of bonusing method for supplementary music practice
CN110559657A (en) * 2019-08-22 2019-12-13 腾讯科技(深圳)有限公司 network game control method, device and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113648651A (en) * 2021-07-02 2021-11-16 北京金三惠科技有限公司 Positioning method and system for music teaching foundation improvement game
CN113648651B (en) * 2021-07-02 2023-11-17 北京金三惠科技有限公司 Positioning method and system for music teaching foundation promotion game
CN113797541A (en) * 2021-09-06 2021-12-17 武汉指娱互动信息技术有限公司 Music game level generating method, device, equipment and storage medium
CN113797541B (en) * 2021-09-06 2024-04-09 武汉指娱互动信息技术有限公司 Music game level generation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109543064B (en) Lyric display processing method and device, electronic equipment and computer storage medium
WO2017054588A1 (en) Music instrument score file playing method, conversion method and apparatus, and electronic device
CN104978973B (en) A kind of audio-frequency processing method and device
CN100585663C (en) Language studying system
CN110025958B (en) Voice sending method, device, medium and electronic equipment
CN110311858A (en) A kind of method and apparatus sending conversation message
CN111185006A (en) Information processing method and device in game, storage medium and electronic equipment
CN114073854A (en) Game method and system based on multimedia file
CN106205571A (en) A kind for the treatment of method and apparatus of singing voice
WO2023051246A1 (en) Video recording method and apparatus, device, and storage medium
CN109410972B (en) Method, device and storage medium for generating sound effect parameters
CN111399745A (en) Music playing method, music playing interface generation method and related products
CN113140230B (en) Method, device, equipment and storage medium for determining note pitch value
US9324377B2 (en) Systems and methods for facilitating rendering visualizations related to audio data
CN111276113B (en) Method and device for generating key time data based on audio
Hajdu et al. On the evolution of music notation in network music environments
CN110910917B (en) Audio clip splicing method and device
CN112632401A (en) Recommendation device, information providing system, recommendation method, and storage medium
CN112071287A (en) Method, apparatus, electronic device and computer readable medium for generating song score
WO2023030536A1 (en) Harmony processing method and apparatus, device, and medium
CN112237740B (en) Beat data extraction method and device, electronic equipment and computer readable medium
JP2014066740A (en) Karaoke device
CN113438547B (en) Music generation method and device, electronic equipment and storage medium
KR20020011570A (en) Teaching system for musical instruments using an internet and operating method thereof
CN116740868A (en) Voice prompt method, device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200522