CN113082709A - Information prompting method and device in game, storage medium and computer equipment - Google Patents
Information prompting method and device in game, storage medium and computer equipment Download PDFInfo
- Publication number
- CN113082709A CN113082709A CN202110425667.6A CN202110425667A CN113082709A CN 113082709 A CN113082709 A CN 113082709A CN 202110425667 A CN202110425667 A CN 202110425667A CN 113082709 A CN113082709 A CN 113082709A
- Authority
- CN
- China
- Prior art keywords
- virtual character
- sound effect
- game
- enemy
- outputting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000000694 effects Effects 0.000 claims abstract description 192
- 238000004590 computer program Methods 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 8
- 230000009471 action Effects 0.000 claims description 5
- 238000007689 inspection Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 206010047571 Visual impairment Diseases 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 208000029257 vision disease Diseases 0.000 description 6
- 230000004393 visual impairment Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 238000013467 fragmentation Methods 0.000 description 1
- 238000006062 fragmentation reaction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a method and a device for prompting information in a game, a storage medium and computer equipment. The method comprises the following steps: when the first virtual character moves to a second virtual character controlled by a player, outputting a first sound effect corresponding to the identity and the state of the first virtual character to a terminal controlled by the player according to the identity and the state of the first virtual character; and if the identity of the first virtual character is the enemy virtual character, outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character controlled by the player attacks the enemy virtual character. Compared with the prior art, the technical scheme provided by the embodiment of the application can greatly improve the accuracy of the players to the virtual characters controlled by the players when the players are vision-impaired people (such as blind people) and are inconvenient to observe the information in the game scene, so that the game experience of the players can be improved.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for prompting information in a game, a storage medium, and a computer device.
Background
The FPS game includes a game hero, a plurality of selectable firearm weapons, and an indefinite number of enemies, similar to most Shooting games.
Because the number of enemies and friends involved in the FPS game is large, prompting of various information in the FPS game is necessary, for example, when the enemies approach to the sides of the FPS game, footsteps sound appears; for another example, when hitting an enemy, corresponding sound effect information is given, and the like.
However, the existing FPS game, although giving the hint information in the FPS game, sets all the footsteps sounds to be the same; on the other hand, after hitting the enemy, a more detailed result is not fed back, namely, the sound effect fed back is the same as long as the enemy is hit. Thus, inconvenience is brought to the player with visual impairment.
Disclosure of Invention
The embodiment of the application provides an in-game information prompting method, an in-game information prompting device, a storage medium and computer equipment, which can provide different prompting information for a player according to different game states.
The embodiment of the application provides an in-game information prompting method, which comprises the following steps:
when a first virtual character moves to a second virtual character controlled by a player, outputting a first sound effect corresponding to the identity and the state of the first virtual character to a terminal controlled by the player according to the identity and the state of the first virtual character;
and if the identity of the first virtual character is an enemy virtual character, outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character attacks the enemy virtual character.
Optionally, the outputting, to the terminal operated by the player, a first sound effect corresponding to the identity and the state of the first virtual character according to the identity and the state of the first virtual character includes: if the identity of the first virtual character is an enemy virtual character, outputting a third sound effect to a terminal controlled by the player; and outputting a fourth sound effect to the terminal operated by the player according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character.
Optionally, the outputting, to the terminal operated by the player, a fourth sound effect according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character includes: determining the current position of the enemy virtual character in a game scene according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character; acquiring a sound effect type and a sound source distance corresponding to the current position of the enemy virtual character in a game scene; acquiring the sound source position at the current position according to the sound source distance; and outputting a target sound effect corresponding to the current position according to the sound effect type, the sound source distance and the sound source azimuth.
Optionally, the outputting, to the terminal operated by the player, a first sound effect corresponding to the identity and the state of the first virtual character according to the identity and the state of the first virtual character includes: if the identity of the first virtual character is an enemy virtual character, outputting a third sound effect to a terminal controlled by the player; and outputting a fifth sound effect to the terminal operated by the player according to the game scene of the enemy virtual character when moving to the second virtual character.
Optionally, the outputting a fifth sound effect to the terminal operated by the player according to the game scene in which the enemy virtual character moves to the second virtual character includes: the method comprises the steps of obtaining the type of a first game scene, the type of a second game scene and scene experience duration when the enemy virtual character moves, wherein the scene experience duration is the time consumed when the enemy virtual character moves from the first game scene to the second game scene; determining a first sound effect parameter corresponding to the first game scene type and a second sound effect parameter corresponding to the second game scene type; determining an intermediate sound effect parameter according to the first sound effect parameter and the second sound effect parameter; and outputting the fifth sound effect to the terminal controlled by the player according to the middle sound effect parameter and the second sound effect parameter.
Optionally, after the second virtual character attacks the enemy virtual character, outputting a second sound effect corresponding to the attack result to the terminal operated by the player according to the attack result, where the second sound effect includes: calculating the damage value of the enemy virtual character after the enemy virtual character is attacked; and outputting a sound effect corresponding to the damage value to a terminal operated and controlled by the player according to the damage value.
Optionally, the method further includes: and responding to the turning control operation aiming at the second virtual character, controlling the second virtual character to turn, and outputting prompt information of a turning angle through a terminal corresponding to the second virtual character.
Optionally, the outputting, by the terminal corresponding to the second virtual character, the prompt information of the turning angle includes: when the second virtual character turns around, responding to the triggering operation of an angle recording control presented in the current game scene, and determining a view area taking the second virtual character as the center; when at least one enemy virtual character exists in the viewport area, outputting the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character.
Optionally, the determining, in response to a triggering operation for an angle recording control presented in a current game scene, a viewing area centered on the second virtual character includes: responding to the triggering operation of the angle recording control, and acquiring the position of the second virtual character in the current game scene and the target viewing distance corresponding to the second virtual character; determining a viewing area taking the second virtual character as the center by taking the position of the second virtual character as the center and taking the target viewing distance as the radius; the outputting of the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character comprises: determining a perspective orientation of the second virtual character in a current game scene; and outputting azimuth prompt information of at least one enemy virtual character relative to the second virtual character by taking the visual angle orientation of the second virtual character as reference, wherein the azimuth prompt information is used for indicating the direction of at least one second virtual character relative to the first virtual character.
An embodiment of the present application further provides an in-game information prompting device, including:
the first sound effect output module is used for outputting a first sound effect corresponding to the identity and the state of a first virtual character to a terminal controlled by a player according to the identity and the state of the first virtual character when the first virtual character moves to a second virtual character controlled by the player;
and the second sound effect output module is used for outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character attacks the enemy virtual character if the identity of the first virtual character is the enemy virtual character.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, where the computer program is suitable for being loaded by a processor to perform the steps in the method for prompting information in a game according to any of the above embodiments.
The embodiment of the present application further provides a computer device, where the computer device includes a memory and a processor, where the memory stores a computer program, and the processor executes the steps in the method for prompting information in a game according to any of the above embodiments by calling the computer program stored in the memory.
According to the technical scheme provided by the embodiment of the application, the first sound effect and the second sound effect can be respectively output to the terminal controlled by the player according to the identity of the first virtual character, the state of the first virtual character when the first virtual character moves to the second virtual character controlled by the player and the attack result of the second virtual character to the virtual character of the enemy, so that the player can judge the identity of the virtual character and the feedback effect when the enemy virtual character is attacked according to the sound effects, and when the player is a person with visual impairment (for example, a blind person) and is inconvenient to observe information in a game scene, the accuracy of the player on the virtual character controlled by the player can be greatly improved, and the game experience of the player can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a system schematic diagram of an in-game information prompt apparatus according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a method for prompting information in a game according to an embodiment of the present application.
Fig. 3 is a schematic flow chart illustrating that a fourth sound effect is output to a terminal manipulated by a player according to a moving speed of an enemy virtual character and/or distance information between the enemy virtual character and a second virtual character according to the embodiment of the present application.
Fig. 4 is a schematic flow chart illustrating a fifth sound effect output to a player control terminal according to a game scene in which an enemy virtual character moves to a second virtual character according to the embodiment of the present application.
Fig. 5 is a schematic structural diagram of an in-game information prompt device according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an in-game information prompting method and device, a storage medium and computer equipment. Specifically, the in-game information prompting method according to the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the information prompting method in the game is operated on the terminal, the terminal device stores a game application program and is used for presenting a virtual scene in a game picture. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the information prompting method in the game is run on a server, the game can be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the information prompting method in the game are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a system schematic diagram of an in-game information prompting device according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The embodiment of the application provides an information prompting method in a game, which can be executed by a terminal or a server. The embodiment of the present application is described by taking an example in which the in-game information presentation method is executed by a terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role-playing game, a strategy game, a sports game, a game of chance, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
Referring to fig. 2, a schematic flow chart of a method for prompting information in a game according to an embodiment of the present application mainly includes step S201 and step S202, which is described in detail as follows:
step S201, when the first virtual character moves to the second virtual character controlled by the player, according to the identity and the state of the first virtual character, outputting a first sound effect corresponding to the identity and the state of the first virtual character to the terminal controlled by the player.
In the embodiment of the present application, the first virtual character may be an enemy virtual character, that is, a virtual character in enemy battle of the player, or may be a friend virtual character, that is, a friend or own battle of the player. Since the two virtual characters have distinct influences on the second virtual character controlled by the player, when the first virtual character moves to the second virtual character controlled by the player, it is necessary to output the first sound effect corresponding to the identity and state of the first virtual character to the player according to the identity and state of the first virtual character. Specifically, as an embodiment of the present application, outputting the first sound effect corresponding to the identity and the state of the first virtual character to the player according to the identity and the state of the first virtual character may be: and if the identity of the first virtual character is the enemy virtual character, outputting a third sound effect to the player, and outputting a fourth sound effect to the terminal operated by the player according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character. Here, the third sound effect output to the terminal operated by the player is different from the sound effect output when the identity of the first virtual character is the friend virtual character, for example, if the identity of the first virtual character is the enemy virtual character, the third sound effect output to the terminal operated by the player has a characteristic of large volume and/or sharp volume to indicate that the second virtual character controlled by the player is alerted, and if the identity of the first virtual character is the friend virtual character, the sound effect output to the player may be small in volume and/or relatively flat and the like. And the third sound effect or the sound effect output when the identity of the first virtual character is the friend virtual character can be made into a configuration file in advance, and then the corresponding sound effect is read from the configuration file according to the identity and the state of the first virtual character and is output to a terminal controlled by a player.
In an embodiment of the present application, the outputting of the fourth sound effect to the terminal manipulated by the player according to the moving speed of the enemy virtual character and/or the distance information from the enemy virtual character can be implemented by steps S301 to S304 as illustrated in fig. 3, and is described as follows:
step S301: and determining the current position of the enemy virtual character in the game scene according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character.
In the embodiment of the present application, a game scene refers to a scene composed of an environment, a building, a machine, a prop, and the like in a 3D (or 4D, 5D) game. The game scene can be a closed scene or an open scene. The current position of the enemy virtual character in the game scene can be determined by using two reference points, wherein one reference point is the initial position of the enemy virtual character in the game scene, and after the initial position of the enemy virtual character in the game scene is recorded, the current position of the enemy virtual character in the game scene can be determined according to the moving speed of the enemy virtual character; and the other reference point is the position of the second virtual character in the game scene, and after the position of the second virtual character in the game scene is known, the current position of the enemy virtual character in the game scene can be determined according to the moving speed of the enemy virtual character and the distance information of the second virtual character.
Step S302: and acquiring the sound effect type and the sound source distance corresponding to the current position of the enemy virtual character in the game scene.
In this application embodiment, the audio type includes action type audio and environment type audio, wherein, action type audio can be the sound that the law was hit, the firearms firing audio, physics is hit the sound, the weapon sound of dropping, the fragmentation of material, the explosion sound, the people sound of falling over, the can kill and shout sound effect and the sound effect of step of people, etc., environment type audio can wind environment audio, water environment audio, woods environment audio, horror environment audio, thunderstorm environment audio, wind and snow environment audio and white noise environment audio, etc. The sound effects can be configured in the configuration file to correspond to the positions of the virtual characters in the game scene, so that the sound effect types corresponding to the current positions of the enemy virtual characters in the game scene can be acquired as long as the current positions of the enemy virtual characters in the game scene are acquired.
As for the sound source distance corresponding to the current position of the enemy virtual character in the game scene, the sound source distance can be obtained according to sound field data (i.e. information such as sound type, sound source distance, direction, surrounding situation and the like) established for the game scene in advance, and the information may specifically be: and in sound field data which is established for a game scene in advance, acquiring the distance from the current position corresponding to the current position of the enemy virtual character in the game scene to the sound source position, and subtracting the distance from the current position to the sound source position from the preset maximum propagation distance of the sound source to obtain the sound source distance corresponding to the current position of the enemy virtual character in the game scene.
Step S303: and acquiring the sound source position of the virtual character of the enemy at the current position in the game scene according to the sound source distance of the generated sound effect.
For convenience of description, in this embodiment, the current position of the enemy virtual character in the game scene is simply referred to as the current position of the enemy virtual character. The specific implementation process of step S303 may be: searching a target square point on the boundary of the inspection range according to the distance from a point on the boundary of the inspection range taking the current position of the enemy virtual character as the center to the sound source position, the distance from the current position of the enemy virtual character to the sound source position and the distance from the point on the boundary of the inspection range to the current position; determining a sound source position of the sound source relative to the current position of the enemy virtual character according to the searched target azimuth point and the current position of the enemy virtual character, wherein the distance from the point on the boundary of the inspection range to the sound source position can specifically obtain a normalized distance difference value of each point on the boundary of the inspection range, a point on the boundary corresponding to the maximum normalized distance difference value is determined as a target azimuth point, and the determination of the sound source position of the sound source relative to the current position of the enemy virtual character according to the searched target azimuth point and the current position of the enemy virtual character can be: when the viewing range taking the current position of the enemy virtual character as the center is a rectangle, an included angle between a connecting line and a horizontal axis of the rectangle is obtained and is used as a sound source position of the sound source relative to the current position of the enemy virtual character, wherein the connecting line is a straight line connecting the target position point and the current position of the enemy virtual character.
Step S304: and outputting a target sound effect corresponding to the current position of the enemy virtual character in the game scene according to the sound effect type corresponding to the current position of the enemy virtual character in the game scene, the sound source distance corresponding to the current position of the enemy virtual character in the game scene and the sound source direction of the enemy virtual character at the current position in the game scene.
As described above, the sound field data includes information such as the surround condition of sound. The surrounding parameter of the sound emitted by the sound source can be determined according to the distance from the point on the boundary of the viewing range with the current position of the enemy virtual character as the center to the sound source position and the total number of the points on the boundary of the viewing range, and the target sound effect corresponding to the current position of the enemy virtual character in the game scene can be output according to the sound effect type corresponding to the current position of the enemy virtual character in the game scene, the sound source distance corresponding to the current position of the enemy virtual character in the game scene, the sound source position of the enemy virtual character in the current position in the game scene and the surrounding parameter of the sound emitted by the sound source.
In the embodiment shown in fig. 3, according to the sound effect type, the sound source distance, and the sound source direction, the target sound effect corresponding to the current position of the virtual character of the enemy in the game scene is output, so that the virtual character can acquire the sound field data corresponding to the position at different positions, and the technical problem of low efficiency of sound field data configuration can be solved.
As another embodiment of the present application, outputting the first sound effect corresponding to the identity and the state of the first virtual character to the terminal operated by the player according to the identity and the state of the first virtual character may be: and if the identity of the first virtual character is the enemy virtual character, outputting a third sound effect to the terminal controlled by the player, and outputting a fifth sound effect to the terminal controlled by the player according to the game scene of the enemy virtual character when moving to the second virtual character. As in the previous embodiment, the third sound effect output to the terminal operated by the player is different from the sound effect output when the identity of the first virtual character is the friend virtual character, and the third sound effect or the sound effect output when the identity of the first virtual character is the friend virtual character can be previously made into a configuration file, and then, according to the identity and the state of the first virtual character, the corresponding sound effect is read from the configuration file and output to the terminal operated by the player.
In an embodiment of the present application, the above-mentioned outputting of the fifth sound effect to the terminal manipulated by the player according to the game scene of the enemy virtual character moving to the second virtual character can be implemented by steps S401 to S404 illustrated in fig. 4, and the following description is provided:
step S401: the method comprises the steps of obtaining the type of a first game scene, the type of a second game scene and scene experience duration when an enemy virtual character moves, wherein the scene experience duration is the time consumed when the enemy virtual character moves from the first game scene to the second game scene.
In the embodiment of the application, each game scene type can be prepared as a configuration file by a game planner in advance, and when each virtual character is in one game scene, the type of the game scene is read from the configuration file; the configuration file also configures the path of the virtual character moving from one game scene to another game scene. Therefore, the type of the first game scene and the type of the second game scene in which the enemy virtual character moves can be acquired from the configuration file. As for the scene elapsed time, the time spent by the virtual character moving from the first game scene to the second game scene may be calculated according to the speed and path of the virtual character moving from the first game scene to the second game scene.
Step S402: and determining a first sound effect parameter corresponding to the first game scene type and a second sound effect parameter corresponding to the second game scene type.
Specifically, determining the first sound effect parameter corresponding to the first game scene type or the second sound effect parameter corresponding to the second game scene type may be: respectively differencing each scene time of the scene experience duration with the scene time corresponding to the sound effect parameter on the scene experience duration to obtain a difference value corresponding to each scene time; for each scene time, dividing the difference value corresponding to the scene time by the scene experience time length to obtain the time proportion between the scene time and the scene experience time length; for each of the at least two audio elements, the audio element value of the audio element at the corresponding time scale at each scene time of the scene elapsed time constitutes the sound effect parameter, where the audio element may be volume and/or intensity, etc.
Step S403: and determining the intermediate sound effect parameters according to the first sound effect parameters and the second sound effect parameters.
In the embodiment of the application, the intermediate sound effect parameter can be determined by adopting an interpolation algorithm according to the first sound effect parameter and the second sound effect parameter.
Step S404: and outputting a fifth sound effect to the terminal operated by the player according to the intermediate sound effect parameter and the second sound effect parameter.
Namely, as the virtual character of the enemy moves from the first game scene to the second game scene, the sound effects corresponding to the first sound effect parameter, the middle sound effect parameter and the second sound effect parameter are output to the terminal operated by the player, and a fifth sound effect is formed. The intermediate sound effect parameter is obtained by interpolation according to the first sound effect parameter and the second sound effect parameter, so the fifth sound effect can comprise more distinctive effects such as a fade-in sound effect, a fade-out sound effect or a stable sound effect.
In the embodiment of the above-mentioned fig. 4, the middle sound effect is determined by the first sound effect parameter and the second sound effect parameter, and finally, the fifth sound effect is output to the terminal controlled by the player according to the middle sound effect parameter and the second sound effect parameter, so that the change of the sound effect is smoother and smoother, and the effect of outputting the sound effect when the enemy virtual character moves to the second virtual character is improved.
Step S202, if the identity of the first virtual character is the enemy virtual character, after the second virtual character controlled by the player attacks the enemy virtual character, a second sound effect corresponding to the attacking result is output to the terminal controlled by the player according to the attacking result.
The existing FPS game has no feedback when the virtual character controlled by the player hits the enemy virtual character, or the feedback result is the same no matter which part of the enemy virtual character is hit, so that good experience cannot be brought to the game player with visual impairment. In the embodiment of the application, after the second virtual character controlled by the player makes an attack (such as shooting, cutting, gunshot, and the like) to the virtual character of the enemy, the second sound effect corresponding to the attack result is output to the terminal operated by the player according to the attack result. Specifically, after the second virtual character controlled by the player shoots the enemy virtual character, the output of the second sound effect corresponding to the attack result to the terminal operated by the player according to the attack result may be: and calculating the damage value of the enemy virtual character after the enemy virtual character is attacked, and outputting a sound effect corresponding to the damage value to the player according to the damage value. In the embodiment that the attack sent by the second virtual character to the enemy virtual character is shooting or gunshot, the damage value of the enemy virtual character can be obtained by comprehensive calculation according to the speed of the bullet when shooting and the position damaged by the enemy virtual character, and specifically, the initial speed of the bullet when the second virtual character shoots is obtained; calculating the final velocity of the bullet after the bullet passes through the medium according to the initial velocity of the bullet and the barrier parameter of the medium passing through the bullet in the flying process; and calculating the damage value of the enemy virtual character according to the initial velocity and the final velocity of the bullet and the damaged part of the enemy virtual character. Obviously, the larger the initial velocity and/or the final velocity of the bullet, the more the damaged part of the enemy avatar is damaged (for example, if the head or the heart of the enemy avatar is damaged, the damaged part belongs to the damaged part, and if the four limbs are damaged, the damaged part is not the damaged part), and the larger the damage value of the enemy avatar is. The greater the damage value of the enemy avatar, the more intense the sound effect, e.g., the greater the volume or the unusual sound of a tragedy, etc.
In the above-described embodiment of the present application, there may be a case where the player perceives that there is an enemy virtual character in the left, right, or back orientation, and at this time, the player needs to control the second virtual character to turn around. Therefore, in the embodiment of the application, the method further includes controlling the second virtual character to turn in response to the turning control operation for the second virtual character, and outputting the prompt information of the turning angle through a terminal corresponding to the second virtual character. Specifically, the prompt information for outputting the turning angle through the terminal corresponding to the second virtual character in the above embodiment may be: when a second virtual character which a player needs to control turns, determining a visual area taking the second virtual character as a center in response to a trigger operation of an angle recording control presented in a current game scene; and when at least one enemy virtual character exists in the viewport, outputting the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character. In order to facilitate the player with impaired vision as much as possible, when the angle recording control is presented in the current game scene, the color of the angle recording control can be adjusted to be displayed as a different color when the color of the image in the setting range of the current game scene and the color of the angle recording control are the same color, and the angle recording control can also be amplified. The determining, in response to the triggering operation for the angle recording control, the viewing area centered on the two virtual characters may be: responding to the triggering operation aiming at the angle recording control, and acquiring the position of a second virtual character in the current game scene and the target viewing distance corresponding to the second virtual character; and determining the inspection area taking the second virtual character as the center by taking the position of the second virtual character as the center and the target inspection distance as the radius. The outputting of the direction prompt information of the at least one enemy virtual character relative to the second virtual character may be: determining the view angle orientation of a second virtual character in the current game scene; and outputting the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character by taking the visual angle orientation of the second virtual character as a reference, wherein the azimuth prompt information is used for indicating the direction of the at least one second virtual character relative to the first virtual character.
According to the in-game information prompting method provided by the embodiment of the application, the first sound effect and the second sound effect can be respectively output to the terminal controlled by the player according to the identity of the first virtual character, the state of the first virtual character when the first virtual character moves to the second virtual character controlled by the player and the attack result of the second virtual character to the virtual character of the enemy, so that the player can judge the identity of the virtual character and the feedback effect when the enemy virtual character is attacked according to the sound effects, and when the player is a person with visual impairment (for example, a blind person) and is inconvenient to observe information in a game scene, the accuracy of the player on the virtual character controlled by the player can be greatly improved, and the game experience of the player can be improved.
In order to better implement the in-game information prompting method of the embodiment of the application, the embodiment of the application also provides an in-game information prompting device. Please refer to fig. 5, which is a schematic structural diagram of an in-game information prompting device according to an embodiment of the present application. The information prompting device in the FPS game may include a first sound effect output module 501 and a second sound effect output module 502, wherein:
a first sound effect output module 501, configured to output a first sound effect corresponding to the identity and the state of a first virtual character to a terminal operated by a player according to the identity and the state of the first virtual character when the first virtual character moves to a second virtual character controlled by the player;
the second sound effect output module 502 is configured to, if the identity of the first virtual character is an enemy virtual character, output a second sound effect corresponding to an attack result to a terminal operated by the player according to the attack result after the second virtual character controlled by the player attacks the enemy virtual character.
Alternatively, the first sound effect output module 501 illustrated in fig. 5 may include a third sound effect output unit and a fourth sound effect output unit, wherein:
the third sound effect output unit is used for outputting a third sound effect to the terminal controlled by the player if the identity of the first virtual character is an enemy virtual character;
and the fourth sound effect output unit is used for outputting a fourth sound effect to the terminal operated by the player according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character.
Optionally, the fourth sound effect output unit may include a first determining unit, a first obtaining unit, a second obtaining unit, and a first output unit, where:
the first determining unit is used for determining the current position of the enemy virtual character in the game scene according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character;
the first acquisition unit is used for acquiring a sound effect type and a sound source distance corresponding to the current position of the enemy virtual character in a game scene;
the second acquisition unit is used for acquiring the sound source position of the virtual character of the enemy at the current position in the game scene according to the sound source distance;
and the first output unit is used for outputting a target sound effect corresponding to the virtual character of the enemy at the current position according to the sound effect type, the sound source distance and the sound source direction.
Alternatively, the first sound effect output module 501 illustrated in fig. 5 may include a third sound effect output unit and a fifth sound effect output unit, wherein:
the third sound effect output unit is used for outputting a third sound effect to the terminal controlled by the player if the identity of the first virtual character is an enemy virtual character;
and the fifth sound effect output unit is used for outputting a fifth sound effect to the terminal operated by the player according to the game scene of the enemy virtual character when moving to the second virtual character.
Optionally, the fifth sound effect output unit may include a third obtaining unit, a second determining unit, a third determining unit, and a second output unit, where:
the third acquisition unit is used for acquiring the type of a first game scene, the type of a second game scene and scene experience duration when the enemy virtual character moves, wherein the scene experience duration is the time consumed when the enemy virtual character moves from the first game scene to the second game scene;
the second determining unit is used for determining a first sound effect parameter corresponding to the first game scene type and a second sound effect parameter corresponding to the second game scene type;
the third determining unit is used for determining the middle sound effect parameter according to the first sound effect parameter and the second sound effect parameter;
and the second output unit is used for outputting a fifth sound effect to the terminal controlled by the player according to the intermediate sound effect parameter and the second sound effect parameter.
Alternatively, the second sound effect output module 502 illustrated in fig. 5 may include a damage value calculation unit and a third output unit, wherein:
the damage value calculating unit is used for calculating the damage value of the enemy virtual character after the enemy virtual character is attacked;
and the third output unit is used for outputting a sound effect corresponding to the damage value to the terminal operated by the player according to the damage value of the enemy virtual character.
Optionally, the apparatus illustrated in fig. 5 may further include a control module, configured to control the second virtual character to turn around in response to a turn-around control operation for the second virtual character, and output a prompt message of a turn-around angle through a terminal corresponding to the second virtual character.
Optionally, the control module may include a fourth determining unit and an output unit, wherein:
a fourth determination unit configured to determine a viewing area centered on a second virtual character in response to a trigger operation for an angle recording control presented in a current game scene when the second virtual character controlled by the player turns around;
and the output unit is used for outputting the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character when the at least one enemy virtual character is in the viewport area.
Optionally, the fourth determining unit of the above example may include an acquiring unit and a viewing area determining unit, and the output unit of the above example may include an angle-of-view orientation determining unit and an orientation prompt information output unit, where:
the obtaining unit is used for responding to the triggering operation of the angle recording control, and obtaining the position of the second virtual character in the current game scene and the target viewing distance corresponding to the second virtual character;
the inspection area determining unit is used for determining an inspection area which takes the second virtual character as the center by taking the position of the second virtual character as the center and the target inspection distance as the radius;
a visual angle orientation determining unit, configured to determine a visual angle orientation of the second virtual character in the current game scene;
and the azimuth prompt information output unit is used for outputting azimuth prompt information of the at least one enemy virtual character relative to the second virtual character by taking the visual angle orientation of the second virtual character as a reference, wherein the azimuth prompt information is used for indicating the direction of the at least one second virtual object relative to the first virtual object.
All the above technical solutions can be combined arbitrarily to form the optional embodiments of the present application, and are not described herein again.
The information prompting device in the game provided by the embodiment of the application can output the first sound effect and the second sound effect to the terminal controlled by the player respectively according to the identity of the first virtual character, the state of the first virtual character when the second virtual character controlled by the player moves and the attack result of the second virtual character to the virtual character of the enemy, so that the player can judge the identity of the virtual character and the feedback effect when the enemy virtual character is attacked only according to the sound effect, when the player is a person with visual impairment (for example, a blind person) and is inconvenient to observe information in a game scene, the accuracy of the player on the controlled virtual character can be greatly improved, and the game experience of the player can be improved.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 6, fig. 6 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 600 includes a processor 601 having one or more processing cores, a memory 602 having one or more computer-readable storage media, and a computer program stored on the memory 602 and executable on the processor. The processor 601 is electrically connected to the memory 602. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 601 is a control center of the computer apparatus 600, connects various parts of the entire computer apparatus 600 using various interfaces and lines, performs various functions of the computer apparatus 600 and processes data by running or loading software programs and/or modules stored in the memory 602, and calling data stored in the memory 602, thereby monitoring the computer apparatus 600 as a whole.
In the embodiment of the present application, the processor 601 in the computer device 600 loads instructions corresponding to processes of one or more applications into the memory 602, and the processor 601 executes the applications stored in the memory 602 according to the following steps, so as to implement various functions:
when the first virtual character moves to a second virtual character controlled by a player, outputting a first sound effect corresponding to the identity and the state of the first virtual character to a terminal controlled by the player according to the identity and the state of the first virtual character; and if the identity of the first virtual character is the enemy virtual character, outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character controlled by the player attacks the enemy virtual character.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 6, the computer device 600 further includes: a touch display screen 603, a radio frequency circuit 604, an audio circuit 605, an input unit 606, and a power supply 607. The processor 601 is electrically connected to the touch display screen 603, the radio frequency circuit 604, the audio circuit 605, the input unit 606, and the power supply 607. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 6 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 603 can be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 603 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 601, and can receive and execute commands sent by the processor 601. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 601 to determine the type of the touch event, and then the processor 601 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 603 to implement input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 603 can also be used as a part of the input unit 606 to implement an input function.
In the embodiment of the present application, a game application is executed by the processor 601 to generate a graphical user interface on the touch display screen 603, where a virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display screen 603 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 604 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 605 may be used to provide an audio interface between the user and the computer device through speakers, microphones. The audio circuit 605 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 605 and converted into audio data, which is then processed by the audio data output processor 601, and then transmitted to, for example, another computer device via the radio frequency circuit 604, or output to the memory 602 for further processing. The audio circuit 605 may also include an earbud jack to provide communication of peripheral headphones with the computer device.
The input unit 606 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 607 is used to power the various components of the computer device 600. Optionally, the power supply 607 may be logically connected to the processor 601 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 607 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 6, the computer device 600 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment can output the first sound effect and the second sound effect to the terminal operated by the player according to the identity of the first virtual character, the state of the first virtual character when the first virtual character moves to the second virtual character controlled by the player, and the result of the attack of the second virtual character to the virtual character of the enemy, so that the player can determine the identity of the virtual character and the effect fed back when the enemy virtual character is attacked according to the sound effects, and when the player is a person with visual impairment (e.g., a blind person) who is inconvenient to observe information in a game scene, the accuracy of the player on the virtual character operated by the player can be greatly improved, and the game experience of the player can be improved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, where the computer programs can be loaded by a processor to execute steps in any method for prompting information in an FPS game provided in embodiments of the present application. For example, the computer program may perform the steps of:
when the first virtual character moves to a second virtual character controlled by a player, outputting a first sound effect corresponding to the identity and the state of the first virtual character to a terminal controlled by the player according to the identity and the state of the first virtual character; and if the identity of the first virtual character is the enemy virtual character, outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character controlled by the player attacks the enemy virtual character.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any information prompting method in the FPS game provided in the embodiment of the present application, the beneficial effects that can be achieved by any information prompting method in the FPS game provided in the embodiment of the present application can be achieved, for details, see the foregoing embodiments, and are not described herein again.
The information prompting method, device, storage medium and computer device in the FPS game provided by the embodiment of the present application are introduced in detail, and a specific example is applied in the present application to explain the principle and implementation manner of the present application, and the description of the above embodiment is only used to help understanding the method and core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (12)
1. A method for prompting information in a game is characterized by comprising the following steps:
when a first virtual character moves to a second virtual character controlled by a player, outputting a first sound effect corresponding to the identity and the state of the first virtual character to a terminal controlled by the player according to the identity and the state of the first virtual character;
and if the identity of the first virtual character is an enemy virtual character, outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character attacks the enemy virtual character.
2. A method for prompting information in a game as set forth in claim 1, wherein outputting a first sound effect corresponding to the identity and status of the first virtual character to the terminal operated by the player according to the identity and status of the first virtual character comprises:
if the identity of the first virtual character is an enemy virtual character, outputting a third sound effect to a terminal controlled by the player;
and outputting a fourth sound effect to the terminal operated by the player according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character.
3. A method for prompting information in a game according to claim 2, wherein the outputting of a fourth sound effect to the player-operated terminal according to the moving speed of the enemy virtual character and/or the distance information from the second virtual character includes:
determining the current position of the enemy virtual character in a game scene according to the moving speed of the enemy virtual character and/or the distance information between the enemy virtual character and the second virtual character;
acquiring a sound effect type and a sound source distance corresponding to the current position of the enemy virtual character in a game scene;
acquiring the sound source position at the current position according to the sound source distance;
and outputting a target sound effect corresponding to the current position according to the sound effect type, the sound source distance and the sound source azimuth.
4. A method for prompting information in a game as set forth in claim 1, wherein outputting a first sound effect corresponding to the identity and status of the first virtual character to the terminal operated by the player according to the identity and status of the first virtual character comprises:
if the identity of the first virtual character is an enemy virtual character, outputting a third sound effect to a terminal controlled by the player;
and outputting a fifth sound effect to the terminal operated by the player according to the game scene of the enemy virtual character when moving to the second virtual character.
5. A method for prompting information in a game as defined in claim 4, wherein outputting a fifth sound effect to the player-operated terminal according to a game scene in which the enemy virtual character moves to the second virtual character comprises:
the method comprises the steps of obtaining the type of a first game scene, the type of a second game scene and scene experience duration when the enemy virtual character moves, wherein the scene experience duration is the time consumed when the enemy virtual character moves from the first game scene to the second game scene;
determining a first sound effect parameter corresponding to the first game scene type and a second sound effect parameter corresponding to the second game scene type;
determining an intermediate sound effect parameter according to the first sound effect parameter and the second sound effect parameter;
and outputting the fifth sound effect to the terminal controlled by the player according to the middle sound effect parameter and the second sound effect parameter.
6. A method for prompting information in a game according to claim 1, wherein outputting a second sound effect corresponding to a result of an attack to the player-controlled terminal according to the result of the attack after the attack is given to the enemy virtual character by the second virtual character comprises:
calculating the damage value of the enemy virtual character after the enemy virtual character is attacked;
and outputting a sound effect corresponding to the damage value to a terminal operated and controlled by the player according to the damage value.
7. A method of prompting information in a game as in claim 1, the method further comprising:
and responding to the turning control operation aiming at the second virtual character, controlling the second virtual character to turn, and outputting prompt information of a turning angle through a terminal corresponding to the second virtual character.
8. The method for prompting information in a game as claimed in claim 7, wherein the outputting the prompting information of the turning angle through the terminal corresponding to the second virtual character comprises:
when the second virtual character turns around, responding to the triggering operation of an angle recording control presented in the current game scene, and determining a view area taking the second virtual character as the center;
when at least one enemy virtual character exists in the viewport area, outputting the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character.
9. A method for prompting information in a game as in claim 8, wherein said determining a viewing area centered on said second virtual character in response to a triggering action for an angle recording control presented in a current game scene comprises: responding to the triggering operation of the angle recording control, and acquiring the position of the second virtual character in the current game scene and the target viewing distance corresponding to the second virtual character; determining a viewing area taking the second virtual character as the center by taking the position of the second virtual character as the center and taking the target viewing distance as the radius;
the outputting of the azimuth prompt information of the at least one enemy virtual character relative to the second virtual character comprises: determining a perspective orientation of the second virtual character in a current game scene; and outputting azimuth prompt information of at least one enemy virtual character relative to the second virtual character by taking the visual angle orientation of the second virtual character as reference, wherein the azimuth prompt information is used for indicating the direction of at least one second virtual character relative to the first virtual character.
10. An information presentation apparatus in a game, comprising:
the first sound effect output module is used for outputting a first sound effect corresponding to the identity and the state of a first virtual character to a terminal controlled by a player according to the identity and the state of the first virtual character when the first virtual character moves to a second virtual character controlled by the player;
and the second sound effect output module is used for outputting a second sound effect corresponding to the attack result to the terminal controlled by the player according to the attack result after the second virtual character attacks the enemy virtual character if the identity of the first virtual character is the enemy virtual character.
11. A computer-readable storage medium, having stored thereon a computer program adapted to be loaded by a processor for performing the steps of the in-game information presentation method according to any one of claims 1 to 9.
12. A computer device, characterized in that the computer device comprises a memory in which a computer program is stored and a processor that executes the steps in the in-game information presentation method according to any one of claims 1 to 9 by calling the computer program stored in the memory.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110425667.6A CN113082709A (en) | 2021-04-20 | 2021-04-20 | Information prompting method and device in game, storage medium and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110425667.6A CN113082709A (en) | 2021-04-20 | 2021-04-20 | Information prompting method and device in game, storage medium and computer equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113082709A true CN113082709A (en) | 2021-07-09 |
Family
ID=76679095
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110425667.6A Pending CN113082709A (en) | 2021-04-20 | 2021-04-20 | Information prompting method and device in game, storage medium and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113082709A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113398582A (en) * | 2021-07-15 | 2021-09-17 | 网易(杭州)网络有限公司 | Game fighting picture display method and device, computer equipment and storage medium |
CN113546404A (en) * | 2021-07-30 | 2021-10-26 | 网易(杭州)网络有限公司 | Control method and device of virtual props in game and electronic terminal |
CN114917585A (en) * | 2022-06-24 | 2022-08-19 | 四川省商投信息技术有限责任公司 | Sound effect generation method and system |
CN115193044A (en) * | 2022-07-12 | 2022-10-18 | 网易(杭州)网络有限公司 | Method and device for playing threat degree sound effect, storage medium and electronic equipment |
CN116328309A (en) * | 2023-03-27 | 2023-06-27 | 广州美术学院 | High-order demand game interaction method aiming at visual impairment crowd Wen Nong travel virtual scene |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002085831A (en) * | 2000-09-20 | 2002-03-26 | Namco Ltd | Game machine and information storage medium |
JP2004130003A (en) * | 2002-10-15 | 2004-04-30 | Namco Ltd | Game system, program, and information storage medium |
WO2017215649A1 (en) * | 2016-06-16 | 2017-12-21 | 广东欧珀移动通信有限公司 | Sound effect adjustment method and user terminal |
CN111773657A (en) * | 2020-08-11 | 2020-10-16 | 网易(杭州)网络有限公司 | Method and device for switching visual angles in game, electronic equipment and storage medium |
CN112316427A (en) * | 2020-11-05 | 2021-02-05 | 腾讯科技(深圳)有限公司 | Voice playing method and device, computer equipment and storage medium |
CN112642152A (en) * | 2020-12-28 | 2021-04-13 | 网易(杭州)网络有限公司 | Method and device for controlling target virtual character in game |
-
2021
- 2021-04-20 CN CN202110425667.6A patent/CN113082709A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002085831A (en) * | 2000-09-20 | 2002-03-26 | Namco Ltd | Game machine and information storage medium |
JP2004130003A (en) * | 2002-10-15 | 2004-04-30 | Namco Ltd | Game system, program, and information storage medium |
WO2017215649A1 (en) * | 2016-06-16 | 2017-12-21 | 广东欧珀移动通信有限公司 | Sound effect adjustment method and user terminal |
CN111773657A (en) * | 2020-08-11 | 2020-10-16 | 网易(杭州)网络有限公司 | Method and device for switching visual angles in game, electronic equipment and storage medium |
CN112316427A (en) * | 2020-11-05 | 2021-02-05 | 腾讯科技(深圳)有限公司 | Voice playing method and device, computer equipment and storage medium |
CN112642152A (en) * | 2020-12-28 | 2021-04-13 | 网易(杭州)网络有限公司 | Method and device for controlling target virtual character in game |
Non-Patent Citations (2)
Title |
---|
NGA-VONKCYMOS: "《守望先锋你不知道的细节 敌我双方大招的音效区别》", Retrieved from the Internet <URL:https://games.sina.com.cn/o/z/overwatch/2016-06-07/fxsvenv6819529.shtml> * |
高宁婧: "论基于3D虚拟成像技术的游戏设计与实现", 美术教育研究, no. 019 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113398582A (en) * | 2021-07-15 | 2021-09-17 | 网易(杭州)网络有限公司 | Game fighting picture display method and device, computer equipment and storage medium |
CN113398582B (en) * | 2021-07-15 | 2024-08-30 | 网易(杭州)网络有限公司 | Game combat picture display method, game combat picture display device, computer equipment and storage medium |
CN113546404A (en) * | 2021-07-30 | 2021-10-26 | 网易(杭州)网络有限公司 | Control method and device of virtual props in game and electronic terminal |
CN114917585A (en) * | 2022-06-24 | 2022-08-19 | 四川省商投信息技术有限责任公司 | Sound effect generation method and system |
CN115193044A (en) * | 2022-07-12 | 2022-10-18 | 网易(杭州)网络有限公司 | Method and device for playing threat degree sound effect, storage medium and electronic equipment |
CN116328309A (en) * | 2023-03-27 | 2023-06-27 | 广州美术学院 | High-order demand game interaction method aiming at visual impairment crowd Wen Nong travel virtual scene |
CN116328309B (en) * | 2023-03-27 | 2023-10-13 | 广州美术学院 | High-order demand game interaction method aiming at visual impairment crowd Wen Nong travel virtual scene |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021043069A1 (en) | Virtual object attack prompt method and apparatus, and terminal and storage medium | |
CN113082709A (en) | Information prompting method and device in game, storage medium and computer equipment | |
CN112076469A (en) | Virtual object control method and device, storage medium and computer equipment | |
CN113101652A (en) | Information display method and device, computer equipment and storage medium | |
CN113398590B (en) | Sound processing method, device, computer equipment and storage medium | |
CN113117331B (en) | Message sending method, device, terminal and medium in multi-person online battle program | |
CN111589102B (en) | Auxiliary tool detection method, device, equipment and storage medium | |
WO2024011894A1 (en) | Virtual-object control method and apparatus, and storage medium and computer device | |
CN113398566A (en) | Game display control method and device, storage medium and computer equipment | |
CN113559512A (en) | Sound source sound effect processing method and device, computer equipment and storage medium | |
CN112843716A (en) | Virtual object prompting and viewing method and device, computer equipment and storage medium | |
CN113332724A (en) | Control method, device, terminal and storage medium of virtual role | |
CN115193035A (en) | Game display control method and device, computer equipment and storage medium | |
CN116115991A (en) | Aiming method, aiming device, computer equipment and storage medium | |
CN115068947A (en) | Game interaction method and device, computer equipment and computer-readable storage medium | |
CN114225412A (en) | Information processing method, information processing device, computer equipment and storage medium | |
CN116650963A (en) | Game information display method, game information display device, computer equipment and storage medium | |
CN113797544A (en) | Attack control method and device for virtual object, computer equipment and storage medium | |
CN114042322B (en) | Animation display method, device, computer equipment and storage medium | |
CN113398564B (en) | Virtual character control method, device, storage medium and computer equipment | |
CN116850594A (en) | Game interaction method, game interaction device, computer equipment and computer readable storage medium | |
CN117654028A (en) | Game display control method and device, computer equipment and storage medium | |
CN118059493A (en) | Game control method, game control device, computer equipment and storage medium | |
CN116966564A (en) | Method and device for playing back action trace, storage medium and computer equipment | |
CN118161853A (en) | Game interaction method, game interaction device, computer equipment and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |