CN113559504B - Information processing method, information processing device, storage medium and electronic equipment - Google Patents

Information processing method, information processing device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113559504B
CN113559504B CN202110853316.5A CN202110853316A CN113559504B CN 113559504 B CN113559504 B CN 113559504B CN 202110853316 A CN202110853316 A CN 202110853316A CN 113559504 B CN113559504 B CN 113559504B
Authority
CN
China
Prior art keywords
sound source
source object
waveform
azimuth
sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110853316.5A
Other languages
Chinese (zh)
Other versions
CN113559504A (en
Inventor
陶毅阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410391501.0A priority Critical patent/CN118179018A/en
Publication of CN113559504A publication Critical patent/CN113559504A/en
Application granted granted Critical
Publication of CN113559504B publication Critical patent/CN113559504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Acoustics & Sound (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an information processing method and device, a storage medium and electronic equipment, wherein the method comprises the following steps: providing a graphical user interface through terminal equipment, wherein content displayed in the graphical user interface at least partially comprises a game scene and an azimuth indicator of the game, responding to a sound source object in the game scene to emit appointed sound, and acquiring first position information, wherein the first position information characterizes the position of the sound source object in the game scene; determining an azimuth value corresponding to the azimuth indicator based on the first position information; and displaying a waveform diagram corresponding to the sound source object according to the azimuth value. The waveform diagram form can be used for replacing the existing simple diagram representation, and more complex and effective sound source information can be expressed through rich waveform diagrams.

Description

Information processing method, information processing device, storage medium and electronic equipment
Technical Field
The present invention relates to the field of computers, and in particular, to an information processing method, an information processing device, a computer readable storage medium, and an electronic device.
Background
Under the wave of the internet, the continuous development and evolution of hardware and software technology has promoted the advent of intelligent devices and software. At the same time, a great deal of hand-tour with different subjects is presented to meet the demands of users.
However, there are still a number of drawbacks in existing games that affect the user's gaming experience. For example, visual sound effects typically take a simpler form of presentation, resulting in a lack of some more complex effective information.
Disclosure of Invention
The embodiment of the application provides an information processing method, an information processing device, a computer readable storage medium and electronic equipment, which can replace the existing simple graph representation by using a waveform graph form, and more complex and effective sound source information is expressed by rich waveform graphs.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
an information processing method of providing a graphical user interface through a terminal device, the content displayed in the graphical user interface at least partially containing a game scene and an azimuth indicator of the game, the method comprising:
responding to a sound source object in the game scene to emit specified sound, and acquiring first position information, wherein the first position information characterizes the position of the sound source object in the game scene;
determining an azimuth value corresponding to the azimuth indicator based on the first position information;
and displaying a waveform diagram corresponding to the sound source object according to the azimuth value.
An information processing apparatus, the apparatus comprising:
a graphical user interface, the content displayed in the graphical user interface at least partially comprising a game scene of the game and an orientation indicator, wherein the orientation indicator is for indicating orientation information in at least one of the game scenes;
the first response module is used for responding to a sound source object in the game scene to emit appointed sound and acquiring first position information, wherein the first position information represents the position of the sound source object in the game scene;
a first determining module, configured to determine an azimuth value corresponding to the azimuth indicator based on the first location information;
and the first display module is used for displaying the waveform diagram corresponding to the sound source object according to the azimuth value.
An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the above-described information processing method steps via execution of the executable instructions.
A computer-readable storage medium having stored thereon a computer program for execution by a processor of the above-described information processing method steps.
In the embodiment of the application, a graphical user interface is provided through a terminal device, content displayed in the graphical user interface at least partially comprises a game scene and an azimuth indicator of the game, a sound source object in the game scene emits specified sound, and first position information is obtained, wherein the first position information characterizes the position of the sound source object in the game scene; determining an azimuth value corresponding to the azimuth indicator based on the first position information; and displaying a waveform diagram corresponding to the sound source object according to the azimuth value. The waveform diagram form can be used for replacing the existing simple diagram representation, and more complex and effective sound source information can be expressed through rich waveform diagrams.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic system diagram of an information processing system according to an embodiment of the present application.
Fig. 1b is a schematic flow chart of an information processing method according to an embodiment of the present application.
Fig. 1c is a first schematic view of an azimuth indicator provided in an embodiment of the present application.
Fig. 1d is a second schematic view of an azimuth indicator provided in an embodiment of the present application.
Fig. 1e is a third schematic view of an azimuth indicator provided in an embodiment of the present application.
Fig. 1f is a fourth schematic view of an azimuth indicator provided in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application.
FIG. 3 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the present application described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be further noted that various triggering events disclosed in the present specification may be preset, and different triggering events may trigger different functions to be executed.
The embodiment of the application provides an information processing method, an information processing device, a storage medium and electronic equipment. Specifically, the information processing method of the embodiment of the present application may be performed by an electronic device, where the electronic device may be a terminal device or a server, etc. The terminal device may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the information processing method is run on a terminal device, the terminal device may be a local terminal device. Taking a game as an example, the terminal device stores a game application program and presents a part of game scenes in the game through a display component. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
When the information processing method is operated on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment. Various cloud applications can be run under the cloud interaction system, for example: and (5) cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the information processing method are completed on the cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal device, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but a terminal device executing the game information processing method is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
A game scene (or referred to as a virtual scene) is a virtual scene that an application program displays (or provides) when running on a terminal device or server. Optionally, the virtual scene is a simulation environment for the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, ocean and the like, wherein the land comprises environmental elements such as deserts, cities and the like. The virtual scene is a scene of a complete game logic of a virtual object such as user control, for example, in a sandbox 3D shooting game, the virtual scene is a 3D game world for a player to control the virtual object to fight, and an exemplary virtual scene may include: at least one element selected from mountains, flat lands, rivers, lakes, oceans, deserts, sky, plants, buildings and vehicles; for example, in a 2D card game, the virtual scene is a scene for showing a released card or a virtual object corresponding to the released card, and an exemplary virtual scene may include: arenas, battle fields, or other "field" elements or other elements that can display the status of card play; for a 2D or 3D multiplayer online tactical game, the virtual scene is a 2D or 3D terrain scene for virtual objects to fight, an exemplary virtual scene may include: mountain, line, river, classroom, table and chair, podium, etc.
The game interface is an interface corresponding to the application program provided or displayed through the graphical user interface, and the interface comprises a UI interface and a game picture for the player to interact. In alternative embodiments, game controls (e.g., skill controls, movement controls, functionality controls, etc.), indication identifiers (e.g., direction indication identifiers, character indication identifiers, etc.), information presentation areas (e.g., number of clicks, time of play, etc.), or game setting controls (e.g., system settings, stores, gold coins, etc.) may be included in the UI interface. In an alternative embodiment, the game screen is a display screen corresponding to the virtual scene displayed by the terminal device, and the game screen may include virtual objects such as game characters, NPC characters, AI characters, and the like for executing game logic in the virtual scene.
Game objects (or virtual objects, game characters) refer to dynamic objects that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, a virtual animal, a cartoon character, or the like. The virtual object is a Character that a Player controls through an input device, or is an artificial intelligence (Artificial Intelligence, AI) set in a virtual environment fight by training, or is a Non-Player Character (NPC) set in a virtual environment fight. Optionally, the virtual object is a virtual character playing an athletic in the virtual scene. Optionally, the number of virtual objects in the virtual scene fight is preset, or dynamically determined according to the number of clients joining the fight, which is not limited in the embodiment of the present application. In one possible implementation, a user can control a virtual object to move in the virtual scene, e.g., control the virtual object to run, jump, crawl, etc., as well as control the virtual object to fight other virtual objects using skills, virtual props, etc., provided by the application. The (game) props refer to props that can be used by virtual objects in a virtual environment, including but not limited to guns, cold weapons, torches, shields, springboards, puppets, and the like, can be used by virtual objects to accelerate their own attributes, assist in combat, or initiate injuries to other virtual objects, can also be replenishment props such as bullets, and can also be assembled on a designated virtual weapon for accessories such as expansion clips, sighting fold, flame arresters, gun stocks, and the like. The virtual camera is an essential component of the game scene picture, and is used for presenting the game scene picture, one game scene at least corresponds to one virtual camera, and according to actual needs, two or more than two virtual cameras can be used as game rendering windows to capture and present picture contents of the game world for players, and the viewing angles of the players for viewing the game world, such as a first person viewing angle and a third person viewing angle, can be adjusted by setting parameters of the virtual cameras.
Referring to fig. 1a, fig. 1a is a schematic system diagram of an information processing apparatus according to an embodiment of the present application. The system may include at least one electronic device 1000, at least one server 2000, at least one database 3000, and a network 4000. The electronic device 1000 held by the user may be connected to servers of different games through the network 4000. Electronic device 1000 is any device having computing hardware capable of supporting and executing software products corresponding to games. In addition, the electronic device 1000 has one or more multi-touch sensitive screens for sensing and obtaining input of a user through touch or slide operations performed at multiple points of the one or more touch sensitive display screens. In addition, when the system includes a plurality of electronic devices 1000, a plurality of servers 2000, and a plurality of networks 4000, different electronic devices 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different electronic devices 1000 may be connected to other terminal devices or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different electronic devices 1000 so as to be connected through an appropriate network and synchronized with each other to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 while different users play the multiplayer game online.
The embodiment of the application provides an information processing method which can be executed by a terminal device or a server. The embodiment of the present application will be described with an example in which an information processing method is executed by a terminal device. The terminal device comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the display component. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal equipment by responding to the received operation instruction, and can also control the content of the opposite-end server by responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or player) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the player, provide virtual services, increase scores related to the player's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the player. For example, a game may include a player controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by the player of the game using one of a plurality of preset touch operations with the touch display screen of the terminal device. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the information processing system shown in fig. 1a is only an example, and the information processing system and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided in the embodiments of the present application, and those skilled in the art can know that, with the evolution of the information processing system and the appearance of a new service scenario, the technical solutions provided in the embodiments of the present application are equally applicable to similar technical problems.
In this embodiment, description will be made from the viewpoint of an information processing apparatus which can be integrated in an electronic device having a storage unit and a microprocessor mounted therein and having arithmetic capability.
The embodiment of the application provides an information processing method, and a graphical user interface is provided through terminal equipment, wherein the terminal equipment can be the terminal equipment mentioned above or can be client equipment in a cloud interaction system mentioned above.
In-game spatial sound effects often contain important information, but they cannot be obtained without the player turning on the sound. Existing games typically employ simple graphical forms to visualize sound related information in a picture. Such visualized graphical forms typically correlate certain representations of graphics with information of a predetermined dimension of sound, e.g., graphics size corresponds to sound distance, graphics different colors correspond to different types of sound, etc. However, in this association of predetermined dimensions, some more complex dimensions of information are often lost, such as the three-dimensional spatial orientation of the sound source, the specific type of sound source (e.g., firearm model), etc. In addition, this visual graphical form is relatively simple and somewhat tedious.
Based on this, the embodiments of the present application first aim to provide an information processing method, which can replace the existing simple graphic representation with a waveform diagram form, and express more complex and effective sound source information through rich waveform diagrams.
Providing a graphical user interface through the terminal device, wherein the content displayed in the graphical user interface at least partially comprises game scenes of the game and azimuth indicators, and the azimuth indicators are used for indicating azimuth information in at least one game scene. Fig. 1b is a flowchart of an information processing method provided in an embodiment of the present application, as shown in fig. 1b, where the method includes the following steps:
101, responding to a sound source object in the game scene to emit specified sound, and acquiring first position information, wherein the first position information characterizes the position of the sound source object in the game scene;
102, determining an azimuth value corresponding to the azimuth indicator based on the first position information;
and 103, displaying a waveform diagram corresponding to the sound source object according to the azimuth value.
The information processing method in the present exemplary embodiment expresses more complex and effective sound source information by using a form of a waveform diagram instead of the existing expression of a simple diagram, by enriching the waveform diagram.
Next, each step of the information processing method in the present exemplary embodiment will be further described.
In this exemplary embodiment, a graphical user interface is provided through the terminal, where the content displayed in the graphical user interface at least partially includes a game scene of the game, where the game scene includes at least one player character.
The graphical user interface is obtained by executing a software application on a processor of the mobile terminal or other terminals and rendering the software application on a display, and can be a display screen interface of the terminal equipment. The graphical user interface may present all or only a portion of the game scene. The game scene comprises a plurality of static virtual objects, and specifically comprises ground, mountain, stone, vegetation, buildings and the like. When the game scene is bigger, only the local content of the game scene is displayed on the graphical user interface of the terminal equipment in the game process. Optionally, the game scene includes a game character, where the game character may be a game character operated by a player, or may be an NPC (non-player character, which is a character type in a game, meaning a non-player character), and the exemplary embodiment is not limited.
In an alternative embodiment, the azimuth indicator is used for indicating at least one azimuth information in the game scene, and the azimuth indicator can represent the azimuth information in the game scene in any visual graphic form, for example, the azimuth indicator represents the azimuth information in the game scene in a compass form of a compass, and also represents the azimuth information in the game scene in a one-dimensional or two-dimensional vernier form, wherein the azimuth information in the game scene can be azimuth coordinates of a virtual world coordinate system where the game scene is located, or can be relative azimuth coordinates relative to an arbitrary target, for example, the azimuth information in the game scene is the orientation azimuth of a game character in the game scene.
In an alternative embodiment, the position indicator may be a relatively large area on the graphical user interface or a relatively small area on the graphical user interface. The orientation indicator may be square, rectangular, frame-shaped, or may be other shapes (e.g., circular, etc.). The content presented by the graphical user interface may include all or part of the position indicators. For example, when the direction indicator is enlarged and displayed in the graphical user interface, the partial content of the direction indicator is displayed on the graphical user interface of the terminal device. The position indicators may be displayed in an upper left, upper right, or other location in the graphical user interface, the exemplary embodiment is not limiting.
Specifically, in step 101, the sound source object may be a virtual firearm, such as a sniper gun 98K, a submachine gun, a pistol, or the like, or may be a virtual grenade, or a carrier sound, or a footstep sound, or the like. The sound source object emits a specified sound, such as a submachine gun emits a continuous "pop" sound, a "rattle" sound generated when the virtual grenade explodes, a "boomerang" sound emitted by the virtual car.
In an alternative embodiment, the first position information characterizes a position of the sound source object in the game scene, wherein the position of the sound source object in the game scene may be a real-time position or a position delayed by a preset threshold time, and the position may be a two-dimensional or three-dimensional coordinate of a world coordinate system in the game scene, such as a three-dimensional coordinate value (3, 4, 5) (compared to a three-dimensional coordinate system origin (0, 0)).
Specifically, in step 102, the position indicator includes a position identifier for indicating a position, which may include a position scale and/or a position description. The azimuth indicator comprises azimuth scales expressed by numbers and words, wherein the numbers are as follows: 105. 120, 150, etc., written as: east, southeast, south, southwest, etc.; also included is an orientation description which preferably takes the form of a text description, but may alternatively take the form of a graphic description, wherein the leftmost displayed "left behind" and the rightmost displayed "right behind" on the one hand have reduced understanding difficulty for the player to understand the orientation scale, and on the other hand, due to the limited screen space, it may be difficult to equally display the entire scale range of 0-360 degrees of the orientation scale on the screen, only a portion of the detail of the scale area is focused on the screen, and a portion of the scale area may be displayed generally by the orientation description.
In an alternative embodiment, the orientation indicator may take the form of a ruler as shown in fig. 1c-1d, or may take other forms capable of indicating an orientation, such as: in the form of an analog compass or in the form of an arc dial, etc. Alternatively, the scale value of the azimuth indicator may have a preset correspondence with the azimuth angle, for example, the 1 scale value of the azimuth indicator corresponds to the 15 ° azimuth angle. Referring to the schematic illustration of one of the orientation indicators shown in FIGS. 1c-1d, wherein the orientation value of the orientation indicator of FIG. 1c corresponds to a west (W) direction of 285; the azimuth value of the azimuth indicator of fig. 1d corresponds to north (N) direction 44 °.
In an alternative embodiment, the azimuth value of the azimuth indicator may represent the azimuth of the sound source object in the game scene, and the coordinate system of the game scene may be a world coordinate system established by taking the position of the game character controlled by the terminal device as the origin of coordinates, alternatively, the north direction in the game scene may take the current direction of the game character as a reference direction, and also take the moving direction of the game character as a reference direction, or any other set direction as a reference direction. The azimuth of the sound source object in the game scene is the direction of the position of the sound source object compared with the reference direction, for example, the position of the sound source object is located at the north (N) 44 ° of the game character. The azimuth indicator can provide azimuth reference information for the player while indicating the direction, and the player can roughly judge the azimuth of the sound source without looking at specific scales.
In some embodiments, the coordinate system of the game scene may also be a world coordinate system established with a point in the game scene as an origin of coordinates.
Specifically, in step 103, the waveform patterns may include a plurality of types, such as a sine wave waveform pattern, a trapezoid wave waveform pattern, a square wave waveform pattern, a triangle wave waveform pattern, or a step wave waveform pattern. Alternatively, the display size and the display range of the waveform diagram may be arbitrarily set. The waveform diagram corresponding to the sound source object is that the sound source objects of different types are in one-to-one correspondence with the waveform diagrams of different types, or in one-to-many correspondence, or in many-to-one correspondence, for example, the sniper gun 98K corresponds to a sine wave waveform diagram, and the submachine gun corresponds to a trapezoidal wave waveform diagram; the virtual car corresponds to a triangular wave waveform diagram. That is, the user can learn the types of different sound source objects by identifying different kinds of waveform diagrams. By the mode, the graphic identification form of the sound source object is enriched.
In some embodiments, the waveforms in the waveform diagram may be formed by arranging a plurality of columns of different lengths, as shown in fig. 1 d.
It will be appreciated that the above is merely an example of waveforms in the waveform diagrams and is not intended to limit the present application.
In an alternative embodiment, the waveform graph may be dynamically changed in real time according to the first position information which is changed in real time, for example, at the time of 1s, the first position information is 230 ° in southeast, and the displayed waveform graph is a trapezoidal waveform graph; at time 2s, the first position information is 330 ° northwest, and the displayed waveform is a sine wave waveform.
In an alternative embodiment, the display position of the waveform chart may be at a preset position of the azimuth indicator, where the display position of the waveform chart corresponds to the azimuth value of the azimuth indicator, for example, the display position of the waveform chart is at a scale value corresponding to the azimuth value of the azimuth indicator, or is at a preset distance from the scale value corresponding to the azimuth value of the azimuth indicator, which is not limited in the present application.
In an alternative embodiment, the waveform includes a plurality of different amplitude wavelengths; the amplitude of the wavelength may refer to the wavelength corresponding to the lateral amplitude or the wavelength corresponding to the longitudinal amplitude; optionally, the different magnitudes of the waveform chart correspond to the high and low positions in the game scene where the sound source object is located, and the correspondence may be one-to-one correspondence, many-to-one correspondence, or one-to-many correspondence, for example, the magnitude 3 corresponds to a height value in the game scene where the sound source corresponds to being in a range of 0-3, and the magnitude 4 corresponds to a height value in the game scene where the sound source corresponds to being equal to 4. By the mode, the player can know the relative position of the sound source object only by identifying the wavelength amplitude of the waveform diagram, and particularly can judge whether the sound source object occupies a high point, so that the expression form of the position information of the sound source object is enriched.
In some embodiments, displaying the waveform diagram corresponding to the sound source object according to the azimuth value may include:
acquiring parameters corresponding to a sound source object, wherein the parameters corresponding to the sound source object comprise at least one of the type of the sound source object, the state of appointed sound, the relative distance between the sound source object and a virtual object and the relative position between the sound source object and the virtual object;
and displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object.
When the sound source object includes a virtual firearm such as a sniper gun 98K, a submachine gun, a pistol, a virtual grenade, a vehicle such as a ship and a car, and a step, the types of the sound source object may include four types of the virtual firearm, the virtual grenade, the vehicle, and the step. In some embodiments, the types of acoustic source objects may include sniper guns, submachine guns, handguns, virtual mines, boats, cars, and footsteps.
The states of the specified sound include a first state, a second state, and a third state. When the designated sound is the sound just emitted by the sound source object, determining the state of the designated sound as a first state; when the designated sound is the sound finally emitted by the sound source object, determining the state of the designated sound as a third state; when the specified sound is a sound other than the sound just emitted and the sound finally emitted among the sounds emitted by the sound source object, the state of the specified sound is determined to be the second state. For example, assuming that the sound source object is a submachine gun G1, the submachine gun G1 issues consecutive "pops" and pops, the state of the first "pop" is the first state, the state of the last "pop" is the third state, and the states of the other "pops" are the second states.
In some embodiments, whether the sound source object emits the specified sound may be detected every preset time period, when a certain sound source object emits the specified sound for the first time, the state of the detected specified sound may be determined to be the first state, whether the sound source object still emits the specified sound may be continuously detected every preset time period, when the sound source object still emits the specified sound is continuously detected, the state of the detected specified sound may be determined to be the second state, and when the sound source object is detected to stop emitting the specified sound, the state of the last detected specified sound before the sound source object is detected to stop emitting the specified sound may be determined to be the third state. The preset time length can be set by the electronic equipment according to a certain rule.
The virtual object may be a game character controlled by the terminal device. The relative distance and relative position of the sound source object to the virtual object may be determined based on the first position information and the second position information of the virtual object. The second position information characterizes the position of the virtual object, i.e. the game character controlled by the terminal device, in the game scene.
For example, a world coordinate system corresponding to a game scene may be established, wherein coordinate values on an X-axis and a Y-axis of the world coordinate system correspond to a ground width in the game scene, coordinate values on a Z-axis correspond to a ground height in the game scene, and the first position information and the second position information may be three-dimensional coordinates in the world coordinate system.
Wherein, based on the three-dimensional coordinate value corresponding to the sound source object and the three-dimensional coordinate value corresponding to the virtual object, the relative distance between the sound source object and the virtual object can be determined.
Specifically, assuming that the three-dimensional coordinate value corresponding to the sound source object includes a first abscissa value, a first ordinate value and a first ordinate value, and the three-dimensional coordinate value corresponding to the virtual object includes a second abscissa value, a second ordinate value and a second ordinate value, the square of the sum of the first abscissa value and the second abscissa value can be calculated to obtain a first square value; calculating the square of the sum of the first longitudinal coordinate value and the second longitudinal coordinate value to obtain a second square value; calculating the square of the sum of the first vertical coordinate value and the second vertical coordinate value to obtain a third square value; calculating the sum of the first square value, the second square value and the third square value to obtain a first target square value; the arithmetic square root of the first target square value is taken as the relative distance of the sound source object and the virtual object.
In some embodiments, the relative distance of the sound source object from the virtual object may be determined based on the abscissa and ordinate values in the three-dimensional coordinate values corresponding to the sound source object and the abscissa and ordinate values in the three-dimensional coordinate values corresponding to the virtual object.
Specifically, assuming that the three-dimensional coordinate value corresponding to the sound source object comprises a first abscissa value and a first ordinate value, the three-dimensional coordinate value corresponding to the virtual object comprises a second abscissa value and a second ordinate value, the square of the sum of the first abscissa value and the second abscissa value can be calculated, and a first square value is obtained; calculating the square of the sum of the first longitudinal coordinate value and the second longitudinal coordinate value to obtain a second square value; calculating a first square value and a second square value to obtain a second target square value; the arithmetic square root of the second target square value is taken as the relative distance of the sound source object and the virtual object.
Wherein, based on the vertical coordinate value in the three-dimensional coordinate value corresponding to the sound source object and the vertical coordinate value in the three-dimensional coordinate value corresponding to the virtual object, the relative position of the sound source object and the virtual object can be determined, for example, whether the sound source object is located above, below or at the same layer as the virtual object is determined.
Specifically, assuming that the three-dimensional coordinate value corresponding to the sound source object includes a first vertical coordinate value, the three-dimensional coordinate value corresponding to the virtual object includes a second vertical coordinate value, the second vertical coordinate value may be subtracted from the first vertical coordinate value to obtain a first height value, and based on the first height value, a relative position of the sound source object and the virtual object is determined. For example, a first height threshold value and a second height threshold value may be preset as the first preset height threshold value and the second preset height threshold value, wherein the first preset height threshold value is smaller than the second preset height threshold value. When the first height value is larger than a second preset height threshold value, determining that the sound source object is positioned above the virtual object; when the first height value is larger than or equal to a first preset height threshold value and smaller than or equal to a second preset height threshold value, determining that the sound source object is positioned on the same layer of the virtual object; and when the first height value is smaller than a first preset height threshold value, determining that the sound source object is positioned below the virtual object. The first preset height threshold and the second preset height threshold may be set by the electronic device according to a certain rule.
In some embodiments, the relative position of the virtual object and the sound source object may also be determined based on the abscissa value in the three-dimensional coordinate values corresponding to the virtual object and the abscissa value in the three-dimensional coordinate values corresponding to the sound source object, for example, determining whether the sound source object is to the left or right of the virtual object, and so on.
Specifically, assuming that the three-dimensional coordinate value corresponding to the sound source object includes a first abscissa value, the three-dimensional coordinate value corresponding to the virtual object includes a second abscissa value, the second abscissa value may be subtracted from the first abscissa value to obtain a first width value, and based on the first width value, a relative position of the virtual object and the sound source object is determined. For example, a first width threshold value and a second width threshold value may be preset as the first preset width threshold value and the second preset width threshold value, wherein the first preset width threshold value is smaller than the second preset width threshold value. When the first width value is larger than a second preset width threshold value, determining that the sound source object is positioned at the left side of the virtual object; and when the first width value is smaller than a first preset width threshold value, determining that the sound source object is positioned at the left side of the virtual object. The first preset height threshold and the second preset height threshold may be set by the electronic device according to a certain rule.
In other embodiments, the relative position of the sound source object and the virtual object may also be determined based on the ordinate value in the three-dimensional coordinate values corresponding to the sound source object and the ordinate value in the three-dimensional coordinate values corresponding to the virtual object, for example, whether the sound source object is in front of or behind the virtual object, or the like.
Specifically, assuming that the three-dimensional coordinate value corresponding to the sound source object includes a first ordinate value, the three-dimensional coordinate value corresponding to the virtual object includes a second ordinate value, the second ordinate value may be subtracted from the first ordinate value to obtain a second width value, and based on the second width value, the relative position of the sound source object and the virtual object is determined. For example, a third width threshold value and a fourth width threshold value may be preset as the third preset width threshold value and the fourth preset width threshold value, wherein the third preset width threshold value is smaller than the fourth preset width threshold value. When the second width value is larger than a fourth preset width threshold value, determining that the sound source object is positioned in front of the virtual object; and when the second width value is smaller than a third preset width threshold value, determining that the sound source object is positioned behind the virtual object. The third preset height threshold and the fourth preset height threshold may be set by the electronic device according to a certain rule.
In some embodiments, when the first width value is greater than a second preset width threshold and the second width value is greater than a fourth preset width threshold, determining that the sound source object is positioned in front of the virtual object; when the first width value is larger than a second preset width threshold value and the second width value is smaller than a third preset width threshold value, determining that the sound source object is positioned at the left rear of the virtual object; when the first width value is larger than a second preset width threshold value, the second width value is larger than or equal to a third preset width threshold value, and smaller than or equal to a fourth preset width threshold value, determining that the sound source object is positioned right left of the virtual object; when the first width value is smaller than a first preset width threshold value and the second width value is larger than a fourth preset width threshold value, determining that the sound source object is positioned in front of the right of the virtual object; when the first width value is smaller than a first preset width threshold value and the second width value is smaller than a third preset width threshold value, determining that the sound source object is positioned at the right rear of the virtual object; when the first width value is smaller than a first preset width threshold value, the second width value is larger than or equal to a third preset width threshold value, and smaller than or equal to a fourth preset width threshold value, determining that the sound source object is positioned right of the virtual object; when the first width is larger than or equal to a first preset width threshold value and smaller than or equal to a second preset width threshold value, and the second width value is larger than a fourth preset width threshold value, determining that the sound source object is positioned right in front of the virtual object; and when the first width is larger than or equal to a first preset width threshold value and smaller than or equal to a second preset width threshold value and the second width value is smaller than a third preset width threshold value, determining that the sound source object is positioned right behind the virtual object.
Wherein, according to the azimuth value and the parameter corresponding to the sound source object, displaying the waveform diagram corresponding to the sound source object may include: determining the display size, dynamic effect and the like of a waveform diagram corresponding to the sound source object according to the parameters corresponding to the sound source object; and displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the determined display size and/or dynamic effect. The parameters corresponding to the sound source object are different, and the determined display sizes are different and the dynamic effects are different.
For example, if it is determined that the dynamic effect of the waveform diagram corresponding to the sound source object is the dynamic effect F1 according to the parameter corresponding to the sound source object, the waveform diagram corresponding to the sound source object may be displayed according to the dynamic effect F1 according to the azimuth value.
It should be noted that the parameters corresponding to the sound source object may also include other parameters related to the sound source object, such as a size of a specified sound emitted by the sound source object, and so on.
In some embodiments, displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object may include:
determining waveform expression parameters according to parameters corresponding to the sound source object, wherein the waveform expression parameters comprise at least one of the shape, the color, the amplitude, the vibration frequency and the vibration direction of a waveform in a waveform diagram corresponding to the sound source object;
And displaying the waveform diagram corresponding to the sound source object according to the waveform expression parameters according to the azimuth value.
For example, when the parameter corresponding to the sound source object includes the type of the sound source object, at least one of the shape, color, amplitude, vibration frequency, and vibration direction of the waveform in the waveform chart may be determined according to the type of the sound source object, so that the player can determine the type of the sound source object through at least one of the shape, color, amplitude, vibration frequency, and vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein the types of different sound source objects correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
For example, when the parameter corresponding to the sound source object includes the type of the sound source object, the color of the waveform in the waveform map may be determined according to the type of the sound source object, so that the waveform map corresponding to the sound source object may be displayed according to the color according to the azimuth value. Wherein, the types of the sound source objects are different, and the colors of the corresponding determined waveforms are also different. For example, the virtual firearm may correspond to red, the footstep may correspond to orange, and when the type of sound source object is the virtual firearm, a waveform chart with a waveform color of red may be displayed; when the type of the sound source object is a step, a waveform diagram with a waveform color of orange yellow may be displayed, and then when the player sees the waveform in the waveform diagram displayed by the graphical user interface as red, the type of the sound source object may be determined to be a virtual firearm, and when the player sees the waveform in the waveform diagram displayed by the graphical user interface as orange yellow, the type of the sound source object may be determined to be a step.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform diagram may be determined according to the state of the designated sound, so that the player may determine the state of the designated sound, such as the designated sound being the first state, the second state, or the third state, through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform diagram displayed by the user interface. Wherein the different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a relative distance of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the relative distance, so that the player may determine the relative distance of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed by the user interface. Wherein different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies, and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a relative position of the sound source object and the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform chart may be determined according to the relative position, so that the player may determine the relative position of the sound source object and the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes the type of the sound source object and the state of the designated sound, at least one of the shape, color, amplitude, vibration frequency, and vibration direction of the waveform in the waveform chart may be determined according to the type of the sound source object and the state of the designated sound, so that the player may determine the type of the sound source object and the state of the designated sound through at least one of the shape, color, amplitude, vibration frequency, and vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein, different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
For example, assuming that the color of the waveform in the waveform diagram is determined according to the type of the sound source object and the state of the designated sound, the color of a part of the waveform in the waveform diagram may be determined according to the type of the sound source object, and the color of another part of the waveform in the waveform diagram may be determined according to the state of the designated sound, so that the player may determine the type of the sound source object through the color of a part of the waveform in the waveform diagram displayed by the graphic user interface, and the state of the designated sound through the color of another part of the waveform in the waveform diagram displayed by the graphic user interface.
For another example, assuming that the color and the amplitude of the waveform in the waveform chart can be determined according to the type of the sound source object and the state of the designated sound, the color of the waveform in the waveform chart can be determined according to the type of the sound source object, and the amplitude of the waveform in the waveform chart can be determined according to the state of the designated sound, so that the player can determine the type of the sound source object by the color of the waveform in the waveform chart displayed by the graphic user interface, and the state of the designated sound by the amplitude of the waveform in the waveform chart displayed by the graphic user interface.
For another example, it is assumed that the shape, color, and vibration direction of the waveforms in the waveform diagram may be determined according to the type of the sound source object and the state of the designated sound, the color and shape of the waveforms in the waveform diagram may be determined according to the type of the sound source object, and the vibration direction of the waveforms in the waveform diagram may be determined according to the state of the designated sound, so that the player may determine the type of the sound source object through the color and shape of the waveforms in the waveform diagram displayed by the graphic user interface, and the state of the designated sound through the vibration direction of the waveforms in the waveform diagram displayed by the graphic user interface.
For another example, it is assumed that the shape, amplitude, vibration direction, and vibration frequency of the waveforms in the waveform diagram may be determined according to the type of the sound source object and the state of the designated sound, the amplitude and vibration direction of the waveforms in the waveform diagram may be determined according to the type of the sound source object, and the shape and vibration frequency of the waveforms in the waveform diagram may be determined according to the state of the designated sound, so that the player may determine the type of the sound source object through the amplitude and vibration direction of the waveforms in the waveform diagram displayed by the graphic user interface, and the state of the designated sound may be determined through the amplitude and vibration direction of the waveforms in the waveform diagram displayed by the graphic user interface.
For another example, it is assumed that the shape, color, amplitude, vibration direction, and vibration frequency of the waveforms in the waveform diagram may be determined according to the type of the sound source object and the state of the designated sound, the color, vibration direction, and vibration frequency of the waveforms in the waveform diagram may be determined according to the type of the sound source object, and the shape and amplitude of the waveforms in the waveform diagram may be determined according to the state of the designated sound, so that the player may determine the type of the sound source object through the color, vibration direction, and vibration frequency of the waveforms in the waveform diagram displayed by the graphic user interface, and the state of the designated sound through the shape and amplitude of the waveforms in the waveform diagram displayed by the graphic user interface.
In some embodiments, when the parameter corresponding to the sound source object includes a type of the sound source object and a relative position of the sound source object and the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the type of the sound source object and the relative position, so that the player may determine the type of the sound source object and the relative position of the sound source object and the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed by the user interface. Wherein, different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameters corresponding to the sound source object include a type of the sound source object and a relative distance of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the type of the sound source object and the relative distance, so that the player may determine the type of the sound source object and the relative distance of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed by the user interface. Wherein different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound and a relative distance of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform chart may be determined according to the state of the designated sound and the relative distance, so that the player may determine the state of the designated sound and the relative distance of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound and a relative position of the sound source object and the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform chart may be determined according to the state of the designated sound and the relative position, so that the player may determine the state of the designated sound and the relative position of the sound source object and the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a relative distance of the sound source object from the virtual object and a relative position of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the relative distance and the relative position, so that the player may determine the relative distance of the sound source object from the virtual object and the relative position of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed by the user interface. Wherein, different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a type of the sound source object, a state of the designated sound, and a relative distance of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the type of the sound source object, the state of the designated sound, and the relative distance, so that the player may determine the type of the sound source object, the state of the designated sound, and the relative distance of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed by the user interface. Wherein, different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
For example, assuming that the shape of the waveform in the waveform diagram is determined according to the type of the sound source object, the state of the designated sound, and the relative distance of the sound source object to the virtual object, the waveform may be divided into a first portion, a second portion, and a third portion, the shape of the first portion of the waveform in the waveform diagram may be determined according to the type of the sound source object, the shape of the second portion of the waveform in the waveform diagram may be determined according to the state of the designated sound, and the shape of the third portion of the waveform in the waveform diagram may be determined according to the relative position, so that the player may determine the type of the sound source object through the shape of the first portion of the waveform in the waveform diagram displayed by the graphic user interface, the state of the designated sound through the shape of the second portion of the waveform in the waveform diagram displayed by the graphic user interface, and the relative position of the sound source object to the virtual object through the shape of the third portion of the waveform in the waveform diagram displayed by the graphic user interface.
For another example, assuming that the vibration direction and the vibration frequency of the waveforms in the waveform diagram can be determined according to the type of the sound source object, the state of the designated sound, and the relative distance between the sound source object and the virtual object, the vibration direction of a part of the waveforms in the waveform diagram can be determined according to the type of the sound source object, the vibration direction of another part of the waveforms in the waveform diagram can be determined according to the state of the designated sound, and the vibration frequency of the waveforms in the waveform diagram can be determined according to the relative distance, so that the player can determine the type of the sound source object through the vibration direction of a part of the waveforms in the waveform diagram displayed by the graphical user interface, the state of the designated sound is determined through the vibration direction of another part of the waveforms in the waveform diagram displayed by the graphical user interface, and the relative distance between the sound source object and the virtual object is determined through the vibration frequency of the waveforms in the waveform diagram displayed by the graphical user interface.
For another example, it is assumed that the shape, color, and vibration direction of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the state of the designated sound, and the relative distance between the sound source object and the virtual object, the color of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the vibration direction of the waveforms in the waveform diagram may be determined according to the state of the designated sound, the shape of the waveforms in the waveform diagram may be determined according to the relative distance, so that the player may determine the type of the sound source object through the color of the waveforms in the waveform diagram displayed by the graphical user interface, the state of the designated sound through the vibration direction of the waveforms in the waveform diagram displayed by the graphical user interface, and the relative distance between the sound source object and the virtual object may be determined through the shape of the waveforms in the waveform diagram displayed by the graphical user interface.
For another example, it is assumed that the shape, color, amplitude, and vibration frequency of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the state of the designated sound, and the relative distance between the sound source object and the virtual object, the shape of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the color and vibration frequency of the waveforms in the waveform diagram may be determined according to the state of the designated sound, the amplitude of the waveforms in the waveform diagram may be determined according to the relative distance, so that the player may determine the type of the sound source object through the shape of the waveforms in the waveform diagram displayed by the graphic user interface, the state of the designated sound through the color and vibration frequency of the waveforms in the waveform diagram displayed by the graphic user interface, and the relative distance between the designated sound source object and the virtual object may be determined through the amplitude of the waveforms in the waveform diagram displayed by the graphic user interface.
For another example, it is assumed that the shape, color, amplitude, vibration direction, and vibration frequency of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the state of the designated sound, and the relative distance between the sound source object and the virtual object, the shape of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the shape of the waveforms in the waveform diagram may be determined according to the state of the designated sound, the color and the amplitude of the waveforms in the waveform diagram may be determined according to the relative distance, so that the player may determine the type of the sound source object through the vibration direction and the vibration frequency of the waveforms in the waveform diagram displayed by the graphic user interface, the state of the designated sound may be determined through the shape of the waveforms in the waveform diagram displayed by the graphic user interface, and the relative distance between the sound source object and the virtual object may be determined through the color and the amplitude of the waveforms in the waveform diagram displayed by the graphic user interface.
In some embodiments, when the parameter corresponding to the sound source object includes a type of the sound source object, a state of the designated sound, and a relative position of the sound source object and the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the type of the sound source object, the state of the designated sound, and the relative position, so that the player may determine the type of the sound source object, the state of the designated sound, and the relative position of the sound source object and the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed by the user interface. Wherein, different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameters corresponding to the sound source object include a type of the sound source object, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the type of the sound source object, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object, so that the player may determine at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed through the user interface. Wherein, different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform chart may be determined according to the state of the designated sound, the relative distance, and the relative position, so that the player may determine the state of the designated sound, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform chart may be determined according to the state of the designated sound, the relative distance, and the relative position, so that the player may determine the state of the designated sound, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object through at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform chart displayed by the user interface. Wherein, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
In some embodiments, when the parameters corresponding to the sound source object include a type of the sound source object, a state of a designated sound, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in the waveform map may be determined according to the type of the sound source object, the state of the designated sound, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object, so that the player may determine at least one of the shape, the color, the amplitude, the vibration frequency, and the vibration direction of the waveform in the waveform map displayed through the user interface. Wherein, different types correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different states correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, different relative distances correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions, and different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and vibration directions.
For example, assuming that the vibration frequency of the waveform in the waveform diagram is determined according to the type of the sound source object, the state of the designated sound, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object, the waveform may be divided into a first portion, a second portion, a third portion, and a fourth portion, the vibration frequency of the first portion of the waveform in the waveform diagram may be determined according to the type of the sound source object, the vibration frequency of the second portion of the waveform in the waveform diagram may be determined according to the state of the designated sound, the vibration frequency of the third portion of the waveform in the waveform diagram may be determined according to the relative distance, the vibration frequency of the third portion of the waveform in the waveform diagram may be determined according to the relative position, and the vibration frequency of the fourth portion of the waveform in the waveform diagram may be determined according to the relative position, so that the player may determine the type of the sound source object through the vibration frequency of the first portion of the waveform in the waveform diagram displayed by the graphical user interface, the state of the vibration frequency of the second portion of the waveform in the waveform diagram displayed by the graphical user interface, the relative distance of the object from the virtual object is determined through the vibration frequency of the third portion of the waveform in the waveform diagram displayed by the graphical user interface.
For another example, assuming that the amplitude and vibration direction of the waveform in the waveform diagram may be determined according to the type of the sound source object, the state of the designated sound, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object, then the waveform in the waveform diagram may be divided into a first portion and a second portion, the amplitude of the first portion of the waveform in the waveform diagram may be determined according to the type of the sound source object, the amplitude of the second portion of the waveform in the waveform diagram may be determined according to the state of the designated sound, the vibration direction of the first portion of the waveform in the waveform diagram may be determined according to the relative distance, the vibration direction of the second portion of the waveform in the waveform diagram may be determined according to the relative position, and thus the type of the sound source object may be determined by the amplitude of the first portion of the waveform in the waveform diagram displayed by the graphical user interface, the state of the designated sound may be determined by the amplitude of the second portion of the waveform in the waveform diagram displayed by the graphical user interface, and the relative position of the vibration direction of the sound source object from the virtual object may be determined by the amplitude of the second portion of the waveform in the waveform diagram displayed by the graphical user interface.
For another example, it is assumed that the color, amplitude, and vibration direction of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the state of the designated sound, the relative distance of the sound source object to the virtual object, and the relative distance of the sound source object to the virtual object, the color of the waveforms in the waveform diagram may be determined according to the type of the sound source object, the amplitude of the waveforms in the waveform diagram may be determined according to the state of the designated sound, the vibration direction of a portion of the waveforms in the waveform diagram may be determined according to the relative distance, the vibration direction of a portion of the waveforms in the waveform diagram may be determined according to the relative position, so that the player may determine the type of the sound source object through the color of the waveforms in the waveform diagram displayed by the graphical user interface, the amplitude of the waveforms in the waveform diagram displayed by the graphical user interface, the state of the designated sound may be determined, the relative distance of the sound source object to the virtual object may be determined through the vibration direction of a portion of the waveforms in the waveform diagram displayed by the graphical user interface, and the relative position of the sound source object to the virtual object may be determined through the vibration direction of another portion of the waveforms in the waveform diagram displayed by the graphical user interface.
For another example, it is assumed that the shape, amplitude, vibration direction, and vibration frequency of the waveform in the waveform diagram can be determined according to the type of the sound source object, the state of the designated sound, the relative distance between the sound source object and the virtual object, and the relative position between the sound source object and the virtual object can be determined according to the type of the sound source object, the amplitude of the waveform in the waveform diagram can be determined according to the state of the designated sound, the vibration frequency of the waveform in the waveform diagram can be determined according to the relative distance, the vibration direction of the waveform in the waveform diagram can be determined according to the relative position, and thus the player can determine the type of the sound source object through the shape of the waveform in the waveform diagram displayed by the graphical user interface, the amplitude of the waveform in the waveform diagram displayed by the graphical user interface, the state of the designated sound can be determined, the relative distance between the sound source object and the virtual object can be determined through the vibration frequency of the waveform in the waveform diagram displayed by the graphical user interface, and the relative position of the sound source object and the virtual object can be determined through the vibration direction of the waveform in the waveform diagram displayed by the graphical user interface.
For another example, it is assumed that the shape, color, amplitude, vibration direction, and vibration frequency of the waveform in the waveform diagram can be determined according to the type of the sound source object, the state of the designated sound, the relative distance between the sound source object and the virtual object, and the relative position of the sound source object and the virtual character can be determined according to the shape and the color of the waveform in the waveform diagram displayed by the graphic user interface, the state of the designated sound, the amplitude of the waveform in the waveform diagram, the vibration frequency of the waveform in the waveform diagram, and the vibration direction of the waveform in the waveform diagram.
It should be noted that the waveform performance parameters may also include other parameters related to the waveform, such as a peak value of the waveform, and so on.
In some embodiments, the parameters corresponding to the sound source object include a type of the sound source object, a state of a designated sound, a relative distance, and a relative position, the waveform expression parameters include a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in a waveform map corresponding to the sound source object, and determining the waveform expression parameters according to the parameters corresponding to the sound source object includes:
determining the shape and the color of waveforms in a waveform diagram corresponding to the sound source object according to the type of the sound source object;
determining the amplitude of a waveform in a waveform diagram corresponding to the sound source object according to the state and the relative distance of the designated sound;
determining the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object according to the state of the designated sound;
determining the vibration direction of the waveform in the waveform diagram corresponding to the sound source object according to the relative position;
displaying a waveform diagram corresponding to the sound source object according to the azimuth value and the waveform expression parameter, wherein the waveform diagram comprises:
and displaying a waveform diagram corresponding to the sound source object according to the azimuth value, the shape, the color, the amplitude, the vibration frequency and the vibration direction.
For example, assuming that the type of sound source object includes a virtual firearm and a step, wherein the virtual firearm corresponds to a triangle waveform and red color, the step corresponds to a trapezoid waveform and orange color, and the state of the designated sound includes a first state, a second state, and a third state, the first state may be set to correspond to a first amplitude and a first frequency, the second state corresponds to a second amplitude and a second frequency, the third state corresponds to a third amplitude and a third frequency, the first amplitude is greater than the second amplitude, the second amplitude is greater than the third amplitude, the first frequency is greater than the second frequency, the second frequency is greater than the third frequency, a mapping relationship of a relative distance and an amplitude may be set, wherein the relative distance is inversely related to the amplitude, e.g., a relative distance D1 corresponds to an amplitude F1, a relative distance D2 corresponds to an amplitude F2, a relative distance D3 corresponds to an amplitude D3, and so on, the relative distance D2 is greater than the relative distance D3, the amplitude F1 is less than the amplitude F2, the amplitude F2 is less than the amplitude F3, assuming that the relative position includes that the sound source object is above the virtual object, the sound source object is below the virtual object, the sound source object and the virtual object are in the same layer, the corresponding waveform of the sound source object above the virtual object can be set to vibrate upwards, the corresponding waveform of the sound source object below the virtual object vibrates downwards, the corresponding waveform of the sound source object and the virtual object are in the same layer to vibrate in two directions simultaneously, then, when the type of the sound source object is a virtual firearm, the state of the designated sound is a second state, the relative distance between the sound source object and the virtual object is D2, and when the relative position between the sound source object and the virtual object is the sound source object below the virtual object, the waveform shape can be displayed on the graphical user interface as a triangular waveform according to the azimuth value, the waveform diagram is red in color, the amplitude of a part of the waveform is the second amplitude, the amplitude of a part of the waveform is the amplitude F2, the vibration frequency of the waveform is the second frequency, and the vibration direction of the waveform is downward vibration.
In some embodiments, determining the amplitude of the waveform in the waveform map corresponding to the sound source object according to the state and the relative distance of the designated sound may include:
determining an amplitude coefficient corresponding to the relative distance, wherein the relative distance is inversely related to the amplitude coefficient;
when the state of the designated sound is a first state, determining the amplitude of the waveform in the waveform diagram corresponding to the sound source object according to the amplitude and the amplitude coefficient corresponding to the first state;
when the state of the designated sound is the second state, determining the amplitude of the waveform in the waveform diagram corresponding to the sound source object according to the amplitude and the amplitude coefficient corresponding to the second state, wherein the amplitude corresponding to the first state is larger than the amplitude corresponding to the second state.
For example, a mapping relationship between a relative distance and an amplitude coefficient may be set, for example, a relative distance D1 corresponds to the amplitude coefficient N1, a relative distance D2 corresponds to the amplitude coefficient N2, and a relative distance D3 corresponds to the amplitude coefficient N3, where the relative distance D1 is greater than the relative distance D2, the relative distance D2 is greater than the relative distance D3, the amplitude coefficient N1 is smaller than the amplitude coefficient N2, and the amplitude coefficient N2 is smaller than the amplitude coefficient N3, and when the state of the designated sound is the first state and the relative distance is D3, the product of the amplitude corresponding to the first state and the amplitude coefficient N3 may be regarded as the amplitude of the waveform in the waveform chart. In some embodiments, the sum of the amplitude corresponding to the first state and the amplitude coefficient N3 may also be taken as the amplitude of the waveform in the waveform map. When the state of the designated sound is the second state and the relative distance is D1, the product of the amplitude corresponding to the second state and the amplitude coefficient N1 may be taken as the amplitude of the waveform in the waveform diagram. In some embodiments, the sum of the amplitude corresponding to the second state and the amplitude coefficient N1 may also be taken as the amplitude of the waveform in the waveform map.
In some embodiments, when the state of the designated sound is the third state, the amplitude of the waveform in the waveform diagram corresponding to the sound source object may be determined according to the amplitude and the amplitude coefficient corresponding to the third state, and the amplitude corresponding to the second state is greater than the amplitude corresponding to the third state.
For example, when the state of the designated sound is the third state and the relative distance is D1, the product of the amplitude corresponding to the third state and the amplitude coefficient N1 may be taken as the amplitude of the waveform in the waveform chart. In some embodiments, the sum of the amplitude corresponding to the third state and the amplitude coefficient N1 may also be taken as the amplitude of the waveform in the waveform diagram.
In some embodiments, determining the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object according to the state of the designated sound includes:
when the state of the designated sound is the first state, determining the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object as the first frequency;
when the state of the designated sound is the second state, the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object is determined to be the second frequency, and the second frequency is smaller than the first frequency.
For example, when the state of the acquired designated sound is the first state, it may be determined that the vibration frequency of the waveform in the waveform chart corresponding to the sound source object is the first frequency. When the state of the acquired designated sound is the second state, it is determined that the vibration frequency of the waveform in the waveform chart corresponding to the sound source object is the second frequency.
In some embodiments, when the state of the designated sound is the third state, it is determined that the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object is the third frequency, the third frequency being smaller than the second frequency.
For example, when the state of the acquired designated sound is the third state, it may be determined that the vibration frequency of the waveform in the waveform chart corresponding to the sound source object is the third frequency.
It will be appreciated that, during the process of making the designated sound by the sound source object, the state of the designated sound will be changed from the first state to the second state to the third state, so that the vibration frequency of the waveform chart displayed on the graphical user interface will also be reduced from the first frequency to the second frequency and then from the second frequency to the third frequency.
In some embodiments, determining the vibration direction of the waveform in the waveform diagram corresponding to the sound source object according to the relative position may include:
when the relative position is the first relative position, determining the vibration direction of the waveform in the waveform diagram corresponding to the sound source object as a first direction;
when the relative position is the second relative position, it is determined that the vibration direction of the waveform in the waveform diagram corresponding to the sound source object is the second direction.
For example, assuming that the first relative position is that the sound source object is above the virtual object, the first direction is upward, the second relative position is that the sound source object is at the same level as the virtual object, and the second direction is upward and downward, a waveform chart of waveform upward vibration may be displayed when the sound source object is above the virtual object. When the sound source object is at the same layer as the virtual object, a waveform chart of simultaneous upward and downward vibrations can be displayed.
In some embodiments, when the relative position is the third relative position, the vibration direction of the waveform in the waveform chart corresponding to the sound source object is determined to be the third direction.
For example, assuming that the third relative position is that the sound source object is under the virtual object and the third direction is downward, a waveform chart of waveform downward vibration may be displayed when the sound source object is under the virtual object.
When the waveform pattern to be determined is a waveform pattern formed by arranging a plurality of columns of different lengths as shown in fig. 1d, the number of columns for forming the waveform pattern and the length of the longest column among the columns for forming the waveform pattern may be determined according to the determined amplitude of the waveform pattern, wherein the number and length of the columns are positively correlated with the amplitude, and the lengths of other columns may be sequentially shortened according to the distance from the longest column, for example, the length of the column closer to the longest column is longer than the length of the column farther from the longest column, and then the plurality of columns are arranged in the order of the longest middle and the shortest edge, and the bottom ends of the columns are kept flush, and then the columns arranged in the above manner are mirrored, thereby finally obtaining the waveform pattern as shown in fig. 1 d.
In some embodiments, displaying a waveform diagram corresponding to the sound source object according to the azimuth value includes:
determining a first display position of a waveform diagram corresponding to the sound source object in the graphical user interface according to the azimuth value;
and displaying the waveform diagram corresponding to the sound source object at the first display position.
For example, the first display position may be below the display position of the azimuth value. For example, as shown in fig. 1e, when the azimuth value is 150, the waveform diagram corresponding to the azimuth value is M1; when the azimuth value is 210, the waveform diagram corresponding to the azimuth value is M2.
In some embodiments, displaying a waveform diagram corresponding to the sound source object according to the azimuth value includes:
when the azimuth value is displayed on the graphical user interface, displaying a waveform diagram corresponding to the sound source object according to the azimuth value;
and when the azimuth value is not displayed on the graphical user interface, displaying an arrow diagram corresponding to the sound source object according to the azimuth value.
For example, a first azimuth value range corresponding to a current field of view of the virtual object, that is, a first azimuth value range displayed on the current graphical user interface, and a second azimuth value range corresponding to an outside of the current field of view of the virtual object, that is, a second azimuth value range not displayed on the current graphical user interface, may be determined, and then whether the azimuth value is within the first azimuth value range or the second azimuth value range may be determined; if the azimuth value is within the first azimuth value range, the azimuth value is determined to be displayed on the graphical user interface, and if the azimuth value is within the second azimuth value range, the azimuth value is determined not to be displayed on the graphical user interface. It should be noted that, in the embodiment of the present application, as long as a certain azimuth value is within the first azimuth value range displayed on the current graphical user interface, the azimuth value is the azimuth value displayed on the graphical user interface. As long as a certain azimuth value is within the second azimuth value range displayed on the current graphical user interface, the azimuth value is the azimuth value not displayed on the graphical user interface.
It will be appreciated that when the content presented by the graphical user interface only contains a part of the position indicator, i.e. the content presented by the graphical user interface only contains a part of the position value of the position indicator. For example, as shown in FIG. 1e, the graphical user interface contains only orientation values in the range of 120 to 240, i.e., the first orientation value range is 120-240, the second orientation value range is 0-120 (excluding 0 and 120), and 240-360 (excluding 240). Then, when the azimuth value corresponding to the sound source object is 90, it may be determined that the azimuth value is not displayed on the graphical user interface, and then, according to the azimuth value, an arrow diagram corresponding to the sound source object may be displayed; when the azimuth value corresponding to the sound source object is 210, it may be determined that the azimuth value is displayed on the graphical user interface, and then the waveform diagram corresponding to the sound source object may be displayed according to the azimuth value.
In some embodiments, displaying an arrow diagram corresponding to the sound source object according to the azimuth value includes:
acquiring parameters corresponding to a sound source object, wherein the parameters corresponding to the sound source object comprise at least one of the type of the sound source object, the state of appointed sound, the relative distance between the sound source object and a virtual object and the relative position between the sound source object and the virtual object;
And displaying an arrow diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object.
For example, displaying the arrow diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object may include: determining the display size and/or dynamic effect of an arrow diagram corresponding to the sound source object according to the parameters corresponding to the sound source object; and displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the determined display size and/or dynamic effect. The parameters corresponding to the sound source object are different, and the determined display sizes are different and the dynamic effects are different.
In some embodiments, displaying an arrow diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object includes:
determining an arrow expression parameter according to parameters corresponding to the sound source object, wherein the arrow expression parameter comprises at least one of the color, the shape, the size and the direction of an arrow in an arrow diagram corresponding to the sound source object;
and displaying an arrow diagram corresponding to the sound source object according to the arrow expression parameter according to the azimuth value.
For example, when the parameter corresponding to the sound source object includes the type of the sound source object, at least one of the color, shape, size, and direction of the arrow in the arrow map may be determined according to the type of the sound source object, so that the player may determine the type of the sound source object through at least one of the color, shape, size, and direction of the arrow in the waveform map displayed by the user interface. Wherein different sound source object types correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the state of the designated sound, so that the player may determine the state of the designated sound, such as the designated sound being the first state, the second state, or the third state, through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different states correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a relative distance of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow map may be determined according to the relative distance, so that the player may determine the relative distance of the sound source object from the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow map displayed by the user interface. Wherein different relative distances correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a relative position of the sound source object and the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the relative position, so that the player may determine the relative position of the sound source object and the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes the type of the sound source object and the state of the designated sound, at least one of the color, shape, size, and direction of the arrow in the arrow diagram may be determined according to the type of the sound source object and the state of the designated sound, so that the player may determine the type of the sound source object and the state of the designated sound through at least one of the color, shape, size, and direction of the arrow in the arrow diagram displayed by the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations and different states correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameters corresponding to the sound source object include a type of the sound source object and a relative position of the sound source object and the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the type of the sound source object and the relative position, so that the player may determine the type of the sound source object and the relative position of the sound source object and the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations and different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameters corresponding to the sound source object include a type of the sound source object and a relative distance of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow map may be determined according to the type of the sound source object and the relative distance, so that the player may determine the type of the sound source object and the relative distance of the sound source object from the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow map displayed by the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations and different relative distances correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the specified sound and a relative distance of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the state of the specified sound and the relative distance, so that the player may determine the state of the specified sound and the relative distance of the sound source object from the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different states correspond to different colors, shapes, sizes and orientations and different relative distances correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound and a relative position of the sound source object and the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the state of the designated sound and the relative position, so that the player may determine the state of the designated sound and the relative position of the sound source object and the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different states correspond to different colors, shapes, sizes and orientations and different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a relative distance of the sound source object from the virtual object and a relative position of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the relative distance and the relative position, so that the player may determine the relative distance of the sound source object from the virtual object and the relative position of the sound source object from the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different relative distances correspond to different colors, shapes, sizes and orientations and different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a type of the sound source object, a state of the designated sound, and a relative distance of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow map may be determined according to the type of the sound source object, the state of the designated sound, and the relative distance, so that the player may determine the type of the sound source object, the state of the designated sound, and the relative distance of the sound source object from the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow map displayed by the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations, different states correspond to different colors, shapes, sizes and orientations, and different relative distances correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a type of the sound source object, a state of the designated sound, and a relative position of the sound source object and the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow map may be determined according to the type of the sound source object, the state of the designated sound, and the relative position, so that the player may determine the type of the sound source object, the state of the designated sound, and the relative position of the sound source object and the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow map displayed by the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations, different states correspond to different colors, shapes, sizes and orientations, and different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameters corresponding to the sound source object include a type of the sound source object, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the type of the sound source object, the relative distance, and the relative position of the sound source object, so that the player may determine the type of the sound source object, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object through at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed by the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations, different relative distances correspond to different colors, shapes, sizes and orientations, and different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a state of the designated sound, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow diagram may be determined according to the state of the designated sound, the relative distance, and the relative position, so that the player may determine at least one of the color, the shape, the size, and the direction of the arrow in the arrow diagram displayed through the user interface, the state of the designated sound, the relative distance of the sound source object from the virtual object, and the relative position of the sound source object from the virtual object. Wherein different states correspond to different colors, shapes, sizes and orientations, different relative distances correspond to different colors, shapes, sizes and orientations, and different relative positions correspond to different colors, shapes, sizes and orientations.
In some embodiments, when the parameter corresponding to the sound source object includes a type of the sound source object, a state of the designated sound, a relative distance of the sound source object from the virtual object, and a relative position of the sound source object from the virtual object, at least one of a color, a shape, a size, and a direction of an arrow in the arrow map may be determined according to the type of the sound source object, the state of the designated sound, the relative distance, and the relative position of the sound source object from the virtual object, so that the player may determine at least one of the color, the shape, the size, and the direction of the arrow in the arrow map displayed through the user interface. Wherein different types correspond to different colors, shapes, sizes and orientations, different states correspond to different colors, shapes, sizes and orientations, different relative distances correspond to different colors, shapes, sizes and orientations, and different relative positions correspond to different colors, shapes, sizes and orientations.
It should be noted that the arrow performance parameters may also include other parameters related to the arrow, such as the movement frequency of the arrow, and so on.
In some embodiments, displaying an arrow diagram corresponding to the sound source object according to the azimuth value includes:
determining a second display position of the arrow diagram on the graphical user interface according to the azimuth value;
and displaying an arrow diagram corresponding to the sound source object at the second display position.
For example, the second azimuth value range may be divided into a first sub-azimuth value range and a second sub-azimuth value range, wherein the first sub-azimuth value range is the same as or different from the second sub-azimuth value range by a preset range, and when the azimuth value is within the first sub-azimuth value range, the second display position may be an upper left side of the graphical user interface, and when the azimuth value is within the first sub-azimuth value range, the second display position may be an upper right side of the graphical user interface. The preset range may be set according to practical situations, and is not particularly limited herein.
For example, as shown in fig. 1e, the second azimuth value range is 0 to 120 (excluding 0 and 120), and 240 to 360 (excluding 240), the first sub-azimuth value range is 0 to 120 (excluding 0 and 120), the second sub-azimuth value range is 240 to 360 (excluding 240), the arrow diagram N1 corresponding to a certain sound source object may be displayed at the upper left of the graphical user interface assuming that the azimuth value of the sound source object is 60, and the arrow diagram N2 corresponding to the sound source object may be displayed at the upper right of the graphical user interface assuming that the azimuth value of the sound source object is 249.
For another example, the second display position may be an upper left side of the graphical user interface when the orientation value is less than a smallest orientation value of the orientation values displayed by the graphical user interface; the second display position may be an upper right side of the graphical user interface when the orientation value is greater than a largest one of the orientation values displayed by the graphical user interface.
For example, as shown in fig. 1e, when the azimuth value corresponding to the sound source object is less than 120, it may be determined that the arrow map corresponding to the sound source object may be displayed at the upper left of the graphical user interface, e.g., the arrow map corresponding to the sound source object may be arrow map N1; when the azimuth value corresponding to the sound source object is greater than 240, it may be determined that the arrow map corresponding to the sound source object may be displayed at the upper right of the graphical user interface, e.g., the arrow map corresponding to the sound source object may be arrow map N2.
In some embodiments, the azimuth value displayed by the graphical user interface changes along with the movement of the virtual object, the azimuth value corresponding to a certain sound source object may change from not being displayed on the graphical user interface to be displayed on the graphical user interface, so when the azimuth value corresponding to the sound source object changes from not being displayed on the graphical user interface to be displayed on the graphical user interface, the arrow diagram may be dismissed, and the waveform diagram corresponding to the sound source object is displayed below the azimuth value. Similarly, the azimuth value displayed by the graphical user interface changes along with the movement of the virtual object, and the azimuth value corresponding to a certain sound source object may change from being displayed on the graphical user interface to not being displayed on the graphical user interface, so that when the azimuth value corresponding to the sound source object changes from being displayed on the graphical user interface to not being displayed on the graphical user interface, the waveform diagram can be dismissed, and the arrow diagram corresponding to the sound source object is displayed below the azimuth value.
In some embodiments, before displaying the arrow diagram corresponding to the sound source object at the second display position, the method further includes:
determining the pointing direction of an arrow in an arrow diagram corresponding to the sound source object according to the azimuth value;
displaying the arrow diagram at a second display position, comprising:
and displaying the arrow diagram corresponding to the sound source object at the second display position according to the pointing direction.
For example, when the azimuth value is within the first range of sub-azimuth values, the arrow points horizontally to the left; when the azimuth value is within the second range of sub-azimuth values, the arrow points horizontally to the right.
For example, as shown in fig. 1e, the first sub-azimuth value range is 0-120 (excluding 0 and 120), the second sub-azimuth value range is 240-360 (excluding 240), assuming that the azimuth value of a certain sound source object is 60, it may be determined that the arrow diagram corresponding to the sound source object may be displayed at the upper left of the graphical user interface, and the arrow pointing to the left is horizontal, e.g., the arrow diagram corresponding to the sound source object may be arrow diagram N1; when the azimuth value corresponding to the sound source object is 249, it may be determined that the arrow map corresponding to the sound source object may be displayed at the upper right of the graphical user interface, and the pointing direction of the arrow is horizontal to the right, e.g., the arrow map corresponding to the sound source object may be the arrow map N2.
For another example, when the orientation value is less than the smallest of the orientation values displayed by the graphical user interface, the arrow points horizontally to the left; when the orientation value is greater than the largest of the orientation values displayed by the graphical user interface, the arrow points horizontally to the right.
For example, as shown in fig. 1e, when the azimuth value corresponding to the sound source object is smaller than 120, it may be determined that the arrow diagram corresponding to the sound source object may be displayed at the upper left of the graphical user interface, and the pointing direction of the arrow is horizontal to the left, e.g., the arrow diagram corresponding to the sound source object may be arrow diagram N1; when the azimuth value corresponding to the sound source object is greater than 240, it may be determined that the arrow map corresponding to the sound source object may be displayed at the upper right of the graphical user interface, and the arrow is directed horizontally to the right, e.g., the arrow map corresponding to the sound source object may be arrow map N2.
In some embodiments, acquiring the first location information in response to a sound source object in the game scene emitting a specified sound includes:
responding to a plurality of sound source objects in a game scene to emit appointed sounds, and acquiring a plurality of first position information, wherein each first position information represents the position of each sound source object in the game scene;
Determining an azimuth value for the corresponding azimuth indicator based on the first location information, comprising:
based on each first position, determining an azimuth value of an azimuth indicator corresponding to each first position to obtain a plurality of azimuth values;
according to the azimuth value, displaying a waveform diagram corresponding to the sound source object, including:
determining a first azimuth value displayed on the graphical user interface from the plurality of azimuth values, and determining the remaining azimuth values as second azimuth values;
displaying a waveform diagram corresponding to the sound source object corresponding to the first azimuth value according to the first azimuth value;
and displaying an arrow diagram corresponding to the sound source object corresponding to the second azimuth value according to the second azimuth value.
When determining a first azimuth value displayed on the graphical user interface from a plurality of azimuth values and determining the rest azimuth values as second azimuth values, determining a first azimuth value range corresponding to the current visual field range of the virtual object, namely, a first azimuth value range displayed on the current graphical user interface, and then judging whether each azimuth value in the plurality of azimuth values is in the first azimuth value range; the position values of the plurality of position values that lie within the first position value range are determined as first position values, i.e. position values that are displayed on the graphical user interface, and the remaining position values are determined as second position values, i.e. position values that are not displayed on the graphical user interface. It should be noted that, in the embodiment of the present application, as long as a certain azimuth value is within the first azimuth value range displayed on the current graphical user interface, the azimuth value is the azimuth value displayed on the graphical user interface. As long as a certain azimuth value is not within the first azimuth value range displayed on the current graphical user interface, the azimuth value is the azimuth value not displayed on the graphical user interface.
For example, assuming that the azimuth indicator displayed on the gui is as shown in fig. 1e, it indicates that the range of azimuth values displayed on the gui is 120 to 240, the azimuth values in the range of 120 to 240 are all first azimuth values, the azimuth values not in the range of 120 to 240 are all second azimuth values, and assuming that the obtained plurality of azimuth values are 90, 120, 127, 150, 172, 210, 240, 279, 290, it is determined that 120, 127, 150, 172, 210, 240 are first azimuth values, and 90, 279, 290 are second azimuth values. For the first azimuth values, the waveform diagram corresponding to the sound source object corresponding to each first azimuth value may be displayed in such a manner that the waveform diagram corresponding to the sound source object is displayed according to the azimuth values as exemplified in the above embodiments. For example, a waveform diagram corresponding to the sound source object corresponding to each first azimuth value may be displayed in a form as shown in fig. 1 e. For the second azimuth values, an arrow map corresponding to a sound source object corresponding to each second azimuth value may be displayed in such a manner that the arrow map corresponding to the sound source object is displayed according to the azimuth values as exemplified in the above embodiments.
It will be appreciated that the waveform diagram shown in fig. 1e is merely an example and is not intended to limit the present application.
In some embodiments, when displaying the arrow map corresponding to the sound source object corresponding to the second azimuth value according to the second azimuth value, a second azimuth value range corresponding to the virtual object outside the current field of view, that is, a second azimuth value range not displayed on the current graphical user interface, may be determined first, and then the second azimuth value range is divided into a first sub-azimuth value range and a second sub-azimuth value range, where the first sub-azimuth value range is the same as or different from the second sub-azimuth value range by a preset range, the arrow map corresponding to any one of the second azimuth values in the first sub-azimuth value range is displayed at a third display position of the graphical user interface, and the arrow map corresponding to any one of the second azimuth values in the second sub-azimuth value range is displayed at a fourth display position of the graphical user interface. The preset range may be set according to practical situations, and is not particularly limited herein.
For example, as shown in fig. 1e, the second azimuth value range is 0-120 (excluding 0 and 120), and 240-360 (excluding 240), the first sub-azimuth value range is 0-120 (excluding 0 and 120), the second sub-azimuth value range is 240-360 (excluding 240), and if the second azimuth value includes 60, 90, 249, 279, 290, the arrow diagram N1 corresponding to the second azimuth value 60 or 90 may be displayed at the upper left side (third display position) of the graphical user interface, and the arrow diagram N2 corresponding to the second azimuth value 249, 279, 290 may be displayed at the upper right side (fourth display position) of the graphical user interface.
In some embodiments, a first number of second bearing values within the first range of sub-bearing values may also be determined, and a second number of second bearing values within the second range of sub-bearing values may be determined, the first number being displayed in a fifth display position of the graphical user interface and the second number being displayed in a sixth display position of the graphical user interface.
For example, as shown in fig. 1f, the first sub-azimuth value range is 0 to 120 (excluding 0 and 120), the second sub-azimuth value range is 240 to 360 (excluding 240), and assuming that the second azimuth value includes 60, 90, 249, 279, 290, the first number is determined to be 2, and the second number is 3, the "2" may be displayed below the arrow view N1 (fifth display position) displayed on the graphical user interface, and the "3" may be displayed below the arrow view N2 (sixth display position) displayed on the graphical user interface.
In some embodiments, as shown in fig. 1e, for a plurality of azimuth values having azimuth values 279, 290, etc. greater than 240 (the azimuth value 240 being the azimuth value displayed with the greatest numerical value in the azimuth value range of the graphical user interface), only one arrow map may be displayed, such as displaying any one of the arrow maps corresponding to an azimuth value greater than 240. It will be appreciated that for a plurality of orientation values less than 120 (i.e., the orientation value for which the value is the smallest in the range of orientation values displayed in the graphical user interface) only one arrow map may be displayed, such as displaying any one of the arrow maps corresponding to an orientation value less than 120.
In some embodiments, according to the first direction value, displaying the waveform diagram corresponding to the sound source object corresponding to the first direction value may include:
when a plurality of first azimuth values exist, determining the relative distance between the sound source object corresponding to each first azimuth value and the virtual object to obtain a plurality of relative distances;
determining a minimum relative distance from the plurality of relative distances;
determining a sound source object corresponding to the minimum distance as a target sound source object;
and displaying the waveform diagrams corresponding to the sound source objects corresponding to each first azimuth value according to each first azimuth value, wherein the display size of the waveform diagrams corresponding to the target sound source object is larger than the display size of the waveform diagrams corresponding to other sound source objects in the sound source objects corresponding to the plurality of first azimuth values.
For example, according to the determining manner that the relative distances between the sound source object corresponding to the determined azimuth value and the virtual object in the above embodiment are the same, the relative distance between the sound source object corresponding to each first azimuth value and the virtual object may be determined, so as to obtain a plurality of relative distances; then, the smallest relative distance may be determined from the plurality of relative distances, and the sound source object corresponding to the smallest relative distance may be determined as the target sound source object. For example, assume that there are first azimuth values 127, 150, 172, and 210, where the relative distance between the sound source object Z1 corresponding to the first azimuth value 210 and the virtual object is the smallest, and the target sound source object is Z1.
After determining the waveform diagram corresponding to the sound source object corresponding to each first azimuth value, displaying the waveform diagram corresponding to the sound source object corresponding to each first azimuth value according to each first azimuth value, wherein the display size of the waveform diagram corresponding to the target sound source object is larger than the display sizes of the waveform diagrams corresponding to other sound source objects in the sound source objects corresponding to the plurality of first azimuth values. For example, assuming that there are first azimuth values 127, 150, 172 and 210, the waveform corresponding to the sound source object corresponding to the first azimuth value 127 is a waveform having a red waveform, the waveform corresponding to the sound source object corresponding to the first azimuth value 150 is a waveform having a green waveform, the waveform corresponding to the sound source object corresponding to the first azimuth value 172 is a waveform having a purple waveform, the waveform corresponding to the sound source object corresponding to the first azimuth value 210 is a waveform having a orange color, the relative distance between the sound source object corresponding to the first azimuth value 210 and the virtual object is the smallest, when the waveform corresponding to each first azimuth value is displayed, the waveform corresponding to the first azimuth value 210 may be displayed below the display position of the first azimuth value 127, the waveform corresponding to the waveform having a green waveform is displayed below the display position of the first azimuth value 150, the waveform corresponding to the waveform having a purple waveform is displayed below the display position of the first azimuth value 172, the waveform corresponding to the waveform having a orange color is displayed below the display position of the first azimuth value 210, and the waveform corresponding to the first azimuth value larger waveform corresponding to the other waveform object is displayed below the display position of the first azimuth value larger.
In order to facilitate better implementation of the information processing method provided by the embodiment of the application, the embodiment of the application also provides a device based on the information processing method. Where the meaning of a noun is the same as in the information processing method described above, specific implementation details may be referred to the description in the method embodiment.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application, where the information processing apparatus 200 may include a graphical user interface, where content displayed in the graphical user interface at least partially includes a game scene of the game and an azimuth indicator, where the azimuth indicator is used to indicate azimuth information in at least one of the game scenes, and the first response module 201, the first determination module 202, the first display module 203, and so on.
A first response module 201, configured to respond to a sound source object in the game scene to emit a specified sound, and acquire first location information, where the first location information characterizes a location of the sound source object in the game scene;
a first determining module 202, configured to determine an azimuth value corresponding to the azimuth indicator based on the first location information;
And the first display module 203 is configured to display a waveform chart corresponding to the sound source object according to the azimuth value.
In some embodiments, the first display module 203 may be configured to: acquiring parameters corresponding to the sound source object, wherein the parameters corresponding to the sound source object comprise at least one of the type of the sound source object, the state of the appointed sound, the relative distance between the sound source object and the virtual object and the relative position between the sound source object and the virtual object; and displaying a waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object.
In some embodiments, the first display module 203 may be configured to: determining waveform expression parameters according to parameters corresponding to the sound source object, wherein the waveform expression parameters comprise at least one of the shape, the color, the amplitude, the vibration frequency and the vibration direction of a waveform in a waveform chart corresponding to the sound source object; and displaying the waveform diagram corresponding to the sound source object according to the waveform expression parameters according to the azimuth value.
In some embodiments, the parameters corresponding to the sound source object include a type of the sound source object, a state of the designated sound, the relative distance, and the relative position, and the waveform expression parameters include a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in a waveform diagram corresponding to the sound source object, and the first display module 203 may be configured to: determining the shape and the color of waveforms in a waveform diagram corresponding to the sound source object according to the type of the sound source object; determining the amplitude of a waveform in a waveform diagram corresponding to the sound source object according to the state of the designated sound and the relative distance; determining the vibration frequency of a waveform in a waveform diagram corresponding to the sound source object according to the state of the designated sound; determining the vibration direction of waveforms in a waveform diagram corresponding to the sound source object according to the relative position; and displaying a waveform diagram corresponding to the sound source object according to the shape, the color, the amplitude, the vibration frequency and the vibration direction according to the azimuth value.
In some embodiments, the first display module 203 may be configured to: determining an amplitude coefficient corresponding to the relative distance, wherein the relative distance is inversely related to the amplitude coefficient; when the state of the appointed sound is a first state, determining the amplitude of a waveform in a waveform chart corresponding to the sound source object according to the amplitude corresponding to the first state and the amplitude coefficient; when the state of the appointed sound is a second state, determining the amplitude of the waveform in the waveform diagram corresponding to the sound source object according to the amplitude corresponding to the second state and the amplitude coefficient, wherein the amplitude corresponding to the first state is larger than the amplitude corresponding to the second state; and when the state of the appointed sound is a third state, determining the amplitude of the waveform in the waveform diagram corresponding to the sound source object according to the amplitude corresponding to the third state and the amplitude coefficient, wherein the amplitude corresponding to the second state is larger than the amplitude corresponding to the third state.
In some embodiments, the first display module 203 may be configured to: when the state of the designated sound is the second state, determining the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object as the first frequency; when the state of the appointed sound is a second state, determining that the vibration frequency of a waveform in a waveform diagram corresponding to the sound source object is a second frequency, wherein the second frequency is smaller than the first frequency; and when the state of the designated sound is a third state, determining that the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object is a third frequency, wherein the third frequency is smaller than the second frequency.
In some embodiments, the first display module 203 may be configured to: when the relative position is a first relative position, determining the vibration direction of a waveform in a waveform chart corresponding to the sound source object as a first direction; when the relative position is a second relative position, determining the vibration direction of the waveform in the waveform diagram corresponding to the sound source object as a second direction; and when the relative position is a third relative position, determining the vibration direction of the waveform in the waveform chart corresponding to the sound source object as a third direction.
In some embodiments, the first display module 203 may be configured to: determining a first display position of a waveform diagram corresponding to the sound source object in the graphical user interface according to the azimuth value; and displaying the waveform diagram corresponding to the sound source object at the first display position.
In some embodiments, the first display module 203 may be configured to: when the azimuth value is displayed on the graphical user interface, displaying a waveform diagram corresponding to the sound source object according to the azimuth value; and displaying an arrow diagram corresponding to the sound source object according to the azimuth value when the azimuth value is not displayed on the graphical user interface.
In some embodiments, the first display module 203 may be configured to: acquiring parameters corresponding to the sound source object, wherein the parameters corresponding to the sound source object comprise at least one of the type of the sound source object, the state of the appointed sound, the relative distance between the sound source object and the virtual object and the relative position between the sound source object and the virtual object; and displaying an arrow diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object.
In some embodiments, the first display module 203 may be configured to: determining an arrow expression parameter according to the parameter corresponding to the sound source object, wherein the arrow expression parameter comprises at least one of the color, the shape, the size and the direction of an arrow in an arrow diagram corresponding to the sound source object; and displaying an arrow diagram corresponding to the sound source object according to the arrow expression parameter according to the azimuth value.
In some embodiments, the first display module 203 may be configured to: determining a second display position of the arrow diagram in the graphical user interface according to the azimuth value; and displaying an arrow diagram corresponding to the sound source object at the second display position.
In some embodiments, the first display module 203 may be configured to: determining the pointing direction of an arrow in an arrow diagram corresponding to the sound source object according to the azimuth value; and displaying an arrow diagram corresponding to the sound source object at the second display position according to the pointing direction.
In some embodiments, the first response module 201 may be configured to: responding to a plurality of sound source objects in the game scene to emit appointed sounds, and acquiring a plurality of first position information, wherein each first position information represents the position of each sound source object in the game scene;
the first determining module 202 may be configured to: based on each first position, determining an azimuth value of an azimuth indicator corresponding to each first position to obtain a plurality of azimuth values;
the first display module 203 may be configured to: determining a first azimuth value displayed on the graphical user interface from a plurality of azimuth values, and determining the remaining azimuth values as second azimuth values; displaying a waveform diagram corresponding to a sound source object corresponding to the first azimuth value according to the first azimuth value; and displaying an arrow diagram corresponding to the sound source object corresponding to the second azimuth value according to the second azimuth value.
In some embodiments, the first display module 203 may be configured to: when a plurality of first azimuth values exist, determining the relative distance between the sound source object corresponding to each first azimuth value and the virtual object to obtain a plurality of relative distances; determining a minimum relative distance from a plurality of the relative distances; determining a sound source object corresponding to the minimum relative distance as a target sound source object; and displaying the waveform diagrams corresponding to the sound source objects corresponding to each first azimuth value according to each first azimuth value, wherein the display size of the waveform diagrams corresponding to the target sound source object is larger than the display size of the waveform diagrams corresponding to other sound source objects in the sound source objects corresponding to the plurality of first azimuth values.
As can be seen from the foregoing, the embodiments of the present application provide an information processing apparatus, where the information processing apparatus includes a graphical user interface, where content displayed in the graphical user interface at least partially includes a game scene of the game and an azimuth indicator, where the azimuth indicator is used to indicate azimuth information in at least one of the game scenes, a first response module 201, a first determination module 202, and a first display module 203, where the first response module 201 is used to respond to a sound source object in the game scene to emit a specified sound, and obtain first position information, where the first position information characterizes a position of the sound source object in the game scene; a first determining module 202, configured to determine an azimuth value corresponding to the azimuth indicator based on the first location information; the first display module 203 is configured to display a waveform chart corresponding to the sound source object according to the azimuth value, and replace the existing simple graphic representation with a waveform chart form, so as to express more complex and effective sound source information through rich waveform charts.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Fig. 3 is a schematic structural diagram of a computer readable storage medium according to an embodiment of the present application. As shown in fig. 3, a program product 1100 according to an embodiment of the present application is described, on which a computer program is stored, and in an alternative embodiment, the information processing method described above when the computer program is executed by a processor, the specific information processing method has been described in detail previously, and thus will not be described herein.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
The program code embodied in a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio frequency, etc., or any suitable combination of the foregoing.
Correspondingly, the embodiment of the application also provides electronic equipment, which can be terminal equipment or a server, wherein the terminal equipment can be terminal equipment such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game machine, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like.
An electronic device 1000 provided in an embodiment of the present application is described below with reference to fig. 4. The electronic device 1000 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present application in any way.
Referring to fig. 4, the electronic device 1000 is embodied in the form of a general purpose computing device. Components of electronic device 1000 may include, but are not limited to: at least one processor 1010, at least one memory 1020, a bus 1030 connecting the different system components (including the processor 1010 and the memory 1020), and a display unit 1040.
Memory 1020 includes, among other things, random access Memory (Random Access Memory, RAM) 1021, cache Memory 1022, read-Only Memory (ROM) 1023, program/utility 1024 with a set (at least one) of program modules 1025, and the like. The memory 1020 stores program codes that can be executed by the processor 1010, so that the processor 1010 performs the above information processing method, and the specific information processing method has been described in detail before, so that a detailed description thereof is omitted herein.
The electronic device 1000 may further include: a power supply assembly configured to power manage the executing electronic device; a wired or wireless network interface, such as network adapter 1060, configured to connect the electronic device to a network; and an input-output (I/O) interface 1050. The electronic device 1000 may be connected to an external device 1100 through an I/O interface, and the electronic device 1000 may operate based on an operating system stored in memory, such as Android, iOS, windows, mac OS X, unix, linux, freeBSD, or the like.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present invention may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, an electronic device, or a network device, etc.) to perform the method according to the embodiments of the present invention.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
Also, it should be understood that some of the same or similar conceptual representations that exist in various embodiments of the application without a exclusive conflict between them may be considered as equally extended.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (19)

1. An information processing method of providing a graphical user interface through a terminal device, the content displayed in the graphical user interface at least partially containing a game scene and an azimuth indicator of a game, the method comprising:
responding to a sound source object in the game scene to emit specified sound, and acquiring first position information, wherein the first position information characterizes the position of the sound source object in the game scene;
determining an azimuth value corresponding to the azimuth indicator based on the first position information;
acquiring parameters corresponding to the sound source object, wherein the parameters corresponding to the sound source object comprise the relative positions of the sound source object and a virtual object, and the relative positions comprise the positions of the sound source object above, below or on the same layer as the virtual object;
Displaying a waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object, wherein the waveform diagram at least indicates the relative position;
when the parameter corresponding to the sound source object comprises the relative position of the sound source object and the virtual object, determining at least one of the shape, the color, the amplitude, the vibration frequency and the vibration direction of the waveform in the waveform diagram corresponding to the sound source object according to the relative position, wherein different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and/or vibration directions.
2. The information processing method according to claim 1, wherein the parameter corresponding to the sound source object further includes at least one of a type of the sound source object, a state of the specified sound, and a relative distance of the sound source object from a virtual object.
3. The information processing method according to claim 2, wherein the displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object includes:
determining waveform expression parameters according to parameters corresponding to the sound source object, wherein the waveform expression parameters comprise at least one of the shape, the color, the amplitude, the vibration frequency and the vibration direction of a waveform in a waveform chart corresponding to the sound source object;
And displaying the waveform diagram corresponding to the sound source object according to the waveform expression parameters according to the azimuth value.
4. The information processing method according to claim 3, wherein the parameters corresponding to the sound source object include a type of the sound source object, a state of the specified sound, the relative distance, and the relative position, the waveform expression parameters include a shape, a color, an amplitude, a vibration frequency, and a vibration direction of a waveform in a waveform chart corresponding to the sound source object,
in the case of determining the vibration direction of the waveform in the waveform diagram corresponding to the sound source object according to the relative position, determining the waveform expression parameter according to the parameter corresponding to the sound source object includes:
determining the shape and the color of waveforms in a waveform diagram corresponding to the sound source object according to the type of the sound source object;
determining the amplitude of a waveform in a waveform diagram corresponding to the sound source object according to the state of the designated sound and the relative distance;
determining the vibration frequency of a waveform in a waveform diagram corresponding to the sound source object according to the state of the designated sound;
and displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the waveform expression parameter, wherein the waveform diagram comprises the following components:
And displaying a waveform diagram corresponding to the sound source object according to the shape, the color, the amplitude, the vibration frequency and the vibration direction according to the azimuth value.
5. The information processing method according to claim 4, wherein the determining the amplitude of the waveform in the waveform diagram corresponding to the sound source object based on the state of the specified sound and the relative distance includes:
determining an amplitude coefficient corresponding to the relative distance, wherein the relative distance is inversely related to the amplitude coefficient;
when the state of the appointed sound is a first state, determining the amplitude of a waveform in a waveform chart corresponding to the sound source object according to the amplitude corresponding to the first state and the amplitude coefficient;
and when the state of the appointed sound is a second state, determining the amplitude of the waveform in the waveform diagram corresponding to the sound source object according to the amplitude corresponding to the second state and the amplitude coefficient, wherein the amplitude corresponding to the first state is larger than the amplitude corresponding to the second state.
6. The information processing method according to claim 4, wherein the determining the vibration frequency of the waveform in the waveform chart corresponding to the sound source object according to the state of the specified sound includes:
When the state of the designated sound is a first state, determining the vibration frequency of a waveform in a waveform diagram corresponding to the sound source object as a first frequency;
and when the state of the designated sound is a second state, determining that the vibration frequency of the waveform in the waveform diagram corresponding to the sound source object is a second frequency, wherein the second frequency is smaller than the first frequency.
7. The information processing method according to claim 4, wherein the determining the vibration direction of the waveform in the waveform diagram corresponding to the sound source object based on the relative position includes:
when the relative position is a first relative position, determining the vibration direction of a waveform in a waveform chart corresponding to the sound source object as a first direction;
and when the relative position is the second relative position, determining the vibration direction of the waveform in the waveform chart corresponding to the sound source object as the second direction.
8. The information processing method according to claim 3, wherein displaying the waveform diagram corresponding to the sound source object in accordance with the waveform expression parameter based on the azimuth value, comprises:
determining a first display position of a waveform diagram corresponding to the sound source object in the graphical user interface according to the azimuth value;
And displaying the waveform diagram corresponding to the sound source object according to the waveform expression parameter at the first display position.
9. The information processing method according to claim 1, wherein the displaying the waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object includes:
when the azimuth value is displayed on the graphical user interface, displaying a waveform diagram corresponding to the sound source object according to the azimuth value and parameters corresponding to the sound source object;
and displaying an arrow diagram corresponding to the sound source object according to the azimuth value when the azimuth value is not displayed on the graphical user interface.
10. The information processing method according to claim 9, wherein the acquired parameters corresponding to the sound source object further include: at least one of a type of the sound source object, a state of the specified sound, and a relative distance of the sound source object to a virtual object;
and displaying an arrow diagram corresponding to the sound source object according to the azimuth value, wherein the arrow diagram comprises the following steps:
and displaying an arrow diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object.
11. The information processing method according to claim 10, wherein the displaying the arrow map corresponding to the sound source object based on the azimuth value and the parameter corresponding to the sound source object includes:
determining an arrow expression parameter according to the parameter corresponding to the sound source object, wherein the arrow expression parameter comprises at least one of the color, the shape, the size and the direction of an arrow in an arrow diagram corresponding to the sound source object;
and displaying an arrow diagram corresponding to the sound source object according to the arrow expression parameter according to the azimuth value.
12. The information processing method according to claim 9, wherein the displaying the arrow diagram corresponding to the sound source object according to the azimuth value includes:
determining a second display position of the arrow diagram in the graphical user interface according to the azimuth value;
and displaying an arrow diagram corresponding to the sound source object at the second display position.
13. The information processing method according to claim 12, wherein before the displaying of the arrow diagram corresponding to the sound source object at the second display position, further comprising:
determining the pointing direction of an arrow in an arrow diagram corresponding to the sound source object according to the azimuth value;
The displaying the arrow diagram at the second display position includes:
and displaying an arrow diagram corresponding to the sound source object at the second display position according to the pointing direction.
14. The information processing method according to claim 1, wherein the acquiring the first position information in response to the sound source object in the game scene making a specified sound includes:
responding to a plurality of sound source objects in the game scene to emit appointed sounds, and acquiring a plurality of first position information, wherein each first position information represents the position of each sound source object in the game scene;
the determining, based on the first location information, an azimuth value corresponding to the azimuth indicator includes:
based on each first position, determining an azimuth value of an azimuth indicator corresponding to each first position to obtain a plurality of azimuth values;
and displaying a waveform diagram corresponding to the sound source object according to the azimuth value, wherein the waveform diagram comprises the following components:
determining a first azimuth value displayed on the graphical user interface from a plurality of azimuth values, and determining the remaining azimuth values as second azimuth values;
displaying a waveform diagram corresponding to a sound source object corresponding to the first azimuth value according to the first azimuth value;
And displaying an arrow diagram corresponding to the sound source object corresponding to the second azimuth value according to the second azimuth value.
15. The method according to claim 14, wherein displaying a waveform diagram corresponding to a sound source object corresponding to the first azimuth value according to the first azimuth value, comprises:
when a plurality of first azimuth values exist, determining the relative distance between the sound source object corresponding to each first azimuth value and the virtual object to obtain a plurality of relative distances;
determining a minimum relative distance from a plurality of the relative distances;
determining a sound source object corresponding to the minimum relative distance as a target sound source object;
and displaying the waveform diagrams corresponding to the sound source objects corresponding to each first azimuth value according to each first azimuth value, wherein the display size of the waveform diagrams corresponding to the target sound source object is larger than the display size of the waveform diagrams corresponding to other sound source objects in the sound source objects corresponding to the plurality of first azimuth values.
16. The information processing method according to claim 1, wherein the waveform in the waveform chart includes at least two portions, and the waveform expression parameter of the waveform of each portion is determined based on the parameter corresponding to the sound source object.
17. An information processing apparatus, characterized in that the apparatus comprises:
a graphical user interface, the content displayed in the graphical user interface at least partially comprising a game scene of a game and an orientation indicator, wherein the orientation indicator is for indicating orientation information in at least one of the game scenes;
the first response module is used for responding to a sound source object in the game scene to emit appointed sound and acquiring first position information, wherein the first position information represents the position of the sound source object in the game scene;
a first determining module, configured to determine an azimuth value corresponding to the azimuth indicator based on the first location information;
the first display module is used for acquiring parameters corresponding to the sound source object, wherein the parameters corresponding to the sound source object comprise the relative positions of the sound source object and the virtual object; displaying a waveform diagram corresponding to the sound source object according to the azimuth value and the parameter corresponding to the sound source object, wherein the waveform diagram at least indicates the relative position, and the relative position comprises the position of the sound source object above, below or on the same layer of the virtual object;
The first display module is specifically configured to determine at least one of a shape, a color, an amplitude, a vibration frequency and a vibration direction of a waveform in a waveform chart corresponding to the sound source object according to the relative position when the parameter corresponding to the sound source object includes the relative position of the sound source object and the virtual object, where different relative positions correspond to different shapes, colors, amplitudes, vibration frequencies and/or vibration directions.
18. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information processing method of any one of claims 1-16 via execution of the executable instructions.
19. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the information processing method of any one of claims 1 to 16.
CN202110853316.5A 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment Active CN113559504B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410391501.0A CN118179018A (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110470536 2021-04-28
CN202110470536X 2021-04-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410391501.0A Division CN118179018A (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113559504A CN113559504A (en) 2021-10-29
CN113559504B true CN113559504B (en) 2024-04-16

Family

ID=78121063

Family Applications (6)

Application Number Title Priority Date Filing Date
CN202110853308.0A Active CN113521759B (en) 2021-04-28 2021-07-27 Information processing method, device, terminal and storage medium
CN202110853314.6A Active CN113559507B (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment
CN202110852102.6A Pending CN113546417A (en) 2021-04-28 2021-07-27 Information processing method and device, electronic equipment and storage medium
CN202410391501.0A Pending CN118179018A (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment
CN202110853341.3A Pending CN113521731A (en) 2021-04-28 2021-07-27 Information processing method and device, electronic equipment and storage medium
CN202110853316.5A Active CN113559504B (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment

Family Applications Before (5)

Application Number Title Priority Date Filing Date
CN202110853308.0A Active CN113521759B (en) 2021-04-28 2021-07-27 Information processing method, device, terminal and storage medium
CN202110853314.6A Active CN113559507B (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment
CN202110852102.6A Pending CN113546417A (en) 2021-04-28 2021-07-27 Information processing method and device, electronic equipment and storage medium
CN202410391501.0A Pending CN118179018A (en) 2021-04-28 2021-07-27 Information processing method, information processing device, storage medium and electronic equipment
CN202110853341.3A Pending CN113521731A (en) 2021-04-28 2021-07-27 Information processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (6) CN113521759B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115703011A (en) * 2021-08-05 2023-02-17 腾讯科技(深圳)有限公司 Sound prompting method, device, equipment and storage medium in virtual world
CN114415922B (en) * 2022-01-19 2024-03-15 网易(杭州)网络有限公司 Operation control adjustment method and device, electronic equipment and readable medium
CN114489897B (en) * 2022-01-21 2023-08-08 北京字跳网络技术有限公司 Object processing method, device, terminal equipment and medium
CN115193048A (en) * 2022-07-08 2022-10-18 网易(杭州)网络有限公司 Virtual item processing method and device, storage medium and electronic equipment
CN118059487A (en) * 2022-11-22 2024-05-24 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for rapidly discarding virtual prop

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0938832B1 (en) * 1996-11-07 2005-12-21 Deutsche Thomson-Brandt Gmbh Method and device for projecting sound sources onto loudspeakers
CN101213589A (en) * 2006-01-12 2008-07-02 松下电器产业株式会社 Object sound analysis device, object sound analysis method, and object sound analysis program
CN107890673A (en) * 2017-09-30 2018-04-10 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN111914115A (en) * 2019-05-08 2020-11-10 阿里巴巴集团控股有限公司 Sound information processing method and device and electronic equipment
CN112044069A (en) * 2020-09-10 2020-12-08 腾讯科技(深圳)有限公司 Object prompting method, device, equipment and storage medium in virtual scene

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8712791B2 (en) * 2000-11-22 2014-04-29 Catalis, Inc. Systems and methods for documenting medical findings of a physical examination
US9180365B2 (en) * 2010-05-10 2015-11-10 Sony Computer Entertainment America Llc Polymorphic firearm controller
JP5497079B2 (en) * 2011-12-27 2014-05-21 株式会社スクウェア・エニックス Game system
US8954890B2 (en) * 2012-04-12 2015-02-10 Supercell Oy System, method and graphical user interface for controlling a game
US20130331162A1 (en) * 2012-06-12 2013-12-12 Alexander Higgins Krivicich Match three gaming system and method
JP6009477B2 (en) * 2014-02-25 2016-10-19 株式会社カプコン Game system and game program
JP5712444B1 (en) * 2014-04-10 2015-05-07 株式会社gloops Game server, game control method, game program, game program recording medium, and game system
US20160348992A1 (en) * 2015-05-29 2016-12-01 Richard J. Tisone Ammunition magazine configured for automatic ejection
JP2017064251A (en) * 2015-10-01 2017-04-06 株式会社コロプラ Game program
US10183222B2 (en) * 2016-04-01 2019-01-22 Glu Mobile Inc. Systems and methods for triggering action character cover in a video game
CN107376357A (en) * 2016-05-17 2017-11-24 蔡小华 A kind of good friend's interaction class internet game method
CN106288941B (en) * 2016-07-23 2018-02-27 赵子成 Double magazines, move magazine rifle, tommy gun certainly
KR20180020702A (en) * 2016-08-19 2018-02-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN107741819B (en) * 2017-09-01 2018-11-23 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107837531B (en) * 2017-09-28 2018-11-23 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107899241B (en) * 2017-11-22 2020-05-22 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment
CN108459811B (en) * 2018-01-09 2021-03-16 网易(杭州)网络有限公司 Method and device for processing virtual prop, electronic equipment and storage medium
CN108465238B (en) * 2018-02-12 2021-11-12 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium
CN108579084A (en) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 Method for information display, device, equipment in virtual environment and storage medium
CN108905211A (en) * 2018-06-27 2018-11-30 深圳开黑科技有限公司 It is a kind of to accompany the user matching method and equipment for playing platform based on game
CN109045705A (en) * 2018-06-27 2018-12-21 深圳开黑科技有限公司 It is a kind of to accompany dynamic publishing method, equipment and the system for playing platform based on game
CN109005235A (en) * 2018-08-14 2018-12-14 深圳开黑科技有限公司 It is a kind of to accompany the information-pushing method and terminal for playing platform based on game
CN110354506B (en) * 2019-08-20 2023-11-21 网易(杭州)网络有限公司 Game operation method and device
CN110507993B (en) * 2019-08-23 2020-12-11 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for controlling virtual object
CN110841284A (en) * 2019-11-08 2020-02-28 网易(杭州)网络有限公司 Signal sending method and device in game and terminal equipment
CN111125127B (en) * 2019-12-06 2023-01-31 腾讯科技(深圳)有限公司 Data synchronization method and device, storage medium and electronic device
CN111744175B (en) * 2020-06-29 2024-03-22 网易(杭州)网络有限公司 Method and device for controlling operation of in-game and out-of-game props
CN111760268B (en) * 2020-07-06 2021-06-08 网易(杭州)网络有限公司 Path finding control method and device in game
CN111905364A (en) * 2020-08-31 2020-11-10 网易(杭州)网络有限公司 Recommendation and purchase method of game virtual prop and electronic equipment
CN112023392A (en) * 2020-09-17 2020-12-04 网易(杭州)网络有限公司 Virtual article processing method and device
CN112206529B (en) * 2020-10-19 2023-01-24 珠海金山数字网络科技有限公司 Data processing method and device
CN112316428A (en) * 2020-10-27 2021-02-05 腾讯科技(深圳)有限公司 Method and device for processing virtual prop and computer readable storage medium
CN112516583B (en) * 2020-12-11 2024-05-14 网易(杭州)网络有限公司 Data processing method and device in game and electronic terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0938832B1 (en) * 1996-11-07 2005-12-21 Deutsche Thomson-Brandt Gmbh Method and device for projecting sound sources onto loudspeakers
CN101213589A (en) * 2006-01-12 2008-07-02 松下电器产业株式会社 Object sound analysis device, object sound analysis method, and object sound analysis program
CN107890673A (en) * 2017-09-30 2018-04-10 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN111914115A (en) * 2019-05-08 2020-11-10 阿里巴巴集团控股有限公司 Sound information processing method and device and electronic equipment
CN112044069A (en) * 2020-09-10 2020-12-08 腾讯科技(深圳)有限公司 Object prompting method, device, equipment and storage medium in virtual scene

Also Published As

Publication number Publication date
CN113559504A (en) 2021-10-29
CN113546417A (en) 2021-10-26
CN113521731A (en) 2021-10-22
CN113521759B (en) 2024-02-13
CN113559507A (en) 2021-10-29
CN113559507B (en) 2024-02-13
CN118179018A (en) 2024-06-14
CN113521759A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN113559504B (en) Information processing method, information processing device, storage medium and electronic equipment
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
CN113181650B (en) Control method, device, equipment and storage medium for calling object in virtual scene
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
CN111275797A (en) Animation display method, device, equipment and storage medium
CN113244603A (en) Information processing method and device and terminal equipment
CN112076473A (en) Control method and device of virtual prop, electronic equipment and storage medium
US20230072503A1 (en) Display method and apparatus for virtual vehicle, device, and storage medium
KR20230007392A (en) Method and apparatus, device, and storage medium for displaying a virtual environment picture
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN113181649A (en) Control method, device, equipment and storage medium for calling object in virtual scene
CN112295230A (en) Method, device, equipment and storage medium for activating virtual props in virtual scene
CN113633964A (en) Virtual skill control method, device, equipment and computer readable storage medium
CN110585706A (en) Interactive property control method, device, terminal and storage medium
JP2022552752A (en) Screen display method and device for virtual environment, and computer device and program
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN115634449A (en) Method, device, equipment and product for controlling virtual object in virtual scene
CN115645923A (en) Game interaction method and device, terminal equipment and computer-readable storage medium
CN113398576A (en) Virtual environment-based picture control method and device, storage medium and equipment
WO2023011063A1 (en) Sound prompting method and apparatus in virtual world, and device and storage medium
CN113663329B (en) Shooting control method and device for virtual character, electronic equipment and storage medium
JP2024523984A (en) Method, apparatus, computer device and computer program for voice prompts in a virtual world
Nabeel et al. Joystick Mapping in Virtual Reality Shooting Game

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant