CN110882540B - Sound source positioning method and device, storage medium and electronic device - Google Patents

Sound source positioning method and device, storage medium and electronic device Download PDF

Info

Publication number
CN110882540B
CN110882540B CN201911176825.8A CN201911176825A CN110882540B CN 110882540 B CN110882540 B CN 110882540B CN 201911176825 A CN201911176825 A CN 201911176825A CN 110882540 B CN110882540 B CN 110882540B
Authority
CN
China
Prior art keywords
sound source
event
display
trigger event
event list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911176825.8A
Other languages
Chinese (zh)
Other versions
CN110882540A (en
Inventor
林凌云
王扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911176825.8A priority Critical patent/CN110882540B/en
Publication of CN110882540A publication Critical patent/CN110882540A/en
Application granted granted Critical
Publication of CN110882540B publication Critical patent/CN110882540B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Stereophonic System (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a sound source positioning method and device, a storage medium and an electronic device. Wherein, the method comprises the following steps: when a game application client updates a game screen in a currently displayed game task, at least one sound source trigger event list generated in the running process of the game task is obtained, wherein the sound source trigger event in each sound source trigger event list is an event which is triggered by a sound source virtual object of a type participating in the game task and is used for generating sound; comparing at least one sound source trigger event list with a display event list matched with the game task; updating a display event list according to the comparison result; and marking and displaying the position of the sound source virtual object in the updated display event list in the map matched with the game task. The invention solves the technical problem of poor sound source positioning accuracy in the mode provided by the related technology.

Description

Sound source positioning method and device, storage medium and electronic device
Technical Field
The invention relates to the field of computers, in particular to a sound source positioning method and device, a storage medium and an electronic device.
Background
In some game applications, it is often simulated to play sounds emitted by virtual objects located in a virtual scene in a game scene, so that players participating in the virtual scene can hear the simulated sounds, thereby sensing the game atmosphere in real time.
However, most players do not have the habit of wearing headphones to play games, limited by the portability of the external device; in addition, many different noisy sounds are often generated simultaneously in a game scene, so that a player cannot clearly distinguish the sound source position of each sound by listening. That is, in the related art, a method for accurately locating the positions of the respective sound sources in the virtual scene has not been provided.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a sound source positioning method and device, a storage medium and an electronic device, which at least solve the technical problem of poor sound source positioning accuracy in a mode provided by the related technology.
According to an aspect of the embodiments of the present invention, there is provided a sound source positioning method, including: when a game application client updates a game screen in a currently displayed game task, acquiring at least one sound source trigger event list generated in the running process of the game task, wherein the sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a sound source virtual object of a type participating in the game task; comparing the at least one sound source trigger event list with the display event list matched with the game task; updating the display event list according to the comparison result; and displaying a mark of the position of the sound source virtual object in the updated display event list on the map matching the game task.
According to another aspect of the embodiments of the present invention, there is also provided an audio source localization apparatus, including: a first obtaining unit, configured to obtain at least one sound source trigger event list generated in an operation process of a currently displayed game task when a game application client updates a game screen in the game task, where a sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a type of sound source virtual object participating in the game task; the comparison unit is used for comparing the at least one sound source trigger event list with the display event list matched with the game task; the updating unit is used for updating the display event list according to the comparison result; and a display unit configured to mark and display a position where the sound source virtual object is located in the updated display event list on a map that matches the game task.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, in which a computer program is stored, where the computer program is configured to execute the sound source localization method when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the sound source localization method through the computer program.
In the embodiment of the invention, in the process of running a game task in a game application client, when a currently displayed game screen is updated, at least one sound source trigger event list generated in the running process is acquired, and the at least one sound source trigger event list and a display event list matched with the game task are compared, so that the display event list is updated according to the comparison result. And then, in a map matched with the game task, marking and displaying the position of the sound source virtual object in the updated display event list. That is, the display event list is used to control and realize the visual display of the position where the sound source virtual object generating sound in the game task is located, rather than to distinguish the position where the target sound source virtual object is located by listening. The display event list is dynamically updated according to different types of sound source trigger events triggered in the running process of the game task. Therefore, the dynamic and accurate visual positioning of the position of the sound source virtual object is realized, the aim of improving the positioning accuracy of the sound source is achieved, and the problem of poor positioning accuracy of the sound source caused by the position distinguishing mode of listening provided by the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a schematic diagram of a hardware environment of an alternative sound source localization method according to an embodiment of the present invention;
fig. 2 is a flow chart of an alternative audio source localization method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a display interface of an alternative sound source localization method according to an embodiment of the present invention; (ii) a
FIG. 4 is a flow chart of an alternative audio source localization method according to embodiments of the present invention;
fig. 5 is a schematic diagram of an alternative sound source localization method according to an embodiment of the present invention;
fig. 6 is a flow chart of an alternative sound source localization method according to an embodiment of the present invention;
fig. 7 is a flowchart of an alternative audio source localization method according to an embodiment of the present invention;
fig. 8 is a flowchart of an alternative audio source localization method according to an embodiment of the present invention;
fig. 9 is a flowchart of an alternative audio source localization method according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a display interface of an alternative sound source localization method according to an embodiment of the present invention;
fig. 11 is a schematic diagram of an alternative sound source localization method according to an embodiment of the present invention;
fig. 12 is a flowchart of an alternative audio source localization method according to an embodiment of the present invention;
fig. 13 is a schematic diagram of an alternative sound source localization method according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of an alternative sound source localization apparatus according to an embodiment of the present invention;
fig. 15 is a schematic structural diagram of an alternative electronic device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The sound source localization method can be applied to the hardware environment shown in fig. 1, but is not limited thereto. The audio source localization method can be applied, but not limited to, to an audio source localization system in the environment shown in fig. 1, wherein the audio source localization system can include, but is not limited to, the terminal device 102, the network 104, and the server 106. A game application client for logging in a target user account is run in the terminal device 102. The terminal device 102 includes a human-machine interaction screen 1022, a processor 1024, and a memory 1026. The man-machine interaction screen 1022 is configured to present a virtual scene in a game task executed by the game application client, provide a man-machine interaction interface to receive a man-machine interaction operation executed on the man-machine interaction interface, and perform a mark display on a position where a sound source virtual object in the virtual scene is located; the processor 1024 is configured to obtain at least one sound source trigger event list generated in the running process of the game task, and compare the at least one sound source trigger event list with a display event list matched with the game task. And the display event list is updated according to the comparison result, so that the position of the sound source virtual object in the updated display event list can be marked and displayed in a map matched with the game task. The memory 108 is used for storing the attribute information of the virtual character controlled by the game application client, at least one sound source trigger event list and a display event list.
In addition, the server 106 includes a database 1062 and a processing engine 1064, where the database 1062 is used for storing at least one sound source trigger event list and a display event list. The processing engine 1064 is configured to search at least one sound source trigger event list and a display event list generated in the current game task; and is further configured to compare the at least one sound source trigger event list with the display event list to update the display event list, and then return the comparison result to the terminal device 102.
The specific process comprises the following steps: as shown in steps S102-S104, a game task is triggered to run on the human-computer interaction screen 104 in the terminal device 102, and during the running process, a request is made to the server 106 through the network 104 for obtaining a display event list matching with the game task. The server 106 will then perform steps S106-S110: the processing engine 1064 in the server 106 calls the database 1062 to obtain at least one sound source trigger event list generated in the running process of the game task, and compares the at least one sound source trigger event list with the display event list currently matched with the game task. The sound source trigger event is an event which is triggered by a sound source virtual object participating in a game task and is about to generate sound. And then updating the display event list according to the comparison result. Further, if step S112 is executed, the server 106 sends the updated display event list to the terminal device 102.
In step S114, after receiving the updated display event list, the terminal device 102 displays a mark of the position of the sound source virtual object in the updated display event list in a map matched with the game task (e.g., an area within a dotted frame on the game application client interface shown in fig. 1), where a bullet icon is used to mark the position of the sound source virtual object (e.g., a gun) that is shooting in fig. 1, and a footprint icon is used to mark the position of the sound source virtual object (e.g., a virtual character) that is moving.
It should be noted that, in this embodiment, in the process of running a game task in the game application client, when a currently displayed game screen is updated, at least one sound source trigger event list generated in the running process is acquired, and the at least one sound source trigger event list and the display event list matched with the game task are compared, so as to update the display event list according to the comparison result. And then, in a map matched with the game task, marking and displaying the position of the sound source virtual object in the updated display event list. That is, the display event list is used to control and realize the visual display of the position where the sound source virtual object generating sound in the game task is located, rather than to distinguish the position where the target sound source virtual object is located by listening. The display event list is dynamically updated according to different types of sound source trigger events triggered in the running process of the game task. Therefore, the dynamic and accurate visual positioning of the position of the sound source virtual object is realized, the aim of improving the positioning accuracy of the sound source is achieved, and the problem of poor positioning accuracy of the sound source caused by the position distinguishing mode of listening provided by the related technology is solved.
As another alternative, the hardware environment of the sound source positioning method may also be, but is not limited to, a standalone terminal device. That is to say, under the condition that the hardware technology can make the storage space of the memory in the terminal device greater than the first threshold and the processing speed of the processor greater than the second threshold, the sound source positioning method can be integrated and applied to an independent terminal device (not shown in the figure), for example, the operations required to be executed by the server 106 are integrated into the terminal device 102, and no additional interactive operation is needed, so as to simplify the sound source positioning operation and achieve the effect of improving the sound source positioning efficiency.
Optionally, in this embodiment, the sound source positioning method may be, but is not limited to, applied to a terminal device, and the terminal device may be, but is not limited to, a terminal device supporting running of an application client, such as a mobile phone, a tablet Computer, a notebook Computer, a Personal Computer (Personal Computer, abbreviated as PC), and the like. The server and the terminal device may implement data interaction through a network, which may include but is not limited to a wireless network or a wired network. Wherein, this wireless network includes: bluetooth, WIFI, and other networks that enable wireless communication. Such wired networks may include, but are not limited to: wide area networks, metropolitan area networks, and local area networks. The above is merely an example, and this is not limited in this embodiment.
Optionally, as an optional implementation manner, as shown in fig. 2, the sound source localization method includes:
s202, when a game application client updates a game screen in a currently displayed game task, at least one sound source trigger event list generated in the running process of the game task is obtained, wherein the sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a sound source virtual object of a type participating in the game task;
s204, comparing at least one sound source trigger event list with a display event list matched with the game task;
s206, updating the display event list according to the comparison result;
and S208, marking and displaying the position of the sound source virtual object in the updated display event list in the map matched with the game task.
Optionally, in this embodiment, the sound source positioning method may be applied, but not limited to, in a game application, for example, the position of each sound source virtual object participating in a game task is visually positioned, so that a player can intuitively and accurately identify the position of the sound source virtual object in a game scene. The Game application may be a Multiplayer Online Battle sports Game (MOBA) application, or a Single-Player Game (SPG) application. The types of gaming applications described above may include, but are not limited to, at least one of: two-dimensional (2D) game applications, Three-dimensional (3D) game applications, Virtual Reality (VR) game applications, Augmented Reality (AR) game applications, Mixed Reality (MR) game applications. The above is merely an example, and the present embodiment is not limited to this.
Further, the above game application may be, but is not limited to, a shooting-type game application. Here, the Shooting Game application may be a Third Person Shooting Game (TPS) application that is executed as viewed from a Third party character object other than the virtual character controlled by the current player, or may be a First Person Shooting Game (FPS) application that is executed as viewed from the virtual character controlled by the current player. Correspondingly, the sound source virtual object for generating sound during the game task running process may be, but is not limited to: a virtual Character (also referred to as a Player Character) controlled by each game application client by a Player, a Non-Player Character (NPC), a property object (such as a gun) controlled by the virtual Character, and a carrier object (such as a vehicle) controlled by the virtual Character. The above is merely an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the event information of each audio source trigger event recorded in the audio source trigger event list may include but is not limited to: and information such as object identification of the sound source virtual object, event identification and event type of the sound source trigger event, trigger time and end time and the like. The event information of each display event recorded in the display event list may also include, but is not limited to: and information such as object identification of the sound source virtual object, event identification and event type of the sound source trigger event, trigger time and end time and the like. The above is an example, and the event information is not limited to the above list. In addition, the sound source triggering event and the display event may also include other different information, which is not described herein again.
In addition, in this embodiment, during the game task running process, a plurality of sound source trigger event lists may be generated, but are not limited to, where each sound source trigger event list is used to record a type of sound source trigger event, that is, the event types of the sound source trigger events recorded in the same sound source trigger event list are the same. Optionally, the event types may include, but are not limited to: shooting type events, vehicle movement type events and object movement type events. Correspondingly, the sound source trigger event list may include, but is not limited to: a gunshot list, a carrier sound list, and a footstep sound list. The above is an example, and the present embodiment is not limited to the event type and the sound source trigger event list corresponding to the event type.
Optionally, in this embodiment, dynamically updating the display event list according to the comparison result may include, but is not limited to, at least one of the following:
1) under the condition that a first target sound source trigger event which is not in the display event list is found in each sound source trigger event list, adding event information of the first target sound source trigger event into the display event list so as to update the display time list;
2) and acquiring the end time of the sound source trigger event searched in the display event list, and removing the event information of the second target sound source trigger event reaching the end time from the display event list so as to update the display time list.
In addition, in this embodiment, the acquiring the sound source trigger event lists may include but is not limited to: and detecting the running state of the sound source virtual object in the running process of the game task, and determining a trigger sound source trigger event under the condition that the detected running state indicates a motion state. And acquiring the event type of the sound source trigger event, and storing the sound source trigger event into a corresponding sound source trigger event list according to the event type. Further, after determining the trigger sound source trigger event, but not limited to, obtaining the sound source position of the trigger sound source event; comparing the distance between the sound source position and the target position of the virtual character controlled by the game client; and under the condition that the distance is smaller than the first threshold, determining to mark and display the position of the sound source triggering event in the map. The first threshold may be, but is not limited to, an effective distance for visually showing a position of the sound source virtual object, and the specific value may be, but is not limited to, configured to be a different numerical value according to different actual scenes, and is not limited herein.
Optionally, in this embodiment, for the sound source virtual objects of different object types, but not limited to, different object icons may be used to mark and display the positions of the sound source virtual objects. For example, when the sound source virtual object is a virtual character class, a character icon may be used when the mark is displayed; when the sound source virtual object is a carrier class, an automobile icon can be adopted during marking and displaying; when the sound source virtual object is a prop type, a bullet icon can be adopted when the mark is displayed. When the sound source virtual object stops making sound, the corresponding object icon disappears from the map. The object icons listed here are only examples, and are used to illustrate that different types of target sound source virtual objects will be displayed using different icons, but are not limited thereto.
Optionally, in this embodiment, when the position of the sound source virtual object is marked and displayed in the map, the icon corresponding to the sound source virtual object may be, but is not limited to, marked and displayed in the display area currently displayed in the map. When the position of the sound source virtual object is positioned in the display area, the icon is directly displayed in the display area according to the position mark; and when the position of the sound source virtual object is positioned outside the display area, the icon is marked and displayed on the boundary corresponding to the display area.
The description is made with specific reference to the examples shown in fig. 3-4:
assuming that a shooting game application is taken as an example, in the process of running a shooting game task in the game application client, the running state of each sound source virtual object participating in the shooting game task is acquired. Further, it is assumed that detecting the sound source virtual object whose operation state is indicated as a motion state includes: the source virtual object 304 and the source virtual object 306 that are in the process of conflagration. The sound source virtual object 306 is a virtual character currently controlled by the game application client, and the two virtual characters belong to different camps.
Then, at least one sound source trigger event list generated in the game task running process is obtained, and the at least one sound source trigger event list is compared with the display event list. Assuming that after alignment: 1) if the event information of the sound source trigger event (movement trigger footstep sound) triggered by the sound source virtual object 304 does not exist in the display event list, adding the event information of the sound source trigger event triggered by the sound source virtual object 304 to the display event list; 2) if the event information of the sound source trigger event (shooting trigger gunshot) triggered by the sound source virtual object 306 exists in the display event list but the end time has not yet been reached, the event information of the sound source trigger event triggered by the sound source virtual object 306 is retained in the display event list. Thereby obtaining an updated display event list according to the above operation. Further, the positions of the sound source virtual object 304 and the sound source virtual object 306 in the virtual scene provided by the shooting game task are acquired. For example, the sound source virtual object 304 is located at position a, and the sound source virtual object 306 is located at position B.
As shown in fig. 3, the positions of the sound source virtual object 304 and the target sound source virtual object 306 can be marked by different icons in the map 302 in the above manner. For example, the footprint icon at position a in the map 302 is used to mark the position where the virtual sound source object 304 moves to generate footsteps sound, and the bullet icon at position B in the map 302 is used to mark the position where the virtual sound source object 306 shoots to generate gunshot sound, so that the position of the virtual sound source object triggering the sound source trigger event is visually displayed, and the purpose of visually and accurately displaying the positioning information is achieved. The orientation of the bullet as described above can here also be used to indicate the direction of firing of the audio source virtual object 306.
In addition, the execution logic of the sound source localization method is described with reference to the case shown in fig. 3, and the execution process may be, but is not limited to, as shown in fig. 4:
during the process of running a shooting game task, a game screen presented by a game application client side is refreshed in real time, wherein a map corresponding to a virtual scene presented by the game screen is included in the game screen (a display interface is shown in fig. 3). The following steps are then performed:
in step S402, when the game application client updates the game screen in the currently displayed one game task, a sound source trigger event (here, the first detected sound source trigger event is assumed) is detected, and then steps S404 to S408 are performed: determine whether the position of the sound source virtual object in the sound source trigger event is within an effective distance (the effective distance is a distance for controlling the visual display)? And if the sound source trigger event is determined to be located within the effective distance, determining the event type of the sound source trigger event, and adding the event information of the sound source trigger event into a sound source trigger event list corresponding to the event type.
Further, an audio source trigger event list is taken as an example. In step S410, determine whether the sound source trigger event in the sound source trigger event list exists in the display event list? If the sound source trigger event does not exist in the display event list, step S412-1 is executed to add the event information of the sound source trigger event to the display event list. If the sound source trigger event exists in the display event list, step S412-2 is executed to determine whether the sound source trigger event reaches the end time? If it is determined that the sound source trigger event does not reach the end time, in step S414-1, the event information of the sound source trigger event is retained in the display event list. If it is determined that the sound source trigger event reaches the end time, step S414-2 is executed to remove the event information of the sound source trigger event from the display event list. The display event list matched with the game task is updated through the steps S412-1, S414-1 and S414-2.
Then, in the map matched with the game task, the position of the sound source virtual object in the updated display event list is marked and displayed, in step S416.
The processes shown in fig. 3 to 4 are only examples, and this embodiment is not limited to this.
In this embodiment, the icon for marking the position of the sound source virtual object may be, but is not limited to, refreshed and detected every predetermined time period, and does not stay there. Thereby ensuring the fairness and the competitive performance of the game.
Through the embodiment provided by the application, the display event list is utilized to control and realize the visual display of the position of the sound source virtual object generating sound in the game task, and the position of the target sound source virtual object is not distinguished by listening. The display event list is dynamically updated according to different types of sound source trigger events triggered in the running process of the game task. Therefore, the dynamic and accurate visual positioning of the position of the sound source virtual object is realized, the aim of improving the positioning accuracy of the sound source is achieved, and the problem of poor positioning accuracy of the sound source caused by the position distinguishing mode of listening provided by the related technology is solved.
As an alternative, updating the display event list according to the comparison result includes at least one of:
1) acquiring a first target sound source trigger event which is not found in the display event list from at least one sound source trigger event list; adding the event information of the first target sound source trigger event into a display event list so as to update the display event list;
2) and under the condition that the second target sound source trigger event searched in the display event list is obtained and the second target sound source trigger event is determined to reach the end time, removing event information corresponding to the second target sound source trigger event from the display event list so as to update the display event list.
Optionally, in this embodiment, the event information of the sound source trigger event may include, but is not limited to: and information such as object identification of the sound source virtual object, event identification and event type of the sound source trigger event, trigger time and end time and the like. Here, the object identification may be, but is not limited to, a login account number used by a player to log in to a game application client, a nickname registered in a game application, or the like. The event identifier may be, but is not limited to, an event ID, and the corresponding event type may include, but is not limited to: shooting type events, vehicle movement type events and object movement type events. Correspondingly, the sound source trigger event list may include, but is not limited to: a gunshot list, a carrier sound list, and a footstep sound list. The above is an example, and the present embodiment is not limited to the event type and the sound source trigger event list corresponding to the event type.
It should be noted that each sound source triggering event may be, but is not limited to, a sound that is played continuously for a certain period of time. Therefore, in this embodiment, the event information of the sound source trigger event further records the trigger time when the sound source trigger event is triggered and the suspended end time.
The description is made with reference to the example shown in fig. 5:
suppose that the sound source trigger event list generated during the running process of the game task comprises: an audio trigger event list 502 (e.g., a gunshot list), an audio trigger event list 504 (e.g., a vehicle sound list), and an audio trigger event list 506 (e.g., a footstep list). The sound source trigger event list 502 includes a sound source trigger event a1 and a sound source trigger event a2, the sound source trigger event list 504 includes a sound source trigger event B1 and a sound source trigger event B2, and the sound source trigger event list 506 includes a sound source trigger event C1, a sound source trigger event C2, and a sound source trigger event C3. The display event list 508 includes: the audio trigger event includes an audio trigger event a1, an audio trigger event B1, an audio trigger event C1, an audio trigger event a2, and an audio trigger event C2.
After comparing the sound source trigger event list and the display event list, it can be determined that:
1) if the sound source trigger event B2 and the sound source trigger event C3 are not found in the display event list 508, storing the event information of the sound source trigger event B2 and the sound source trigger event C3 in the display event list;
2) finding the sound source trigger event A1 and the sound source trigger event C1 in the display event list 508, and if the sound source trigger event A1 and the sound source trigger event C1 do not reach the end time, continuing to retain the event information of the sound source trigger event A1 and the sound source trigger event C1 in the display event list;
3) when the sound source trigger event a2, the sound source trigger event B1, and the sound source trigger event C2 are found in the display event list 508 and the sound source trigger event a2, the sound source trigger event B1, and the sound source trigger event C2 all reach the end time, the event information of the sound source trigger event a2, the sound source trigger event B1, and the sound source trigger event C2 is removed from the display event list.
The event information of the sound source trigger event recorded in the display event list is updated in the above manner, so as to obtain an updated display event list 510, where the updated display event list includes: the sound source trigger event a1, the sound source trigger event C1, the sound source trigger event B2 and the sound source trigger event C3.
According to the embodiment provided by the application, the event information of the sound source trigger events in each sound source trigger event trigger list is compared with the event information of the display events in the display event list, so that the display event list is dynamically updated according to the comparison result, and the sound source positioning accuracy can be ensured when the position of the sound source virtual object is controlled to be visually displayed by using the display event list.
As an optional solution, comparing at least one sound source trigger event list with the display event list matched with the game task includes:
s1, traversing each audio source trigger event list in the at least one audio source trigger event list, and performing the following operations:
s11, acquiring a current sound source trigger event;
s12, searching the object identification of the current sound source virtual object in the current sound source trigger event in the display event list;
s13, determining the current sound source virtual object as the sound source virtual object to be added in the display event list under the condition that the object identification of the current sound source virtual object is not found in the display event list;
s14, under the condition that the object identification of the current sound source virtual object is found in the display event list, the end time of the current sound source virtual object is obtained; under the condition that the end time of the current sound source virtual object is reached, determining the current sound source virtual object as a sound source virtual object to be removed in the display event list; under the condition that the end time of the current sound source virtual object is not reached, reserving event information corresponding to the current sound source trigger event in a display event list;
and S15, acquiring the next sound source trigger event as the current sound source trigger event.
Optionally, in this embodiment, after the current sound source virtual object is determined as the sound source virtual object to be added in the display event list, the sound source virtual object to be added may be directly added to the display event list, and the adding operation may be performed in a unified manner after traversing the sound source trigger events in the complete sound source trigger event list. In addition, in this embodiment, after the current sound source virtual object is determined as the sound source virtual object to be removed in the display event list, the sound source virtual object to be removed may be directly removed from the display event list, and the removal operation may be performed in a unified manner after all the sound source trigger events in the sound source trigger event list are traversed. In this embodiment, the processing timings of the sound source virtual object to be added and the sound source virtual object to be removed are not limited.
It should be noted that, when comparing at least one sound source trigger event list with the display event list, one sound source trigger event list may be randomly selected from the at least one sound source trigger event list for comparison, and the comparison order of the sound source trigger event lists for comparison may also be determined according to a preset comparison priority.
Optionally, in this embodiment, before acquiring the current sound source trigger event, the method further includes: determining the comparison priority of each sound source trigger event list in at least one sound source trigger event list; and traversing each sound source trigger event list in the at least one sound source trigger event list according to the comparison priority, and sequentially acquiring the current sound source trigger event.
The description is made with reference to the example shown in fig. 6: it is assumed that when the game application client updates a game screen in a currently displayed game task, at least one currently generated sound source trigger event list (including a gunshot list, a carrier sound list and a footstep sound list) is obtained, wherein the comparison priority of the gunshot list is higher than that of the carrier sound list, and the comparison priority of the carrier sound list is higher than that of the footstep sound list.
In step S602, the event information of the sound source trigger event (the sound source trigger event that has reached the end time) that has expired in the sound source trigger event list is removed. Then, the comparison is carried out according to the comparison priority. Traversing the gunshot list, reading each firing event in turn, as in steps S604-S606, to determine if all firing events are in the display event list? If it is determined that there is a target shooting event that is not located in the display event list, step S607 is executed to add event information of the target shooting event to the display event list. Then, step S608-S610 are executed to traverse the vehicle sound list and sequentially read each vehicle movement event to determine whether all vehicle movement events are located in the display event list? If it is determined that the target vehicle movement event is not located in the display event list, step S611 is executed to add the event information of the target vehicle movement event to the display event list. Then, step S612-S614 are performed to traverse the footstep sound list and read each footstep movement event in turn to determine if all the footstep movement events are located in the display event list? If it is determined that the target step movement event is not located in the display event list, step S615 is executed to add the event information of the target step movement event to the display event list. After the display event list is updated according to the comparison result to obtain an updated display event list, step S616 is executed to mark and display the position of the sound source virtual object in the updated display event list.
After step S606, if it is determined that all the gunshot events are located in the display event list, step S608 is executed; after step S610, if it is determined that all the vehicle acoustic events are located in the display event list, step S612 is executed; after step S614, if it is determined that all of the footstep sound events are located in the display event list, step S616 is performed.
When comparing the audio source trigger event with the display event, the method may include, but is not limited to: 1) comparing the object identification of the sound source virtual object used for triggering the event; and comparing the event identifications of the sound source trigger events. That is to say, whether the sound source trigger event is located in the display event list is determined through directly comparing the identifiers, so that the comparison duration is shortened, the effect of improving the comparison efficiency is achieved, and the effect of improving the efficiency of sound source positioning visual display is achieved.
According to the embodiment provided by the application, the sound source trigger events in each sound source trigger event list are traversed to be compared with the display event list, so that the completeness and comprehensiveness of comparison are guaranteed, and the positions of all sound source virtual objects which are allowed to be displayed in a visualized mode are accurately displayed in a map.
As an alternative, the step of obtaining at least one sound source trigger event list generated during the running process of the game task comprises:
s1, detecting a sound source trigger event in the running process of the game task;
s2, acquiring the event type of the sound source trigger event under the condition that the sound source trigger event is detected;
and S3, adding the sound source trigger event into the corresponding sound source trigger event list according to the event type.
Optionally, in this embodiment, before obtaining the event type of the sound source trigger event, the method further includes:
s1-1, acquiring the sound source position where the sound source trigger event is triggered;
s1-2, determining the distance between the sound source position and the target position of the virtual character controlled by the game application client;
s1-3, determining an event type of the sound source trigger event when the distance is smaller than the first threshold, where event information of the sound source trigger event is to be recorded in the sound source trigger event list, and the event information of the sound source trigger event includes: and the sound source trigger event comprises an object identifier of a sound source virtual object in the sound source trigger event, an event type, trigger time of the sound source trigger event and end time of the sound source trigger event.
It should be noted that, in this embodiment, each sound source virtual object is respectively configured with an effective distance (for example, a first threshold) for visual display, so as to ensure that the position of the sound source virtual object within the effective distance can be marked and displayed in a map, and thus, the sound source trigger event triggered in a short distance and the sound source trigger event triggered in a long distance are distinguished by the above manner. The first threshold may be, but is not limited to, set to different values according to different scenarios, which is not limited in this embodiment.
Optionally, in this embodiment, adding the sound source trigger event to the corresponding sound source trigger event list according to the event type includes:
1) adding the sound source trigger event into a shooting trigger event list under the condition that the sound source trigger event is a shooting event;
2) adding the sound source trigger event into a carrier trigger event list under the condition that the sound source trigger event is a carrier movement event;
3) and adding the sound source trigger event into the object trigger event list when the sound source trigger event is the object movement type event.
It should be noted that for the object movement type event and the vehicle movement type event, the terminal device uses the speaker to implement unicast, so that the end time of the sound source trigger event can be determined at each trigger, for example, n seconds after the trigger. For the shooting event, the terminal device uses a loop of audio, and after detecting the command of stopping shooting, the shooting event is considered to be ended.
The description is made in detail with reference to the examples shown in fig. 7 to 9: it is assumed that when the game application client updates a game screen in a currently displayed game task, at least one sound source trigger event list which is generated currently is obtained, and the sound source trigger event list can comprise a shooting trigger event list (such as a gunshot list), a carrier trigger event list (such as a carrier sound list) and an object trigger event list (such as a footstep list). When different sound source Trigger events are triggered, the sound source Trigger events are broadcasted in an Event Trigger (Event Trigger) mode, and call-back of the events is registered in a static class and processed.
As shown in steps S702-S712 of fig. 7, after the registration event callback is executed, a trigger tone source trigger event is detected, and the event type is a shooting event, i.e. it is determined to be a shooting event. Is it determined whether the virtual object of the sound source that triggered the firing event is located at an effective distance? In the event that it is determined to be within the valid distance, the event information for the firing event is added to the gunshot list. Further, detect if the firing event has aborted firing? If the termination is determined, the event information for the firing event is removed from the gunshot list.
As shown in steps S802-S812 of fig. 8, after the registration event callback is executed, a trigger audio source trigger event is detected, and the event type is a vehicle movement event, that is, it is determined as a vehicle driving event. Is it determined whether the sound source virtual object triggering the vehicle driving event is located at an effective distance? In the case where it is determined that the vehicle is located within the valid distance, the event information of the vehicle travel event is added to the vehicle sound list. Further, determine is the vehicle travel event reached an end time (e.g., M seconds after triggering)? If the end time is determined to be reached, removing the event information of the vehicle driving event from the vehicle sound list.
As shown in steps S902-S912 of fig. 9, after the registration event callback is executed, a trigger-tone trigger event is detected, and the event type is an object movement-class event, that is, it is determined as an object walking event. Is it determined whether the virtual object of the sound source that triggered the walking event of the object is located at an effective distance? In the event that it is determined to be within the valid distance, the event information of the subject walking event is added to the footfall list. Further, determine is the subject's walking event reach an end time (e.g., N seconds after triggering)? If it is determined that the end time is reached, the event information of the subject walking event is removed from the footfall list.
According to the embodiment provided by the application, after the sound source trigger event is detected, whether the position of the sound source virtual object in the sound source trigger event is located within the effective distance or not is obtained, and under the condition that the position of the sound source virtual object is located within the effective distance, the sound source trigger event is added into the corresponding sound source trigger event list according to the event type. Therefore, different types of sound source trigger events can be distinguished and managed.
As an alternative, in the map matched with the game task, the marking and displaying the position where the sound source virtual object in the updated display event list is located includes:
s1, determining the display priority of each sound source trigger event recorded in the updated display event list;
and S2, displaying the position of the sound source virtual object in each sound source triggering event according to the display priority, wherein the display priority of the shooting event is higher than that of the carrier moving event, and the display priority of the carrier moving event is higher than that of the object moving event.
Since the display area of the map on the game screen is limited, icons whose number is larger than a certain threshold value cannot be simultaneously displayed on the game screen. In this embodiment, different display priorities may be configured for different types of sound source trigger events, so as to clearly display icons with higher display priorities according to the display priorities. For example, the display priority of the shooting type event is higher than that of the vehicle moving type event, and the display priority of the vehicle moving type event is higher than that of the object moving type event.
According to the embodiment provided by the application, under the condition that the number of the sound source trigger events is larger than a certain threshold value, or under the condition that a plurality of sound source trigger events are detected at the same position, the positions of the sound source virtual objects in part of the sound source trigger events can be selectively displayed by referring to the display priority, so that the display effectiveness and definition are ensured.
As an alternative, in the map matched with the game task, the marking and displaying the position where the sound source virtual object in the updated display event list is located includes:
s1, acquiring the object type of the sound source virtual object;
and S2, displaying an object icon matched with the object type at the position of the sound source virtual object in the map.
Optionally, in this embodiment, displaying an object icon matching the object type at the position of the sound source virtual object in the map includes at least one of:
1) displaying a first object icon in the case that the object type indicates a virtual character type;
2) displaying a second object icon in the case that the object type indicates a carrier type;
3) and in the case that the object type indicates the prop type, displaying a third object icon.
The description is made with reference to the example shown in fig. 10:
assuming that the object type of the sound source virtual object is determined to be a virtual character, a corresponding first object icon, such as a footprint icon 1002 shown in fig. 10, may be displayed to visually display the footstep sound generated by the virtual character type in the virtual scene. If it is determined that the object type of the sound source virtual object is the vehicle type, a corresponding second object icon, such as a vehicle icon 1004 shown in fig. 10, may be displayed to visually display the sound generated by the vehicle driving in the virtual scene. Assuming that the object type of the sound source virtual object is determined to be a prop type, a corresponding third object icon, such as a bullet icon 1006 shown in fig. 10, may be displayed to visually show the sound generated when the prop is used in the virtual scene, such as a firing sound when a gun is used as represented by the bullet icon 1006.
Through the embodiment provided by the application, different object icons are configured for the sound source virtual objects of different object types, so that the types of the sound source virtual objects can be conveniently and visually seen from a map, and sound source positioning information is enriched.
As an optional scheme, before the marking and displaying the position of the sound source virtual object in the updated display event list in the map matched with the game task, the method further includes:
s1, acquiring the three-dimensional coordinates of the position of the sound source virtual object in the world coordinate system;
s2, converting the three-dimensional coordinates into mapping coordinates in a mapping coordinate system of the map;
s3, converting the mapping coordinates into display coordinates under a display interface coordinate system, wherein the display interface coordinate system is a coordinate system corresponding to a part of a display area displayed by the map in the game picture;
s4, the position where the sound source virtual object is displayed is marked in the map according to the display coordinates.
It should be noted that, when data is transferred, the world coordinates of the position of the sound source virtual object triggering the sound source trigger event are included, and the world coordinates of the virtual character (local player) controlled by the game application client is a global variable. After the world coordinate is obtained, the world coordinate needs to be converted into a display coordinate (User Interface (UI) coordinate), which can be displayed in the map, but the two coordinates cannot be directly converted, so that the chartlet coordinate is used as the intermediate conversion relationship between the world coordinate and the UI coordinate in this embodiment. The map-attached coordinates are coordinates with the map base map as a reference object. As shown in fig. 11, the world coordinate-chartlet coordinate calculation method determines the correspondence relationship by means of GPS dotting, and the chartlet coordinate-UI coordinate is determined by the ratio of the pixel to the size of the control.
Specifically, the following steps S1201 to S1210 shown in fig. 12 are described: three GPS points are set in a game task, and the three GPS points all contain 2 pieces of information: world coordinates and the mapping coordinate position corresponding to the world coordinates. Therefore, as long as the coordinates of any one point are given, the mapping coordinates calculated by the three GPS points can be obtained through linear operation, and then the mapping coordinates which are finally required to be displayed can be obtained through a weighted average method.
And after the position of the mapping coordinate to be displayed is obtained, the UI coordinate to be displayed of the icon corresponding to the position of the sound source virtual object can be known through the conversion between the mapping coordinate and the UI coordinate. Since the map is always centered on the local player, and the UI coordinate point of the local player is the center point of the map Prefab, the mapping coordinates of the local player can also be calculated by the GPS dotting method. Since the rendering of the map in the UI is the scaling of the artwork, i.e. no scaling is performed. The relative position of the map coordinates is the relative position of the UI coordinates.
According to the world coordinates and the world coordinates of the local player, the distance between the position of the sound source virtual object and the target position of the virtual character controlled by the game application client (local player) can be acquired. In order to prevent the picture from overlapping with the picture of the person when displayed, when the distance is small (for example, within 50 m), the relative position of the map is modified to be 50 m. And setting icons to be displayed according to different distances.
Through the embodiment provided by the application, the position of the sound source virtual object can be accurately presented in the map through coordinate conversion, so that the visual processing of the sound source virtual object is achieved, and the problem of poor positioning accuracy caused by sound listening position distinguishing is avoided.
As an alternative, marking the position of the virtual object of the display sound source in the map according to the display coordinates includes:
1) when the display coordinates are located in the currently displayed display area of the map, marking the position of the virtual object of the display sound source in the display area on the map according to the display coordinates;
2) when the display coordinates are located outside the currently displayed display area of the map, determining the boundary of the currently displayed display area closest to the display coordinates; the direction showing the position of the sound source virtual object is marked on the boundary.
The description will be made with reference to fig. 13: if the display coordinates of the sound source virtual object at the position D are determined to be located in the currently displayed display area of the map 1302, the display can be directly marked; on the other hand, if the display coordinates of another sound source virtual object are assumed to be located in the north side outside the display area currently displayed on the map 1302, the icon of the target sound source virtual object, such as the icon at the position E shown in fig. 13, may be displayed on the boundary of the map 1302 corresponding to the north side.
Through the embodiment provided by the application, the icon of the sound source virtual object positioned in the display area is marked and displayed on the map, and the icon of the sound source virtual object positioned outside the display area can also be marked and displayed on the map, so that the marked and displayed range of the sound source virtual object is expanded.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
According to another aspect of the embodiments of the present invention, there is also provided an audio source localization apparatus for implementing the audio source localization method. As shown in fig. 14, the apparatus includes:
1) a first obtaining unit 1402, configured to obtain at least one sound source trigger event list generated during a running process of a game task when a game application client updates a game screen in a currently displayed game task, where a sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a type of sound source virtual object participating in the game task;
2) a comparing unit 1404, configured to compare at least one sound source trigger event list with a display event list matched with the game task;
3) an updating unit 1406, configured to update the display event list according to the comparison result;
4) display section 1408 marks and displays the position where the sound source virtual object in the updated display event list is located on the map matching the game task.
Optionally, in this embodiment, the sound source positioning method may be applied, but not limited to, in a game application, for example, the position of each sound source virtual object participating in a game task is visually positioned, so that a player can intuitively and accurately identify the position of the sound source virtual object in a game scene. The Game application may be a Multiplayer Online Battle sports Game (MOBA) application, or a Single-Player Game (SPG) application. The types of gaming applications described above may include, but are not limited to, at least one of: two-dimensional (2D) game applications, Three-dimensional (3D) game applications, Virtual Reality (VR) game applications, Augmented Reality (AR) game applications, Mixed Reality (MR) game applications. The above is merely an example, and the present embodiment is not limited to this.
Further, the above game application may be, but is not limited to, a shooting-type game application. Here, the Shooting Game application may be a Third Person Shooting Game (TPS) application that is executed as viewed from a Third party character object other than the virtual character controlled by the current player, or may be a First Person Shooting Game (FPS) application that is executed as viewed from the virtual character controlled by the current player. Correspondingly, the sound source virtual object for generating sound during the game task running process may be, but is not limited to: a virtual Character (also referred to as a Player Character) controlled by each game application client by a Player, a Non-Player Character (NPC), a property object (such as a gun) controlled by the virtual Character, and a carrier object (such as a vehicle) controlled by the virtual Character. The above is merely an example, and this is not limited in this embodiment.
Optionally, in this embodiment, the event information of each audio source trigger event recorded in the audio source trigger event list may include but is not limited to: and information such as object identification of the sound source virtual object, event identification and event type of the sound source trigger event, trigger time and end time and the like. The event information of each display event recorded in the display event list may also include, but is not limited to: and information such as object identification of the sound source virtual object, event identification and event type of the sound source trigger event, trigger time and end time and the like. The above is an example, and the event information is not limited to the above list. In addition, the sound source triggering event and the display event may also include other different information, which is not described herein again.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
Through the embodiment provided by the application, the display event list is utilized to control and realize the visual display of the position of the sound source virtual object generating sound in the game task, and the position of the target sound source virtual object is not distinguished by listening. The display event list is dynamically updated according to different types of sound source trigger events triggered in the running process of the game task. Therefore, the dynamic and accurate visual positioning of the position of the sound source virtual object is realized, the aim of improving the positioning accuracy of the sound source is achieved, and the problem of poor positioning accuracy of the sound source caused by the position distinguishing mode of listening provided by the related technology is solved.
As an alternative, the updating unit 1406 includes at least one of the following:
1) the first updating module is used for acquiring a first target sound source trigger event which is not found in the display event list from at least one sound source trigger event list; adding the event information of the first target sound source trigger event into a display event list so as to update the display event list;
2) and the second updating module is used for removing the event information corresponding to the second target sound source trigger event from the display event list under the condition that the second target sound source trigger event searched in the display event list is obtained and the second target sound source trigger event is determined to reach the end time so as to update the display event list.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, alignment unit 1404 includes:
1) the processing module is used for traversing each sound source trigger event list in at least one sound source trigger event list and executing the following operations:
s1, acquiring a current sound source trigger event;
s2, searching the object identification of the current sound source virtual object in the current sound source trigger event in the display event list;
s3, determining the current sound source virtual object as the sound source virtual object to be added in the display event list under the condition that the object identification of the current sound source virtual object is not found in the display event list;
s4, under the condition that the object identification of the current sound source virtual object is found in the display event list, the end time of the current sound source virtual object is obtained; under the condition that the end time of the current sound source virtual object is reached, determining the current sound source virtual object as a sound source virtual object to be removed in the display event list; under the condition that the end time of the current sound source virtual object is not reached, reserving event information corresponding to the current sound source trigger event in a display event list;
and S5, acquiring the next sound source trigger event as the current sound source trigger event.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an optional solution, the processing module is further configured to:
s1, before the current sound source trigger event is obtained, the comparison priority of each sound source trigger event list in at least one sound source trigger event list is determined;
and S2, traversing each sound source trigger event list in at least one sound source trigger event list according to the comparison priority, and sequentially acquiring the current sound source trigger event.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the first obtaining unit 1402 includes:
1) the detection module is used for detecting a sound source trigger event in the running process of the game task;
2) the first acquisition module is used for acquiring the event type of the sound source trigger event under the condition that the sound source trigger event is detected;
3) and the adding module is used for adding the sound source trigger event into the corresponding sound source trigger event list according to the event type.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an optional scheme, the method further comprises the following steps:
1) the second acquisition module is used for acquiring the sound source position where the sound source trigger event is triggered before acquiring the event type of the sound source trigger event;
2) the first determining module is used for determining the distance between the position of the sound source and the target position of the virtual character controlled by the game application client;
3) a second determining module, configured to determine an event type of the audio source trigger event when the distance is smaller than a first threshold, where event information of the audio source trigger event is to be recorded in the audio source trigger event list, and the event information of the audio source trigger event includes: and the sound source trigger event comprises an object identifier of a sound source virtual object in the sound source trigger event, an event type, trigger time of the sound source trigger event and end time of the sound source trigger event.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the adding module includes:
1) the first adding submodule is used for adding the sound source triggering event into the shooting triggering event list under the condition that the sound source triggering event is a shooting event;
2) the second adding submodule is used for adding the sound source trigger event into the carrier trigger event list under the condition that the sound source trigger event is the carrier moving event;
3) and the third adding submodule is used for adding the sound source trigger event into the object trigger event list under the condition that the sound source trigger event is the object movement type event.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the display unit 1408 includes:
1) the third determining module is used for determining the display priority of each sound source triggering event recorded in the updated display event list;
2) the first display module is used for displaying the position of the sound source virtual object in each sound source triggering event according to the display priority, wherein the display priority of the shooting event is higher than that of the carrier moving event, and the display priority of the carrier moving event is higher than that of the object moving event.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the display unit 1408 includes:
1) the third acquisition module is used for acquiring the object type of the sound source virtual object;
2) and the second display module is used for displaying the object icon matched with the object type at the position of the sound source virtual object in the map.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the second display module includes at least one of:
1) the first display sub-module is used for displaying a first object icon under the condition that the object type indicates the virtual role type; the object type display device is also used for displaying a second object icon when the object type is indicated as the carrier type; and the display unit is also used for displaying a third object icon when the object type indicates the prop type.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the display unit 1408 further includes:
1) the fourth acquisition module is used for acquiring the three-dimensional coordinates of the position of the sound source virtual object in the world coordinate system in the map matched with the game task before marking and displaying the position of the sound source virtual object in the updated display event list;
2) the first conversion module is used for converting the three-dimensional coordinates into mapping coordinates under a mapping coordinate system where the map is located;
3) the second conversion module is used for converting the mapping coordinates into display coordinates under a display interface coordinate system, wherein the display interface coordinate system is a coordinate system corresponding to a part of display area displayed by the map in the game picture;
4) and the third display module is used for marking the position of the virtual object of the display sound source in the map according to the display coordinates.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
As an alternative, the third display module includes:
1) the second display submodule is used for marking and displaying the position of the sound source virtual object in the display area on the map according to the display coordinate when the display coordinate is positioned in the currently displayed display area of the map; the map display device is also used for determining the boundary of the currently displayed display area closest to the display coordinates when the display coordinates are positioned outside the currently displayed display area of the map; the direction showing the position of the sound source virtual object is marked on the boundary.
The embodiments in this embodiment may refer to the above method embodiments, but are not limited thereto.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device for implementing the sound source localization method, as shown in fig. 15, the electronic device includes a memory 1502 and a processor 1504, the memory 1502 stores a computer program, and the processor 1504 is configured to execute the steps in any one of the above method embodiments through the computer program.
Optionally, in this embodiment, the electronic apparatus may be located in at least one network device of a plurality of network devices of a computer network.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, when the game application client updates the game screen in the current game task, at least one sound source trigger event list generated in the running process of the game task is obtained, wherein the sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a sound source virtual object of the same type participating in the game task;
s2, comparing at least one sound source trigger event list with the display event list matched with the game task;
s3, updating the display event list according to the comparison result;
s4 marks and displays the position of the sound source virtual object in the updated display event list on the map matching the game task.
Alternatively, it can be understood by those skilled in the art that the structure shown in fig. 15 is only an illustration, and the electronic device may also be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 15 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 15, or have a different configuration than shown in FIG. 15.
The memory 1502 may be used for storing software programs and modules, such as program instructions/modules corresponding to the sound source positioning method and apparatus in the embodiments of the present invention, and the processor 1504 executes various functional applications and data processing by running the software programs and modules stored in the memory 1502, that is, implements the sound source positioning method described above. The memory 1502 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some instances, the memory 1502 can further include memory located remotely from the processor 1504, which can be coupled to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 1502 may be used for storing information such as an audio trigger event list and a display event list, but is not limited thereto. As an example, as shown in fig. 15, the memory 1502 may include, but is not limited to, the first obtaining unit 1402, the comparing unit 1404, the updating unit 1406, and the display unit 1408 of the sound source positioning device. In addition, the sound source positioning apparatus may further include, but is not limited to, other module units in the sound source positioning apparatus, which is not described in this example again.
Optionally, the transmission device 1506 is used for receiving or transmitting data via a network. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 1506 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices to communicate with the internet or a local area Network. In one example, the transmission device 1506 is a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In addition, the electronic device further includes: a display 1508, configured to display an icon corresponding to a position of the sound source virtual object; and a connection bus 1510 for connecting the respective module parts in the above-described electronic apparatus.
According to a further aspect of an embodiment of the present invention, there is also provided a computer-readable storage medium having a computer program stored thereon, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the above-mentioned computer-readable storage medium may be configured to store a computer program for executing the steps of:
s1, when the game application client updates the game screen in the current game task, at least one sound source trigger event list generated in the running process of the game task is obtained, wherein the sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a sound source virtual object of the same type participating in the game task;
s2, comparing at least one sound source trigger event list with the display event list matched with the game task;
s3, updating the display event list according to the comparison result;
s4 marks and displays the position of the sound source virtual object in the updated display event list on the map matching the game task.
Alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (15)

1. A method for locating an audio source, comprising:
when a game application client updates a game screen in a currently displayed game task, acquiring at least one sound source trigger event list generated in the running process of the game task, wherein the sound source trigger event in each sound source trigger event list is an event for generating sound triggered by a sound source virtual object of a type participating in the game task;
comparing the at least one sound source trigger event list with a display event list matched with the game task;
updating the display event list according to the comparison result;
and marking and displaying the position of the sound source virtual object in the updated display event list in a map matched with the game task.
2. The method of claim 1, wherein the updating the display event list according to the comparison comprises at least one of:
acquiring a first target sound source trigger event which is not found in the display event list from the at least one sound source trigger event list; adding event information of the first target sound source trigger event into the display event list to update the display event list;
and under the condition that a second target sound source trigger event searched in the display event list is obtained and the second target sound source trigger event is determined to reach the end time, removing event information corresponding to the second target sound source trigger event from the display event list so as to update the display event list.
3. The method of claim 1, wherein the comparing the at least one audio source triggering event list with the display event list matching the game task comprises:
traversing each of the audio source trigger event lists in the at least one audio source trigger event list, and executing the following operations:
acquiring a current sound source trigger event;
searching the object identification of the current sound source virtual object in the current sound source triggering event in the display event list;
determining the current sound source virtual object as a sound source virtual object to be added in the display event list under the condition that the object identifier of the current sound source virtual object is not found in the display event list;
under the condition that the object identification of the current sound source virtual object is found in the display event list, acquiring the end time of the current sound source virtual object; determining the current sound source virtual object as a sound source virtual object to be removed in the display event list under the condition that the end time of the current sound source virtual object is reached; under the condition that the end time of the current sound source virtual object is not reached, reserving event information corresponding to the current sound source trigger event in the display event list;
and acquiring the next sound source trigger event as the current sound source trigger event.
4. The method of claim 3, further comprising, prior to said obtaining a current audio source triggering event:
determining the comparison priority of each sound source trigger event list in the at least one sound source trigger event list;
and traversing each sound source trigger event list in the at least one sound source trigger event list according to the comparison priority, and sequentially acquiring the current sound source trigger event.
5. The method of claim 1, wherein obtaining at least one audio source triggering event list that has been generated during the execution of the game task comprises:
detecting the sound source trigger event in the running process of the game task;
under the condition that the sound source trigger event is detected, acquiring the event type of the sound source trigger event;
and adding the sound source trigger event into the corresponding sound source trigger event list according to the event type.
6. The method according to claim 5, wherein before the obtaining the event type of the audio source triggering event, further comprising:
acquiring a sound source position where the sound source trigger event is triggered;
determining the distance between the sound source position and the target position of the virtual character controlled by the game application client;
determining an event type of the audio source trigger event when the distance is smaller than a first threshold, wherein event information of the audio source trigger event is to be recorded in the audio source trigger event list, and the event information of the audio source trigger event includes: the method comprises the following steps of obtaining an object identifier of a sound source virtual object in the sound source trigger event, obtaining an event type of the sound source virtual object, obtaining trigger time of the sound source trigger event and end time of the sound source trigger event.
7. The method according to claim 5, wherein the adding the audio source trigger event to the corresponding audio source trigger event list according to the event type comprises:
adding the sound source trigger event into a shooting trigger event list under the condition that the sound source trigger event is a shooting event;
adding the sound source trigger event into a carrier trigger event list under the condition that the sound source trigger event is a carrier movement event;
and adding the sound source trigger event into an object trigger event list under the condition that the sound source trigger event is an object movement type event.
8. The method according to claim 7, wherein the marking and displaying the position of the sound source virtual object in the updated display event list in the map matched with the game task comprises:
determining the display priority of each sound source triggering event recorded in the updated display event list;
and displaying the position of the sound source virtual object in each sound source triggering event according to the display priority, wherein the display priority of the shooting event is higher than that of the vehicle moving event, and the display priority of the vehicle moving event is higher than that of the object moving event.
9. The method according to any one of claims 1 to 8, wherein the marking and displaying the position of the sound source virtual object in the updated display event list in the map matched with the game task comprises:
acquiring the object type of the sound source virtual object;
and displaying an object icon matched with the object type at the position of the sound source virtual object in the map.
10. The method of claim 9, wherein displaying an object icon matching the object type at the position of the audio source virtual object in the map comprises at least one of:
displaying a first object icon in the case that the object type indicates a virtual character type;
displaying a second object icon if the object type indicates a carrier type;
and displaying a third object icon when the object type indicates a prop type.
11. The method according to any one of claims 1 to 8, wherein the position of the sound source virtual object in the updated display event list is marked and displayed in a map matched with the game task, and further comprising:
acquiring a three-dimensional coordinate of the position of the sound source virtual object in a world coordinate system;
converting the three-dimensional coordinates into map coordinates under a map coordinate system of the map;
converting the map mapping coordinate into a display coordinate under a display interface coordinate system, wherein the display interface coordinate system is a coordinate system corresponding to a part of a display area displayed by the map in the game picture;
and marking and displaying the position of the sound source virtual object in the map according to the display coordinate.
12. The method of claim 11, wherein said marking a position in the map where the audio source virtual object is displayed according to the display coordinates comprises:
when the display coordinates are located in the currently displayed display area of the map, marking and displaying the position of the sound source virtual object in the display area on the map according to the display coordinates;
when the display coordinates are located outside the currently displayed display area of the map, determining the boundary of the currently displayed display area closest to the display coordinates; and marking and displaying the direction of the position of the sound source virtual object on the boundary.
13. An audio source localization apparatus, comprising:
the game playing method comprises the steps that a first obtaining unit is used for obtaining at least one sound source trigger event list generated in the running process of a game task when a game application client side updates a game picture in a currently displayed game task, wherein the sound source trigger event in each sound source trigger event list is an event which is triggered by a type of sound source virtual object participating in the game task and is used for generating sound;
the comparison unit is used for comparing the at least one sound source trigger event list with the display event list matched with the game task;
the updating unit is used for updating the display event list according to the comparison result;
and the display unit is used for marking and displaying the position of the sound source virtual object in the updated display event list in a map matched with the game task.
14. A computer-readable storage medium comprising a stored program, wherein the program when executed performs the method of any of claims 1 to 12.
15. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method of any of claims 1 to 12 by means of the computer program.
CN201911176825.8A 2019-11-26 2019-11-26 Sound source positioning method and device, storage medium and electronic device Active CN110882540B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911176825.8A CN110882540B (en) 2019-11-26 2019-11-26 Sound source positioning method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911176825.8A CN110882540B (en) 2019-11-26 2019-11-26 Sound source positioning method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN110882540A CN110882540A (en) 2020-03-17
CN110882540B true CN110882540B (en) 2021-04-09

Family

ID=69748860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911176825.8A Active CN110882540B (en) 2019-11-26 2019-11-26 Sound source positioning method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN110882540B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475268B (en) * 2020-04-01 2023-05-05 腾讯科技(深圳)有限公司 Task item distribution method, device, equipment and readable storage medium
WO2023075706A2 (en) * 2021-11-01 2023-05-04 Garena Online Private Limited Method of using scriptable objects to insert audio features into a program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109358853A (en) * 2018-09-30 2019-02-19 武汉斗鱼网络科技有限公司 A kind of control exposure method, apparatus and readable storage medium storing program for executing

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005107903A1 (en) * 2004-05-10 2005-11-17 Sega Corporation Electronic game machine, data processing method in electronic game machine, program and storage medium for the same
CN102646152B (en) * 2011-02-22 2015-10-21 腾讯科技(深圳)有限公司 A kind of map changing method and system
JP5901828B1 (en) * 2015-08-20 2016-04-13 株式会社Cygames Information processing system, program, and server
JP6427079B2 (en) * 2015-09-16 2018-11-21 株式会社カプコン Game program and game system
CN106454438B (en) * 2016-11-16 2019-12-10 腾讯科技(深圳)有限公司 data processing method, related equipment and system
CN107890673A (en) * 2017-09-30 2018-04-10 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN107992281A (en) * 2017-10-27 2018-05-04 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN108014495A (en) * 2017-11-23 2018-05-11 网易(杭州)网络有限公司 Method, storage medium and the electronic equipment of vision compensating sound information
CN108854069B (en) * 2018-05-29 2020-02-07 腾讯科技(深圳)有限公司 Sound source determination method and device, storage medium and electronic device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109358853A (en) * 2018-09-30 2019-02-19 武汉斗鱼网络科技有限公司 A kind of control exposure method, apparatus and readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN110882540A (en) 2020-03-17

Similar Documents

Publication Publication Date Title
CN108597530B (en) Sound reproducing method and apparatus, storage medium and electronic apparatus
CN108744516B (en) Method and device for acquiring positioning information, storage medium and electronic device
US9836889B2 (en) Executable virtual objects associated with real objects
CN108854069B (en) Sound source determination method and device, storage medium and electronic device
CN109876444B (en) Data display method and device, storage medium and electronic device
CN111640171B (en) Historical scene explanation method and device, electronic equipment and storage medium
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
CN110882540B (en) Sound source positioning method and device, storage medium and electronic device
CN111744202A (en) Method and device for loading virtual game, storage medium and electronic device
CN112774203B (en) Pose control method and device of virtual object and computer storage medium
CN110898430B (en) Sound source positioning method and device, storage medium and electronic device
CN111558221B (en) Virtual scene display method and device, storage medium and electronic equipment
JP6859436B2 (en) Augmented reality software application support
CN113041611B (en) Virtual prop display method and device, electronic equipment and readable storage medium
US8497902B2 (en) System for locating a display device using a camera on a portable device and a sensor on a gaming console and method thereof
JP2014217566A (en) Hunting game distribution system
CN109395387B (en) Three-dimensional model display method and device, storage medium and electronic device
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
WO2018177113A1 (en) Method and device for displaying account information in client, and storage medium
CN111265861A (en) Display method and device of virtual prop, storage medium and electronic device
WO2021093703A1 (en) Interaction method and system based on optical communication apparatus
CN111277866B (en) Method and related device for controlling VR video playing
KR102587645B1 (en) System and method for precise positioning using touchscreen gestures
CN112927293A (en) AR scene display method and device, electronic equipment and storage medium
CN112957732A (en) Searching method, searching device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022626

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant