CN112245917B - Virtual object control method, device, equipment and storage medium - Google Patents

Virtual object control method, device, equipment and storage medium Download PDF

Info

Publication number
CN112245917B
CN112245917B CN202011266337.9A CN202011266337A CN112245917B CN 112245917 B CN112245917 B CN 112245917B CN 202011266337 A CN202011266337 A CN 202011266337A CN 112245917 B CN112245917 B CN 112245917B
Authority
CN
China
Prior art keywords
virtual
target
prop
terminal
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011266337.9A
Other languages
Chinese (zh)
Other versions
CN112245917A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011266337.9A priority Critical patent/CN112245917B/en
Publication of CN112245917A publication Critical patent/CN112245917A/en
Application granted granted Critical
Publication of CN112245917B publication Critical patent/CN112245917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Abstract

The application discloses a virtual object control method, a virtual object control device, virtual object control equipment and a storage medium, and belongs to the technical field of computers. Through the technical scheme provided by the embodiment of the application, the user can throw the target virtual prop with the virtual firearm shooting sound simulation in the game process. And after the target virtual prop is contacted with the target object, the terminal plays audio frequency of the virtual gun shooting in the virtual scene. The user can utilize the target virtual prop to confuse the enemy and attract the enemy to enter the area where the trap is located. In the process of puzzling the enemy, the user does not need to manually control the controlled virtual object to shoot and run, and the efficiency of man-machine interaction is improved.

Description

Virtual object control method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for controlling a virtual object.
Background
With the development of multimedia technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, and in the game process, a user can control a virtual object to match with virtual objects controlled by other users in a virtual scene.
In the related art, a user can control a virtual object to attack virtual objects of other teams using various virtual firearms. For some users, the position of the enemy and the type of the virtual firearm can be judged by the sound of the virtual firearm when it is fired. During the game, the user can also use the sound of the virtual gun shooting to confuse the enemy and attract the enemy to enter a specific area.
However, when a person is confused by using the sound generated when the virtual firearm shoots, the user is required to control the virtual object to shoot and run in the virtual scene, which results in low efficiency of human-computer interaction.
Disclosure of Invention
The embodiment of the application provides a virtual object control method, a virtual object control device, virtual object control equipment and a storage medium, and the efficiency of human-computer interaction can be improved. The technical scheme is as follows:
in one aspect, a virtual object control method is provided, and the method includes:
displaying a visual field picture of a controlled virtual object, wherein the controlled virtual object is provided with a target virtual prop, and the target virtual prop is used for simulating the sound of at least one virtual firearm in a virtual scene;
controlling the controlled virtual object to throw the target virtual item in the virtual scene in response to a throwing instruction of the target virtual item;
responding to the target virtual prop contacting with a target object, and playing audio frequency of the at least one virtual firearm shooting in a target area in the virtual scene, wherein the target area is an area where the target virtual prop contacts with the target object.
In one aspect, there is provided a virtual object control apparatus, the apparatus including:
the system comprises a display module, a control module and a display module, wherein the display module is used for displaying a visual field picture of a controlled virtual object, the controlled virtual object is provided with a target virtual prop, and the target virtual prop is used for simulating the sound of at least one virtual firearm in a virtual scene during shooting;
the control module is used for responding to a throwing instruction of the target virtual prop and controlling the controlled virtual object to throw the target virtual prop in the virtual scene;
and the playing module is used for responding to the contact of the target virtual prop and a target object, and playing the audio frequency of the at least one virtual firearm when shooting in a target area in the virtual scene, wherein the target area is the area of the target virtual prop in contact with the target object.
In a possible implementation manner, a property throwing control is displayed on the visual field screen, and the control module is configured to, in response to detecting a touch operation on the throwing control, display a preview throwing track of the target virtual property in the virtual scene on the visual field screen; and in response to that no touch operation on the throwing control is detected, controlling the controlled virtual object to throw the target virtual prop according to the preview throwing track.
In a possible implementation manner, the target object is a virtual ground in the virtual scene, and the playing module is configured to control the target virtual item to enter a triggered state in response to the target virtual item contacting the virtual ground in the virtual scene; and responding to the triggered state of the target virtual prop, and playing the audio in the target area.
In a possible implementation manner, after the target virtual prop is thrown, a detection ray with a target length may be emitted, and the playing module is configured to determine that the target virtual prop contacts a virtual ground in the virtual scene in response to detection of a material corresponding to any virtual ground in the virtual scene through the detection ray, and control the target virtual prop to enter the triggered state.
In a possible implementation manner, the control module is further configured to control the target virtual item to bounce in response to the target virtual item contacting a virtual wall in the virtual scene.
In one possible embodiment, the playing module is configured to determine at least one location point in the target area in response to the target virtual item contacting the target object; and playing the audio at the at least one position point.
In a possible implementation manner, the playing module is configured to obtain a first coordinate of a center point of the target area in the virtual scene and a radius of the target area; obtaining a coordinate offset value, wherein the coordinate offset value is smaller than the radius; and obtaining a second coordinate of the at least one position point in the virtual scene based on the coordinate deviation value and the first coordinate.
In a possible implementation manner, the playing module is configured to play the audio at any target location point randomly determined from the at least one location point.
In one possible embodiment, the playing module is configured to play audio of at least two virtual firearms shooting at the target area simultaneously in response to the target virtual item contacting the target object.
In a possible implementation manner, the playing module is configured to play an audio frequency of a target virtual gun when the target virtual gun is shot in response to the target virtual prop contacting the target object, where the target virtual gun is a virtual gun equipped with a virtual object in the team of the controlled virtual object.
In a possible embodiment, the display module is further configured to display virtual smoke in the target area.
In a possible implementation manner, the playing module is further configured to stop playing the audio in response to the playing time duration of the audio being greater than or equal to a target time duration.
In one aspect, a computer device is provided, the computer device comprising one or more processors and one or more memories having stored therein at least one computer program, the computer program being loaded and executed by the one or more processors to implement the virtual object control method.
In one aspect, a computer-readable storage medium having at least one computer program stored therein is provided, the computer program being loaded and executed by a processor to implement the virtual object control method.
In one aspect, a computer program product or a computer program is provided, the computer program product or the computer program comprising program code, the program code being stored in a computer-readable storage medium, the program code being read by a processor of a computer device from the computer-readable storage medium, the program code being executed by the processor such that the computer device performs the virtual object control method described above.
Through the technical scheme provided by the embodiment of the application, the user can throw the target virtual prop with the virtual firearm shooting sound simulation in the game process. And after the target virtual prop is contacted with the target object, the terminal plays audio frequency of the virtual gun shooting in the virtual scene. The user can utilize the target virtual prop to confuse the enemy and attract the enemy to enter the area where the trap is located. In the process of puzzling the enemy, the user does not need to manually control the controlled virtual object to shoot and run, and the efficiency of man-machine interaction is improved.
Drawings
In order to illustrate the technical solutions in the embodiments of the present application more clearly, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings may be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a virtual object control method provided in an embodiment of the present application;
fig. 2 is a schematic view of a visual field image according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 4 is a schematic view of an interface provided by an embodiment of the present application;
fig. 5 is a flowchart of a virtual object control method according to an embodiment of the present application;
fig. 6 is a flowchart of a virtual object control method according to an embodiment of the present application;
FIG. 7 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 8 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 9 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 10 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 11 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 12 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 13 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 14 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 15 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 16 is a logic block diagram of a method for controlling a virtual object according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of a virtual object control apparatus according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution.
The term "at least one" in this application refers to one or more, and the meaning of "a plurality" refers to two or more.
Virtual scene: is a virtual scene that an application shows (or provides) when running on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, the virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as desert, city, etc., and the user may control the virtual object to move in the virtual scene.
Virtual object: refers to a movable object in a virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, rocks, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
Optionally, the virtual object is a user Character controlled by an operation on the client, or an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-user Character (NPC) set in the virtual scene. Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene is preset, or is dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user can control a virtual object to freely fall, glide or open a parachute to fall in the sky of the virtual scene, run, jump, crawl, bow to move on land, or control a virtual object to swim, float or dive in the sea, or the like, and of course, the user can also control a virtual object to move in the virtual scene by riding a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, or the like, and the above scenes are merely exemplified, and the present invention is not limited to this. The user can also control the virtual object to interact with other virtual objects in a fighting mode and other modes through the interactive prop, for example, the interactive prop can be a throwing interactive prop such as a grenade, a cluster mine and a viscous grenade (called viscous grenade for short), and also can be a shooting interactive prop such as a machine gun, a pistol and a rifle, and the type of the interactive prop is not specifically limited in the application.
Fig. 1 is a schematic diagram of an implementation environment of a display method of a virtual scene provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a first terminal 120, a second terminal 140, and a server 160.
The first terminal 120 is installed and operated with an application program supporting the display of a virtual scene. Optionally, the application is any one of a First-Person Shooting game (FPS), a third-Person Shooting game, a virtual reality application, a three-dimensional map program, or a multi-player gunfight type live game. The first terminal 120 is a terminal used by a first user, and the first user uses the first terminal 120 to operate a controlled virtual object located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the controlled virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first terminal 120 and the second terminal 140 are connected to the server 160 through a wireless network or a wired network.
The second terminal 140 is installed and operated with an application program supporting the display of a virtual scene. Optionally, the application program is any one of an FPS, a third-person shooter game, a virtual reality application program, a three-dimensional map program, or a multiplayer gunfight type live game. The second terminal 140 is a terminal used by a second user, and the second user uses the second terminal 140 to operate another virtual object located in the virtual scene for activities, which include but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the virtual object controlled by the second terminal 140 is a second virtual character, such as a simulated character or an animated character.
Optionally, the virtual object controlled by the first terminal 120 and the virtual object controlled by the second terminal 140 are in the same virtual scene, and the virtual object controlled by the first terminal 120 may interact with the virtual object controlled by the second terminal 140 in the virtual scene. In some embodiments, the virtual object controlled by the first terminal 120 and the virtual object controlled by the second terminal 140 are in an opponent relationship, for example, the virtual object controlled by the first terminal 120 and the virtual object controlled by the second terminal 140 belong to different teams and organizations, and the opponent-oriented virtual objects can be mutually-shooting-oriented and interactive on the land.
Alternatively, the applications installed on the first terminal 120 and the second terminal 140 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 120 generally refers to one of a plurality of terminals, and the second terminal 140 generally refers to one of a plurality of terminals, and this embodiment is only exemplified by the first terminal 120 and the second terminal 140. The device types of the first terminal 120 and the second terminal 140 are the same or different, and include: at least one of a smartphone, a tablet, a laptop portable computer, and a desktop computer. For example, the first terminal 120 and the second terminal 140 may be, but are not limited to, smart phones, or other handheld portable gaming devices. The technical solution provided in the embodiment of the present application can be applied to both the first terminal 120 and the second terminal 140, which is not limited in the embodiment of the present application. For the sake of clarity and brevity, in the following description, a terminal is used to refer to either a first terminal or a second terminal.
Optionally, the server 160 is an independent physical server, or a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud function, cloud storage, web service, cloud communication, middleware service, domain name service, security service, distribution Network (CDN), big data, and an artificial intelligence platform, and the number and the device type of the servers are not limited in this embodiment of the present application.
In order to more clearly describe the technical solution provided by the embodiment of the present application, first, a view screen of a controlled virtual object in the present application is introduced, and referring to fig. 2, in order to make a shooting game more realistic, a game designer will refer to a way of human beings observing a real world to design a way of the controlled virtual object observing a virtual scene. The controlled virtual object 201 can observe the virtual scene in the area 202 in the virtual scene, and the picture obtained by observing the area 202 at the angle of the controlled virtual object 201 is also the view picture of the controlled virtual object. The user can adjust the position of the controlled virtual object 201 observing the virtual scene by adjusting the orientation of the controlled virtual object 201. For the user, the way in which the controlled virtual object 201 observes the virtual scene is also the way in which the user observes the virtual scene. The terminal can project the virtual scene in the area 202 onto the screen, so that the user can see the scene of the controlled virtual object 201 in the virtual scene through the screen.
Taking the terminal as an example of a smart phone, a control for controlling the controlled virtual object to execute different actions is also displayed on the view screen of the controlled virtual object. Referring to fig. 3, a control 302, a control 303, a control 304, and a control 305 are displayed on a visual field 301 of the controlled virtual object, wherein the control 302 is used for controlling the moving direction of the controlled virtual object. The control 303 is used to adjust the posture of the controlled virtual object, for example, control the virtual object to perform squat or crawl actions. Control 304 is used to control the interactive props held by the controlled virtual object to launch virtual ammunition. Control 305 is used to switch the target virtual prop, and in this embodiment of the present application, the user can control the controlled virtual object to throw the target virtual prop through control 304. The mini-map 306 allows the user to view the location of teammates and enemies in the virtual scene through the mini-map 306.
In one possible implementation mode, a user can select the virtual item equipped with the controlled virtual object by himself before the game starts, and the user can control the controlled virtual object to fight against other virtual objects through the selected virtual item in the game process.
For example, a game application program can provide a prop selection interface 401 shown in fig. 4 for a user before a game starts, icons 402 of various types of virtual props are provided on the prop selection interface 401, after the user clicks the icon 402, the terminal can display an introduction interface 403 of the virtual prop corresponding to the icon 402 on the prop selection interface 401, a text introduction 4031 of the virtual prop is displayed on the introduction interface 403, and an equipment control 4032 is also displayed. Clicking on control 4032 by the user enables the controlled virtual object to be equipped with the virtual prop.
In one possible implementation mode, the virtual props are divided into combat equipment and tactical equipment, the combat equipment is the virtual prop which can hurt enemies, the tactical equipment is the virtual prop which can interfere with the enemies, and the target virtual prop provided in the embodiment of the application is the tactical equipment.
It should be noted that, the technical solutions provided in the embodiments of the present application can be executed by a terminal alone, that is, the terminal executes the steps of processing and displaying the background data, and can also be executed by cooperation between the terminal and the server, that is, the terminal receives an operation of a user, the terminal sends an instruction corresponding to the operation of the user to the server, the server processes the background data, and the server sends the processed background data to the terminal, and the terminal displays the processed background data. The following description will take a terminal as an example of the execution subject.
Fig. 5 is a flowchart of a virtual object control method provided in an embodiment of the present application, and referring to fig. 5, the method includes:
501. the terminal displays a visual field picture of the controlled virtual object, the controlled virtual object is provided with a target virtual prop, and the target virtual prop is used for simulating sound of at least one virtual firearm in a virtual scene.
For the description of the view frame of the controlled virtual object, reference is made to the description of fig. 3, and details are not repeated here. In some embodiments, the target virtual prop is a throwing virtual prop, and the user can control the controlled virtual object to throw the target virtual prop in the virtual scene through the terminal.
502. And responding to the throwing instruction of the target virtual prop, and controlling the controlled virtual object to throw the target virtual prop in the virtual scene by the terminal.
Optionally, the throwing instruction of the target virtual prop is triggered by the terminal detecting the touch or click operation of the user on a specific key.
503. Responding to the contact between the target virtual prop and the target object, the terminal plays at least one audio frequency generated when the virtual gun shoots in a target area in the virtual scene, and the target area is an area where the target virtual prop is in contact with the target object.
When the terminal plays the audio in the target area, other virtual objects in the virtual scene can hear the audio when approaching the area.
Through the technical scheme provided by the embodiment of the application, the user can throw the target virtual prop with the virtual firearm shooting sound simulation in the game process. When the target virtual prop is in contact with the target object, the terminal plays audio of virtual firearms shooting in a virtual scene. The user can utilize the target virtual prop to confuse the enemy and attract the enemy to enter the area where the trap is located. In the process of puzzling the enemy, the user does not need to manually control the controlled virtual object to shoot and run, and the efficiency of man-machine interaction is improved.
Fig. 6 is a flowchart of a virtual object control method provided in an embodiment of the present application, and referring to fig. 6, the method includes:
601. the terminal displays a visual field picture of the controlled virtual object, the controlled virtual object is provided with a target virtual prop, and the target virtual prop is used for simulating sound of at least one virtual firearm in a virtual scene.
The controlled virtual object is a virtual object controlled by a terminal, and a user can control the controlled virtual object to execute different actions in a virtual scene through the terminal. The target virtual prop is a throwing virtual prop, and a user can control a controlled virtual object to throw the target virtual prop in a virtual scene through a terminal, wherein the throwing direction of the target virtual prop is also controlled by the user through the terminal.
Optionally, the target virtual item is obtained by exchanging a user based on virtual currency in a game, or is issued in a form of reward by a server based on account level promotion of the user, or is an interactive item provided based on a current game stage or a game mode, and a source of the interactive skill is not limited in the embodiment of the application. If the user qualifies to equip the controlled virtual object with a target virtual item, then the user is able to select the target virtual item on the interface as shown in FIG. 4.
In one possible implementation, the terminal can display a view screen of the controlled virtual object, and in response to a switching instruction for the target virtual item, the terminal switches the virtual item virtually held by the control object to the target virtual item, which is described below by using several examples.
In example 1, if the terminal is a smartphone, the terminal can display a view screen 301 of the controlled virtual object as shown in fig. 3, and the virtual item held by the controlled virtual object in the view screen 301 is a virtual gun. In response to detecting the touch operation on the control 305, the terminal triggers a switching instruction on the target virtual item. In response to the switching instruction for the target virtual item, the terminal switches the virtual item held by the controlled virtual object in the view screen 301 to the target virtual item, that is, referring to fig. 7, in the view screen 701 of the controlled virtual object, the virtual item held by the controlled virtual object is switched from the virtual gun in fig. 3 to the target virtual item 702.
Example 2, if the terminal is a smartphone, the user can set a target operation for a certain physical key of the terminal in the game, where the target operation is used to control the controlled virtual object to switch the target virtual item, and in some embodiments, the target operation is to continuously click the volume down key twice or to continuously click the volume up key twice. In the game process, in response to the detection of the target operation of the physical key, the terminal triggers a switching instruction of the target virtual item. And responding to the switching instruction of the target virtual item, and switching the virtual item held by the controlled virtual object into the target virtual item by the terminal.
Example 3, if the terminal is a smart phone and the user plays a game through an external gamepad of the smart phone, the user can set the function of one key on the gamepad as the switching target prop. In the game process, in response to the terminal detecting the click operation of the key, the terminal triggers a switching instruction of the target virtual item. And responding to the switching instruction of the target virtual item, and switching the virtual item held by the controlled virtual object into the target virtual item by the terminal.
In example 4, if the terminal is a desktop computer, the terminal can display a view screen 801 of the controlled virtual object as shown in fig. 8, and a controlled virtual object 802 is displayed on the view screen 801. In response to detecting that one or more keys on the keyboard are pressed, the terminal can trigger a switching instruction for the target virtual item. And responding to the switching instruction of the target virtual item, and switching the virtual item held by the controlled virtual object into the target virtual item by the terminal. For example, the key corresponding to the switching instruction of the target virtual item in the game is "number key 2", and when the user wants to switch the virtual item held by the controlled virtual object to the target virtual item, the user presses the "number key 2". And in response to the detection that the 'number key 2' is pressed, the terminal triggers a switching instruction of the target virtual item. And responding to the switching instruction of the target virtual item, and switching the virtual item held by the controlled virtual object into the target virtual item by the terminal.
It should be noted that, the user can control the controlled virtual object to switch the held virtual item to the target virtual item by using the scheme provided in any of the above examples according to the type of the terminal, which is not limited in this embodiment of the present application.
602. And responding to the throwing instruction of the target virtual prop, and controlling the controlled virtual object to throw the target virtual prop in the virtual scene by the terminal.
In one possible implementation manner, a property throwing control is displayed on the visual field screen, and in response to the detection of the touch operation on the throwing control, the terminal displays a preview throwing track of the target virtual property in the virtual scene on the visual field screen. And in response to the fact that the touch operation on the throwing control is not detected, the terminal controls the controlled virtual object to throw the target virtual prop according to the previewing throwing track.
The previewing throwing track is a flight track of the target virtual prop in the virtual scene after being thrown, that is, the previewing throwing track can be used for indicating the position where the target virtual prop can reach after the controlled virtual object is at the current position and faces the target virtual prop to be thrown. The user controls the movement of the controlled virtual object in the virtual scene through the terminal, so that the flight track and the terminal point of the thrown target virtual prop in the virtual scene can be controlled, the preview of the throwing track can be provided for the user before the controlled virtual object throws the target virtual prop, and the accuracy of the user in controlling the controlled virtual object to throw the target virtual prop through the terminal is improved.
The above embodiments are described below by way of two examples:
in example 1, if the terminal is a smartphone, referring to fig. 9, a prop throwing control 902 is displayed on a view screen 901, and when the user wants to control the controlled virtual object to throw the target virtual prop through the terminal, the user touches the prop throwing control 902 and keeps the finger pressed. In response to detecting a touch operation on the property throwing control 902, the terminal displays a preview throwing trajectory 903 on the field-of-view screen 901. When the finger of the user is lifted from prop throwing control 902, the terminal cannot detect the touch operation on prop throwing control 902. In response to not detecting a touch operation on the prop throwing control 902, the terminal triggers a throwing instruction on the target virtual prop. In response to the throwing instruction of the target virtual item, the terminal controls the controlled virtual object to throw the target virtual item according to the preview throwing track 903.
Example 2, if the terminal is a desktop computer, the user can press a key corresponding to the throwing instruction of the target virtual item on the keyboard, for example, a "key G" to trigger the throwing instruction of the target virtual item. During the game, in response to detecting that the "key G" is pressed, the terminal can display a preview trajectory on the view screen of the controlled virtual object. In response to detecting that "key G" is no longer pressed, the terminal can trigger a throw instruction to the target virtual prop. And responding to the throwing instruction of the target virtual prop, and controlling the controlled virtual object to throw the target virtual prop according to the previewing throwing track by the terminal.
603. Responding to the contact between the target virtual prop and the target object, the terminal plays at least one audio frequency generated when the virtual gun shoots in a target area in the virtual scene, and the target area is an area where the target virtual prop is in contact with the target object.
In one possible implementation mode, the target object is a virtual ground, and the terminal controls the target virtual item to enter the triggered state in response to the target virtual item contacting the virtual ground in the virtual scene. And responding to the triggered state of the target virtual prop, and playing audio shot by at least one virtual gun in the target area by the terminal.
Optionally, the target virtual item entering the triggered state means that the target virtual item explodes in the virtual scene. The virtual gun is any one of virtual guns in a game, for example, a virtual machine gun, a virtual pistol, or a virtual sniper gun, which is not limited in the embodiment of the present application.
For example, referring to fig. 10, after the target virtual prop 1001 is thrown, a detection ray 1002 with a target length can be emitted, where the detection ray is an invisible ray, and the target length is set by a game designer according to an actual situation, which is not limited in the embodiment of the present application. And responding to the fact that the material corresponding to any virtual ground in the virtual scene is detected through the detection ray, the terminal determines that the target virtual prop is in contact with the virtual ground in the virtual scene, and controls the target virtual prop to enter a triggered state, namely controls the target virtual prop to explode, wherein the material is an illumination rendering effect based on the mapping and is used for simulating an object in the real world. And the terminal plays audio of at least one virtual gun shooting in the region where the target virtual prop explodes.
On the basis of the above example, the above embodiment is further described with reference to a manner in which a game designer sets a material for a virtual ground in a virtual scene.
For example, a game designer can set textures for a virtual floor in a virtual scene through a game development application as shown in FIG. 11. The game designer can select a virtual ground surface in the virtual scene in the position selection area 1101 and select a material to be added in the material selection area 1102. In some embodiments, the game designer sets the material of the ground in the virtual scene to "stone," and then the terminal controls the target virtual item to explode in response to the detection ray emitted by target virtual item 1001 in fig. 10 detecting the material of "stone. And the terminal plays the audio frequency of at least one virtual gun when shooting in the area where the target virtual prop explodes.
In addition, in one possible embodiment, in response to the target virtual item contacting the target object, the terminal can display virtual smoke in the target area, wherein the range of the virtual smoke is the target area. In this embodiment, the terminal can indicate the positions of the target virtual item and the target object by displaying the form of the virtual smoke in the virtual scene, and when the audio shot by the virtual firearm is subsequently played, the audio playing range is the range where the virtual smoke is located.
For example, referring to fig. 12, the terminal can display a virtual smoke 1202 in a view 1201 of a controlled virtual object.
In a possible implementation, after step 602, if the target virtual item is not in contact with the virtual ground in the virtual scene, the terminal can also perform the following steps.
And responding to the contact of the target virtual prop and a virtual wall in the virtual scene, and controlling the target virtual prop to rebound by the terminal. And responding to the contact between the target virtual prop and the virtual ground in the virtual scene, and controlling the target virtual prop to enter a triggered state by the terminal.
For example, with continued reference to fig. 10, after a target virtual prop 1001 is thrown, a detection ray 1002 of the target length and in the same direction as the direction of flight can be launched. In response to the fact that the material corresponding to any virtual wall in the virtual scene is detected through the detection ray, the terminal determines that the target virtual prop is in contact with the virtual wall in the virtual scene, and controls the target virtual prop to rebound in the virtual scene, namely, controls the target virtual object to fly in the virtual scene in the opposite direction. After the target virtual prop rebounds, the terminal controls the target virtual prop to explode in response to the detection of the material corresponding to any virtual ground in the virtual scene.
In the above description, the target object is taken as an example of a virtual ground, and in other possible embodiments, the target object may also be another object in a virtual scene, which is not limited in this embodiment of the present application.
After explaining the way in which the terminal determines that the target virtual item is in contact with the target object, a method for playing audio of at least one virtual gun shooting in the target area by the terminal is explained below.
In one possible embodiment, the terminal determines at least one location point in the target area in response to the target virtual prop contacting the target object. And the terminal plays audio of at least one virtual gun shooting at the at least one position point.
In order to more clearly describe the above embodiment, the above embodiment will be described in two parts, the first part describes a method for the terminal to determine at least one location point, and the second part describes a method for the terminal to play audio at the location point when at least one virtual gun is shot.
In the first part, the terminal acquires a first coordinate of the central point of the target area in the virtual scene and the radius of the target area. The terminal obtains a coordinate deviation value, and the coordinate deviation value is smaller than the radius. And the terminal obtains a second coordinate of at least one position point in the virtual scene based on the coordinate deviation value and the first coordinate.
Taking the number of position points as an example, referring to fig. 13, 1301 is a target region, and 1302 is a center point of the region 1301. The terminal can obtain a first coordinate of the center point, denoted as (X, Y, Z), and the terminal obtains the radius of the region 1301, denoted as R. The terminal randomly obtains a coordinate offset value, which is recorded as a, and the coordinate offset value is smaller than the radius R, namely 0 < a < R. The terminal fuses the first coordinate (X, Y, Z) and the coordinate offset value a to obtain a second coordinate (X + m, Y + m, Z + m) of a position point in the virtual scene, where the position indicated by the second coordinate is the position of the position point 1303 in the virtual scene, and m is 0 or a.
In other possible embodiments, the terminal can also simultaneously obtain three coordinate offset values, a, b, and c, where a, b, and c are all positive numbers less than R. In this case, the second coordinate obtained by the terminal from the first coordinate (X, Y, Z) and the three coordinate offset values is (X + a, Y + b, Z + c).
Of course, the terminal can obtain a plurality of location points at the same time in addition to obtaining 1 location point in the target area, and the implementation process of obtaining a plurality of location points and the implementation process of obtaining 1 location point belong to the same inventive concept, and are not described herein again.
And a second part, wherein the terminal plays audio when at least one virtual firearm shoots at the position indicated by the second coordinate.
In a possible implementation mode, the terminal plays audio of at least one virtual firearm shooting at any target position point randomly determined from at least one position point.
The above-described embodiment will be described below in terms of the number of position points being one and the number of position points being at least two.
When the number of the position points is one, the terminal can play audio of at least one virtual gun shooting at the position point. To the extent that the number of the audios when virtual firearms are shot is 3, the terminal can play the audios when 3 kinds of virtual firearms are shot in sequence at the position point according to the arranged sequence. Optionally, the order of audio play is determined by the terminal after the target virtual item is thrown, e.g., the terminal can number the audio for 3 virtual firearm shots, e.g., 1,2, and 3, and randomly combine the codes without putting back, e.g., 123, 132, 213, 231, 312, or 312. If the obtained code combination is 132, the terminal can play the audio frequency of the first virtual gun shooting, the audio frequency of the third virtual gun shooting and the audio frequency of the second virtual gun shooting in sequence after the target virtual prop contacts with the target object. In this way, after the target virtual item is in contact with the target object, the terminal can play all the audios when 3 kinds of virtual firearms are shot according to the sequence.
In addition to the above embodiments, the terminal may perform random combinations with codes being played back, for example, combinations of 112, 113, 122, 223, 311, or 323, in addition to random combinations without codes being played back. If the obtained code combination is 311, the terminal can play the audio shot by the third virtual gun first after the target virtual item is contacted with the target object, and play the audio shot by the first virtual gun twice continuously. In this way, the randomness of the audio playing mode of the terminal is stronger, and the terminal is easier to interfere with enemies.
In a possible implementation manner, the terminal can also play the audio of the 3 kinds of virtual firearms shooting at the position point based on the random number in the target interval, that is, the terminal can obtain the random number in the target interval, and determine the playing sequence of the audio of the 3 kinds of virtual firearms shooting according to the corresponding relationship between the random number and the audio.
In this way, the randomness of the audio playing mode of the terminal is stronger, and the terminal is easier to interfere with enemies.
For example, the target interval is [1,4], where the interval [1, 2) corresponds to the audio when the first type of virtual firearm is fired, the interval [2, 3) corresponds to the audio when the second type of virtual firearm is fired, and the interval [3,4] corresponds to the audio when the third type of virtual firearm is fired. The terminal can obtain random numbers in the target interval [1,4] with the ratio of 1.5, and because 1.5 belongs to the interval [1, 2), the terminal plays audio at the position point when the first virtual gun corresponding to the interval [1, 2) shoots. When the terminal plays the audio frequency of the first virtual gun shooting, the terminal can also obtain the random number again in the target interval [1,4], for example, 2.5, since 2.5 belongs to the interval [2, 3), so that after the terminal plays the audio frequency of the first virtual gun shooting, the terminal can play the audio frequency of the second virtual gun shooting corresponding to the interval [2, 3), and so on.
When the number of the position points is at least two, the terminal can randomly determine a target position point from the at least two position points, and play the audio of at least one virtual firearm shooting at the target position point, and can also play the audio of at least one virtual firearm shooting at two or more position points simultaneously, which is not limited in the embodiment of the present application. For the audio frequency of the terminal randomly playing at least one virtual gun shooting at the target position point, the implementation mode is the same as the implementation mode when the number of the position points is one, and the description is omitted. The following describes an implementation process of playing audio of at least one virtual firearm shooting at two or more location points of the terminal at the same time.
Taking the number of the position points as 4 as an example, referring to fig. 14, there are four position points a, B, C, and D, the terminal can play the audio shot by the first virtual firearm N times continuously at the point a, and when the point a plays the audio shot by the first virtual firearm, the terminal can also play the audio shot by the first or other virtual firearms at any point or any two points of the points B, C, and D, or play the same or different audio shot by the virtual firearms at the points a, B, C, and D at the same time, which is not limited in this application embodiment. Through the implementation mode, after the target virtual item is contacted with the target object, the position and the content of the played audio have stronger randomness, and the confusion to enemies is stronger.
After the description of the embodiment in which the terminal plays the audio of at least one virtual firearm shooting in the target area, the above embodiment will be further described with reference to the above description and the type of virtual firearm.
In one possible implementation, in response to the target virtual item contacting the target object, the terminal plays audio when the target virtual gun is shot, wherein the target virtual gun is a virtual gun equipped with a virtual object in the team of the controlled virtual object.
For some users with rich experience, the type of the virtual gun can be distinguished through audio, and when the audio which is played by the terminal and used by the opposite user when shooting without the equipped virtual gun, the users with rich experience can quickly perceive that the user uses the target virtual prop. In this implementation manner, the audio played by the terminal is the audio obtained when the virtual shooter equipped with the virtual object in the team of the virtual object shoots, and can also play a sufficient puzzlement role for users with rich experience.
In addition, the embodiment of the application also provides a method for distinguishing whether the audio is the audio simulated by the target virtual prop.
In one possible embodiment, referring to fig. 15, when the controlled virtual object is close to a position at which virtual shooter audio is played, if the audio is emitted when the other player controls the virtual object to shoot, the terminal can display a position point 1502 on a small map 1501 of the visual field screen of the controlled virtual object, the position point 1502 being used to indicate the position of the virtual object to shoot. If the audio is sent by the target virtual item thrown by the virtual object controlled by other players, the terminal does not display the position point on the small map 1501 of the visual field picture of the controlled virtual object, and through the display difference, the user can distinguish whether the heard audio is sent when the virtual object is shot, so that a counter measure for the target virtual item is provided for the user, and the fairness of the game is improved.
Optionally, after step 603, the terminal can also perform step 604 described below.
604. And stopping playing the audio in response to the playing time length of the audio being greater than or equal to the target time length.
The target duration can be randomly determined by the terminal, for example, the terminal is randomly determined to be 4 seconds, 5 seconds or 6 seconds, and the like, and can also be set by the game designer according to the actual situation, for example, the target duration is set to be fixed for 5 seconds or 6 seconds, which is not limited in this embodiment of the present application. In addition, when there are multiple kinds of audio frequencies when virtual firearms shoot, the playing time of the audio frequency when each kind of virtual firearms shoot can also be set by the game designer according to the actual situation, for example, set to 0.3 second, 0.5 second, 0.8 second, or 1 second, etc., which is not limited in the embodiment of the present application.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
The technical solutions provided in the embodiments of the present application will be further described below with reference to fig. 16 and some possible implementations of the foregoing 601-604.
Referring to fig. 16, a user can select an icon 402 corresponding to a target virtual item on item selection interface 401 shown in fig. 4, click on control 4032 to equip the controlled virtual object with the target virtual item, which in some embodiments is referred to as a "jamming bomb". When the controlled virtual object is equipped with a "chaff," referring to fig. 9, in response to the terminal detecting a user's touch operation of a prop throw control 902, the terminal displays a preview throw trajectory 903 on a view screen 901, in some embodiments, prop throw control 902 is also referred to as a "fire key. In response to the terminal not detecting a touch operation to prop throwing control 902, i.e., the user releases his or her hand, the terminal triggers a throw instruction to the target virtual prop. And responding to the throwing instruction of the target virtual prop, and controlling the controlled virtual object to throw the 'interference bomb' by the terminal. In response to detecting that the "interfering bomb" is in contact with the virtual ground, the terminal controls the "interfering bomb" to enter a triggered state, which in some embodiments means that the "interfering bomb" explodes in the virtual scene. The terminal can randomly play audio frequency of virtual firearms shooting for a plurality of times in an explosive area of the interference bomb. And when the time length for playing the audio exceeds the target time length, the terminal stops playing the audio, and the process is ended.
Through the technical scheme provided by the embodiment of the application, the user can throw the target virtual prop with the virtual firearm shooting sound simulation in the game process. And after the target virtual prop is contacted with the target object, the terminal plays audio frequency of the virtual gun shooting in the virtual scene. The user can utilize the target virtual prop to confuse the enemy and attract the enemy to enter the area where the trap is located. In the process of puzzling the enemy, the user does not need to manually control the controlled virtual object to shoot and run, and the efficiency of man-machine interaction is improved.
Fig. 17 is a schematic structural diagram of a virtual object control apparatus provided in an embodiment of the present application, and referring to fig. 17, the apparatus includes: a display module 1701, a control module 1702, and a playback module 1703.
And the display module 1701 is used for displaying a visual field picture of the controlled virtual object, the controlled virtual object is provided with a target virtual prop, and the target virtual prop is used for simulating the sound of at least one virtual gun when shooting in a virtual scene.
A control module 1702, configured to control the controlled virtual object to throw the target virtual item in the virtual scene in response to the throwing instruction for the target virtual item.
And a playing module 1703, configured to, in response to the target virtual item contacting the target object, play at least one audio frequency of the virtual firearm shooting in a target area in the virtual scene, where the target area is an area where the target virtual item contacts the target object.
In one possible implementation manner, a property throwing control is displayed on the visual field screen, and the control module is used for responding to the detection of the touch operation on the throwing control, and displaying the preview throwing track of the target virtual property in the virtual scene on the visual field screen. And in response to the fact that the touch operation on the throwing control is not detected, controlling the controlled virtual object to throw the target virtual prop according to the preview throwing track.
In a possible implementation manner, the target object is a virtual ground in a virtual scene, and the playing module is configured to control the target virtual item to enter a triggered state in response to the target virtual item contacting the virtual ground in the virtual scene. And responding to the triggered state of the target virtual item, and playing audio in the target area.
In a possible implementation manner, after the target virtual prop is thrown, a detection ray with a target length can be emitted, and the playing module is used for determining that the target virtual prop is in contact with a virtual ground in a virtual scene in response to the detection of a material corresponding to any virtual ground in the virtual scene through the detection ray, and controlling the target virtual prop to enter a triggered state.
In one possible embodiment, the control module is further configured to control the target virtual item to bounce in response to the target virtual item contacting a virtual wall in the virtual scene.
In one possible embodiment, the play module is configured to determine at least one location point in the target area in response to the target virtual item contacting the target object. At least one location point, audio is played.
In a possible implementation manner, the playing module is configured to obtain a first coordinate of a center point of the target area in the virtual scene and a radius of the target area. And acquiring a coordinate deviation value, wherein the coordinate deviation value is smaller than the radius. And obtaining a second coordinate of the at least one position point in the virtual scene based on the coordinate deviation value and the first coordinate.
In a possible embodiment, the playing module is configured to play the audio at any target location point randomly determined from the at least one location point.
In one possible embodiment, the playing module is used for responding to the contact of the target virtual prop and the target object, and simultaneously playing audio of at least two virtual firearms when shooting in the target area.
In one possible implementation, the playing module is configured to play audio of a target virtual gun when the target virtual gun is shot in response to the target virtual prop contacting the target object, where the target virtual gun is a virtual gun equipped with a virtual object in the team of the controlled virtual object.
In one possible embodiment, the display module is further configured to display the virtual smoke within the target area.
In a possible implementation manner, the playing module is further configured to stop playing the audio in response to the playing time duration of the audio being greater than or equal to the target time duration.
Through the technical scheme provided by the embodiment of the application, the user can throw the target virtual prop with the virtual firearm shooting sound simulation in the game process. And after the target virtual prop is contacted with the target object, the terminal plays audio frequency of the virtual gun shooting in the virtual scene. The user can utilize the target virtual prop to confuse the enemy and attract the enemy to enter the area where the trap is located. In the process of puzzling the enemy, the user does not need to manually control the controlled virtual object to shoot and run, and the efficiency of man-machine interaction is improved.
An embodiment of the present application provides a computer device, configured to perform the foregoing method, where the computer device may be implemented as a terminal or a server, and a structure of the terminal is introduced below:
fig. 18 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 1800 may be: a smartphone, a tablet, a laptop, or a desktop computer. The terminal 1800 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, the terminal 1800 includes: one or more processors 1801 and one or more memories 1802.
The processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1801 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1801 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1801 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content required to be displayed on the display screen. In some embodiments, the processor 1801 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1802 may include one or more computer-readable storage media, which may be non-transitory. Memory 1802 can also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1802 is used to store at least one computer program for execution by the processor 1801 to implement the virtual object control method provided by the method embodiments of the present application.
In some embodiments, the terminal 1800 may further optionally include: a peripheral interface 1803 and at least one peripheral. The processor 1801, memory 1802, and peripheral interface 1803 may be connected by buses or signal lines. Each peripheral device may be connected to the peripheral device interface 1803 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1804, display 1805, camera assembly 1806, audio circuitry 1807, and power supply 1809.
The peripheral interface 1803 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1801 and the memory 1802. In some embodiments, the processor 1801, memory 1802, and peripheral interface 1803 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1801, the memory 1802, and the peripheral device interface 1803 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1804 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1804 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuitry 1804 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals. Optionally, the radio frequency circuitry 1804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth.
The display 1805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1805 is a touch display screen, the display screen 1805 also has the ability to capture touch signals on or over the surface of the display screen 1805. The touch signal may be input to the processor 1801 as a control signal for processing. At this point, the display 1805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard.
The camera assembly 1806 is used to capture images or video. Optionally, the camera assembly 1806 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of a terminal, and a rear camera is disposed at a rear surface of the terminal.
The audio circuitry 1807 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1801 for processing or inputting the electric signals to the radio frequency circuit 1804 for realizing voice communication.
The power supply 1809 is used to power various components within the terminal 1800. The power supply 1809 may be ac, dc, disposable or rechargeable.
In some embodiments, the terminal 1800 also includes one or more sensors 1810. The one or more sensors 1810 include, but are not limited to: acceleration sensor 1811, gyro sensor 1812, pressure sensor 1813, optical sensor 1815, and proximity sensor 1816.
The acceleration sensor 1811 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 1800.
The gyro sensor 1812 may be used to detect the body direction and rotation angle of the terminal 1800, and the gyro sensor 1812 may cooperate with the acceleration sensor 1811 to collect the 3D movement of the user on the terminal 1800.
The pressure sensors 1813 may be disposed on the side frame of the terminal 1800 and/or on the lower layer of the display 1805. When the pressure sensor 1813 is disposed on a side frame of the terminal 1800, a user's grip signal on the terminal 1800 can be detected, and the processor 1801 performs left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 1813. When the pressure sensor 1813 is disposed at the lower layer of the display screen 1805, the processor 1801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1805.
The optical sensor 1815 is used to collect ambient light intensity. In one embodiment, the processor 1801 may control the display brightness of the display screen 1805 based on the ambient light intensity collected by the optical sensor 1815.
The proximity sensor 1816 is used to collect the distance between the user and the front surface of the terminal 1800.
Those skilled in the art will appreciate that the configuration shown in fig. 18 is not intended to be limiting of terminal 1800 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
The computer device may also be implemented as a server, and the following describes a structure of the server:
fig. 19 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1900 may have a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1901 and one or more memories 1902, where the one or more memories 1902 store at least one computer program that is loaded and executed by the one or more processors 1901 to implement the methods provided by the foregoing method embodiments. Certainly, the server 1900 may further have a wired or wireless network interface, a keyboard, an input/output interface, and other components to facilitate input and output, and the server 1900 may further include other components for implementing functions of the device, which is not described herein again.
In an exemplary embodiment, there is also provided a computer readable storage medium, such as a memory including a computer program executable by a processor to perform the virtual object control method in the above embodiments. For example, the computer readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided a computer program product or a computer program including program code stored in a computer-readable storage medium, which is read by a processor of a computer apparatus from the computer-readable storage medium, and which is executed by the processor so that the computer apparatus executes the above-described virtual object control method.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A virtual object control method, the method comprising:
displaying a visual field picture of a controlled virtual object, wherein the controlled virtual object is provided with a target virtual prop, the target virtual prop is used for simulating the sound of at least two virtual firearms when shooting in a virtual scene, the at least two virtual firearms are virtual firearms equipped with virtual objects in a team where the controlled virtual object is located, and a prop throwing control is displayed on the visual field picture;
in response to the detection of the touch operation on the throwing control, displaying a preview throwing track of the target virtual prop in the virtual scene on the visual field screen, wherein the preview throwing track is a flying track of the target virtual prop in the virtual scene after being thrown; in response to that the touch operation on the throwing control is not detected any more, controlling the controlled virtual object to throw the target virtual prop according to the preview throwing track;
in response to the target virtual item contacting a target object, determining at least two location points in a target area and displaying virtual smoke within the target area; randomly playing the audio of the at least two virtual firearms when the at least two virtual firearms are shot at the at least two position points, wherein the randomly playing comprises at least one of randomly determining the playing position of the audio and randomly determining the playing sequence of the audio; the target area is an area where the target virtual prop is in contact with the target object;
the method includes the steps that a target object is a virtual ground in a virtual scene, a detection ray with a target length can be emitted after the target virtual prop is thrown, and at least two position points are determined in a target area in response to the target virtual prop contacting the target object, and the method includes the following steps: responding to the material corresponding to any virtual ground in the virtual scene through the detection ray, and controlling the target virtual prop to enter a triggered state; responsive to the target virtual prop being in the triggered state, determining at least two location points in the target area, wherein said determining at least two location points in the target area comprises: acquiring a first coordinate of a central point of the target area in the virtual scene and a radius of the target area; obtaining at least two coordinate deviation values, wherein the at least two coordinate deviation values are smaller than the radius; and obtaining second coordinates of the at least two position points in the virtual scene based on the at least two coordinate deviation values and the first coordinates.
2. The method of claim 1, wherein said controlling said controlled virtual object, in response to a throw instruction for said target virtual prop, after throwing said target virtual prop in said virtual scene, further comprises:
and responding to the contact of the target virtual prop and a virtual wall in the virtual scene, and controlling the target virtual prop to rebound.
3. The method of claim 1, wherein said randomizing the audio of the at least two virtual firearm shots at the at least two location points comprises:
randomly playing the audio frequency of the at least two virtual firearms when shooting at any two target position points randomly determined in the at least two position points.
4. The method of claim 1, wherein said randomizing the audio of the at least two virtual firearm shots at the at least two location points comprises:
and simultaneously playing the audio of the at least two virtual firearms when shooting at the at least two position points.
5. The method of claim 1, wherein after randomly playing the audio at the time of the at least two virtual firearm shots at the at least two location points, the method further comprises:
and stopping playing the audio in response to the playing time length of the audio being greater than or equal to the target time length.
6. A virtual object control apparatus, characterized in that the apparatus comprises:
the system comprises a display module and a control module, wherein the display module is used for displaying a visual field picture of a controlled virtual object, the controlled virtual object is provided with a target virtual prop, the target virtual prop is used for simulating the sound of at least two virtual guns when shooting in a virtual scene, the at least two virtual guns are virtual guns equipped with a virtual object in a team where the controlled virtual object is located, and a prop throwing control is displayed on the visual field picture;
the control module is used for responding to the detection of the touch operation on the throwing control, displaying a preview throwing track of the target virtual prop in the virtual scene on the visual field picture, wherein the preview throwing track is a flying track of the target virtual prop in the virtual scene after being thrown; in response to that the touch operation on the throwing control is not detected any more, controlling the controlled virtual object to throw the target virtual prop according to the preview throwing track;
the playing module is used for responding to the contact of the target virtual prop and a target object, determining at least two position points in a target area and displaying virtual smoke in the target area; randomly playing the audio frequencies of the at least two virtual firearms when shooting at the at least two position points, wherein the target area is an area where the target virtual prop is in contact with the target object; the method comprises the following steps that a target object is a virtual ground in a virtual scene, a detection ray with a target length can be emitted after the target virtual prop is thrown, and at least two position points are determined in a target area in response to the contact between the target virtual prop and the target object, and comprises the following steps: responding to the fact that the material corresponding to any virtual ground in the virtual scene is detected through the detection rays, and controlling the target virtual prop to enter a triggered state; in response to the target virtual item being in the triggered state, determining at least two location points in the target area, wherein the determining at least two location points in the target area comprises: acquiring a first coordinate of a central point of the target area in the virtual scene and a radius of the target area; obtaining at least two coordinate offset values, wherein the at least two coordinate offset values are smaller than the radius; and obtaining second coordinates of the at least two position points in the virtual scene based on the at least two coordinate deviation values and the first coordinates.
7. The apparatus of claim 6, wherein the control module is further configured to control the target virtual prop to bounce in response to the target virtual prop contacting a virtual wall in the virtual scene.
8. The apparatus of claim 6, wherein the playing module is configured to randomly play the audio of the at least two virtual firearms when shooting at any two target location points randomly determined from the at least two location points.
9. The apparatus of claim 6, wherein the playing module is configured to play the audio of the at least two virtual firearms shot at the at least two location points simultaneously.
10. The apparatus of claim 6, wherein the playing module is further configured to stop playing the audio in response to the playing time duration of the audio being greater than or equal to a target time duration.
11. A computer device, comprising one or more processors and one or more memories having at least one computer program stored therein, the computer program being loaded and executed by the one or more processors to implement the virtual object control method of any one of claims 1 to 5.
12. A computer-readable storage medium, in which at least one computer program is stored, the computer program being loaded and executed by a processor to implement the virtual object control method according to any one of claims 1 to 5.
CN202011266337.9A 2020-11-13 2020-11-13 Virtual object control method, device, equipment and storage medium Active CN112245917B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011266337.9A CN112245917B (en) 2020-11-13 2020-11-13 Virtual object control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011266337.9A CN112245917B (en) 2020-11-13 2020-11-13 Virtual object control method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112245917A CN112245917A (en) 2021-01-22
CN112245917B true CN112245917B (en) 2022-11-25

Family

ID=74266749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011266337.9A Active CN112245917B (en) 2020-11-13 2020-11-13 Virtual object control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112245917B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7803048B2 (en) * 2006-03-15 2010-09-28 Microsoft Corporation Radar manipulation in a video game
CN111135566A (en) * 2019-12-06 2020-05-12 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111589145B (en) * 2020-04-22 2023-03-24 腾讯科技(深圳)有限公司 Virtual article display method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN112245917A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
EP4000704A1 (en) Virtual object control method, device, terminal, and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN112221141B (en) Method and device for controlling virtual object to use virtual prop
CN110917623B (en) Interactive information display method, device, terminal and storage medium
CN111359206B (en) Virtual object control method, device, terminal and storage medium
CN113289331B (en) Display method and device of virtual prop, electronic equipment and storage medium
CN113144597B (en) Virtual vehicle display method, device, equipment and storage medium
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN110755844B (en) Skill activation method and device, electronic equipment and storage medium
CN110585706B (en) Interactive property control method, device, terminal and storage medium
CN111298441A (en) Using method, device, equipment and storage medium of virtual prop
US20220379209A1 (en) Virtual resource display method and related apparatus
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112221135B (en) Picture display method, device, equipment and storage medium
CN114225406A (en) Virtual prop control method and device, computer equipment and storage medium
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment
CN112402966B (en) Virtual object control method, device, terminal and storage medium
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN113680060A (en) Virtual picture display method, device, equipment, medium and computer program product
CN111589102B (en) Auxiliary tool detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40037339

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant