CN113274724B - Virtual object control method, device, equipment and computer readable storage medium - Google Patents

Virtual object control method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN113274724B
CN113274724B CN202110604843.2A CN202110604843A CN113274724B CN 113274724 B CN113274724 B CN 113274724B CN 202110604843 A CN202110604843 A CN 202110604843A CN 113274724 B CN113274724 B CN 113274724B
Authority
CN
China
Prior art keywords
target virtual
virtual object
virtual
target
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110604843.2A
Other languages
Chinese (zh)
Other versions
CN113274724A (en
Inventor
顾列宾
蒋国太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110604843.2A priority Critical patent/CN113274724B/en
Publication of CN113274724A publication Critical patent/CN113274724A/en
Application granted granted Critical
Publication of CN113274724B publication Critical patent/CN113274724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a control method, a control device, control equipment and a computer-readable storage medium of a virtual object; the method comprises the following steps: presenting a picture of a virtual scene corresponding to the interactive game, and presenting a target virtual object and a target virtual carrier for at least two virtual objects to interact in the picture; when the target virtual carrier is in a waiting state, responding to an interactive instruction aiming at the target virtual object, and controlling the target virtual object to execute interactive operation aiming at the target virtual carrier so as to obtain the ownership of the target virtual carrier; and when the target virtual object acquires the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to the evacuation state, controlling the target virtual object to take the target virtual carrier for evacuation so as to indicate the target virtual object to obtain the victory of the interactive opponent. By the method and the device, interaction between the virtual objects in the virtual scene can be enhanced, so that the interaction result of the interaction with the office can more accurately reflect the interaction capability of the virtual objects.

Description

Virtual object control method, device, equipment and computer readable storage medium
Technical Field
The present application relates to the field of virtualization and human-computer interaction technologies, and in particular, to a method, an apparatus, a device, and a computer-readable storage medium for controlling a virtual object.
Background
With the development of computer technology, electronic devices can realize richer and more vivid virtual scenes. The virtual scene refers to a digital scene outlined by a computer through a digital communication technology, and a user can obtain a fully virtualized feeling (for example, virtual reality) or a partially virtualized feeling (for example, augmented reality) in the aspects of vision, hearing and the like in the virtual scene and can interact with various objects in the virtual scene to obtain feedback.
In the related art, a method for determining a match result of an interactive match is provided, that is, a virtual vehicle such as a flying boat, an airplane, etc. is presented, and when a user controls a virtual object to enter the virtual vehicle, the virtual object is determined to obtain a win of the interactive match. However, with the above method, when there are no other teams around the virtual vehicle, the virtual object can quickly enter the virtual vehicle to win the victory, so that the interaction strength between the virtual objects is reduced, and the interaction capability of the virtual object cannot be truly reflected by the game result.
Disclosure of Invention
Embodiments of the present application provide a method, an apparatus, a device, and a computer-readable storage medium for controlling virtual objects, which can enhance interaction between virtual objects in a virtual scene, so that an interaction result of an interaction session can more accurately reflect an interaction capability of a virtual object.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a control method of a virtual object, which comprises the following steps:
presenting a picture of a virtual scene corresponding to the interactive game, and displaying a target virtual object and a target virtual carrier for at least two virtual objects to interact in the picture;
when the target virtual vehicle is in a waiting state, responding to an interactive instruction aiming at the target virtual object, and controlling the target virtual object to execute interactive operation aiming at the target virtual vehicle so as to acquire the ownership of the target virtual vehicle;
and when the target virtual object acquires the ownership of the target virtual vehicle and the state of the target virtual vehicle is switched from the waiting state to an evacuation state, controlling the target virtual object to take the target virtual vehicle for evacuation so as to indicate the target virtual object to obtain the victory of the interactive game.
An embodiment of the present application provides a control apparatus for a virtual object, including:
the display module is used for presenting a picture of a virtual scene corresponding to the interactive game, and displaying a target virtual object and a target virtual carrier for at least two virtual objects to interact in the picture;
the interaction module is used for responding to an interaction instruction aiming at the target virtual object when the target virtual vehicle is in a waiting state, and controlling the target virtual object to execute an interaction operation aiming at the target virtual vehicle so as to obtain the ownership of the target virtual vehicle;
and the evacuation module is used for controlling the target virtual object to take the target virtual carrier for evacuation when the target virtual object obtains the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to an evacuation state so as to indicate the target virtual object to obtain the victory of the interactive game.
In the above scheme, the interaction module is further configured to control the target virtual vehicle to move in the virtual scene; counting down the time when the target virtual vehicle reaches the target position in the moving process of the target virtual vehicle; and when the countdown time is zero, controlling the target virtual vehicle to be stationary at the target position so as to enable the target virtual vehicle to be in a waiting state.
In the above scheme, the target virtual vehicle is a virtual flying vehicle, and the interaction module is further configured to control the virtual flying vehicle to sweep over a virtual object team to which the target virtual object belongs when at least two virtual object teams exist in the virtual scene;
and after the virtual flying vehicle skims over each virtual object team, controlling the virtual flying vehicle to hover in an air area corresponding to the target position.
In the above solution, the interaction module is further configured to display a map thumbnail of the virtual scene;
and displaying the position information of the target position in the map thumbnail.
In the above solution, the interaction module is further configured to display the relative position of the target virtual object and the target virtual vehicle in real time;
receiving a movement instruction for the target virtual object based on the relative position;
controlling the target virtual object to move towards the target virtual vehicle in response to the movement instruction for the target virtual object.
In the above scheme, the interaction module is further configured to show a duration that the target virtual vehicle is in the waiting state when the target virtual vehicle is in the waiting state;
and when the duration of the target virtual vehicle in the waiting state reaches a duration threshold, switching the state of the target virtual vehicle from the waiting state to an evacuation state.
In the above scheme, the evacuation module is further configured to, when the state of the target virtual vehicle is switched from the waiting state to an evacuation state and no virtual object exists to obtain the ownership of the target virtual vehicle, control the target virtual vehicle to evacuate, and display result prompt information of the interactive match-making to prompt that no virtual object exists to obtain the win of the interactive match-making.
In the above scheme, the interaction module is further configured to respond to an interaction instruction for the target virtual object, and acquire object information of a virtual object in an associated area of the target virtual vehicle;
and controlling the target virtual object to execute the interactive operation aiming at the target virtual carrier when the target virtual object is determined to be in the associated area and the virtual objects in the associated area belong to the same virtual object team based on the object information.
In the above solution, the interaction module is further configured to, in a process that the target virtual object performs an interaction operation for the target virtual vehicle, control the interaction operation to be in an interrupted state when a virtual object belonging to a different virtual object team from the target virtual object enters the association area;
controlling the target virtual object to interact with the virtual objects belonging to the different virtual object teams in response to a control instruction for the target virtual object.
In the above scheme, the interaction module is further configured to control the target virtual vehicle to periodically scan the virtual object located in the associated area, and mark the scanned virtual object.
In the above scheme, the interaction module is further configured to display an interaction score obtained by interaction between the target virtual object and another virtual object;
and responding to an interaction instruction aiming at the target virtual object, and controlling the target virtual object to execute the interaction operation aiming at the target virtual carrier when the interaction achievement reaches an interaction achievement threshold value.
In the above scheme, the interaction module is further configured to, in a process in which the virtual object performs an interaction operation for the target virtual vehicle, show an interaction progress corresponding to the interaction operation, and control the interaction progress to change with the progress of the interaction operation;
and when the interaction progress represents that the interaction operation is completed, determining that the target virtual object obtains the ownership of the target virtual carrier.
In the above solution, the interaction module is further configured to, when at least two virtual object teams exist in the virtual scene, obtain the number of virtual objects performing the interaction operation in a virtual object team to which the target virtual object belongs;
and controlling the interactive progress, and changing at a speed matched with the quantity.
In the above solution, the interaction module is further configured to, when at least two virtual object teams exist in the virtual scene, obtain a position of each virtual object in a virtual object team to which the target virtual object belongs;
and resetting the interaction progress when the virtual objects in the virtual object team to which the target virtual object belongs are determined to be out of the associated area of the target virtual vehicle based on the positions of the virtual objects.
In the above solution, the interaction module is further configured to, when there is another virtual object executing an interaction operation for the target virtual vehicle, present object information of the other virtual object and an interaction progress corresponding to the interaction operation.
In the above solution, the evacuation module is further configured to, when at least two virtual object teams exist in the virtual scene, obtain a life value of each virtual object in a virtual object team to which the target virtual object belongs;
when the target virtual object obtains the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to an evacuation state, if the life value of at least one virtual object in a virtual object team to which the target virtual object belongs is higher than a life value threshold value, controlling the virtual object team to which the target virtual object belongs to take the target virtual carrier to evacuate.
An embodiment of the present application provides a computer device, including:
a memory for storing executable instructions;
and the processor is used for realizing the control method of the virtual object provided by the embodiment of the application when the executable instruction stored in the memory is executed.
The embodiment of the present application provides a computer-readable storage medium, which stores executable instructions for causing a processor to implement the control method for a virtual object provided in the embodiment of the present application when the processor executes the executable instructions.
The embodiment of the application has the following beneficial effects:
by applying the embodiment, when the target virtual vehicle is in a waiting state, the target virtual vehicle is controlled to execute the interactive operation aiming at the target virtual vehicle in response to the interactive instruction aiming at the target virtual vehicle, so as to acquire the ownership of the target virtual vehicle; when the target virtual object obtains the ownership of the target virtual carrier and the state of the target virtual carrier is switched from a waiting state to an evacuation state, controlling the target virtual object to evacuate by taking the target virtual carrier so as to indicate the target virtual object to obtain the victory of the interactive game; therefore, the target virtual object can be determined to obtain the victory of the interactive opponent only when the target virtual object is required to obtain the ownership of the target virtual carrier and the state of waiting for the target virtual carrier is switched from the waiting state to the evacuation state, so that the virtual object can compete for the ownership of the target virtual carrier through interaction in the process that the target virtual carrier is in the waiting state, the interaction among the virtual objects in the virtual scene is enhanced, and the interaction result of the interactive opponent can more accurately reflect the interaction capability of the virtual object.
Drawings
Fig. 1 is a schematic diagram of an optional implementation scenario of an information presentation method in a virtual scenario provided in an embodiment of the present application;
FIG. 2 is an alternative structural diagram of a computer device 500 according to an embodiment of the present application;
fig. 3 is an alternative flow chart of a control method for a virtual object according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating presentation of interactive prompt information provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of prompt information of a landing zone provided by an embodiment of the present application;
FIG. 9 is a schematic illustration of a position cue for an airship according to an embodiment of the application;
FIG. 10 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 11 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 12 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 13 is a schematic view of a virtual scene provided in an embodiment of the present application;
FIG. 14 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 15 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 16 is a flowchart illustrating a control method for a virtual object according to an embodiment of the present application;
FIG. 17 is a schematic illustration of a process flow for an airship according to an embodiment of the application;
FIGS. 18A-18B are schematic diagrams illustrating the display of wait countdown provided by embodiments of the present application;
FIGS. 19A-19B are schematic diagrams of a building complex provided by an embodiment of the present application;
FIG. 20 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
FIG. 21 is a schematic diagram illustrating an implementation flow of the spiral phase provided by an embodiment of the present application;
FIG. 22 is a flow chart illustrating the performance of the interaction for an airship according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the accompanying drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without making creative efforts fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The terminal comprises a client and an application program running in the terminal and used for providing various services, such as an instant messaging client and a video playing client.
2) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
3) The virtual scene is a virtual scene displayed (or provided) when an application program runs on the terminal. The virtual scene may be a simulated environment of a real world, a virtual environment combining virtuality and reality, or a purely fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present invention. For example, the virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as desert, city, etc., and the user may control the virtual object to move in the virtual scene.
4) Virtual objects, the appearance of various people and objects in the virtual scene that can interact, or movable objects in the virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. A plurality of virtual objects may be included in the virtual scene, each virtual object having its own shape and volume in the virtual scene, occupying a portion of the space in the virtual scene.
Alternatively, the virtual object may be a user Character controlled by an operation on the client, an Artificial Intelligence (AI) set in a virtual scene battle by training, or a Non-user Character (NPC) set in a virtual scene interaction. Alternatively, the virtual object may be a virtual character that is confrontationally interacted with in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control the virtual object to freely fall, glide, open a parachute to fall, run, jump, crawl over land, or the like in the sky of the virtual scene, or may control the virtual object to swim, float, or dive in the sea. Of course, the user may also control the virtual object to move in the virtual scene by taking a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, or the like; the user may also control the virtual object to perform antagonistic interaction with other virtual objects through the attack-type virtual item, for example, the virtual item may be a virtual machine a, a virtual tank, a virtual fighter, and the like, which is only illustrated in the above scenario, and the embodiment of the present invention is not limited to this specifically.
5) Scene data, representing various features that objects in the virtual scene are exposed to during the interaction, may include, for example, the location of the objects in the virtual scene. Of course, different types of features may be included depending on the type of virtual scene; for example, in a virtual scene of a game, scene data may include a time required to wait for various functions provided in the virtual scene (depending on the number of times the same function can be used within a certain time), and attribute values indicating various states of a game character may include, for example, a life value (also referred to as a red value), a magic value (also referred to as a blue value), a guard value, an electric quantity value, and the like.
Referring to fig. 1, fig. 1 is a schematic diagram of an optional implementation scenario of the information presentation method in a virtual scenario provided in this embodiment, in order to support an exemplary application, terminals (exemplary terminal 400-1 and terminal 400-2 are shown) are connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and data transmission is implemented using a wireless link.
In some embodiments, the server 200 may be an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like. The terminal may be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited.
In actual implementation, the terminal is installed and operated with an application program supporting a virtual scene. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multiplayer Online Battle Arena game (MOBA), a virtual reality application program, a three-dimensional map program, or a Multiplayer gunfight survival game. The user uses the terminal to operate the virtual object located in the virtual scene to perform activities, which include but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the virtual object is a virtual character, such as a character or an animation character created from a real character.
In an exemplary scenario, the target virtual object controlled by the terminal 400-1 and the virtual object controlled by the terminal 400-2 are in the same virtual scenario, and the target virtual object controlled by the terminal 400-1 can interact with the virtual objects controlled by the other terminals 400-2 in the virtual scenario. In some embodiments, the target virtual object controlled by the terminal 400-1 and the virtual objects controlled by the other terminals 400-2 may be in an allied relationship, for example, the target virtual object controlled by the terminal 400-1 and the virtual objects controlled by the other terminals 400-2 belong to the same team, or the target virtual object controlled by the terminal 400-1 and the virtual objects controlled by the other terminals 400-2 are in a hostile relationship, and the target virtual object controlled by the terminal 400-1 and the virtual objects controlled by the terminal 400-2 may be in a antagonistic interaction in a manner of shooting each other on the land.
In an exemplary scene, a picture of a virtual scene corresponding to the interactive game is presented on the terminal 400-1, and a target virtual object controlled by the terminal 400-1 and a target virtual carrier for interaction of at least two virtual objects are displayed in the picture; when the target virtual carrier is in a waiting state, responding to an interactive instruction aiming at the target virtual object, and controlling the target virtual object to execute interactive operation aiming at the target virtual carrier so as to obtain the ownership of the target virtual carrier; and when the target virtual object acquires the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to the evacuation state, controlling the target virtual object to take the target virtual carrier for evacuation so as to indicate the target virtual object to obtain the victory of the interactive opponent.
In actual implementation, the server 200 calculates scene data in a virtual scene and sends the scene data to the terminal 400, the terminal 400 relies on graphics computing hardware to complete loading, parsing and rendering of computing display data, and relies on graphics output hardware to output the virtual scene to form visual perception, for example, a two-dimensional video frame can be presented on a display screen of a smart phone, or a video frame realizing a three-dimensional display effect is projected on a lens of augmented reality/virtual reality glasses; for perception in the form of a virtual scene, it is understood that a hearing perception may be formed by means of a corresponding hardware output of the terminal, e.g. using a microphone output, a haptic perception using a vibrator output, etc.
The terminal 400-1 runs a client (e.g., a network-based game application), performs game interaction with other users through the connection server 200, the terminal 400-1 outputs a picture of a virtual scene, the picture includes a target virtual object, where the target virtual object is a game character controlled by a user, that is, the target virtual object is controlled by a real user, and will move in the virtual scene in response to an operation of the real user on a controller (including a touch screen, a voice control switch, a keyboard, a mouse, a joystick, and the like), for example, when the real user moves the joystick to the left, the first virtual object will move to the left in the virtual scene, and can also keep in place still, jump, and use various functions (such as skills and props).
For example, the user controls the target virtual object to perform an interactive operation for the target virtual vehicle through the client running on the terminal 400-1, so as to obtain the ownership of the target virtual vehicle; and when the target virtual object acquires the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to the evacuation state, controlling the target virtual object to take the target virtual carrier for evacuation so as to indicate the target virtual object to obtain the victory of the interactive opponent.
In an exemplary scene, in a virtual shooting application, virtual scene technology is adopted to enable a trainee to visually and auditorily experience a battlefield environment in a real way, to be familiar with the environmental characteristics of an area to be battled, to interact with objects in the virtual environment through necessary equipment, and the virtual battlefield environment can be realized by a method for creating a three-dimensional battlefield environment with a dangerous image ring life and near reality through background generation and image synthesis through a corresponding three-dimensional battlefield environment graphic image library comprising a battle background, a battlefield scene, various weaponry, fighters and the like.
In actual implementation, the terminal 400-1 controls a target virtual object (e.g., a simulated fighter) to perform a maneuver with another user through the connection server 200, and the terminal 400-1 outputs a screen of a virtual scene (e.g., a city a) including the target virtual object and a target virtual carrier for interaction between at least two virtual objects, where the target virtual object is the simulated fighter controlled by the user. For example, the user controls the target virtual object to perform an interactive operation for the target virtual vehicle through the client running on the terminal 400-1, so as to obtain the ownership of the target virtual vehicle; and when the target virtual object acquires the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to the evacuation state, controlling the target virtual object to take the target virtual carrier for evacuation so as to indicate the target virtual object to obtain the victory of the interactive opponent.
Referring to fig. 2 and fig. 2 are schematic structural diagrams of an optional configuration of a computer device 500 provided in the embodiment of the present application, in an actual application, the computer device 500 may be the terminal or the server 200 in fig. 1, and a computer device implementing the method for controlling a virtual object in the embodiment of the present application is described by taking the computer device as the terminal shown in fig. 1 as an example. The computer device 500 shown in fig. 2 includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in computer device 500 are coupled together by a bus system 540. It is understood that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 2.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile memory may be a Read Only Memory (ROM), and the volatile memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 may be capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the control device of the virtual object provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates a control device 555 of the virtual object stored in a memory 550, which may be software in the form of programs and plug-ins, and includes the following software modules: a display module 5551, an interaction module 5552 and an evacuation module 5553, which are logical and therefore can be arbitrarily combined or further separated depending on the functions implemented.
The functions of the respective modules will be explained below.
In other embodiments, the control Device of the virtual object provided in this embodiment may be implemented in hardware, and as an example, the control Device of the virtual object provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the control method of the virtual object provided in this embodiment, for example, the processor in the form of the hardware decoding processor may be one or more Application Specific Integrated Circuits (ASICs), DSPs, programmable Logic Devices (PLDs), complex Programmable Logic Devices (CPLDs), field Programmable Gate Arrays (FPGAs), or other electronic components.
Next, a description will be given of a method for controlling a virtual object provided in the embodiments of the present application, but in actual implementation, the method for controlling a virtual object provided in the embodiments of the present application may be implemented by a server or a terminal alone, or may be implemented by cooperation of a server and a terminal.
Referring to fig. 3, fig. 3 is an alternative flowchart of a control method for a virtual object according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 3.
Step 301: the terminal presents a picture of a virtual scene corresponding to the interactive game, and displays a target virtual object and a target virtual carrier for interaction of at least two virtual objects in the picture.
In practical application, the terminal is provided with an application program which supports the virtual scene and is installed on the support terminal. The application program can be any one of a first person shooting game, a third person shooting game, a multi-person online tactical competitive game, a virtual reality application program, a three-dimensional map program or a multi-person gunfight survival game. The user can use the terminal to operate the target virtual object located in the virtual scene to perform activities, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the target virtual object is a virtual character, such as a character or an animation character created from a real character.
When a user opens an application program on a terminal, and the terminal runs the application program, the terminal presents a picture of a virtual scene, wherein the picture of the virtual scene is obtained by observing the virtual scene at a first person object view angle or a third person object view angle, and the picture of the virtual scene comprises an interactive object and an object interactive environment, such as a target virtual object controlled by the current user.
In practical implementation, the target virtual vehicle may be a virtual automobile, a virtual aircraft, a virtual yacht, or the like, and the target virtual vehicle is not specifically limited herein. The target virtual carrier may be located in the virtual scene at the beginning, or may appear in the virtual scene after the interactive game starts for a period of time, for example, ten minutes after the interactive game starts; or when the interactive game progresses to some extent (for example, only a preset number of virtual objects remain to survive), the interactive game reappears in the virtual scene, and the generation time of the target virtual item is not limited. Here, when the target virtual vehicle is within the visual field of the target virtual object, the target virtual vehicle is presented in the screen of the virtual scene; or when the target virtual vehicle is not within the visual field range of the target virtual object, the user may control the target virtual object to move so as to present the target virtual vehicle in the screen of the virtual scene when the target virtual vehicle is within the visual field range of the target virtual object.
Step 302: and when the target virtual carrier is in a waiting state, responding to the interactive instruction aiming at the target virtual object, and controlling the target virtual object to execute interactive operation aiming at the target virtual carrier so as to acquire the ownership of the target virtual carrier.
In actual implementation, when the target virtual vehicle is in a waiting state, the virtual object may perform an interactive operation for the target virtual vehicle, and when the target virtual object completes the interactive operation, the ownership of the target virtual vehicle may be obtained.
In practical applications, the interactive instruction for the target virtual object may be triggered by a virtual key, a physical key, voice, or the like, and the triggering manner of the interactive instruction for the target virtual object is not limited herein. Here, the terminal may present interaction prompt information in a screen of the virtual scene to prompt the user to trigger an interaction instruction for the target virtual vehicle. For example, fig. 4 is a schematic diagram of presenting an interaction prompt message provided by an embodiment of the present application, referring to fig. 4, where the target virtual vehicle is an airship, and the interaction prompt message 401 is presented in a screen of a virtual scene to prompt a user to trigger an interaction instruction for the airship, such as by using a key E.
In some embodiments, the terminal may further control the target virtual vehicle to move in the virtual scene before controlling the target virtual object to perform the interactive operation for the target virtual vehicle; counting down the time when the target virtual carrier reaches the target position in the moving process of the target virtual carrier; and when the countdown time is zero, controlling the target virtual vehicle to be stationary at the target position so as to enable the target virtual vehicle to be in a waiting state.
In practical implementation, different modes can be adopted for moving different types of target virtual vehicles, for example, when the target virtual vehicle is a virtual automobile, the target virtual vehicle can be controlled to move on the land; when the target virtual carrier is a virtual flying carrier, the target virtual carrier can be controlled to fly in the air; when the target virtual vehicle is a virtual yacht, the target virtual vehicle can be controlled to sail in the sea. And during the moving process of the target virtual carrier, counting down the time of the target virtual carrier reaching the target position, wherein the counting down is presented in the picture of the virtual scene, and when the time of counting down is changed to zero, the target virtual carrier is marked to reach the target position, and the target virtual carrier is controlled to be still at the target position.
As an example, fig. 5 is a schematic view of a screen of a virtual scene provided in an embodiment of the present application, and referring to fig. 5, during a process of moving a target virtual vehicle, a countdown 501 is displayed in the screen of the virtual scene to indicate a time when the target virtual vehicle reaches a target position.
In some embodiments, the target virtual vehicle is a virtual flying vehicle, and the target virtual vehicle may be controlled to move in the virtual scene by: when at least two virtual object teams exist in the virtual scene, controlling the virtual flying vehicle to pass over the virtual object team to which the target virtual object belongs; and after the virtual flying vehicle skims over each virtual object team, controlling the virtual flying vehicle to hover in an air area corresponding to the target position.
In practical implementation, when the target virtual vehicle is a virtual flying vehicle, such as an airplane, an airship, etc., after appearing in the virtual scene, the target virtual vehicle first skims over a virtual object team to which the target virtual object belongs, where the virtual object team may include one or more virtual objects. There may be an over-head from each virtual object of the team of virtual objects, or an over-head from a key virtual object in the team of virtual objects, e.g., the key virtual object may be the team leader of the team of virtual objects. After the virtual flight sweeps over each virtual object team, the virtual flight vehicle is controlled to hover in an aerial area corresponding to the target position, wherein the aerial area can be hovering in an aerial area right above an area range containing the target position.
In some embodiments, controlling the virtual flying vehicle to fly over the team of virtual objects to which the target virtual object belongs may be accomplished by: the method comprises the steps of firstly obtaining the position of a captain in a virtual object team, then presetting the height (such as the height of 10 cm) at the head of the captain, generating a collision cuboid according to the direction of the current captain, wherein the cuboid is path data for playing the animation of the airship, judging whether the current captain collides or not, rotating a certain angle if the current captain collides, judging once again until no collision occurs, and then playing the animation of the airship flying along the corresponding path based on the cuboid.
According to the method and the device, the virtual flying vehicle is controlled to pass over the virtual object team to which the target virtual object belongs, so that each virtual object team can know that the target virtual prop appears, and a clear victory target is established for a user.
In some embodiments, the terminal may also display a prompt message to inform the virtual flying vehicle of entering the virtual scene when the virtual flying vehicle skims over the virtual object team. As an example, fig. 6 is a schematic view of a virtual scene provided by an embodiment of the present application, and referring to fig. 6, the virtual flying vehicle is a spacecraft, and a target virtual object 601 and a spacecraft 602 are shown in the view of the virtual scene, and the spacecraft skims over a virtual object team to which the target virtual object belongs; at the same time, a prompt 603 is presented to inform the user that the airship is entering the evacuation zone.
In some embodiments, before controlling the target virtual vehicle to rest at the target location, a map thumbnail of the virtual scene is also shown; and displaying the position information of the target position in the map thumbnail.
In practical implementation, accurate position information of the target position, such as an accurate coordinate point, can be displayed in the map thumbnail; it may also be a region exhibiting a target location. As an example, taking the case of showing one area range including the target position as an example, fig. 7 is a schematic view of a screen of a virtual scene provided in the embodiment of the present application, and referring to fig. 7, a map thumbnail is shown in the screen of the virtual scene, and a circular area 701 is shown in the map thumbnail to indicate that the airship will land within the range indicated by the circular area.
In practical application, a small map thumbnail can be displayed first, and when a user needs to view the position information, an enlargement operation for the map thumbnail can be triggered to present the enlarged map thumbnail.
In some embodiments, when an area range is shown in the map thumbnail, the area range may be changed, that is, the area range may gradually shrink during the movement of the target virtual vehicle, so as to prompt the target position gradually and accurately. For example, the moving time of the target virtual vehicle may be divided into three parts, and the landing area is reduced once every third time, for example, the three parts are divided into 3 circles from large to small, the circle corresponds to the area range, and the size of the circle is reduced every third time.
In practical application, when the target virtual vehicle is a virtual flight vehicle, the hovering position of the virtual flight vehicle corresponds to the area range in the map thumbnail, that is, the virtual flight vehicle can seek a path and hover according to the area range.
In some embodiments, during the moving of the target virtual vehicle, the terminal may further present a prompt message in a screen of the virtual scene according to a relative position between an area range including the target position and the target virtual object, so that a user may trigger a moving instruction for the target virtual object according to the prompt message to control the target virtual object to move to the target position.
As an example, fig. 8 is a schematic diagram of prompt information of a landing area provided in the embodiment of the present application, and referring to fig. 8, the landing area is an area range including a target position, the prompt information 801 of the landing area is presented in a screen of a virtual scene, and a direction of the landing area relative to a target virtual object can be known according to the prompt information, so that a user triggers a movement instruction according to the direction to control the target virtual object to move.
In some implementations, the terminal may also show the relative positions of the target virtual object and the target virtual vehicle in real time before controlling the target virtual object to perform the interactive operation for the target virtual vehicle; receiving a movement instruction for the target virtual object based on the relative position; and controlling the target virtual object to move towards the target virtual carrier in response to the movement instruction for the target virtual object.
In practical implementation, the relative position includes a direction and a distance of the target virtual vehicle relative to the target virtual object, so that a user can trigger a movement operation for the target virtual object according to the displayed relative position to control the target virtual object to move towards the target virtual vehicle.
In some embodiments, the relative position may be presented in a text form to specifically indicate the position of the target area relative to the first virtual object and the distance between the target area and the first virtual object, for example, fig. 9 is a schematic diagram of a position prompt message of the airship provided in the embodiment of the present application, referring to fig. 9, when the target virtual vehicle is an airship, the relative position 901 of the target virtual object and the airship is shown in a picture of a virtual scene, so that the direction and the distance of the airship relative to the target virtual object are known.
In some implementations, the terminal may control the target virtual object to perform the interaction with the target virtual vehicle by: responding to an interactive instruction aiming at the target virtual object, and acquiring object information of the virtual object in the association area of the target virtual carrier; and controlling the target virtual object to execute interactive operation aiming at the target virtual carrier when the target virtual object is determined to be in the associated area and the virtual objects in the associated area belong to the same virtual object team based on the object information.
In practical applications, only when the virtual objects in the associated area belong to the same virtual object team, the virtual objects in the associated area can perform the interactive operation on the target virtual vehicle. Based on the method, when an interaction instruction for the target virtual object is received, object information of the virtual object in the associated area is needed to judge whether the virtual object in the associated area belongs to the same virtual object team, and if yes, the target virtual object is controlled to execute an interaction operation for the target virtual carrier; otherwise, the target virtual object is not controlled to execute the interactive operation aiming at the target virtual carrier. Here, when the virtual objects in the associated area do not belong to the same virtual object team, a prompt message may also be presented to inform the user that the interactive operation cannot be performed.
In some implementations, the terminal may further control the interactive operation to be in an interrupted state when there is a virtual object that belongs to a different virtual object team with the target virtual object entering the association area during the target virtual object performing the interactive operation for the target virtual vehicle; in response to a control instruction for a target virtual object, the control target virtual object interacts with virtual objects belonging to different teams of virtual objects.
In actual implementation, since the interactive operation for the target virtual vehicle can be executed only when the virtual object in the associated area belongs to one virtual object team, based on this, if the virtual object of another virtual object team enters the associated area during the execution of the interactive operation, the interactive operation will be interrupted. Here, the user may trigger a control instruction for a target virtual object, which interacts with virtual objects belonging to different virtual object teams to catch virtual objects of other virtual object teams out of the associated area.
In the process that the interactive operation is in the interrupted state, whether the virtual objects in the associated area belong to the same virtual object team is monitored in real time, if yes, whether the virtual object team is the virtual object team to which the target virtual object belongs is further judged, and if yes, the target virtual object can be continuously controlled to execute the interactive operation aiming at the target virtual carrier; if the target virtual object does not belong to the virtual object team, the interaction progress of the interaction operation executed by the target virtual object is cleared.
In practical application, when the virtual object enters the association area and stays in the association area for a preset time (such as 1 second), the virtual object is determined to enter the association area, so that the user can be prevented from interfering with the execution of the interactive operation by controlling the way that the virtual object rapidly enters and exits the association area.
In some implementations, the terminal may further control the target virtual vehicle to periodically scan the virtual object within the associated area and mark the scanned virtual object.
Here, when the target virtual object is in the associated area, the virtual object in the associated area is periodically scanned, and the virtual object in the associated area is marked, so that the user can control the target virtual object to interact with other virtual objects in the associated area according to the marks, and the virtual objects in the other virtual object teams are driven out of the associated area. The labeling method herein is not limited to a specific labeling method as long as it can distinguish the virtual object in the related area from other virtual objects.
As an example, fig. 10 is a schematic view of a virtual scene provided in an embodiment of the present application, and referring to fig. 10, a virtual object in an associated area is marked in the virtual scene, such as marking virtual object 1001.
In some embodiments, the terminal may further display an interaction score obtained by the interaction between the target virtual object and another virtual object; correspondingly, the terminal can control the target virtual object to execute the interactive operation aiming at the target virtual carrier in the following way: and responding to the interaction instruction aiming at the target virtual object, and controlling the target virtual object to execute the interaction operation aiming at the target virtual carrier when the interaction score reaches an interaction score threshold value.
In practical implementation, the target virtual object can perform the interactive operation on the target virtual vehicle only when the interactive result of the target virtual object reaches the interactive result threshold, and here, the interactive result obtained by the target virtual object interacting with other virtual objects is displayed and updated in real time, so that when the interactive result reaches the interactive result threshold, the target virtual object can be controlled to perform the interactive operation on the target virtual vehicle in real time.
For example, the interaction achievement may be a killing number, and if the number of the target virtual objects killing the virtual objects reaches a number threshold, the target virtual objects are controlled to perform an interaction operation for the target virtual vehicle.
In some embodiments, the terminal may further display an interaction progress of the corresponding interaction operation in a process in which the virtual object performs the interaction operation for the target virtual vehicle, and control the interaction progress to change along with the progress of the interaction operation; and when the interactive progress represents that the interactive operation is completed, determining that the target virtual object obtains the ownership of the target virtual carrier.
In actual implementation, the interaction progress corresponding to the interaction operation can be displayed in the process that the virtual object performs the interaction operation aiming at the target virtual carrier, so that a user can know the interaction progress of the interaction operation in real time, and when the progress value in the interaction progress reaches the maximum value, the interaction operation is determined to be completed, and at the moment, the target virtual object obtains the ownership of the target virtual carrier.
As an example, fig. 11 is a schematic screen view of a virtual scene provided in an embodiment of the present application, and referring to fig. 11, in a process that a virtual object performs an interactive operation on a target virtual vehicle, an interaction progress 1101 of the corresponding interactive operation is shown.
In some embodiments, the terminal may control the interaction progress to change as the interaction operation progresses by: when at least two virtual object teams exist in a virtual scene, acquiring the number of virtual objects for executing interactive operation in the virtual object team to which the target virtual object belongs; and controlling the interactive progress, and changing at a speed matched with the quantity.
In actual implementation, the execution speed of the interactive operation is associated with the number of the virtual objects for executing the interactive operation, that is, the larger the number of the virtual objects for executing the interactive operation is, the faster the execution speed of the interactive operation is, for example, 1 person needs 3.3 seconds to complete the interactive operation, 2 persons needs 2.5 seconds to complete the interactive operation, and 3 persons needs 1.6 seconds to complete the interactive operation; the faster the corresponding rate of change of the progress of the interaction. Based on this, the corresponding relation between the change speed of the interaction progress and the number of the virtual objects can be preset, and in actual implementation, the speed matched with the number is determined according to the corresponding relation, the interaction progress is controlled, and the speed matched with the number is adopted for changing.
In some embodiments, during the target virtual object performing the interactive operation for the target virtual vehicle, the terminal may also present the number of virtual objects; and when the interactive operation is completed, presenting identification information to indicate that the team to which the target virtual object belongs obtains the ownership right of the airship.
For example, fig. 12 is a schematic view of a virtual scene provided in an embodiment of the present application, and referring to fig. 12, where the target virtual vehicle is an airship, and the interaction progress 1201 and the number 1202 of virtual objects participating in the interaction operation are shown in the view of the virtual scene; when the interaction progress indicates that the virtual operation is completed, the identification information 1203 is presented to indicate that the virtual object team to which the target virtual object belongs obtains the ownership right of the spacecraft.
In some embodiments, before determining that the virtual object obtains the ownership of the target virtual vehicle, when at least two virtual object teams exist in the virtual scene, obtaining the position of each virtual object in the virtual object team to which the target virtual object belongs; and resetting the interaction progress when the virtual objects in the virtual object team to which the target virtual object belongs are determined to be outside the associated area of the target virtual carrier based on the positions of the virtual objects.
Here, if a certain virtual object team leaves the associated area during the course of performing the interactive operation on the target virtual vehicle, that is, all the virtual objects in the virtual object team leave the associated area, it is considered that the virtual object team gives up the ownership right for the target virtual vehicle, and then the interaction progress is reset, that is, the interaction progress is returned to the initial state.
In some embodiments, when there is another virtual object to perform the interactive operation for the target virtual vehicle, the terminal may also present object information of the other virtual object and an interactive progress corresponding to the corresponding interactive operation.
In actual implementation, when other virtual objects execute the interactive operation for the target virtual vehicle, the user can control the target virtual object to find the virtual object executing the interactive operation before the interactive operation is completed according to the object information of the presented virtual object and the interactive progress corresponding to the corresponding interactive operation, so as to compete for the ownership of the target virtual object.
As an example, fig. 13 is a schematic view of a screen of a virtual scene provided in an embodiment of the present application, and referring to fig. 13, object information 1301 and an interaction progress 1302 of other virtual objects (enemies) are shown in the screen of the virtual scene.
In some embodiments, when other virtual objects or teams of other virtual objects are completed in the interaction, the terminal may present an identification message to indicate that the other virtual objects have gained ownership of the spacecraft.
Step 303: and when the target virtual object acquires the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to the evacuation state, controlling the target virtual object to take the target virtual carrier for evacuation so as to indicate the target virtual object to obtain the victory of the interactive opponent.
Here, evacuation refers to leaving the interaction area where the target virtual object interacts with other virtual objects. For example, fig. 14 is a schematic view of a virtual scene provided by an embodiment of the present application, and referring to fig. 14, where the target virtual vehicle is an airship and where the target virtual object is controlled to enter the airship for evacuation by the airship.
In some embodiments, the terminal may further show a duration of the target virtual vehicle in the waiting state when the target virtual vehicle is in the waiting state; and when the time length of the target virtual vehicle in the waiting state reaches the time length threshold value, switching the state of the target virtual vehicle from the waiting state to the evacuation state.
In actual implementation, when the target virtual vehicle is switched to the waiting state, the duration of the target virtual vehicle in the waiting state is timed, and the duration of the target virtual vehicle in the waiting state is displayed. Here, the length of time of presentation is varied in real time.
In practical application, the duration of the target virtual vehicle in the waiting state can be displayed in a numerical value form, and can also be displayed in a progress bar form. Here, when the duration in which the target virtual vehicle is in the waiting state is displayed in a numerical form, a process of counting up or counting down the duration may be presented.
As an example, fig. 15 is a schematic view of a screen of a virtual scene provided in an embodiment of the present application, and referring to fig. 15, a time length of the target virtual vehicle in the waiting state is counted down, and when the count-down reaches zero, a state of the target virtual vehicle is switched from the waiting state to the evacuation state.
In some embodiments, the terminal may further control the target virtual vehicle to evacuate when the state of the target virtual vehicle is switched from the waiting state to the evacuation state and the virtual object does not exist to acquire the ownership of the target virtual vehicle, and display a result prompt message of the interactive match so as to prompt that no virtual object obtains the win of the interactive match.
In actual implementation, if the state of the target virtual carrier is switched from the waiting state to the evacuation state, and no virtual object obtains the ownership of the target virtual carrier, the current interactive game is a streaming game, that is, no virtual object can win the interactive game, the target virtual carrier can evacuate by itself, and result prompt information of the interactive game is displayed.
In some embodiments, when the target virtual object acquires the ownership of the target virtual vehicle and the state of the target virtual vehicle is switched from the waiting state to the evacuation state, the terminal may control the target virtual object to evacuate with the target virtual vehicle by: when at least two virtual object teams exist in a virtual scene, acquiring the life value of each virtual object in the virtual object team to which the target virtual object belongs; when the target virtual object obtains the ownership of the target virtual carrier and the state of the target virtual carrier is switched from a waiting state to an evacuation state, if the life value of at least one virtual object in the virtual object team to which the target virtual object belongs is higher than the life value threshold, controlling the virtual object team to which the target virtual object belongs to take the target virtual carrier for evacuation.
Here, the life value of the virtual object is higher than the life value threshold value, which indicates that the virtual object is not dead or loses combat power, and the life value threshold value here may be zero or a non-zero value (for example, when the total life value is 100, the life value threshold value is set to 10).
In practical application, when the state of a target virtual vehicle is switched from a waiting state to an evacuation state, the target virtual vehicle carries the virtual object team to evacuate only when the life value of at least one virtual object in the virtual object team for acquiring the ownership of the target virtual vehicle is higher than a life value threshold value; otherwise, the interactive game is determined to be a streaming game, that is, no virtual object wins, and the target virtual vehicle withdraws by itself.
In actual implementation, when a target virtual object acquires the ownership of a target virtual vehicle, it needs to be determined whether a virtual object team to which the target virtual object belongs has a life value higher than a life value threshold, and only when the life value of the virtual object is higher than the life value threshold, the virtual object team to which the target virtual object belongs is controlled to take the target virtual vehicle for evacuation.
By applying the embodiment, when the target virtual vehicle is in a waiting state, the target virtual vehicle is controlled to execute the interactive operation aiming at the target virtual vehicle in response to the interactive instruction aiming at the target virtual vehicle, so as to acquire the ownership of the target virtual vehicle; when the target virtual object obtains the ownership of the target virtual carrier and the state of the target virtual carrier is switched from a waiting state to an evacuation state, controlling the target virtual object to take the target virtual carrier for evacuation so as to indicate the target virtual object to obtain the victory of the interactive opponent; therefore, the target virtual object can be determined to obtain the victory of the interactive opponent only when the target virtual object is required to obtain the ownership of the target virtual carrier and the state of waiting for the target virtual carrier is switched from the waiting state to the evacuation state, so that the virtual object can compete for the ownership of the target virtual carrier through interaction in the process that the target virtual carrier is in the waiting state, the interaction among the virtual objects in the virtual scene is enhanced, and the interaction result of the interactive opponent can more accurately reflect the interaction capability of the virtual object.
Continuing to describe the method for controlling a virtual object provided in the embodiment of the present application, fig. 16 is a schematic flowchart of the method for controlling a virtual object provided in the embodiment of the present application, and referring to fig. 16, the method for controlling a virtual object provided in the embodiment of the present application includes:
step 1601: the terminal presents a start game button.
Step 1602: the terminal responds to click operation aiming at the game key and sends an acquisition request of scene data of the virtual scene to the server.
Step 1603: and the server sends the scene data to the terminal.
Step 1604: the terminal renders based on the received scene data, presents a picture for displaying the virtual scene, and displays the target virtual object in the picture.
Step 1605: and when the starting time of the game reaches a first time threshold, controlling the virtual flying vehicle to fly over the virtual object team to which the target virtual object belongs.
Step 1606: and after the virtual flying vehicle skims over each virtual object team, controlling the virtual flying vehicle to hover in an air area corresponding to the target position, and displaying the position information of the target position in a map thumbnail.
Step 1607: and when the landing time of the virtual flying carrier is reached, controlling the virtual flying carrier to land to the target position so as to enable the virtual flying carrier to be in a waiting state.
Step 1608: and displaying the duration of the target virtual vehicle in a waiting state.
Step 1609: in response to the movement instruction for the target virtual object, the target virtual object is moved into the associated area of the virtual flying vehicle.
Step 1610: when virtual objects in other teams of virtual objects are included in the associated area, the virtual objects in the other teams of virtual objects in the associated area are tagged.
Step 1611: when the virtual objects in other virtual object teams are not contained in the association area, the target virtual object is controlled to execute interactive operation aiming at the target virtual carrier in response to the interactive instruction aiming at the target virtual object.
Step 1612: and when the target virtual object finishes the interactive operation, determining that the target virtual object obtains the ownership of the virtual flying carrier.
Step 1613: and when the time length of the virtual flying vehicle in the waiting state reaches a second time length threshold value, controlling the target virtual object to take the target virtual vehicle for evacuation.
By applying the embodiment, the ownership of the target virtual carrier can be determined only when the target virtual object is required to obtain the ownership of the target virtual carrier and the duration of waiting for the virtual flight carrier in the waiting state reaches a duration threshold, so that the virtual object can compete for the ownership of the target virtual carrier through interaction in the process of waiting for the target virtual carrier, the interaction among the virtual objects in the virtual scene is enhanced, and the interaction capability of the virtual object can be more accurately reflected by the interaction result of the interaction with the target virtual object.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described. Here, taking the virtual vehicle as an example, in actual implementation, after a period of time from the start of the session (for example, 10 minutes from the start of the session), the airship randomly lands at a certain position in the virtual scene. After landing, the airship is in a waiting state, and a user can control the virtual object to execute interactive operation aiming at the airship, so that the airship can temporarily obtain the ownership right of the airship by randomly and successfully interacting with the airship; the airship ownership will be continuously transferred along with the successful interaction of the new team; after the evacuation time is up, the airship begins to evacuate, and at the moment, the team obtaining the ownership right of the airship obtains the victory of the competitive fight of the local station.
FIG. 17 is a schematic representation of a process flow for an airship according to an embodiment of the disclosure, with reference to FIG. 17, the process flow for the airship including:
step 1701: and entering a prompting phase.
In the prompting stage, the terminal can show the process that the airship skims over the virtual object team to which the target virtual object belongs in the picture of the virtual scene, wherein the airship skims over each virtual object team in the virtual scene; meanwhile, prompt information is displayed to establish a clear victory target for the user.
As an example, referring to fig. 6, the virtual flying vehicle is an airship, and a target virtual object 601 and the airship 602 are shown in the picture of a virtual scene, the airship skims over a virtual object team to which the target virtual object belongs; at the same time, a prompt 603 is presented to inform the user that the airship is entering the evacuation zone.
Step 1702: entering a spiral phase.
During the hover phase, the airship may hover over an airborne zone in the virtual scene at a location corresponding to the landing zone where it is to land, indicating the extent of the landing zone where the airship may land. Here, when the ship is in the view field of the target virtual object, the terminal displays an animation in which the ship is hovering in the air space in the view field of the virtual scene. And no matter the airship is in the visual field picture of the target virtual object, the terminal also displays a map thumbnail and displays the prompt information of the landing area in the map thumbnail. The terminal also exhibits a countdown to the time the airship lands while in hover, indicating the end of the hover phase when the countdown reaches zero.
As an example, referring to fig. 7, a map thumbnail is shown in the screen of the virtual scene, and a circular area 701 is shown in the map thumbnail to indicate that the airship will land within the range indicated by the circular area.
Step 1703: judging whether the spiral stage is finished, if so, executing a step 1704; otherwise, step 1705 is performed.
Step 1704: and entering a waiting evacuation stage.
Here, when the countdown corresponding to the landing time changes to zero, the airship randomly selects one spot (target position) within the landing area to land so that the airship is in a waiting state. While the airship is in the waiting state, the terminal displays a waiting countdown in the picture of the virtual scene to indicate the time at which the airship evacuates.
In practical application, in the evacuation waiting phase, a user can trigger a control instruction for a target virtual object through a terminal to control the target virtual object to execute a corresponding operation so as to compete for the ownership right for a target virtual vehicle.
Step 1705: and periodically updating the prompt information of the landing area.
Here, in the circling phase, the landing area is periodically updated, that is, the landing area is gradually reduced, and the prompt information of the landing area is correspondingly updated in the map thumbnail, for example, the circular area in the map thumbnail is gradually reduced by using the indication of the circular area; after the landing area is reduced, the airship can perform diameter-seeking hover according to the reduced landing area, so that the hover position of the airship corresponds to the landing area.
As an example, the waiting time may be divided into three parts, and the descending area is narrowed once every third of the waiting time, for example, into 3 circles from large to small, and the size of the circle is narrowed every third of the waiting time, and the airship starts to descend when the waiting time is over.
Step 1706: judging whether the evacuation time is reached, if yes, executing a step 1712; otherwise, step 1707 is performed.
Here, whether the evacuation time is reached is judged by waiting for countdown, and when the waiting countdown is finished, it is determined that the evacuation time is reached; otherwise, determining that the evacuation time is not reached.
Step 1707: judging whether a virtual object team obtains the ownership of the target virtual carrier, if so, executing a step 1708; otherwise, step 604 is performed.
Step 1708: entering a waiting transmission phase.
Step 1709: judging whether the transmitting time is up, if so, executing a step 1710; otherwise, step 1708 is performed.
Here, whether the transmission time is reached is judged by waiting for countdown, and when the waiting countdown is finished, the transmission time is determined to be reached; otherwise, determining that the transmission time does not arrive.
In practical implementation, the evacuation time and the launch time are both judged by waiting for countdown, and the difference is that when a virtual object team obtains the ownership of the target virtual carrier, the display style of waiting for countdown is updated to indicate that the launch waiting stage is entered.
Fig. 18A-18B are schematic diagrams illustrating the waiting countdown provided by the embodiment of the present application, and referring to fig. 18A-18B, the display patterns of the waiting countdown are different, wherein the waiting countdown shown in fig. 18A corresponds to the waiting for the evacuation phase, and the waiting countdown shown in fig. 18A corresponds to the waiting for the emission phase.
Step 1710: judging whether a team obtaining the ownership of the airship has a virtual object to live, if so, executing a step 1711; otherwise, step 1712 is performed.
Step 1711: the virtual object team that determines to obtain the right of ownership of the airship wins the interactive game.
Here, the airship evacuates with the virtual object team.
Step 1712: and determining a flow office.
Here, a play-out means that no team wins the interactive play-out.
The manner of acquiring the ownership will be described below. In actual implementation, when a flying boat descends, a correlation area corresponding to the flying boat is generated, when only one virtual object of a virtual object team exists in the correlation area, the virtual object in the virtual object team can execute interactive operation aiming at the flying boat, here, a terminal can display the interactive progress of the virtual object team, and the interactive progress changes along with the progress of the interactive operation; the change speed of the interaction progress is related to the number of virtual objects for executing the interaction operation in the team, if the number of the virtual objects is larger, the change of the interaction progress is faster, if 1 person needs 3.3 seconds to complete the interaction operation, 2 persons need 2.5 seconds to complete the interaction operation, and 3 persons need 1.6 seconds to complete the interaction operation; in the process of executing the interactive operation, if all the virtual objects in the team leave the association area, clearing the interactive progress; in the process of executing interactive operation, if virtual objects of other teams appear in the associated area, the terminal is operated interactively, and the occupation progress is also suspended; when a certain team finishes the interactive operation, the team successfully acquires the occupation; all members of the team will not change ownership of the spacecraft at this time even if they leave the associated area.
It should be noted that the airship does not land on the ground, but hovers in half-empty; on one hand, a random attack and defense environment is formed with the building group in the virtual scene; on the other hand, enough three-dimensional space is reserved for users to use. For example, fig. 19A-19B are schematic diagrams of a building complex provided by an embodiment of the present application, referring to fig. 19A, where there are 4 lines of attack and defense in the same building complex; referring to fig. 19A, there are 3 defense and attack routes in the same building group.
When the target virtual object controlled by the user is in the associated area, the virtual object in the associated area is periodically scanned, and the virtual object in the associated area is marked, so that the user can control the target virtual object to interact with other virtual objects in the associated area according to the marks, and other virtual object teams are driven out of the associated area. Here, when the virtual object enters the association area and stays in the association area for a preset time (e.g., 1 second), the virtual object is determined to enter the association area, so that the user can be prevented from interfering with the operation of the game by controlling the virtual object to rapidly enter and exit the association area.
In practical applications, the terminal may display, in a screen of the virtual scene, prompt information of a landing area of the airship, where the prompt information is determined according to a relative position of the target virtual object and the landing area, so that a user may trigger a movement instruction for the target virtual object according to the prompt information to control the target virtual object to move towards the airship. For example, referring to fig. 8, the landing area is an area range including the target position, the prompt information 801 of the landing area is presented in the screen of the virtual scene, and the direction of the landing area relative to the target virtual object can be known according to the prompt information, so that the user triggers a movement instruction according to the direction to control the target virtual object to move.
Alternatively, the terminal may show the relative position of the airship and the target virtual object in the screen of the virtual scene, so that the user may trigger a movement instruction for the target virtual object according to the relative position to control the target virtual object to move towards the airship. For example, referring to fig. 9, when the target virtual vehicle is an airship, the relative position 901 of the target virtual object and the airship is shown in the picture of the virtual scene, so that the direction and distance of the airship with respect to the target virtual object are known.
When the target virtual object moves to the associated area of the airship and only the virtual objects belonging to the same virtual object team as the target virtual object exist in the associated area, the interaction prompt information is presented to prompt the user to control the target virtual object to perform the interaction operation for the airship, for example, referring to fig. 4, the interaction prompt information is presented in the screen of the virtual scene to prompt the user to trigger the interaction instruction for the airship, such as through the button E. In the process that the target virtual object executes the interactive operation aiming at the airship, the terminal can display information such as the interactive progress corresponding to the interactive operation, the number of virtual objects participating in the interactive operation in the belonged virtual object team and the like; when the interaction is complete, an identification message is presented indicating that the right of ownership of the airship has been acquired. For example, referring to fig. 12, an interaction progress 1201, the number 1202 of virtual objects participating in an interaction operation are shown in a screen of a virtual scene; when the interaction progress indicates that the virtual operation is completed, the identification information 1203 is presented to indicate that the virtual object team to which the target virtual object belongs obtains the right of the airship to which the virtual object team belongs.
Correspondingly, if the virtual objects of other virtual object teams (enemies) execute the interactive operation aiming at the airship, the terminal presents the interactive progress corresponding to the interactive operation executed by the corresponding virtual object team, and presents identification information to indicate that the other virtual object teams acquire the ownership of the airship when the interactive operation is completed. For example, fig. 20 is a schematic view of a screen of a virtual scene provided in an embodiment of the present application, and referring to fig. 20, an interaction progress 2001 of an enemy is shown in the screen of the virtual scene; when the interaction progress indicates that the virtual operation is complete, identification information 2002 is presented identifying the enemy to obtain the airship's ownership.
The following describes the flow of the spiral phase. Fig. 21 is a schematic diagram of an implementation flow of the spiral phase provided in the embodiment of the present application, and referring to fig. 21, the implementation flow of the spiral phase provided in the embodiment of the present application includes:
step 2101: synchronizing spacecraft birth information.
Step 2102: the position of the captain in the virtual object team is obtained.
Step 2103: the angle of impingement on the cuboid is selected.
The position of the captain is firstly obtained, then the height (such as 10 cm) is preset at the head of the captain, a collision cuboid is generated according to the direction of the current captain, the cuboid is path data of the moving picture played by the airship, whether the current cuboid is collided or not is judged, and if the collision occurs, the current cuboid rotates for a certain angle and is judged again. If all results fail, then the animation is not played and step 2106 is directly started.
Step 2104: judging whether collision occurs, if so, executing a step 2103; otherwise, step 2105 is performed.
Step 2105: and playing the animation.
Step 2106: synchronizing the position and orientation of the airship.
Here, the server synchronizes the position and direction of the spaceship on the server every 2 seconds from the beginning of the shrink-ring phase, the client starts to track up the current latest position and direction after playing the animation, and the server is always faster than the client.
Step 2107: generating 3 different sized circles.
Step 2108: clockwise winding.
Step 2109: judging whether the ring shrinkage time is reached, if so, executing step 2110; otherwise, step 2108 is performed.
Step 2110: and (5) shrinking the ring.
Here, 3 circles from large to small are generated, the airship spirals according to the corresponding circle routing, the size of the circle is reduced after one third of the waiting time, and the airship starts to descend when the waiting time is over.
Step 2111: judging whether the landing time is reached, if yes, executing step 2112; otherwise, step 2113 is performed.
Step 2112: a landing is selected in the area within the circle.
Step 2113: and controlling the airship to descend according to the curve.
The flow of execution of the interaction for the airship continues to be described. Figure 22 is a schematic diagram of a flow of execution of an interaction for an airship according to an embodiment of the present application, and referring to figure 22, a flow of execution of an interaction for an airship according to an embodiment of the present application includes:
step 2201: the virtual object is controlled to enter an associated zone of the airship.
Step 2202: it is determined whether there is only one virtual object of the team of virtual objects in the associated area.
Step 2203: the control-target virtual object performs an interactive operation for the airship.
Step 2204: and controlling the interaction progress to change according to the number of virtual objects for executing the interaction operation in the virtual object team.
Here, the correspondence between the number of virtual objects and the speed of change in the interaction progress is set in advance to determine the change in the interaction progress based on the correspondence.
Step 2205: judging whether the interactive operation is continuously executed, if so, executing step 2211; otherwise, step 2206 is performed.
Step 2206: and judging whether virtual objects of other virtual object teams enter the associated area, if so, executing a step 2207.
Step 2207: the interactive operation is interrupted.
Step 2208: it is determined whether there is only one virtual object of the team of virtual objects in the associated area, and if so, step 2209 is performed.
Step 2209: judging whether the target virtual object belongs to a virtual object team, if so, executing a step 2203; otherwise, step 2210 is performed.
Step 2210: and clearing the interaction progress.
Step 2211: and determining that the virtual object team to which the target virtual object belongs obtains the ownership of the airship.
The application of the embodiment has the following beneficial effects:
1) Strengthening the establishment of a victory target;
2) The influence of the fortune component on the victory mode is reduced, and the competition among users is enhanced;
3) The competition experience when a plurality of teams exist is optimized, and the adaptability of the winning mode is stronger.
Continuing with the exemplary structure of the control device 555 of the virtual object provided by the embodiment of the present application implemented as a software module, in some embodiments, as shown in fig. 2, the software module stored in the control device 555 of the virtual object in the memory 550 may include:
the display module 5551 is configured to present a picture of a virtual scene corresponding to an interactive game, and display a target virtual object and a target virtual carrier for interacting at least two virtual objects in the picture;
an interaction module 5552, configured to, when the target virtual vehicle is in a waiting state, in response to an interaction instruction for the target virtual vehicle, control the target virtual vehicle to perform an interaction operation for the target virtual vehicle, so as to obtain an ownership of the target virtual vehicle;
an evacuation module 5553, configured to, when the target virtual object obtains the ownership of the target virtual vehicle and the state of the target virtual vehicle is switched from the waiting state to an evacuation state, control the target virtual object to take the target virtual vehicle for evacuation so as to indicate that the target virtual object obtains the win of the interactive match.
In some embodiments, the interaction module 5552 is further configured to control the target virtual vehicle to move in the virtual scene; counting down the time when the target virtual vehicle reaches a target position in the moving process of the target virtual vehicle; and when the countdown time is zero, controlling the target virtual vehicle to stand still at the target position so as to enable the target virtual vehicle to be in a waiting state.
In some embodiments, the target virtual vehicle is a virtual flying vehicle, and the interaction module 5552 is further configured to control the virtual flying vehicle to fly over a virtual object team to which the target virtual object belongs when at least two virtual object teams exist in the virtual scene;
and after the virtual flying vehicle skims over each virtual object team, controlling the virtual flying vehicle to hover in an air area corresponding to the target position.
In some embodiments, the interaction module 5552 is further configured to present a map thumbnail of the virtual scene;
and displaying the position information of the target position in the map thumbnail.
In some embodiments, the interaction module 5552 is further configured to show the relative position of the target virtual object and the target virtual vehicle in real time;
receiving a movement instruction for the target virtual object based on the relative position;
controlling the target virtual object to move towards the target virtual vehicle in response to the movement instruction for the target virtual object.
In some embodiments, the interaction module 5552 is further configured to show a duration that the target virtual vehicle is in a waiting state when the target virtual vehicle is in the waiting state;
and when the time length of the target virtual vehicle in the waiting state reaches a time length threshold value, switching the state of the target virtual vehicle from the waiting state to an evacuation state.
In some embodiments, the evacuation module 5553 is further configured to, when the state of the target virtual vehicle is switched from the waiting state to an evacuation state and there is no virtual object to obtain the ownership of the target virtual vehicle, control the target virtual vehicle to evacuate and display a result prompt message of the interactive match, so as to prompt that there is no virtual object to obtain the winner of the interactive match.
In some embodiments, the interaction module 5552 is further configured to, in response to the interaction instruction for the target virtual object, obtain object information of a virtual object in an associated area of the target virtual vehicle;
and controlling the target virtual object to execute the interactive operation aiming at the target virtual carrier when the target virtual object is determined to be in the associated area and the virtual objects in the associated area belong to the same virtual object team based on the object information.
In some embodiments, the interaction module 5552 is further configured to, during the course of the target virtual object performing the interaction operation on the target virtual vehicle, control the interaction operation to be in an interrupted state when there is a virtual object that belongs to a different team of virtual objects than the target virtual object entering the association area;
controlling the target virtual object to interact with the virtual objects belonging to the different virtual object teams in response to a control instruction for the target virtual object.
In some embodiments, the interaction module 5552 is further configured to control the target virtual vehicle to periodically scan the virtual object in the associated area, and mark the scanned virtual object.
In some embodiments, the interaction module 5552 is further configured to display interaction results obtained by the target virtual object interacting with other virtual objects;
and responding to an interaction instruction aiming at the target virtual object, and controlling the target virtual object to execute the interaction operation aiming at the target virtual carrier when the interaction achievement reaches an interaction achievement threshold value.
In some embodiments, the interaction module 5552 is further configured to, during the process that the virtual object performs the interaction operation on the target virtual vehicle, show an interaction progress corresponding to the interaction operation, and control the interaction progress to change with the progress of the interaction operation;
and when the interaction progress represents that the interaction operation is completed, determining that the target virtual object obtains the ownership of the target virtual carrier.
In some embodiments, the interaction module 5552 is further configured to, when at least two teams of virtual objects exist in the virtual scene, obtain the number of virtual objects performing the interaction operation in the team of virtual objects to which the target virtual object belongs;
and controlling the interactive progress, and changing at a speed matched with the number.
In some embodiments, the interaction module 5552 is further configured to, when at least two teams of virtual objects exist in the virtual scene, obtain the location of each virtual object in the team of virtual objects to which the target virtual object belongs;
and resetting the interaction progress when the virtual objects in the virtual object team to which the target virtual object belongs are determined to be outside the associated area of the target virtual vehicle based on the positions of the virtual objects.
In some embodiments, the interaction module 5552 is further configured to, when there is another virtual object to perform an interaction operation on the target virtual vehicle, present object information of the other virtual object and an interaction progress corresponding to the interaction operation.
In some embodiments, the evacuation module 5553 is further configured to, when there are at least two teams of virtual objects in the virtual scene, obtain a life value of each virtual object in the team of virtual objects to which the target virtual object belongs;
when the target virtual object obtains the ownership of the target virtual carrier and the state of the target virtual carrier is switched from the waiting state to an evacuation state, if the life value of at least one virtual object in a virtual object team to which the target virtual object belongs is higher than a life value threshold value, controlling the virtual object team to which the target virtual object belongs to take the target virtual carrier to evacuate.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the control method of the virtual object according to the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium having stored therein executable instructions, which when executed by a processor, will cause the processor to perform a method provided by embodiments of the present application, for example, the method as illustrated in fig. 3.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EP ROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (19)

1. A method for controlling a virtual object, comprising:
presenting a picture of a virtual scene corresponding to the interactive game, and displaying a target virtual object and a target virtual carrier for at least two virtual objects to interact in the picture;
when the target virtual vehicle is in a waiting state, displaying an associated area of the target virtual vehicle, and when the target virtual object is in the associated area and the virtual objects in the associated area belong to the same virtual object team, responding to an interaction instruction aiming at the target virtual object, and controlling the target virtual object to execute an interaction operation aiming at the target virtual vehicle so as to acquire the ownership of the target virtual vehicle;
when the target virtual object obtains the ownership of the target virtual carrier and the duration of the target virtual carrier in the waiting state reaches a duration threshold, switching the state of the target virtual carrier from the waiting state to an evacuation state; when the life value of at least one virtual object in a virtual object team to which the target virtual object belongs is higher than a life value threshold value, the target virtual object is controlled to take the target virtual vehicle for evacuation, and the target virtual object is indicated to obtain the victory of the interaction partner.
2. The method of claim 1, wherein prior to said controlling said target virtual object to perform an interactive operation with said target virtual vehicle, further comprising:
controlling the target virtual vehicle to move in the virtual scene;
counting down the time when the target virtual vehicle reaches the target position in the moving process of the target virtual vehicle;
and when the countdown time is zero, controlling the target virtual vehicle to be stationary at the target position so as to enable the target virtual vehicle to be in a waiting state.
3. The method of claim 2, wherein said target virtual vehicle is a virtual flying vehicle, said controlling said target virtual vehicle to move in said virtual scene comprising:
when at least two virtual object teams exist in the virtual scene, controlling the virtual flying vehicle to fly over the virtual object team to which the target virtual object belongs;
and after the virtual flying vehicle skims over each virtual object team, controlling the virtual flying vehicle to hover in an air area corresponding to the target position.
4. The method of claim 2, wherein said controlling said target virtual vehicle to rest before said target position further comprises:
displaying a map thumbnail of the virtual scene;
and displaying the position information of the target position in the map thumbnail.
5. The method of claim 1, wherein prior to the controlling the target virtual object to perform the interactive operation with the target virtual vehicle, further comprising:
displaying the relative position of the target virtual object and the target virtual carrier in real time;
receiving a movement instruction for the target virtual object based on the relative position;
controlling the target virtual object to move towards the target virtual vehicle in response to the movement instruction for the target virtual object.
6. The method of claim 1, wherein the method further comprises:
and when the target virtual vehicle is in a waiting state, displaying the duration of the target virtual vehicle in the waiting state.
7. The method of claim 1, wherein the method further comprises:
and when the state of the target virtual vehicle is switched from the waiting state to an evacuation state and no virtual object exists to acquire the ownership of the target virtual vehicle, controlling the target virtual vehicle to evacuate, and displaying result prompt information of the interactive game so as to prompt that no virtual object exists to acquire the victory of the interactive game.
8. The method of claim 1, wherein said controlling the target virtual object to perform an interactive operation for the target virtual vehicle in response to the interaction instruction for the target virtual object comprises:
responding to an interactive instruction aiming at the target virtual object, and acquiring object information of the virtual object in the associated area of the target virtual carrier;
and controlling the target virtual object to execute the interactive operation aiming at the target virtual carrier when the target virtual object is determined to be in the associated area and the virtual objects in the associated area belong to the same virtual object team based on the object information.
9. The method of claim 8, wherein the method further comprises:
in the process that the target virtual object performs interactive operation aiming at the target virtual vehicle, when a virtual object belonging to a different virtual object team with the target virtual object enters the association area, controlling the interactive operation to be in an interruption state;
controlling the target virtual object to interact with the virtual objects belonging to the different virtual object teams in response to a control instruction for the target virtual object.
10. The method of claim 8, wherein the method further comprises:
and controlling the target virtual vehicle to periodically scan the virtual objects in the associated area, and marking the scanned virtual objects.
11. The method of claim 1, wherein the method further comprises:
displaying an interaction score obtained by the interaction of the target virtual object and other virtual objects;
the controlling the target virtual object to perform an interactive operation for the target virtual vehicle in response to the interactive instruction for the target virtual object includes:
and responding to an interaction instruction aiming at the target virtual object, and controlling the target virtual object to execute the interaction operation aiming at the target virtual carrier when the interaction achievement reaches an interaction achievement threshold value.
12. The method of claim 1, wherein the method further comprises:
displaying an interaction progress corresponding to the interaction operation in the process that the virtual object executes the interaction operation aiming at the target virtual carrier, and controlling the interaction progress to change along with the progress of the interaction operation;
and when the interaction progress represents that the interaction operation is completed, determining that the target virtual object obtains the ownership of the target virtual carrier.
13. The method of claim 12, wherein the controlling the interaction progress as the interaction proceeds comprises:
when at least two virtual object teams exist in the virtual scene, acquiring the number of virtual objects for executing the interactive operation in the virtual object team to which the target virtual object belongs;
and controlling the interactive progress, and changing at a speed matched with the number.
14. The method of claim 12, wherein prior to the determining that the target virtual object obtains ownership of the target virtual vehicle, further comprising:
when at least two virtual object teams exist in the virtual scene, acquiring the position of each virtual object in a virtual object team to which the target virtual object belongs;
and resetting the interaction progress when the virtual objects in the virtual object team to which the target virtual object belongs are determined to be out of the associated area of the target virtual vehicle based on the positions of the virtual objects.
15. The method of claim 1, wherein the method further comprises:
and when other virtual objects execute the interactive operation aiming at the target virtual carrier, presenting the object information of the other virtual objects and the corresponding interactive progress corresponding to the interactive operation.
16. The method of claim 1, wherein before controlling the target virtual object to evacuate with the target virtual vehicle when a life value of at least one virtual object in a team of virtual objects to which the target virtual object belongs is above a life value threshold, the method further comprises:
and when at least two virtual object teams exist in the virtual scene, acquiring the life value of each virtual object in the virtual object team to which the target virtual object belongs.
17. An apparatus for controlling a virtual object, comprising:
the display module is used for presenting a picture of a virtual scene corresponding to the interactive game and displaying a target virtual object and a target virtual carrier for interaction of at least two virtual objects in the picture;
the interaction module is used for displaying an association area of the target virtual vehicle when the target virtual vehicle is in a waiting state, and controlling the target virtual vehicle to execute interaction operation aiming at the target virtual vehicle in response to an interaction instruction aiming at the target virtual vehicle when the target virtual vehicle is in the association area and virtual objects in the association area belong to the same virtual object team so as to acquire the ownership of the target virtual vehicle;
the evacuation module is used for switching the state of the target virtual carrier from the waiting state to an evacuation state when the target virtual object obtains the ownership of the target virtual carrier and the duration of the target virtual carrier in the waiting state reaches a duration threshold; when the life value of at least one virtual object in a virtual object team to which the target virtual object belongs is higher than a life value threshold value, the target virtual object is controlled to take the target virtual vehicle for evacuation, and the target virtual object is indicated to obtain the victory of the interaction partner.
18. A computer device, comprising:
a memory for storing executable instructions;
a processor for implementing the method of controlling a virtual object of any one of claims 1 to 16 when executing executable instructions stored in the memory.
19. A computer-readable storage medium storing executable instructions for implementing the method of controlling a virtual object according to any one of claims 1 to 16 when executed by a processor.
CN202110604843.2A 2021-05-31 2021-05-31 Virtual object control method, device, equipment and computer readable storage medium Active CN113274724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110604843.2A CN113274724B (en) 2021-05-31 2021-05-31 Virtual object control method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110604843.2A CN113274724B (en) 2021-05-31 2021-05-31 Virtual object control method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113274724A CN113274724A (en) 2021-08-20
CN113274724B true CN113274724B (en) 2023-04-07

Family

ID=77282863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110604843.2A Active CN113274724B (en) 2021-05-31 2021-05-31 Virtual object control method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113274724B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113730913A (en) * 2021-09-03 2021-12-03 网易(杭州)网络有限公司 Data processing method and device in game and electronic terminal

Also Published As

Publication number Publication date
CN113274724A (en) 2021-08-20

Similar Documents

Publication Publication Date Title
CN112121430B (en) Information display method, device, equipment and storage medium in virtual scene
WO2022252911A1 (en) Method and apparatus for controlling called object in virtual scene, and device, storage medium and program product
CN110882545B (en) Virtual object control method and device, electronic equipment and storage medium
CN113101667B (en) Virtual object control method, device, equipment and computer readable storage medium
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
US20220072427A1 (en) Virtual environment display method and apparatus, device, and storage medium
CN111921198B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112057860B (en) Method, device, equipment and storage medium for activating operation control in virtual scene
US20230072503A1 (en) Display method and apparatus for virtual vehicle, device, and storage medium
CN112295228B (en) Virtual object control method and device, electronic equipment and storage medium
CN112691366B (en) Virtual prop display method, device, equipment and medium
CN112090067B (en) Virtual carrier control method, device, equipment and computer readable storage medium
KR20220139970A (en) Data processing method, device, storage medium, and program product in a virtual scene
CN112057863A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112121417A (en) Event processing method, device, equipment and storage medium in virtual scene
CN113633964A (en) Virtual skill control method, device, equipment and computer readable storage medium
CN112402946A (en) Position acquisition method, device, equipment and storage medium in virtual scene
CN113101639A (en) Target attack method and device in game and electronic equipment
CN112057864A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN113274724B (en) Virtual object control method, device, equipment and computer readable storage medium
JP7355948B2 (en) Virtual object control method and device, computer device and program
CN113144617B (en) Control method, device and equipment of virtual object and computer readable storage medium
CN112156472B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN114344902A (en) Interaction method and device of virtual objects, electronic equipment and storage medium
CN112121433A (en) Method, device and equipment for processing virtual prop and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40050643

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant