CN110876849A - Virtual vehicle control method, device, equipment and storage medium - Google Patents

Virtual vehicle control method, device, equipment and storage medium Download PDF

Info

Publication number
CN110876849A
CN110876849A CN201911113863.9A CN201911113863A CN110876849A CN 110876849 A CN110876849 A CN 110876849A CN 201911113863 A CN201911113863 A CN 201911113863A CN 110876849 A CN110876849 A CN 110876849A
Authority
CN
China
Prior art keywords
virtual
control
character
vehicle
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911113863.9A
Other languages
Chinese (zh)
Other versions
CN110876849B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911113863.9A priority Critical patent/CN110876849B/en
Publication of CN110876849A publication Critical patent/CN110876849A/en
Application granted granted Critical
Publication of CN110876849B publication Critical patent/CN110876849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a control method, device, equipment and storage medium of a virtual vehicle, which are applied to the field of computers. The method comprises the following steps: displaying a first user interface, wherein the first user interface comprises a first virtual world picture and a position replacement control, the first virtual world picture displays a main control virtual role positioned on a driving position of a virtual carrier, and the virtual carrier further comprises a passenger position used for controlling an airborne virtual prop; when position replacement operation on the position replacement control is received, controlling the main control virtual character to be replaced from the driving position to the passenger position; displaying a second user interface, wherein the second user interface comprises a second virtual world picture and a firing control, and the second virtual world picture displays a main control virtual role positioned on the position of the passenger; and after receiving the firing operation on the firing control, controlling the main control virtual role to use the airborne virtual prop to launch the virtual component. The method reduces the operation difficulty of reducing the life values of other virtual characters by using the virtual carrier.

Description

Virtual vehicle control method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computers, and in particular, to a method, an apparatus, a device and a storage medium for controlling a virtual vehicle.
Background
In an application program based on a three-dimensional virtual world, such as a first-person shooter game, a user can operate a first virtual character in the virtual world to drive a virtual vehicle, for example, to drive a car, a locomotive, a ship, and the like.
In the related art, the first virtual character may drive the virtual vehicle to impact other virtual characters, so as to reduce or zero the life values of the other virtual characters. For example, a first virtual character drives a car to collide with a second virtual character, the terminal obtains the running speed and the collision position when the car collides with the second virtual character, and obtains the life value of the second virtual character, which should be lost, according to the running speed and the collision position.
The virtual carrier in the related art can only reduce the life value of other virtual characters in an impacting mode, and the difficulty of driving a virtual carrier with a larger body size to accurately impact other virtual characters with a smaller body size is higher, so that the actual operation of a user is very difficult.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, control equipment and a storage medium of a virtual carrier, and solves the problems that the virtual carrier in the related technology can only reduce the life values of other virtual characters in an impact mode, the difficulty of driving the virtual carrier with a large body size to accurately impact other virtual characters with a small body size is high, and the actual control of a user is very difficult. The technical scheme is as follows:
according to an aspect of the present application, there is provided a method for controlling a virtual vehicle, the method including:
displaying a first user interface, wherein the first user interface comprises a first virtual world picture and a position changing control, the first virtual world picture displays a main control virtual character positioned at a driving position of a virtual carrier, and the virtual carrier further comprises a passenger position for controlling an airborne virtual prop;
when a position replacing operation on the position replacing control is received, controlling the main control virtual character to be replaced from the driving position to the passenger position;
displaying a second user interface, wherein the second user interface comprises a second virtual world picture and a firing control, and the second virtual world picture displays the main control virtual role positioned on the passenger position;
and after receiving the firing operation on the firing control, controlling the main control virtual role to use the airborne virtual prop on the virtual carrier to launch a virtual component.
According to another aspect of the present application, there is provided a control apparatus of a virtual vehicle, the apparatus including:
the virtual vehicle comprises a display module, a position changing control module and a control module, wherein the display module is used for displaying a first user interface, the first user interface comprises a first virtual world picture and a position changing control, the first virtual world picture displays a main control virtual role positioned at a driving position of the virtual vehicle, and the virtual vehicle further comprises a passenger position used for controlling an airborne virtual prop;
the interaction module is used for receiving position replacement operation on the position replacement control;
the control module is used for controlling the main control virtual character to be replaced from the driving position to the passenger position after receiving the position replacement operation on the position replacement control;
the display module is further configured to display a second user interface, where the second user interface includes a second virtual world picture and a firing control, and the second virtual world picture displays the main control virtual character located at the passenger position;
the interaction module is also used for receiving the firing operation on the firing control;
the control module is further configured to control the main control virtual character to use the onboard virtual prop on the virtual vehicle to launch a virtual component after receiving the firing operation on the firing control.
According to another aspect of the present application, there is provided a computer device comprising: a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the method of controlling a virtual vehicle as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which is loaded and executed by the processor to implement the method of controlling a virtual vehicle as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
by additionally arranging the airborne virtual prop at the position of the passenger of the virtual carrier, when the virtual character is positioned at the position of the passenger, the virtual character can control the airborne virtual prop to shoot out a virtual part to damage other virtual characters. When the virtual character is located at the driving position, the virtual character can control the virtual carrier to move in the virtual world. The virtual carrier is combined with the airborne virtual prop, so that the virtual carrier has the functions of movement and attack, the virtual role can reduce the life value of other virtual roles by using the airborne virtual prop, and the airborne virtual prop is simple in use and convenient for user operation.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic user interface diagram of a control method for a virtual vehicle according to an exemplary embodiment of the present application;
fig. 2 is a schematic user interface diagram of a control method for a virtual vehicle according to an exemplary embodiment of the present application;
fig. 3 is a schematic user interface diagram of a control method of a virtual vehicle according to an exemplary embodiment of the present application;
fig. 4 is a schematic user interface diagram of a control method of a virtual vehicle according to an exemplary embodiment of the present application;
FIG. 5 is a block diagram of an implementation environment provided by an exemplary embodiment of the present application;
fig. 6 is a flowchart of a method for controlling a virtual vehicle according to an exemplary embodiment of the present application;
fig. 7 is a flowchart of a method for controlling a virtual vehicle according to another exemplary embodiment of the present application;
fig. 8 is a schematic virtual world diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 9 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 10 is a virtual world schematic diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 11 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
FIG. 12 is a schematic view of a camera model corresponding to a perspective provided by another exemplary embodiment of the present application;
fig. 13 is a virtual world schematic diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 14 is a virtual world schematic diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 15 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 16 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 17 is a schematic user interface diagram of a control method for a virtual vehicle according to another exemplary embodiment of the present application;
fig. 18 is a flowchart of a method for controlling a virtual vehicle according to an exemplary embodiment of the present application;
fig. 19 is a flowchart of a method for controlling a virtual vehicle according to an exemplary embodiment of the present application;
fig. 20 is a block diagram of a control apparatus of a virtual vehicle according to an exemplary embodiment of the present application;
fig. 21 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are described:
virtual world: is a virtual world that is displayed (or provided) when an application program runs on a terminal. The virtual world can be a simulation environment of a real world, a semi-simulation semi-fictional environment or a pure fictional environment. The virtual world may be any one of a two-dimensional virtual world, a 2.5-dimensional virtual world, and a three-dimensional virtual world, which is not limited in this embodiment of the present application. The following embodiments are exemplified in the case where the virtual world is a three-dimensional virtual world.
Virtual roles: refers to a movable object in a virtual world. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the three-dimensional virtual world. Optionally, the virtual character is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual character has its own shape and volume in the three-dimensional virtual world, and occupies a part of the space in the three-dimensional virtual world. Illustratively, the virtual character has a life value, and when the life value of the virtual character is zero, the virtual character can not continue to move in the virtual world. For example, the life value is a criterion for determining whether the virtual character can move in the virtual world, and may also be referred to as a signal value, a red bar, and the like.
The airborne virtual property: is a virtual prop fixed on a virtual carrier. For example, the virtual character can only use the onboard virtual prop on the virtual carrier. The airborne virtual props comprise at least one of virtual weapons, functional props and virtual equipment. Illustratively, in the present application, the onboard virtual property refers to an onboard virtual weapon, the onboard virtual weapon is fixed on a virtual vehicle, and the virtual character can only use the weapon on the virtual vehicle. The virtual weapons include machine guns, pistols, rifles, sniper guns, crossbows, arrows and other universal weapons.
Virtual components: is a consumable part used in an onboard virtual item. The number of virtual components decreases as the number of times the on-board virtual item is used increases. Schematically, in this application a dummy part refers to a dummy ammunition. The dummy ammunition includes bullets, shells, missiles, crossbows, electric energy, fuel, and the like. Illustratively, when a virtual component hits a virtual character, the life value of the virtual character is reduced.
First-person shooter game (FPS): the shooting game is a shooting game that a user can play from a first-person perspective, and a screen of a virtual world in the game is a screen that observes the virtual world from a perspective of a main control virtual character. In the game, at least two virtual characters carry out a single-game fighting mode in the virtual world, the virtual characters achieve the purpose of survival in the virtual world by avoiding attacks launched by other virtual characters and dangers (such as poison circle, marshland and the like) existing in the virtual world, when the life value of the virtual characters in the virtual world is zero, the lives of the virtual characters in the virtual world are ended, and finally the virtual characters which survive in the virtual world are winners. Optionally, each client may control one or more virtual characters in the virtual world, with the time when the first client joins the battle as the start time and the time when the last client exits the battle as the end time. Optionally, the competitive mode of the battle may include a single battle mode, a double group battle mode or a multi-person group battle mode, and the battle mode is not limited in the embodiment of the present application.
User interface UI (user interface) controls, any visual control or element that can be seen on the user interface of an application, such as controls for pictures, input boxes, text boxes, buttons, tabs, and the like. Some of the UI controls respond to the operation of the user, for example, the user triggers a firing control corresponding to the airborne virtual item to control the airborne virtual item to eject the virtual component. Illustratively, UI controls also include controls that are not visible on the user interface of the application, but which may be responsive to user manipulation. For example, there is a position on the user interface, and when the user clicks the position, the onboard virtual prop can be controlled to eject the virtual component. The UI control referred to in the embodiments of the present application includes, but is not limited to: firing control, driving control, position changing control and allowance prompting control.
The method provided in the present application can be applied to an application program supporting a virtual world. Illustratively, an application that supports the virtual world is one in which a user can control the movement of a virtual character within the virtual world. By way of example, the methods provided herein may be applied to: any one of a virtual Reality application program, an Augmented Reality (AR) program, a three-dimensional map program, a military simulation program, a virtual Reality Game, an Augmented Reality Game, a First-person shooter Game (FPS), a Third-person shooter Game (TPS), and a Multiplayer Online Battle sports Game (MOBA). The following embodiments are illustrated in an application in a game.
The game based on the virtual world is often composed of one or more maps of the game world, the virtual world in the game simulates the scene of the real world, a user can control the virtual character in the game to walk, run, jump, shoot, fight, drive, use a virtual weapon to attack other virtual characters and the like in the virtual world, the interactivity is strong, and a plurality of users can form a team on line to play a competitive game.
In some embodiments, the application may be a shooting game, a racing game, a role playing game, an adventure game, a sandbox game, a tactical competition game, a military simulation program, or the like. The client can support at least one operating system of a Windows operating system, an apple operating system, an android operating system, an IOS operating system and a LINUX operating system, and the clients of different operating systems can be interconnected and intercommunicated. In some embodiments, the client is a program adapted to a mobile terminal having a touch screen.
In some embodiments, the client is an application developed based on a three-dimensional engine, such as the three-dimensional engine being a Unity engine.
Fig. 1 shows a schematic diagram of a user interface in the related art, and a master virtual character 201 and a virtual vehicle 202 are provided on the user interface 200. When the user approaches virtual vehicle 202, virtual vehicle 202 may be entered. After the master control virtual character enters the virtual carrier, shooting can be carried out by using the airborne virtual prop carried by the master control virtual character. As shown in fig. 2, after the master virtual character 201 enters the virtual vehicle 202, the body of the master virtual character can be extended out of the virtual vehicle 202 to shoot.
As can be seen from fig. 2, when the master virtual character 201 shoots in the virtual vehicle 202 in the related art, the virtual vehicle 202 may block most of the view of the master virtual character 201, and the view of the master virtual character 201 is affected by the virtual vehicle 202, the view of the master virtual character 201 can only rotate 180 ° outside the window, and the view is greatly limited.
The application provides a control method of a virtual vehicle, and fig. 3 shows a user interface schematic diagram of the application. On the user interface 300 there is a master virtual character 201 and a virtual vehicle 202. Wherein, an onboard virtual prop 203 is fixed on the virtual vehicle 202. After the master virtual character 201 enters the virtual vehicle 202, the onboard virtual prop 203 can be used to shoot. As shown in fig. 4, after the master virtual character 201 enters the virtual vehicle 202, it shoots using the onboard virtual prop 203.
As can be seen from fig. 4, when the main control virtual character 201 uses the onboard virtual prop 203 to shoot in the virtual vehicle 202, the virtual vehicle 202 is located below the main control virtual character 201, so that the view of the main control virtual character 201 is not blocked, the main control virtual character can shoot at 306 ° without dead angles, and the shooting capability of the main control virtual character 201 in the virtual vehicle 202 is greatly improved.
Fig. 5 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual world. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The first terminal 120 is a terminal used by the first user, and the first user uses the first terminal 120 to control a master virtual character located in the virtual world to perform activities, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, shooting, throwing, attacking other virtual characters with virtual weapons energetically. Illustratively, the master avatar is a first avatar, such as a simulated character object or an animated character object.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Illustratively, the server 140 includes a processor 144 and a memory 142, the memory 142 in turn including a display module 1421, a control module 1422, and a receiving module 1423. The server 140 is used to provide background services for applications that support virtual worlds. Alternatively, the server 140 undertakes primary computational work and the first and second terminals 120, 160 undertake secondary computational work; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual world. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second terminal 160 is a terminal used by the second user, and the second user uses the second terminal 160 to control a second virtual character located in the virtual world to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, shooting, throwing, attacking other virtual characters with virtual weapons energetically. Illustratively, the second avatar is a second virtual character, such as a simulated character object or an animated character object.
Optionally, the first virtual character and the second virtual character are in the same virtual world. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different control system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 6 is a flowchart illustrating a method for controlling a virtual vehicle according to an exemplary embodiment of the present application, where the method may be applied to the first terminal 120 or the second terminal 160 in the computer system shown in fig. 5 or other terminals in the computer system. The method comprises the following steps:
and 104, displaying a first user interface, wherein the first user interface comprises a first virtual world picture and a position replacement control, the first virtual world picture displays a main control virtual character positioned at the driving position of the virtual carrier, and the virtual carrier further comprises a passenger position for controlling the airborne virtual prop.
The terminal displays a first user interface.
The user interface is an interface of an application displayed on the terminal. Illustratively, the user interface includes a virtual world screen and a UI control located on the virtual world screen. The virtual world pictures are pictures acquired by observing the virtual world from the perspective of the virtual character.
Illustratively, the first user interface is an application interface displayed on the terminal when the master virtual character is located at the driving position of the virtual vehicle. Illustratively, the first user interface includes a first virtual world screen and a position change control positioned on the first virtual world screen.
The first virtual world screen is a screen of a virtual world that is displayed when the master virtual character is located at the driving position.
Illustratively, there is a master virtual character and a virtual vehicle in the virtual world.
The master virtual character is a virtual character controlled by the terminal. Illustratively, the master virtual character may be active within the virtual world under the control of the terminal. Illustratively, when the virtual world picture is a picture obtained by viewing the virtual world from a first-person perspective, the master virtual character is a virtual character located in the middle of the virtual world picture.
The virtual carrier is a transportation tool of the virtual character in the virtual world. For example, a virtual character may control a virtual vehicle to move in a virtual world. Illustratively, the virtual character controls the virtual vehicle to move faster in the world than the virtual character alone. Illustratively, the virtual vehicle includes: at least one of a vehicle, a ship, a naval vessel, a helicopter, and an airplane. In one example, the virtual vehicle is a vehicle having a cockpit. The cabin is internally provided with a riding position for riding the virtual character.
Illustratively, the seating positions in the virtual vehicle include a driving position and a passenger position.
The driving position is a riding position used to control the movement of the virtual vehicle in the virtual world. For example, when the virtual character is located at the driving position, a UI control used for driving the virtual carrier is displayed on the user interface, and a UI control used for attacking is not displayed. That is, when the virtual character is located at the driving position of the virtual vehicle, the virtual character cannot perform the shooting action. For example, there is only one driving position in one virtual vehicle. Illustratively, the driving position of the virtual vehicle is provided with a model of an operation tool for controlling the virtual vehicle, and the operation tool comprises at least one of a steering wheel, an accelerator, a brake, an automatic driving control panel and a rudder.
The passenger position is used for controlling the riding position of the airborne virtual props fixed on the virtual vehicle. Illustratively, when the virtual character passenger is in position, UI controls for controlling the onboard virtual items are displayed on the user interface, and no UI controls for driving the virtual vehicle are displayed. That is, when the virtual character is located at the passenger position of the virtual vehicle, the virtual character cannot drive the virtual vehicle to move in the virtual world. Illustratively, at least one passenger position is included on a virtual vehicle. Illustratively, a model of the airborne virtual prop is also provided in front of the passenger position of the virtual vehicle.
The position replacement control is a UI control used to replace the seating position of the master virtual character in the virtual vehicle. For example, the position change control may be responsive to a user action. For example, when the user triggers the location change control, the terminal controls the master avatar to change the seating location. Illustratively, the location change control is a UI control that is displayed on the user interface after the master virtual character enters the virtual vehicle.
The airborne virtual prop is a virtual prop fixed on the virtual vehicle. Illustratively, the onboard virtual items and virtual vehicles are not detachable (but do not exclude the possibility of detachment). For example, only the virtual character entering the virtual vehicle and located at the passenger's location may use the onboard virtual prop. Illustratively, the on-board virtual items are virtual weapons. The airborne virtual property includes: submachine guns, special guns, machine guns, artillery, guided missiles, laser weapons, microwave weapons, etc. For example, the onboard virtual prop may also be a virtual prop carried by a virtual character, such as a virtual prop held by a virtual character or a virtual prop located in a virtual character backpack. For example, when the virtual character installs the carried virtual item on the virtual vehicle to become an onboard virtual item, the attack attribute of the virtual item is enhanced, for example, the range side length, the killing force of the virtual item become large, and the like.
And 105, controlling the main control virtual character to be changed from the driving position to the passenger position after receiving the position change operation on the position change control.
And after receiving the position replacement operation on the position replacement control, the terminal controls the main control virtual character to be replaced from the driving position to the passenger position.
The position change operation is a trigger operation made by the user on the position change control. Illustratively, the triggering operation includes: at least one of click, double click, slide, press, long press. For example, the triggering operation may be a sensing operation performed on a terminal mounted with at least one of a gravity sensor, a gyro sensor, and an acceleration sensor, for example, tilting, turning, rotating, shaking, and the like of the terminal. And 107, displaying a second user interface, wherein the second user interface comprises a second virtual world picture and a firing control, and the second virtual world picture displays a main control virtual role positioned on the position of the passenger.
The terminal displays a second user interface.
Illustratively, the second user interface is an application interface displayed on the terminal when the master virtual character is located at a passenger position of the virtual vehicle. Illustratively, the second user interface includes a second virtual world screen and an ignition control located on the second virtual world screen.
The second virtual world picture is a picture of the virtual world collected using the perspective of the master virtual character when the master virtual character is located at the passenger position.
The firing control is a UI control used to control the onboard virtual items. Illustratively, the firing controls are used to control the launch of virtual items from the airborne virtual items. Illustratively, the second user interface includes at least one firing control.
And 108, after receiving the firing operation on the firing control, controlling the main control virtual role to use the onboard virtual prop on the virtual carrier to launch a virtual component.
And after receiving the firing operation on the firing control, the terminal controls the main control virtual role to use the onboard virtual prop on the virtual carrier to launch a virtual component.
The firing operation is a triggering operation made by a user on the firing control.
Illustratively, after receiving the firing operation on the firing control, the terminal controls the main control virtual character to use the onboard virtual prop on the virtual vehicle to eject the virtual component in the aiming direction according to the current aiming direction of the onboard virtual prop.
Illustratively, the onboard virtual props are fixed on the virtual vehicles. For example, the position of the onboard virtual item on the virtual vehicle may be: front, top, rear of the vehicle; the head, lower part, tail and side wing of the airplane; the upper part, the lower part, the left side, the right side and the like of the naval vessel.
For example, when the virtual component hits other virtual characters, the life values of the other virtual characters may be lowered. When the virtual component does not hit other virtual characters, the onboard virtual prop can collide with other virtual models in the virtual world. For example, after the virtual parts and other virtual models are installed, impact marks, such as bullet holes, are left on the surfaces of the other virtual models.
In summary, in the method provided in this embodiment, the airborne virtual item is added to the passenger position of the virtual vehicle, and when the virtual character is located at the passenger position, the virtual character can control the airborne virtual item to eject the virtual component to cause damage to other virtual characters. When the virtual character is located at the driving position, the virtual character can control the virtual carrier to move in the virtual world. The virtual carrier is combined with the airborne virtual prop, so that the virtual carrier has the functions of movement and attack, the virtual role can reduce the life value of other virtual roles by using the airborne virtual prop, and the airborne virtual prop is simple in use and convenient for user operation.
The application further provides an exemplary embodiment of a control method of the virtual vehicle. Fig. 7 is a flowchart illustrating a method for controlling a virtual vehicle according to another exemplary embodiment of the present application. The method may be applied in the first terminal 120 or the second terminal 160 in a computer system as shown in fig. 5 or in other terminals in the computer system. The method comprises the following steps:
step 101, when the collision information related to the master virtual character on the collision box is acquired, determining that the master virtual character is close to the virtual carrier.
And when the collision information related to the main control virtual role on the collision box is acquired, the terminal determines that the main control virtual role is close to the virtual carrier.
Illustratively, the terminal controls the master virtual character to move in the virtual world, when the master virtual character is located close to the virtual vehicle, the model of the master virtual character collides with a collision box on the virtual vehicle, and the collision box of the virtual vehicle acquires collision information. The collision information includes at least one of a collision point, a model generating a collision box with the collision box, a plane on which the collision point is located, and a material of the plane on which the collision point is located. When the terminal acquires collision information of collision between the main control virtual character and the collision box, the main control virtual character is determined to be very close to the virtual carrier.
Crash boxes are one way of crash detection. For example, the collision box may be a regular model or an irregular model. Illustratively, the crash box has a volume. The crash information can be detected by the surface of the crash box. That is, when an object in the virtual world contacts the crash box, the crash box can sense the crash and acquire the crash information.
Illustratively, at least one crash box is included on the virtual vehicle. Illustratively, the collision box is as close to the model of the virtual vehicle as possible, so as to more truly and accurately acquire the collision information between the master control virtual character and the virtual vehicle.
For example, when the virtual vehicle has a plurality of crash boxes, taking the virtual vehicle as a vehicle as an example, the crash boxes may be arranged in the following manner: the vehicle body is provided with a collision box close to the vehicle body, and the four wheels are respectively provided with four collision boxes close to the wheels.
Illustratively, as shown in fig. 8, a crash box 301 is covered outside the virtual vehicle 202. When the model of the master virtual character 201 collides with the collision box 301, the terminal acquires collision information generated by the master virtual character 201 and the collision box 301, and determines that the master virtual character is close to the virtual vehicle 301.
And 102, displaying a position selection control when the master control virtual character is close to the virtual carrier.
And when the main control virtual character is close to the virtual carrier, the terminal displays a position selection control.
The position selection control is a UI control for receiving a position selection operation by a user. Illustratively, the position selection operation includes a passenger position selection operation corresponding to a passenger position, and a driving position selection operation corresponding to a driving position. For example, the passenger position selection operation is a double-click position selection control, and the driving position selection operation is a single-click position selection control. And the terminal controls the main control virtual character to enter the passenger position or the driving position of the virtual carrier according to the position selection operation. Illustratively, the position selection operation is a user-initiated operation of a position selection control. Illustratively, the position selection control can comprise a passenger position control corresponding to the passenger position and a driving position control corresponding to the driving position, and when the passenger position control receives a trigger operation, the terminal controls the main control virtual character to enter the passenger position; and when the driving position control receives the triggering operation, the terminal controls the main control virtual character to enter the driving position.
For example, as shown in fig. 9, after the master virtual character 201 approaches the virtual vehicle 202, position selection controls 302 are displayed on the user interface, which illustratively include a passenger position control 303 and a driving position control 304.
And 103, when the position selection operation on the position selection control is received, controlling the master control virtual character to enter the virtual carrier according to the position selection operation.
And when receiving the position selection operation on the position selection control, the terminal controls the main control virtual role to enter the virtual carrier according to the position selection operation.
Illustratively, the terminal controls the master virtual character to enter different riding positions in the virtual vehicle according to the position selection operation.
For example, as shown in fig. 9, when the terminal receives a position selection operation on the passenger position control 303, the terminal controls the master virtual character to enter the passenger position of the virtual vehicle; when the terminal receives the position selection operation on the driving position control 304, the terminal controls the master virtual character to enter the driving position of the virtual vehicle. As shown in fig. 10, the terminal controls the virtual character 201 to enter the driving position of the virtual vehicle 202.
And 104, displaying a first user interface, wherein the first user interface comprises a first virtual world picture and a position replacement control, the first virtual world picture displays a main control virtual character positioned at the driving position of the virtual carrier, and the virtual carrier further comprises a passenger position for controlling the airborne virtual prop.
Illustratively, the virtual vehicle includes at least one passenger position. Illustratively, each passenger position in the virtual vehicle corresponds to an onboard virtual item. For example, only one passenger position in the virtual vehicle corresponds to the onboard virtual item. Illustratively, a part of the positions of passengers in the virtual carrier correspond to the onboard virtual props.
For example, when the master virtual character is located within the virtual vehicle, a position change control is displayed on the user interface, i.e., the position change control is displayed on the user interface regardless of whether the master virtual character is located at the passenger position or the driving position.
For example, the position changing control may be a UI control, and at this time, the user triggers the position changing control, and may control the master virtual character to switch the seating position to the next seating position in the sequence. For example, the order in which the position change control switches positions is: a driving position, a first passenger position, a second passenger position, and then returning to the driving position. Then, when the main control virtual character is located at the first passenger position, the user triggers the position changing control, and can control the main control virtual character to switch the seating position to the next seating position in sequence: a second passenger position. When the main control virtual role is located at the position of the second passenger, the user triggers the position replacing control, and the main control virtual role can be controlled to switch the riding position to the next riding position in sequence: a driving position. For example, as shown in fig. 4, there is a position change control 305 at the lower left corner of the user interface, and when the user triggers a position change space, the terminal automatically switches the seating position of the master virtual character to the next seating position according to the seating position where the master virtual character is currently located and a preset position switching sequence.
For example, the position replacement control may include a plurality of UI controls, for example, one UI control is corresponding to each seating position on the virtual vehicle, and when a user triggers a certain UI control, the terminal controls the master virtual character to switch to the seating position corresponding to the UI control. For example, as shown in fig. 11, another position replacement control 305 is displayed at the lower left corner of the first user interface 400, the position replacement control 305 includes a first UI control 307 corresponding to the passenger position and a second UI control 306 corresponding to the driving position, and when the user triggers the first UI control 307, the terminal switches the seating position of the master virtual character to the passenger position.
For example, the first virtual world frame is a virtual world frame obtained by observing the virtual world from the viewpoint of the virtual vehicle.
The perspective is an observation angle when the virtual world is observed from a first person perspective or a third person perspective of the virtual character or the virtual vehicle. Optionally, in an embodiment of the present application, the viewing angle is an angle when the virtual character or the virtual vehicle is observed through the camera model in the virtual world.
The following describes the viewing angle by taking a virtual character as an example. Optionally, the camera model automatically follows the virtual character in the virtual world, that is, when the position of the virtual character in the virtual world changes, the camera model changes while following the position of the virtual character in the virtual world, and the camera model is always within the preset distance range of the virtual character in the virtual world. Optionally, the relative positions of the camera model and the virtual character do not change during the automatic following process.
The camera model refers to a three-dimensional model located around the virtual character in the virtual world, and when the first person perspective is adopted, the camera model is located near the head of the virtual character or at the head of the virtual character; when a third person perspective view is adopted, the camera model can be located behind the virtual character and bound with the virtual character, or located at any position away from the virtual character by a preset distance, the virtual character located in the virtual world can be observed from different angles through the camera model, and optionally, when the third person perspective view is the over-shoulder perspective view of the first person, the camera model is located behind the virtual character (such as the head and the shoulder of the virtual character). Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be located overhead of the virtual character's head when a top-down view is used, which is a view looking into the virtual world from an overhead top-down view. Optionally, the camera model is not actually displayed in the virtual world, i.e. the camera model is not displayed in the virtual world displayed by the user interface.
To illustrate an example where the camera model is located at any position away from the virtual character by a preset distance, optionally, one virtual character corresponds to one camera model, and the camera model may rotate with the virtual character as a rotation center, for example: the camera model is rotated with any point of the virtual character as a rotation center, the camera model rotates not only angularly but also shifts in displacement during the rotation, and the distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model rotates on the surface of a sphere with the rotation center as the sphere center, wherein any point of the virtual character can be the head, the trunk or any point around the virtual character, which is not limited in the embodiment of the present application. Optionally, when the virtual character is observed by the camera model, the center of the view angle of the camera model points to the direction in which the point of the spherical surface on which the camera model is located points to the center of the sphere.
Optionally, the camera model may also observe the virtual character at a preset angle in different directions of the virtual character.
Referring to fig. 12, schematically, a point is determined in the virtual character 11 as a rotation center 12, and the camera model rotates around the rotation center 12, and optionally, the camera model is configured with an initial position, which is a position above and behind the virtual character (for example, a rear position of the brain). Illustratively, as shown in fig. 12, the initial position is position 13, and when the camera model rotates to position 14 or position 15, the direction of the angle of view of the camera model changes as the camera model rotates.
For example, as shown in fig. 11, a first user interface 400 is displayed, the first user interface 400 includes a first virtual world screen captured from the perspective of the virtual vehicle 202, and a position replacement control 305 for replacing the seating position of the master virtual character is provided in the lower left corner of the first user interface 400. Therein, the virtual vehicle 202 includes a driving position located inside the virtual vehicle 202, and a seating position 308 located on top of the virtual vehicle 202. The riding position 308 corresponds to the onboard virtual item 203 fixed to the virtual vehicle 202.
And 1051, after receiving the position replacement operation on the position replacement control, controlling the riding position of the main control virtual character to be replaced from the driving position to the passenger position.
And after receiving the position replacement operation on the position replacement control, the terminal controls the riding position of the main control virtual character to be replaced from the driving position to the passenger position.
For example, as shown in fig. 11, after receiving a trigger operation on the first UI control 307 corresponding to the passenger position in the position change control 305, the terminal controls the master virtual character to change from the driving position to the passenger position 308. As shown in fig. 13, the master avatar 201 is changed from the driving location 309 to the passenger location 308 of the virtual vehicle 202.
Step 1052, after receiving the position replacing operation on the position replacing control, controlling the view angle of the main control virtual character to be replaced from the driving view angle corresponding to the driving position to the fighting view angle corresponding to the passenger position.
And after the position replacing operation on the position replacing control is received, the terminal controls the visual angle of the main control virtual character to be replaced from the driving visual angle corresponding to the driving position to the fighting visual angle corresponding to the passenger position.
The driving perspective is a perspective that facilitates user control of the virtual vehicle. The combat perspective is a perspective that facilitates user control of airborne virtual props. Illustratively, the driving view takes the virtual vehicle as an observation focus, and the fighting view takes the master virtual character as an observation focus. Illustratively, the driving perspective has a smaller field of view than the combat perspective. Illustratively, the driving perspective is a perspective corresponding to the virtual vehicle. The battle view angle is the view angle corresponding to the master control virtual character. Illustratively, the virtual vehicle is located in the middle of the virtual world picture, which is obtained according to the driving perspective. And on the virtual world picture acquired according to the fighting view angle, the main control virtual character is positioned in the middle of the virtual world picture.
Illustratively, the range of view of the combat perspective is greater than the range of view of the driving perspective. Namely, the angle rotation range of the fighting visual angle is larger than the driving visual angle, and the rotation is flexible; and/or the width of the field of view at the combat perspective is greater than the driving perspective.
For example, as shown in fig. 14, in the virtual world, there is a first viewing angle 310 (a first camera model) corresponding to the virtual vehicle, and the first viewing angle 310 takes the virtual vehicle 202 as a center of sphere and a certain distance as a radius, and observes the virtual world around the surface of the sphere. In the virtual world, there is also a second viewing angle 311 (a second camera model) corresponding to the main control virtual character 201, and the second viewing angle 311 takes the main control virtual character 201 as a sphere center and takes a certain distance as a radius to observe the virtual world around the surface of the sphere.
For example, as shown in fig. 14, the second view angle 311 is farther forward and wider in view than the virtual world screen acquired from the first view angle 310.
And 106, after receiving the position replacement operation on the position replacement control, controlling the virtual carrier to be switched from the moving state to the stopping state.
And when the position replacing operation on the position replacing control is received, the terminal controls the virtual carrier to be switched from the moving state to the stopping state.
For example, when the master virtual character is changed from the driving position of the virtual vehicle to the passenger position, and no virtual character controls the movement of the virtual vehicle, the virtual vehicle stops moving immediately or stops moving after sliding for a certain distance under the action of inertia. That is, the terminal controls the virtual vehicle to switch from the moving state to the stopped state.
The stopped state is a stationary state of the virtual vehicle.
For example, the virtual vehicle may be used by multiple virtual characters simultaneously, for example, a first virtual character located at a driving position of the virtual vehicle to control movement of the virtual vehicle, and a second virtual character located at a passenger position of the virtual vehicle to control use of onboard virtual props.
Illustratively, step 1051, step 1052, and step 106 are three steps performed simultaneously or in any order followed by three steps.
And 107, displaying a second user interface, wherein the second user interface comprises a second virtual world picture and a firing control, and the second virtual world picture displays a main control virtual role positioned on the position of the passenger.
For example, as shown in fig. 15, a second user interface 500 is displayed, and the second user interface 500 includes a second virtual world screen in which the master virtual character 201 is located at the passenger position 308 of the virtual vehicle 202, and firing controls 312 located on the left and right sides of the second user interface 500.
And 108, after receiving the firing operation on the firing control, controlling the main control virtual character to use the airborne virtual prop to launch a virtual component.
For example, as shown in fig. 15, a foresight 313 corresponding to the onboard virtual prop is provided on the second user interface 500, and after the terminal receives a firing operation on the firing control, the main control virtual character is controlled to use the onboard virtual prop to shoot out the virtual component in a direction aimed at by the foresight 313.
Step 109, when the number of virtual components in the onboard virtual item is less than the maximum number, the onboard virtual item automatically supplements the virtual components at the target rate until the number of virtual components in the onboard virtual item reaches the maximum number.
Illustratively, the maximum number of virtual components is loaded in the airborne virtual item at most. The maximum number is the number of virtual parts that are loaded at most in the onboard virtual items.
Illustratively, the maximum number may also be positive infinity, i.e., there are an infinite number of virtual parts in the on-board virtual item.
When the number of the virtual components in the airborne virtual prop is less than the maximum number, the airborne virtual prop automatically supplements the virtual components at a target speed until the number of the virtual components in the airborne virtual prop reaches the maximum number.
Illustratively, the virtual components in the airborne virtual items are used indefinitely. That is, the virtual character can use the onboard virtual prop an unlimited number of times without any loading, charging, and maintenance operations.
For example, the maximum loading capacity of the virtual components in the airborne virtual item is fixed, and when the virtual character uses the airborne virtual item to launch all the virtual components in a short time, the airborne virtual item cannot launch the virtual components any more, and the virtual components need to be added into the airborne virtual item. Illustratively, an onboard virtual item is an item to which a virtual component may be automatically added. When the number of virtual components in the onboard virtual prop is less than the maximum number, the onboard virtual prop automatically supplements the number of virtual components. Exemplary, complementary ways may be: instantly supplementing the number of virtual parts to a maximum number; or, the number of dummy parts is gradually replenished at a certain speed until the maximum number is reached.
For example, the on-board virtual item is a virtual firearm, the virtual component is a virtual bullet corresponding to the virtual firearm, and the virtual firearm can be loaded with up to 100 virtual bullets. When the virtual firearm fires the virtual cartridges, the number of virtual cartridges in the virtual firearm is less than 100, for example: at the moment, 90 virtual bullets exist in the virtual firearms, and the virtual firearms immediately supplement the virtual bullets to 100 bullets; or, the virtual gun is gradually replenished with the virtual bullets at the speed of replenishing 1 bullet per second, the number of the virtual bullets in the virtual gun reaches 100 after 10 seconds, and the replenishment of the virtual bullets is stopped.
The target rate is the rate at which the onboard virtual item supplements the virtual component. Illustratively, the target rate is a value obtained by dividing the number of supplemental virtual parts of the airborne virtual item over a certain time by the certain time. For example, the target rate may be 10 per second, 1 per minute, etc. Illustratively, the target rate is less than the launch rate of the airborne virtual item in the continuous launch state. The launching state is the state that the speed of launching the virtual component by the airborne virtual prop is fastest. For example, when the user presses the firing control for a long time, the onboard virtual item enters a continuous launch state, and the virtual component is launched at a speed of 10 shots per second, then the target rate is a rate less than 10 shots per second.
The automatic supplement is that the supplement operation is performed without the main control virtual role, and the airborne virtual property can be automatically completed.
Illustratively, the second user interface further comprises a margin hint control for the virtual part. For example, when the maximum number is a positive number of fixed values, the number of virtual parts remaining in the airborne virtual item may be displayed on the second user interface.
And the allowance prompt control is a UI control corresponding to the number of virtual components in the airborne virtual prop. For example, the number of virtual components in the airborne virtual item is displayed on the allowance prompt control.
And step 110, acquiring the ejection number and ejection time of the airborne virtual prop for ejecting the virtual components, and acquiring the maximum number and target speed of the airborne virtual props.
The terminal obtains the ejection number and the ejection time of the airborne virtual prop ejection virtual components, and obtains the maximum number and the target speed of the airborne virtual props.
The ejection number is the number of the ejected virtual components when the onboard virtual tool is in the full-load state and the first virtual component is ejected until the onboard virtual tool reaches the next full-load state. The full load state is a state in which the number of virtual components in the onboard virtual item reaches the maximum number.
The ejection time is the time when the airborne virtual prop ejects each virtual component.
And step 111, calculating the current number of the virtual components in the airborne virtual prop according to the ejection speed, the ejection time, the maximum number of the airborne virtual props and the target speed of the virtual components.
And the terminal calculates the current number of the virtual components in the airborne virtual prop according to the ejection speed, the ejection time, the maximum number of the airborne virtual props and the target speed of the virtual components.
The current number is the number of virtual parts in the airborne virtual item at the current time.
For example, an onboard virtual item may emit 10 virtual components at 1 minute 30 seconds, 5 virtual components at 1 minute 35 seconds, the maximum number of onboard virtual items is 100, the target rate is 1 per second, and the current time is 1 minute 40 seconds. Then the current number of virtual parts in the virtual item carried at 1 minute 40 seconds is: 100-10-5+10, 95.
And 112, displaying the allowance prompt information of the virtual components on the allowance prompt control according to the current quantity.
And the terminal displays the allowance prompt information of the virtual components on the allowance prompt control according to the current quantity.
Illustratively, the terminal displays the allowance prompt information of the virtual component on the allowance prompt control according to at least one of the current quantity and the maximum quantity.
The allowance prompt information is the number information of the virtual components in the current airborne virtual item. Illustratively, the margin prompting information can be displayed in the form of at least one of number, percentage and progress bar. Illustratively, the allowance indication information includes at least one of a maximum number of the on-board virtual items that can be loaded with virtual components and a current number of virtual components that are currently loaded in the on-board virtual items.
For example, as shown in FIG. 15, a margin hint control 314 is displayed on the second user interface 500. The margin hint control displays the percentage progress bars for the current number and the maximum number. The larger the current quantity is, the larger the colorless part of the margin prompting control 314 is, otherwise, the larger the shaded part of the margin prompting control 314 is, and when no virtual component exists in the onboard virtual item, the margin prompting control 314 is filled with the shadow completely.
And step 113, when the number of the virtual components in the onboard virtual item is zero, prompting that the onboard virtual item cannot be fired.
When the number of the virtual components in the airborne virtual prop is zero, the terminal prompts that the airborne virtual prop cannot be fired.
For example, when the number of virtual components in the onboard virtual item is zero, the user clicks the firing control again, and the terminal prompts that the onboard virtual item cannot fire.
For example, when there is no virtual component in the onboard virtual item, the onboard virtual item cannot be shot, and at this time, the user is prompted that the onboard virtual item cannot be used.
For example, the terminal prompts that the onboard virtual property cannot be fired by at least one of display, voice and vibration. For example, prompt information that the fire cannot be fired is displayed on a user interface in a display mode; playing a voice prompt which cannot be fired in a voice mode; through the vibration mode, when the onboard virtual prop does not have a virtual component, and the user clicks the firing control again, the terminal vibrates to prompt the user that the current onboard virtual prop cannot eject the virtual component.
For example, as shown in fig. 16, when the number of virtual components in the onboard virtual item is zero, the margin prompting control 314 is completely filled with the shaded portion, and at this time, the user clicks the firing control 312 again, and the terminal displays the firing impossible prompt 315 on the second user interface 500, so as to prompt the user that the ammunition in the onboard virtual item is insufficient and cannot be used.
And step 114, after receiving the position replacement operation on the position replacement control again, controlling the main control virtual character to be replaced from the passenger position to the driving position.
And after the position replacing operation on the position replacing control is received again, the terminal controls the main control virtual character to be replaced from the passenger position to the driving position.
Illustratively, a position change control is also included on the second user interface.
When the main control virtual machine role is located at the passenger position, the position replacing control receives the position replacing operation, and the terminal controls the main control virtual machine role to be replaced from the passenger position to the driving position.
Illustratively, after receiving the position replacement operation on the position replacement control again, the terminal controls the riding position of the main control virtual character to be replaced from the passenger position to the driving position; and controlling the visual angle of the main control virtual character to be changed from the fighting visual angle corresponding to the passenger position to the driving visual angle corresponding to the driving position.
For example, as shown in fig. 16, when the master virtual character 201 is located at the passenger position 308 and the second UI control 306 in the position change control 305 receives a position change operation, the terminal controls the master virtual character to change from the passenger position 308 to the driving position, and controls the perspective of the master virtual character to change from the fighting perspective shown in fig. 16 to the driving perspective shown in fig. 17.
Step 115, displaying a third user interface, the third user interface including a driving control.
The terminal displays a third user interface, which includes a driving control.
The third user interface is the interface when the master virtual character is in the driving position. Illustratively, the third user interface includes a third virtual world screen, a position change control and a driving control, which are acquired according to a driving view angle. Illustratively, the third user interface and the first user interface are the same interface displayed by the terminal at different times. That is, the UI controls on the third user interface and the first user interface are the same. Illustratively, the third virtual world screen and the first virtual world screen are different virtual world screens acquired at different times from the driving perspective. For example, the third virtual world screen and the first virtual world screen may be the same virtual world screen.
The driving control is a UI control used to control the movement of the virtual vehicle in the virtual world. Illustratively, the driving controls include at least one of a forward control, a reverse control, a throttle control, a brake control, a left turn control, a right turn control, a direction control, an elevation control, and a descent control. Illustratively, the terminal moves the position of the virtual vehicle in the virtual world according to the triggering operation on the driving control. For example, when a triggering operation of a user on the driving control is not received, the virtual vehicle is in a static state or a free-wheeling state.
For example, as shown in FIG. 17, there is a driving control 316 on the third user interface 600. Illustratively, the steering controls 316 include a throttle control 3161, a brake control 3162, and a direction control 3163.
And step 116, when the driving operation on the driving control is received, controlling the virtual vehicle to enter a moving state according to the driving operation.
And when the driving operation on the driving control is received, the terminal controls the virtual vehicle to enter a moving state according to the driving operation.
The driving operation is a trigger operation performed by the user on the driving control. Illustratively, the terminal controls the moving mode of the virtual vehicle in the virtual world according to the driving operation. The moving mode comprises the following steps: at least one of forward, backward, acceleration, deceleration, steering, ascending and descending.
The moving state is a state in which the position of the virtual vehicle in the virtual world changes constantly with time.
For example, as shown in fig. 17, when the terminal receives a driving operation on the throttle control 3161, the terminal controls the virtual vehicle to move in an accelerated manner in the virtual world; when the terminal receives a driving operation on the brake control 3162, the terminal controls the virtual vehicle to move in a deceleration mode in the virtual world; when the terminal receives the driving operation on the direction control 3163, the terminal controls the virtual vehicle to move forwards, backwards, leftwards and rightwards in the virtual world according to the driving operation;
in summary, in the method provided in this embodiment, the airborne virtual item is added to the passenger position of the virtual vehicle, and when the virtual character is located at the passenger position, the virtual character can control the airborne virtual item to eject the virtual component to cause damage to other virtual characters. When the virtual character is located at the driving position, the virtual character can control the virtual carrier to move in the virtual world. The virtual carrier is combined with the airborne virtual prop, so that the virtual carrier has the functions of movement and attack, the virtual role can reduce the life value of other virtual roles by using the airborne virtual prop, and the airborne virtual prop is simple in use and convenient for user operation.
By observing the main control virtual roles at different riding positions on the virtual carrier at different viewing angles, when the main control virtual roles are positioned at the driving positions, the driving viewing angle with a smaller viewing range is provided, so that a user can conveniently control the heavy virtual carrier to move in the virtual world; when the main control virtual role is positioned at the position of the passenger, a fighting visual angle with a larger visual field range is provided, so that a user can conveniently control the airborne virtual prop to shoot in the virtual world, the visual field of the main control virtual role shooting on the virtual carrier is widened, and the user operation is facilitated.
The position changing controls are arranged on the first user interface and the second user interface to control the main control virtual role to change the riding position in the virtual carrier, so that the main control virtual role can freely control the virtual carrier to move in the virtual world or control the airborne virtual prop to eject a virtual component.
Through adding surplus suggestion controlling part, the surplus quantity of virtual part in the suggestion user machine-mounted virtual stage property, when the quantity of virtual part in the machine-mounted virtual stage property is less, can remind the user to switch main control virtual role from passenger's position to driving position, shifts the position as soon as possible to do not have firepower to cover, make oneself openly enemy.
Through setting the onboard virtual prop on the virtual carrier into an automatic supplementary virtual part, a user does not need to pick up the virtual part from the virtual world and manually supplements the virtual part for the virtual carrier, the use difficulty of the onboard virtual prop is reduced, and the use durability of the onboard virtual prop is improved.
Through add the collision box on virtual carrier, can be accurate detect main control virtual role's being close to, improve the detection sensitivity of terminal striking between main control virtual role and the virtual carrier.
The application also provides an exemplary embodiment for controlling the virtual character to enter the virtual carrier.
Fig. 18 is a flowchart illustrating a method for controlling a virtual vehicle according to another exemplary embodiment of the present application, where the method may be applied to the first terminal 120 or the second terminal 160 in the computer system shown in fig. 5 or other terminals in the computer system. Unlike the control method of the virtual vehicle shown in fig. 6, steps 102, 1031, and 1032 are added before step 104:
and 102, displaying a position selection control when the master control virtual character is close to the virtual carrier.
Step 1301, when a position selection operation on the position selection control is received, a root node on a role model of the main control virtual role is obtained, the role model is a three-dimensional model corresponding to the main control virtual role in the virtual world, the root node is a point on the main control virtual role model, and the main control virtual role model moves along with the movement of the root node.
When receiving a position selection operation on the position selection control, the terminal acquires a root node on the role model of the main control virtual role.
The root node is a point on the character model. For example, the position of the root node in the virtual world may represent the position of the master virtual character in the virtual world, that is, when the position of the root node is determined, the positions of other nodes on the character model in the virtual world are determined according to the position of the root node.
For example, after the master virtual character enters the virtual vehicle, the character model of the master virtual character moves along with the movement of the virtual vehicle, and only a root node on the character model needs to be connected with a point on the virtual vehicle model, and the root node moves along with the movement of the virtual vehicle, so that the character model of the master virtual character moves along with the movement of the virtual vehicle.
For example, there is a root node (0, 0) on the role model of the master virtual role, and there is a second node (1, 0) one unit distance from the root node. Then, when the root node moves to the (3, 3) position, the second node moves with the movement of the root node, i.e., to the (4, 3) position.
Step 1302, associate the root node with a first node on a virtual vehicle model, where the virtual vehicle model is a virtual model of a virtual vehicle in a virtual world, the first node is a point on the virtual vehicle model, and the association controls a position of one node to move along with a movement of another node.
The terminal associates the root node with a first node on the virtual vehicle model.
Illustratively, the terminal mounts the root node under the first node on the virtual vehicle model. I.e., the root node moves as the first node on the virtual vehicle model moves. When the position of the first node of the virtual vehicle model in the virtual world is determined, the position of the root node in the virtual world is determined according to the position of the first node.
For example, there is a first node (0, 0) on the virtual vehicle, and the root node is mounted under the first node, and initially, the root node is located at (0, 1). Then when the first node moves to the (1, 1) position, the first node follows the movement to the (1, 2) position.
In summary, in the method provided in this embodiment, the root node on the master virtual character model is associated with the first node on the virtual vehicle model, and the master virtual character model is controlled to move in the virtual world along with the virtual vehicle, so that the master virtual character enters the virtual vehicle, and the virtual vehicle can move in the virtual world while carrying the master virtual character.
For example, the methods provided in the above embodiments can be split and then freely combined into new embodiments.
The application also provides an exemplary embodiment of a control method for using the virtual carrier in the tactical competitive game. A tactical competitive game is a game in which a plurality of virtual characters participate together, and the plurality of virtual characters play in a virtual world for survival.
Fig. 19 is a flowchart illustrating a method for controlling a virtual vehicle according to another exemplary embodiment of the present application, where the method may be applied to the first terminal 120 or the second terminal 160 in the computer system shown in fig. 5 or other terminals in the computer system. The method comprises the following steps:
step 701, the terminal controls the master control virtual character to search for a combat carrier.
Illustratively, the master avatar first looks for a combat vehicle (virtual vehicle) in the virtual world.
In step 702, the terminal determines whether the master control avatar has found a combat vehicle.
When the master control virtual character finds the battle carrier, the step 703 is performed; otherwise, the procedure returns to step 701.
Illustratively, the master avatar finds the battle carrier, i.e., the battle carrier appears on the virtual world scene.
In step 703, the terminal controls the master virtual character to walk beside the combat vehicle.
For example, after the master virtual character finds the combat vehicle, the terminal controls the master virtual character to approach the combat vehicle, that is, the master virtual character model collides with a collision box on the combat vehicle, the collision box acquires collision information, and the terminal displays a position selection control on a user interface.
In step 704, the terminal determines whether the user clicks on the passenger location.
Illustratively, the position selection control is provided with a passenger position selection control corresponding to the passenger position and a driving position selection control corresponding to the driving position. When the user clicks the passenger position selection control corresponding to the passenger position, step 705 is performed; otherwise, the process returns to step 703.
Step 705, the terminal controls the master virtual character to enter the combat vehicle.
Illustratively, the terminal controls the position of the passenger that the master avatar enters into the combat vehicle.
In step 706, the terminal determines whether the user clicks to fire.
Illustratively, when the user clicks the firing control on the user interface, step 707 is performed, otherwise step 705 is returned.
In step 707, the terminal controls the master virtual character to fire a bullet using a weapon.
Illustratively, the terminal controls the master avatar to fire bullets (virtual parts) using weapons (airborne virtual items) affixed to the combat vehicle.
At step 708, the terminal determines whether the cartridge of the weapon is spent.
Illustratively, when all of the bullets in the weapon are used up, step 709 is performed; otherwise, the step 707 is returned to.
Step 709, the terminal determines that the weapon cannot be fired and prompts the user that the weapon cannot be used currently.
Illustratively, when the current number of rounds in the weapon is zero, the weapon cannot fire any more rounds, at which point the terminal determines that the weapon cannot be used and prompts the user on the user interface that the weapon is currently unavailable.
In step 710, the terminal determines whether the cartridge of the weapon has been refilled.
Illustratively, the weapon will automatically replenish the bullet. When the weapon is replenished with bullets, proceed to step 711; otherwise, return to step 709.
At step 711, the terminal determines that the weapon can be fired again.
For example, when the current number of rounds in the weapon is not zero, the terminal determines that the weapon can be fired again.
In step 712, the terminal determines whether the master virtual character is switched to the driving seat.
Illustratively, the terminal determines whether a location change control on the user interface has received a location change operation. If the position replacement operation is received, go to step 713; otherwise, return to step 711.
In step 713, the terminal controls the master control virtual character to drive the vehicle to move.
Illustratively, the terminal controls the riding position of the main control virtual character to be switched from the passenger position to the driving position, and controls the combat vehicle to move in the virtual world according to the driving operation of the user.
In summary, the exemplary embodiment provides an exemplary method for controlling a virtual vehicle. After the main control virtual role enters a virtual carrier fixed with the airborne virtual prop, the passenger can sit at the passenger position through the position replacing control, so that the airborne virtual prop at the passenger position is controlled to eject out a virtual component, and the mode that the virtual carrier damages other virtual roles is increased. And the airborne virtual prop is used for shooting other virtual roles to control the virtual carrier to impact other virtual roles, so that the control mode is simpler, and the operation of a user is facilitated.
The following are embodiments of the apparatus of the present application, and for details that are not described in detail in the embodiments of the apparatus, reference may be made to corresponding descriptions in the above method embodiments, and details are not described herein again.
Fig. 20 is a schematic structural diagram illustrating a control apparatus of a virtual vehicle according to an exemplary embodiment of the present application. The apparatus can be implemented as all or a part of a terminal by software, hardware or a combination of both, and includes:
a display module 801, configured to display a first user interface, where the first user interface includes a first virtual world picture and a position change control, the first virtual world picture displays a master control virtual character located at a driving position of a virtual vehicle, and the virtual vehicle further includes a passenger position for controlling an airborne virtual item;
an interaction module 802, configured to receive a location change operation on the location change control;
a control module 803, configured to control the master virtual character to change from the driving position to the passenger position after receiving a position change operation on the position change control;
the display module 801 is further configured to display a second user interface, where the second user interface includes a second virtual world screen and a firing control, and the second virtual world screen displays the main control virtual character located at the passenger position;
the interaction module 802 is further configured to receive a firing operation on the firing control;
the control module 803 is further configured to control the main control virtual character to use the airborne virtual item to launch a virtual component after receiving the firing operation on the firing control.
In an optional embodiment, the control module 803 is further configured to control the riding position of the master virtual character to be changed from the driving position to the passenger position;
the control module 803 is further configured to control the view angle of the main control virtual character to be changed from the driving view angle corresponding to the driving position to the combat view angle corresponding to the passenger position.
In an alternative embodiment, the range of view of the combat perspective is greater than the range of view of the driving perspective.
In an optional embodiment, the interaction module 802 is further configured to receive a location change operation on the location change control;
the control module 803 is further configured to control the virtual vehicle to switch from a moving state to a stopped state after receiving the position replacement operation on the position replacement control.
In an alternative embodiment, the second user interface further comprises the position change control;
the interaction module 802 is further configured to receive the location change operation on the location change control;
the control module 803 is further configured to control the master control virtual character to change from the passenger position to the driving position after receiving the position change operation on the position change control again;
the display module 801 is further configured to display a third user interface, where the third user interface includes a driving control;
the interaction module 802 is further configured to receive a driving operation on the driving control;
the control module 803 is further configured to, when a driving operation on the driving control is received, control the virtual vehicle to enter a moving state according to the driving operation.
In an optional embodiment, the display module 801 is further configured to display a position selection control when the master virtual character approaches the virtual vehicle;
the interaction module 802 is further configured to receive a location selection operation on the location selection control;
the control module 803 is further configured to, when receiving a position selection operation on the position selection control, control the master virtual character to enter the virtual vehicle according to the position selection operation.
In an optional embodiment, the virtual vehicle includes at least one crash box thereon, and the apparatus further includes: an obtaining module 806 and a determining module 804;
the obtaining module 806 is configured to obtain collision information about the master virtual character on the collision box;
the determining module 804 is configured to determine that the master virtual character is close to the virtual vehicle when the collision information about the master virtual character on the collision box is acquired.
In an optional embodiment, the apparatus further comprises: an association module 808;
the obtaining module 806 is further configured to obtain a root node on a master virtual character model, where the master virtual character model is a virtual model corresponding to the master virtual character in the virtual world, and the root node is a point on the master virtual character model, and the master virtual character model moves along with movement of the root node;
the association module 808 is configured to associate the root node with a first node on a virtual vehicle model, where the virtual vehicle model is a virtual model of the virtual vehicle corresponding to the virtual world, the first node is a point on the virtual vehicle model, and the association controls a position of one node to move along with a movement of another node.
In an alternative embodiment, said airborne virtual item is loaded with a maximum number of said virtual components at most, said apparatus further comprising: a loading module 807;
the loading module 807 is configured to, when the number of virtual components in the onboard virtual item is less than the maximum number, automatically replenish the virtual components with the onboard virtual item at a target rate until the number of virtual components in the onboard virtual item reaches the maximum number.
In an optional embodiment, the second user interface further comprises a margin hint control for the virtual part.
In an alternative embodiment, the apparatus further comprises; an acquisition module 806 and a calculation module 805;
the obtaining module 806 is configured to obtain the number of virtual components ejected by the onboard virtual item and the ejection time, and obtain the maximum number and the target rate of the onboard virtual item;
the calculating module 805 is configured to calculate a current number of the virtual components in the airborne virtual item according to the ejection speed and the ejection time of the virtual components, the maximum number of the airborne virtual item, and the target rate;
the display module 801 is further configured to display the allowance prompt information of the virtual component on the allowance prompt control according to the current number.
In an optional embodiment, the apparatus further comprises: a prompt module 809;
the prompting module 809 is configured to prompt that the onboard virtual item cannot fire when the number of the virtual components in the onboard virtual item is zero.
Referring to fig. 21, a block diagram of a computer device 1300 according to an exemplary embodiment of the present application is shown. The computer device 1300 may be a portable mobile terminal, such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4). Computer device 1300 may also be referred to by other names such as user equipment, portable terminal, etc.
Generally, computer device 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1302 may include one or more computer-readable storage media, which may be tangible and non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1302 is used to store at least one instruction for execution by the processor 1301 to implement the control method of the virtual vehicle provided in the present application.
In some embodiments, the electronic device 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display 1305 also has the capability to collect touch signals on or over the surface of the touch display 1305. The touch signal may be input to the processor 1301 as a control signal for processing. The touch display 1305 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display 1305 may be one, providing the front panel of the electronic device 1300; in other embodiments, the touch display 1305 may be at least two, respectively disposed on different surfaces of the electronic device 1300 or in a folded design; in still other embodiments, the touch display 1305 may be a flexible display disposed on a curved surface or on a folded surface of the electronic device 1300. Even more, the touch screen 1305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 is used to provide an audio interface between the user and the electronic device 1300. The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the electronic device 1300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The positioning component 1308 is used to locate a current geographic Location of the electronic device 1300 for navigation or LBS (Location Based Service). The Positioning component 1308 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 1309 is used to provide power to various components within the electronic device 1300. The power source 1309 may be alternating current, direct current, disposable or rechargeable. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the electronic apparatus 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect the body direction and the rotation angle of the electronic device 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to acquire a 3D motion of the user on the electronic device 1300. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1313 may be disposed on a side bezel of the electronic device 1300 and/or underlying the touch display 1305. When the pressure sensor 1313 is provided in the side frame of the electronic apparatus 1300, a user's grip signal for the electronic apparatus 1300 can be detected, and left-right hand recognition or shortcut operation can be performed based on the grip signal. When the pressure sensor 1313 is disposed on the lower layer of the touch display 1305, it is possible to control an operability control on the UI interface according to a pressure operation of the user on the touch display 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user to identify the identity of the user according to the collected fingerprint. When the identity of the user is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1314 may be disposed on the front, back, or side of the electronic device 1300. When a physical button or vendor Logo is provided on the electronic device 1300, the fingerprint sensor 1314 may be integrated with the physical button or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
A proximity sensor 1316, also known as a distance sensor, is typically disposed on a front side of the electronic device 1300. The proximity sensor 1316 is used to capture the distance between the user and the front face of the electronic device 1300. In one embodiment, the processor 1301 controls the touch display 1305 to be used from the bright screen state to the rest screen state when the proximity sensor 1316 detects that the distance between the user and the front face of the electronic device 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to be used from the rest state to the bright state when the proximity sensor 1316 detects that the distance between the user and the front surface of the electronic device 1300 becomes progressively larger.
Those skilled in the art will appreciate that the configuration shown in fig. 21 does not constitute a limitation of the electronic device 1300, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
The present application further provides a terminal, including: the virtual vehicle control system comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the control method of the virtual vehicle provided by the method embodiments.
The present application further provides a computer device, comprising: the virtual vehicle control system comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the storage medium, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the control method of the virtual vehicle provided by the method embodiments.
The present application further provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the control method for a virtual vehicle provided by the above method embodiments.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (14)

1. A method for controlling a virtual vehicle, the method comprising:
displaying a first user interface, wherein the first user interface comprises a first virtual world picture and a position changing control, the first virtual world picture displays a main control virtual character positioned at a driving position of a virtual carrier, and the virtual carrier further comprises a passenger position for controlling an airborne virtual prop;
when a position replacing operation on the position replacing control is received, controlling the main control virtual character to be replaced from the driving position to the passenger position;
displaying a second user interface, wherein the second user interface comprises a second virtual world picture and a firing control, and the second virtual world picture displays the main control virtual role positioned on the passenger position;
and after receiving the firing operation on the firing control, controlling the main control virtual role to use the airborne virtual prop on the virtual carrier to launch a virtual component.
2. The method of claim 1, wherein the controlling the master virtual character to change from the driving location to the passenger location comprises:
controlling the ride position of the master virtual character to change from the drive position to the passenger position;
and controlling the visual angle of the main control virtual character to be changed from the driving visual angle corresponding to the driving position to the fighting visual angle corresponding to the passenger position.
3. The method of claim 2,
the field of view of the combat perspective is greater than the field of view of the driving perspective.
4. The method of any of claims 1 to 3, further comprising:
and when the position replacing operation on the position replacing control is received, controlling the virtual carrier to be switched from a moving state to a stopping state.
5. The method of any of claims 1-4, wherein the second user interface further comprises the change of position control, the method further comprising:
when the position replacing operation on the position replacing control is received again, the main control virtual role is controlled to be replaced from the passenger position to the driving position;
displaying a third user interface, the third user interface including a driving control;
and when the driving operation on the driving control is received, controlling the virtual vehicle to enter a moving state according to the driving operation.
6. The method of any of claims 1 to 5, further comprising:
displaying a position selection control when the master control virtual character approaches the virtual vehicle;
and when a position selection operation on the position selection control is received, controlling the main control virtual role to enter the virtual carrier according to the position selection operation.
7. The method of claim 6, wherein the virtual vehicle includes at least one crash box thereon, the method further comprising:
and when the collision information related to the main control virtual character on the collision box is acquired, determining that the main control virtual character is close to the virtual carrier.
8. The method of claim 6, wherein the controlling the master virtual character into the virtual vehicle according to the location selection operation comprises:
acquiring a root node on a role model of the master virtual role, wherein the role model is a three-dimensional model corresponding to the master virtual role in the virtual world, the root node is a point on the master virtual role model, and the master virtual role model moves along with the movement of the root node;
associating the root node with a first node on a virtual vehicle model, the virtual vehicle model being a corresponding virtual model of the virtual vehicle in the virtual world, the first node being a point on the virtual vehicle model, the associating being controlling the position of one node to move following the movement of another node.
9. The method of any one of claims 1 to 8, wherein the airborne virtual prop contains at most a maximum number of said virtual components, the method further comprising:
when the number of the virtual components in the onboard virtual item is less than the maximum number, the onboard virtual item automatically supplements the virtual components at a target rate until the number of the virtual components in the onboard virtual item reaches the maximum number.
10. The method of claim 9, wherein the second user interface further comprises a margin hint control for the virtual part, the method further comprising;
acquiring the ejection number and ejection time of the airborne virtual prop for ejecting the virtual component, and acquiring the maximum quantity and the target speed of the airborne virtual prop;
calculating the current number of the virtual components in the airborne virtual prop according to the ejection speed and the ejection time of the virtual components, the maximum number of the airborne virtual prop and the target speed;
and displaying the allowance prompt information of the virtual component on the allowance prompt control according to the current quantity.
11. The method of claim 9, further comprising:
when the number of the virtual components in the airborne virtual prop is zero, prompting that the airborne virtual prop cannot be fired.
12. An apparatus for controlling a virtual vehicle, the apparatus comprising:
the virtual vehicle comprises a display module, a position changing control module and a control module, wherein the display module is used for displaying a first user interface, the first user interface comprises a first virtual world picture and a position changing control, the first virtual world picture displays a main control virtual role positioned at a driving position of the virtual vehicle, and the virtual vehicle further comprises a passenger position used for controlling an airborne virtual prop;
the interaction module is used for receiving position replacement operation on the position replacement control;
the control module is used for controlling the main control virtual character to be replaced from the driving position to the passenger position after receiving the position replacement operation on the position replacement control;
the display module is further configured to display a second user interface, where the second user interface includes a second virtual world picture and a firing control, and the second virtual world picture displays the main control virtual character located at the passenger position;
the interaction module is also used for receiving the firing operation on the firing control;
the control module is further configured to control the main control virtual character to use the onboard virtual prop on the virtual vehicle to launch a virtual component after receiving the firing operation on the firing control.
13. A computer device, the computer comprising: a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the method of controlling a virtual vehicle according to any one of claims 1 to 11.
14. A computer-readable storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the storage medium, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the method for controlling a virtual vehicle according to any one of claims 1 to 11.
CN201911113863.9A 2019-11-14 2019-11-14 Virtual vehicle control method, device, equipment and storage medium Active CN110876849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911113863.9A CN110876849B (en) 2019-11-14 2019-11-14 Virtual vehicle control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911113863.9A CN110876849B (en) 2019-11-14 2019-11-14 Virtual vehicle control method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110876849A true CN110876849A (en) 2020-03-13
CN110876849B CN110876849B (en) 2022-09-20

Family

ID=69730578

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911113863.9A Active CN110876849B (en) 2019-11-14 2019-11-14 Virtual vehicle control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110876849B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111603766A (en) * 2020-06-29 2020-09-01 上海完美时空软件有限公司 Control method and device of virtual carrier, storage medium and electronic device
CN114011073A (en) * 2021-11-05 2022-02-08 腾讯科技(深圳)有限公司 Method, device and equipment for controlling vehicle and computer readable storage medium
WO2023185259A1 (en) * 2022-04-01 2023-10-05 腾讯科技(深圳)有限公司 Virtual object control method and related apparatus
WO2024016769A1 (en) * 2022-07-21 2024-01-25 腾讯科技(深圳)有限公司 Information processing method and apparatus, and storage medium and electronic device
WO2024037208A1 (en) * 2022-08-18 2024-02-22 腾讯科技(深圳)有限公司 Vehicle interaction method and apparatus in virtual scene, and device and computer program product

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014195591A (en) * 2013-03-29 2014-10-16 株式会社コナミデジタルエンタテインメント Game machine, and control method and computer program used for the same
CN108245888A (en) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 Virtual object control method, device and computer equipment
CN109718545A (en) * 2017-10-31 2019-05-07 腾讯科技(上海)有限公司 Object control device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014195591A (en) * 2013-03-29 2014-10-16 株式会社コナミデジタルエンタテインメント Game machine, and control method and computer program used for the same
CN109718545A (en) * 2017-10-31 2019-05-07 腾讯科技(上海)有限公司 Object control device and method
CN108245888A (en) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 Virtual object control method, device and computer equipment

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
佚名: "《和平精英开车怎么打枪》", 21 May 2019 *
小贝的游戏食堂: "《吃鸡空投车模式,全图玩家召唤装甲车,在决赛圈互相撞击!》", 17 July 2019 *
绿茶说游: "《开着坦克去吃鸡,这是我见过最新颖的吃鸡游戏》", 29 October 2018 *
高哥哥解说: "《刺激战场:被吉普车大队包围看我如何极限吃鸡打爆装甲车部队》", 7 February 2019 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111603766A (en) * 2020-06-29 2020-09-01 上海完美时空软件有限公司 Control method and device of virtual carrier, storage medium and electronic device
CN111603766B (en) * 2020-06-29 2024-01-09 上海完美时空软件有限公司 Virtual carrier control method and device, storage medium and electronic device
CN114011073A (en) * 2021-11-05 2022-02-08 腾讯科技(深圳)有限公司 Method, device and equipment for controlling vehicle and computer readable storage medium
CN114011073B (en) * 2021-11-05 2023-07-14 腾讯科技(深圳)有限公司 Method, apparatus, device and computer readable storage medium for controlling carrier
WO2023185259A1 (en) * 2022-04-01 2023-10-05 腾讯科技(深圳)有限公司 Virtual object control method and related apparatus
WO2024016769A1 (en) * 2022-07-21 2024-01-25 腾讯科技(深圳)有限公司 Information processing method and apparatus, and storage medium and electronic device
WO2024037208A1 (en) * 2022-08-18 2024-02-22 腾讯科技(深圳)有限公司 Vehicle interaction method and apparatus in virtual scene, and device and computer program product

Also Published As

Publication number Publication date
CN110876849B (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN110876849B (en) Virtual vehicle control method, device, equipment and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
WO2019179294A1 (en) Equipment display method, apparatus, device and storage medium in virtual environment battle
CN110585710B (en) Interactive property control method, device, terminal and storage medium
WO2021143260A1 (en) Method and apparatus for using virtual props, computer device and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN110465098B (en) Method, device, equipment and medium for controlling virtual object to use virtual prop
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
CN112316421B (en) Equipment method, device, terminal and storage medium of virtual item
CN111001159B (en) Virtual item control method, device, equipment and storage medium in virtual scene
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN111330274B (en) Virtual object control method, device, equipment and storage medium
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
CN113713382B (en) Virtual prop control method and device, computer equipment and storage medium
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
US11786817B2 (en) Method and apparatus for operating virtual prop in virtual environment, device and readable medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN111659116A (en) Virtual vehicle control method, device, equipment and medium
CN113041622A (en) Virtual throwing object throwing method in virtual environment, terminal and storage medium
CN111921190B (en) Prop equipment method, device, terminal and storage medium for virtual object
CN113713383A (en) Throwing prop control method and device, computer equipment and storage medium
CN112044073A (en) Using method, device, equipment and medium of virtual prop
CN111589137B (en) Control method, device, equipment and medium of virtual role
CN110960849B (en) Interactive property control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022519

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant