CN113440850A - Virtual object control method and device, storage medium and electronic device - Google Patents

Virtual object control method and device, storage medium and electronic device Download PDF

Info

Publication number
CN113440850A
CN113440850A CN202110580068.1A CN202110580068A CN113440850A CN 113440850 A CN113440850 A CN 113440850A CN 202110580068 A CN202110580068 A CN 202110580068A CN 113440850 A CN113440850 A CN 113440850A
Authority
CN
China
Prior art keywords
control
virtual object
virtual
state
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110580068.1A
Other languages
Chinese (zh)
Inventor
王亮亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202110580068.1A priority Critical patent/CN113440850A/en
Publication of CN113440850A publication Critical patent/CN113440850A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a control method and device of a virtual object, a storage medium and an electronic device, wherein the method comprises the following steps: when a virtual object in a virtual scene is in a first motion state, presenting a first movement control in a control interface of the virtual object, wherein the first movement control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first movement control comprises a first part and a second part surrounding the first part; controlling the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion. By the method and the device, the technical problem that the motion state of the virtual object cannot be actively switched in the related technology is solved, and the human-computer interaction efficiency is improved.

Description

Virtual object control method and device, storage medium and electronic device
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for controlling a virtual object, a storage medium and an electronic device.
Background
In the related art, a display technology based on graphic processing hardware expands a perception environment and a channel for acquiring information, particularly a display technology of a virtual scene, can realize diversified interaction between virtual objects controlled by users or artificial intelligence according to actual application requirements, has various typical application scenes, and can simulate a real fighting process between the virtual objects in the virtual scenes of military exercise simulation, games and the like.
In the related art, a user can control a virtual object such as a player character in a virtual scene such as a game to move through a terminal, for example, the virtual object is controlled to move in the virtual scene, and when the virtual object needs to be controlled to move, the virtual object can be controlled to move in four directions, front, back, left and right, through a control interface. Moreover, when a character moves in a three-dimensional or more duplicated scene, all motion states cannot be satisfied by the four directions of the wheel, resulting in low interaction efficiency of human-computer interaction.
In view of the above problems in the related art, no effective solution has been found at present.
Disclosure of Invention
The embodiment of the invention provides a control method and device of a virtual object, a storage medium and an electronic device.
According to an embodiment of the present invention, there is provided a method of controlling a virtual object, including: when a virtual object in a virtual scene is in a first motion state, presenting a first movement control in a control interface of the virtual object, wherein the first movement control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first movement control comprises a first part and a second part surrounding the first part; controlling the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion.
Optionally, the method further includes: presenting a second moving control and a switching control in the virtual scene, wherein the second moving control is used for controlling the virtual object to move in the vertical direction under the world coordinate system, the switching control is arranged on the first part, and the switching control is used for controlling the carrier state of the virtual object; and controlling the virtual object to be switched from the third motion state to the fourth motion state in response to a state switching operation triggered based on the switching control.
Optionally, presenting the second movement control in the virtual scene includes one of: presenting a second movement control at a first location of the virtual scene, wherein the first location overlaps with a region of the control interface; presenting a second movement control at a second location of the virtual scene, wherein the second location does not overlap with a region of the control interface.
Optionally, after presenting the second moving control at the first position of the virtual scene, the method further includes: detecting control actions for the first mobile control and the second mobile control at a first designated position and a second designated position of the control interface respectively; generating a control instruction for the virtual object based on the action type of the control action; if the control command is a first control command, controlling the virtual object to move towards a first direction; if the control command is a second control command, controlling the virtual object to move towards a second direction; and if the control command is a third control command, controlling the virtual object to be maintained at the current position.
Optionally, after presenting the second moving control at the first position of the virtual scene, the method further includes: detecting a first movement instruction on the first movement control at a first time, wherein the first movement control is in a selectable state at the first time, and the second movement control is in a non-selectable state or a hidden state at the first time; and detecting a second movement instruction on the second movement control at a second time, wherein the second movement control is in a selectable state at the second time, and the first movement control is in a non-selectable state or a hidden state at the second time.
Optionally, after presenting the second moving control in the virtual scene, the method further includes: and in response to the movement operation triggered based on the second movement control, controlling the virtual object to move in the vertical direction in an airspace above the ground level of the virtual scene, and/or controlling the virtual object to move in the vertical direction in a water area below the ground level of the virtual scene.
Optionally, after presenting the second moving control in the virtual scene, the method further includes: detecting a control instruction on the control interface, wherein the control interface comprises the first moving control, the switching control and the second moving control; controlling the virtual object to move in a horizontal direction and a vertical direction simultaneously based on the detected control instruction.
Optionally, after presenting the second moving control in the virtual scene, the method further includes:
and controlling the second mobile control to move from the starting position to the target position in the virtual scene in response to the dragging operation of the second mobile control.
Optionally, presenting a second movement control in the virtual scene includes: presenting the second movement control in a virtual scene when a virtual object in the virtual scene is in a specified virtual state.
Optionally, controlling the virtual object to switch from the third motion state to the fourth motion state includes one of: controlling the virtual object to be switched from a non-carrier state to a carrier state; controlling the virtual object to be switched from a carrier-loaded state to a carrier-free state; and controlling the virtual object to be switched from a first carrier state to a second carrier state, wherein the first carrier state is used for indicating that the virtual object is carried on a first virtual carrier, and the second carrier state is used for indicating that the virtual object is carried on a second virtual carrier.
Optionally, presenting the first moving control and the switching control in the control interface of the virtual object includes: and presenting the first moving control in a first area of the directional wheel of the virtual object, and presenting the switching control in a second area of the directional wheel of the virtual object, wherein the control interface comprises the first area and the second area.
Optionally, when a switching control is presented in the virtual scene, the method further includes: and displaying a control identification on a control interface of the switching control, wherein the control identification is used for representing the virtual carrier to be selected currently or the currently carried virtual carrier of the virtual object.
Optionally, displaying a control identifier on the control interface of the switching control includes: and displaying a switching area in an extension of the skill release area of the virtual scene, wherein the switching area is used for displaying control identifications of a plurality of switching controls, and each control identification is used for representing a virtual carrier.
Optionally, after responding to a state switching operation triggered based on the first moving control, the method further includes: determining an operation direction of the state switching operation; and generating a corresponding operation identifier on the first mobile control based on the operation direction, wherein the operation identifier is used for representing the horizontal movement orientation of the virtual object.
Optionally, after the first movement control is presented in the control interface of the virtual object, the method further includes: controlling the first portion to move from a start position to a target position in the virtual scene in response to a drag operation for the first portion.
Optionally, the first movement control further includes: a third portion fixedly disposed in the center region, wherein the third portion is configured to characterize an operational bulls-eye of the first movement control.
Optionally, controlling the virtual object to switch from the first motion state to the second motion state includes one of: controlling the virtual object to switch from a first force state to a second force state; controlling the virtual object to switch from a first amplitude state to a second amplitude state; controlling the virtual object to switch from a first motion posture to a second motion posture; controlling the virtual object to be switched from a first carrier state to a second carrier state; controlling the virtual object to be switched from a first motion direction to a second motion direction of the same plane; and controlling the virtual object to be switched from the first motion axis to the second motion axis.
Optionally, responding to the touch operation at the first portion and/or the second portion includes at least one of: responding to a touch operation at the first portion and/or the second portion; in response to a drag operation at the first portion and/or the second portion; in response to a voltage controlled operation at the first portion and/or the second portion.
According to another embodiment of the present invention, there is provided a control apparatus of a virtual object, including: the first presentation module is used for presenting a first moving control in a control interface of a virtual object when the virtual object in a virtual scene is in a first motion state, wherein the first moving control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first moving control comprises a first part and a second part surrounding the first part; a first switching module, configured to control the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion.
Optionally, the apparatus further comprises: a second presentation module, configured to present a second moving control and a switching control in the virtual scene, where the second moving control is used to control the virtual object to move in the vertical direction, the switching control is disposed on the first portion, and the switching control is used to control a vehicle state of the virtual object; and the second switching module is used for responding to the state switching operation triggered based on the switching control and controlling the virtual object to be switched from the third motion state to the fourth motion state.
Optionally, the second presenting module includes one of: the first presentation unit is used for presenting a second mobile control at a first position of the virtual scene, wherein the first position is overlapped with the area of the control interface; and the second presenting unit is used for presenting a second mobile control at a second position of the virtual scene, wherein the second position is not overlapped with the area of the control interface.
Optionally, the second presenting module further includes: a first detecting unit, configured to detect, after the first presenting unit presents the second mobile control at the first position of the virtual scene, control actions for the first mobile control and the second mobile control at a first specified position and a second specified position of the control interface, respectively; a generation unit configured to generate a control instruction for the virtual object based on an action type of the control action; the control unit is used for controlling the virtual object to move towards a first direction if the control command is a first control command; if the control command is a second control command, controlling the virtual object to move towards a second direction; and if the control command is a third control command, controlling the virtual object to be maintained at the current position.
Optionally, the apparatus further comprises: a first detection module, configured to detect, at a first time, a first movement instruction on a first movement control after a first presentation unit presents the second movement control at a first position of the virtual scene, where the first movement control is in a selectable state at the first time, and the second movement control is in an unselected state or a hidden state at the first time; and the second detection module is used for detecting a second moving instruction on the second moving control at a second time, wherein the second moving control is in a selectable state at the second time, and the first moving control is in an unselected state or a hidden state at the second time.
Optionally, the apparatus further comprises: and the first control module is used for controlling the virtual object to move in the vertical direction in an airspace above the ground level of the virtual scene and/or controlling the virtual object to move in the vertical direction in a water area below the ground level of the virtual scene in response to a movement operation triggered based on the second movement control after the second movement control is presented in the virtual scene by the second presentation module.
Optionally, the apparatus further comprises: a third detecting module, configured to detect a control instruction on the control interface after the second presenting module presents a second moving control in the virtual scene, where the control interface includes the first moving control, the switching control, and the second moving control; and the second control module is used for controlling the virtual object to move in the horizontal direction and the vertical direction simultaneously based on the detected control instruction.
Optionally, the apparatus further comprises: and the third control module is used for responding to the dragging operation of the second mobile control after the second mobile control is presented in the virtual scene by the second presentation module, and controlling the second mobile control to move from the starting position to the target position in the virtual scene.
Optionally, the second presenting module includes: and the first presentation unit is used for presenting the second mobile control in the virtual scene when the virtual object in the virtual scene is in a specified virtual state.
Optionally, the second switching module includes one of: the first switching unit is used for controlling the virtual object to be switched from a non-carrier state to a carrier state; the second switching unit is used for controlling the virtual object to be switched from a first carrier state to a second carrier state, wherein the first carrier state is used for indicating that the virtual object is carried on a first virtual carrier, and the second carrier state is used for indicating that the virtual object is carried on a second virtual carrier; and the third switching unit is used for controlling the virtual object to be switched from a carrier-carrying state to a carrier-free state.
Optionally, the first presenting module includes: and a second presenting unit, configured to present the first movement control in a first region of the directional wheel of the virtual object, and present the switching control in a second region of the directional wheel of the virtual object, where the control interface includes the first region and the second region.
Optionally, the apparatus further comprises: and the display module is used for displaying a control identifier on a control interface of the switching control when the first presentation module presents the first mobile control and the switching control in the control interface of the virtual object, wherein the control identifier is used for representing the virtual carrier to be selected currently or the virtual carrier carried currently by the virtual object.
Optionally, the display module includes: and the display unit is used for displaying a switching area in an extension of the skill release area of the virtual scene, wherein the switching area is used for displaying control identifications of a plurality of switching controls, and each control identification is used for representing a virtual carrier.
Optionally, the apparatus further comprises: a determining module, configured to determine, after the first switching module responds to a state switching operation triggered based on the first moving control, an operation direction of the state switching operation; and a generating module, configured to generate a corresponding operation identifier on the first mobile control based on the operation direction, where the operation identifier is used to characterize a horizontal movement orientation of the virtual object.
Optionally, the apparatus further comprises: and the fourth control module is used for controlling the first part to move from the starting position to the target position in the virtual scene in response to the dragging operation aiming at the first part after the first moving control is presented in the control interface of the virtual object by the first presenting module.
Optionally, the first movement control further includes: a third portion fixedly disposed in the center region, wherein the third portion is configured to characterize an operational bulls-eye of the first movement control.
Optionally, the first switching module includes one of: a first control unit for controlling the virtual object to switch from a first force state to a second force state; the second control unit is used for controlling the virtual object to be switched from the first amplitude state to the second amplitude state; a third control unit for controlling the virtual object to switch from the first motion posture to the second motion posture; the fourth control unit is used for controlling the virtual object to be switched from the first carrier state to the second carrier state; a fifth control unit, configured to control the virtual object to switch from the first movement direction to the second movement direction in the same plane; and the sixth control unit is used for controlling the virtual object to be switched from the first motion axis to the second motion axis.
Optionally, the first switching module includes at least one of: a first response unit for responding to a touch operation at the first portion and/or the second portion; a second response unit for responding to a drag operation at the first portion and/or the second portion; a third response unit for responding to a voltage control operation at the first portion and/or the second portion.
According to a further embodiment of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the method and the device, when the virtual object in the virtual scene is in the first motion state, the first mobile control is presented in the control interface of the virtual object, the first mobile control is used for controlling the virtual object to move in the horizontal direction, the virtual object is controlled to be switched from the first motion state to the second motion state in response to the state switching operation triggered based on the first mobile control, the motion state of the virtual object is switched by collecting the first mobile control in the control interface, the technical problem that the motion state of the virtual object cannot be actively switched in the related art is solved, and the human-computer interaction efficiency is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a block diagram of a hardware configuration of a control computer of a virtual object according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for controlling a virtual object according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of controlling a virtual object on a control interface according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating an embodiment of switching carrier states using a switching control;
FIG. 5 is a schematic diagram of an embodiment of the present invention showing a toggle control around a right wheel;
FIG. 6 is a block diagram of a control apparatus for a virtual object according to an embodiment of the present invention;
fig. 7 is a block diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The method provided by the first embodiment of the present application may be executed in a mobile phone, a tablet, a server, a computer, or a similar electronic terminal. Taking a computer as an example, fig. 1 is a block diagram of a hardware structure of a control computer of a virtual object according to an embodiment of the present invention. As shown in fig. 1, the computer may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally, a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those of ordinary skill in the art that the configuration shown in FIG. 1 is illustrative only and is not intended to limit the configuration of the computer described above. For example, a computer may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to a control method of a virtual object in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to a computer through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. In the present embodiment, the processor 104 is configured to control the target virtual character to perform a specified operation to complete the game task in response to the human-machine interaction instruction and the game policy. The memory 104 is used for storing program scripts of the electronic game, configuration information, attribute information of the virtual character, and the like.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the computer. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
Optionally, the input/output device 108 further includes a human-computer interaction screen for acquiring a human-computer interaction instruction through a human-computer interaction interface and for presenting a game picture in a virtual scene;
in this embodiment, a method for controlling a virtual object is provided, and fig. 2 is a schematic flowchart of a method for controlling a virtual object according to an embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, when a virtual object in a virtual scene is in a first motion state, presenting a first mobile control in a control interface of the virtual object, wherein the first mobile control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first mobile control comprises a first part and a second part surrounding the first part;
the horizontal direction of the present embodiment may be an absolute horizontal direction, and may be a relative horizontal direction with respect to the world coordinate system, or may be a relative horizontal direction with respect to a reference coordinate system, such as a direction parallel to the virtual character. The horizontal direction may be the xy plane in the world coordinate system, or a plane parallel to the xy plane.
Optionally, the virtual scene of this embodiment may be a virtual game scene, a virtual teaching scene, a virtual demonstration scene, and the like, where the virtual scene includes multiple types of virtual characters, and the virtual characters may be Controlled by a user operation or a system AI, where when the virtual characters are Controlled by a user, the virtual characters may be PCC (Player-Controlled Character) in a virtual game Controlled by a master Player. In the present embodiment, a specific scene is taken as an example of a game scene.
The virtual scene of the present embodiment is a virtual scene displayed (or provided) when an application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
Step S204, in response to the touch operation at the first part and/or the second part, controlling the virtual object to be switched from the first motion state to the second motion state.
Optionally, the first and second motion states are used to represent the moving speed, moving mode, habitat position (such as land, cliff, trunk, iron net, etc.), vehicle state, etc. of the virtual object. Controlling the virtual object to switch from the first motion state to the second motion state includes, for example, switching from a moving state of walking to a moving state of running, switching from a land walking to a climbing crawl, and so forth.
In one example, the first movement control is a direction wheel, a center region of the direction wheel (i.e., a first portion of the first movement control) is clicked, the virtual object is in a walking state, and when the center of the direction wheel is operated in a region of a first radius with a preset center (e.g., the center of the direction wheel) as a center, the virtual object is in a first motion state (e.g., walks in a direction corresponding to the operation), wherein operating the center of the direction wheel in the region of the first radius may be dragging the center of the direction wheel in the region of the first radius, or clicking the center of the direction wheel in the region of the first radius and pointing in a direction; and when the annular area (namely the second part of the first mobile control) from the first radius to the second radius (the second radius is larger than the first radius) is dragged by taking the preset center as the center, the virtual object is in a second motion state (for example, running towards the dragging direction), namely when the center of the direction wheel disc is dragged outwards on the direction wheel disc, the switching from the first motion state with the first motion speed to the second motion state with the second motion speed (for example, the virtual object changes from walking to running) is realized, wherein the first motion speed and the second motion speed are different, and the first motion speed is not 0.
Through the steps, when the virtual object in the virtual scene is in the first motion state, a first mobile control is presented in the control interface of the virtual object, wherein the first mobile control is used for controlling the virtual object to move in the horizontal direction under the world coordinate system of the virtual scene, the first mobile control comprises a first part and a second part surrounding the first part, the virtual object is controlled to be switched from the first motion state to the second motion state in response to touch operation at the first part and/or the second part, the motion state of the virtual object is switched by collecting the first mobile control in the control interface, the technical problem that the motion state of the virtual object cannot be actively switched in the related art is solved, and the human-computer interaction efficiency is improved.
Optionally, the first portion of the first movement control may move, and after the first movement control is presented in the control interface of the virtual object, the method further includes: the first portion is controlled to move from a start position to a target position in the virtual scene in response to a drag operation for the first portion.
In one example of this embodiment, the first movement control further comprises: a third portion fixedly disposed in the center region, wherein the third portion is for characterizing an operational bulls-eye of the first movement control.
In one scenario based on this example, the first movement control has 3 concentric circles, the middle-most circle acts as a cue, the first part being the bulls-eye; when the finger is operated within the range of the second circle (second portion), the character moves; when the finger is operated in the range from the outside of the second circle to the inside of the third circle (third portion), the character runs.
Optionally, the controlling the virtual object to switch from the first motion state to the second motion state may be, but is not limited to: controlling the virtual object to switch from the first force state to the second force state; controlling the virtual object to switch from a first amplitude state to a second amplitude state; controlling the virtual object to switch from the first motion posture to the second motion posture; controlling the virtual object to be switched from the first carrier state to the second carrier state; controlling the virtual object to be switched from a first motion direction to a second motion direction of the same plane; and controlling the virtual object to be switched from the first motion axis to the second motion axis.
In this embodiment, based on the touch operation of the first mobile control, the motion postures of the virtual object under the conditions of force, amplitude, walking posture, carrier state, horizontal flight, Z-axis flight, and the like can be switched.
In an example of the present embodiment, the response to the touch operation at the first portion and/or the second portion may be, but is not limited to: responding to a touch operation at the first portion and/or the second portion; in response to a drag operation at the first portion and/or the second portion; in response to a voltage controlled operation at the first portion and/or the second portion.
In one scenario of this example, the virtual object may be controlled to perform a corresponding state switching action based on different touch operation types (e.g., clicking, double-clicking, etc.), the virtual object may be controlled to perform a corresponding state switching action based on different dragging directions, and the virtual object may be controlled to perform a corresponding state switching action based on different pressure values.
Optionally, after responding to the state switching operation triggered based on the first moving control, the method further includes: determining an operation direction of the state switching operation; and generating a corresponding operation identifier on the first moving control based on the operation direction, wherein the operation identifier is used for representing the horizontal moving direction of the virtual object.
In one example, determining the operating direction of the state switching operation includes: and positioning a target coordinate position of the state switching operation, calculating an angle between an extension line from a preset origin point to the target coordinate position and a horizontal reference line in a preset horizontal coordinate system, and outputting the direction of the angle as an operation direction of the state switching operation. The state switching operation triggered based on the first mobile control may be to drag a center region of the first mobile control to a target coordinate position, and the preset origin may be an initial position of the center of the first mobile control. Optionally, the operation identifier may be a directional identifier such as an arrow, a finger, a sharp prop, and the like, the direction of the operation identifier is the same as the operation direction, and the direction of the operation identifier is adjusted in real time along with the update of the state switching operation. The operation identifier is further configured to represent a preset origin and an operation path from the preset origin to the target coordinate position, for example, the operation identifier further includes a graphical identifier for representing the preset origin and a path identifier (for example, a graphical identifier such as a straight line from the preset origin to the target coordinate position) for representing the operation path from the preset origin to the target coordinate position.
In an embodiment of this embodiment, the method further includes: and presenting a second moving control and a switching control in the virtual scene, wherein the second moving control is used for controlling the virtual object to move in the vertical direction under the world coordinate system, the switching control is arranged on the first part, and the switching control is used for controlling the carrier state of the virtual object.
Wherein, the vertical direction may be a z-axis direction in a three-dimensional world coordinate system.
Optionally, the third and fourth motion states are used to represent whether the virtual object carries a virtual vehicle, a type of the carried vehicle, and the like.
The user may use the first moving control and the second moving control to control the virtual object to move in various directions on the ground, in the sky, in the water of the virtual scene, for example, to run, jump, crawl, bow to move ahead on the land, or to control the virtual object to swim, float, or dive in the sea, or of course, the user may also control the virtual object to move in the virtual scene by taking a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, and the like, which is only exemplified by the above scenes, and is not limited specifically.
In an example of a virtual game scene, a first moving control and a switching control are arranged on a control interface of a game role, in a three-dimensional game map, the first moving control is used for controlling the game role to move back and forth, left and right (corresponding to an x axis and a y axis of a three-dimensional coordinate system), the switching control switches motion states of the game role, such as a moving state, a body-dwelling state and the like, a second moving control is also arranged on the control interface of the game scene, and the second moving control is used for controlling the game role to move up and down (corresponding to a z axis of the three-dimensional coordinate system), so that six-axis movement of the virtual role in the virtual scene is realized, the human-computer interaction efficiency is improved, and the control scene is enriched.
In this embodiment, the second movement control may overlap or not overlap with the control interface where the first movement control and the switching control are located, and may be adapted and selected according to the interface layout of the virtual scene and the user habit. Presenting a second movement control in the virtual scene includes: presenting a second movement control at a first location of the virtual scene, wherein the first location overlaps with a region of the control interface; alternatively, a second movement control is presented at a second location of the virtual scene, wherein the second location does not overlap with a region of the control interface.
In one of the application scenarios, the second position is not overlapped with the area of the control interface, the control interface where the first mobile control and the switching control are located is the first control interface, the control interface where the second mobile control is located is the second control interface (a third control interface for virtual object release skills can also be presented), for example, the first control interface is located at the lower left corner of the virtual scene, and the second control interface is located at the middle position of the virtual scene, because the two control interfaces are not overlapped, the two control interfaces can be separately and synchronously controlled, and the movement of the virtual character in the horizontal direction and the vertical direction is controlled.
In another application scenario, the first position overlaps with a region of the control interface, the first mobile control, the switching control, and the second mobile control are integrated in the same control interface, and in order to implement simultaneous control of the virtual character by the first mobile control and the second mobile control, a user may respectively operate the first mobile control and the second mobile control in different control regions and control time slots, which is described in detail herein:
in one implementation scenario, after presenting the second movement control at the first location of the virtual scene, further comprising: detecting control actions aiming at the first mobile control and the second mobile control at a first appointed position and a second appointed position of a control interface respectively; generating a control instruction for the virtual object based on the action type of the control action; if the control command is a first control command, controlling the virtual object to move towards a first direction; if the control command is a second control command, controlling the virtual object to move towards a second direction; and if the command is the third control command, controlling the virtual object to maintain the current position.
In this implementation scenario, the first control instruction may be a trigger triggered by the first mobile control or the second mobile control, and correspondingly, the first direction may be front or upper, the second control instruction may also be a trigger triggered by the first mobile control or the second mobile control, and correspondingly, the second direction may be rear or lower, and the third control instruction may also be a trigger triggered by the first mobile control or the second mobile control.
In one example, the first designated position and the second designated position are different positions on the control interface, at the first designated position, a command for moving forward, backward, left, right, in place, or horizontally in a super-oblique direction can be generated according to the control action, and at the second designated position, a command for moving vertically, such as ascending, descending, floating, and the like, can be generated according to the control action. In another example, the first designated location and the second designated location are the same location on the control interface, control actions aiming at the first mobile control and the second mobile control can be generated by detecting different operation actions, each mobile control is preset with a set of operation actions, if the first movement control can only respond to the sliding operation on the control interface, when the user slides the direction on the turntable by fingers, an included angle theta between a finger and a connecting line of the center of the turntable in the horizontal direction is an angle acting on the moving direction of the virtual object in the virtual scene, the second moving control can only respond to click operation on the control interface, double-click on the control interface triggers an ascending instruction, single-click on the control interface triggers a descending instruction and the like, when there is no operation on the control interface, the current position of the virtual object is maintained, and fig. 3 is a schematic diagram of controlling the virtual object on the control interface according to the embodiment of the present invention.
In another implementation scenario, after presenting the second movement control at the first location of the virtual scene, further comprising: detecting a first moving instruction on a first moving control at a first time, wherein the first moving control is in a selectable state at the first time, and the second moving control is in an unselected state or a hidden state at the first time; and detecting a second movement instruction on the second movement control at a second time, wherein the second movement control is in a selectable state at the second time, and the first movement control is in a non-selectable state or a hidden state at the second time.
Optionally, the first time and the second time may be duration of an operation cycle, or may be time allocated by the system, and in a scene of a virtual game, the system configures, according to a game scenario, a current fighting condition of a game character, a state of a vehicle currently carried by the game character (e.g., whether a vehicle is carried, whether the vehicle is a land vehicle or a flying vehicle), whether the first mobile control and the second mobile control are selectable or not, whether the first mobile control and the second mobile control are hidden, or not.
In some cases, the first and second movement controls may be merged and integrated into a single control area, such as a lower left area of the interface, and a lower right area for displaying skill buttons, enabling one-handed control of direction. When the first mobile control and the second mobile control are integrated on the directional wheel disc, the identification area in the middle of the directional wheel disc can be used as a control area of the second mobile control, if the identification area is touched, the identification area moves towards the positive direction of the Z axis of the three-dimensional coordinate, if the identification area is not touched, the identification area moves towards the negative direction of the Z axis of the three-dimensional coordinate (suitable for a moving scene in the vertical direction on the ground plane, if the identification area is on the sky), if the identification area is touched, the identification area is not touched, the identification area moves towards the negative direction of the Z axis of the three-dimensional coordinate, if the identification area is not touched, the identification area moves towards the positive direction of the Z axis of the three-dimensional coordinate (suitable for a moving scene in the vertical direction under the ground plane, if the identification area is underwater). Or the first mobile control and the second mobile control are arranged to be overlapped, if the forward movement of the first mobile control corresponds to the upward movement of the second mobile control, the first mobile control and the second mobile control are subjected to time division control, and only one mobile control is allowed to be in the selectable middle state at the same time. For example, when the virtual character carries a flying sword to fly on the sky, the first mobile control can be controlled to be in an unselected state, the second mobile control can be in an optional middle state, and the virtual character can be adapted according to scenes.
In one implementation scenario, after presenting the second movement control in the virtual scenario, further comprising: detecting a control instruction on a control interface, wherein the control interface comprises a first mobile control, a switching control and a second mobile control; controlling the virtual object to move in the horizontal direction and the vertical direction simultaneously based on the detected control instruction. The control instruction can be triggered by one or more of the first moving control, the switching control and the second moving control.
In this implementation scenario, the first movement control, the switching control, and the second movement control may all be integrated on a direction wheel of the control interface (including but not limited to, the switching control is located in a central region of the direction wheel), and may also move back and forth and left and right simultaneously when ascending and descending. For example: and after the virtual object is switched to the flying state by adopting the operation switching control, the central area of the direction wheel disc is switched to the second moving control. Under this condition, through long mode (or when the pressure of pressing the rim plate surpassed preset pressure threshold) operation second removal controlling part according to the central zone of rim plate, open the mode that rises, drag the central zone forward, back, seat, the right side operation of rim plate simultaneously, realize vertical + horizontal direction's simultaneous movement (if rise forward, rise backward etc.). Optionally, in the implementation scenario, the relative distance from the current position to the center of the wheel disc may be calculated according to the position reached by the center region of the dragged wheel disc, and the moving speed of the virtual object is adjusted, where the speed changes with the change of the dragged position; or the speed is in direct proportion to the dragging distance after the dragging distance exceeds the preset distance.
In an embodiment of this embodiment, while the second motion control is collected to control the virtual Character in the vertical direction, parameters such as the height, speed, duration, and the like of the virtual Character currently ascending or descending may be displayed in a virtual map of the virtual scene, in some virtual scenes, such as in a virtual game, the virtual scene further includes elements such as a Non-Player-Controlled Character (NPC) and a Policy Control Controller (PCC), for example, a Player-Controlled Character (Player-Controlled Character) where the target NPC is a BOSS of the virtual scene in a spatial domain, and when the virtual Character ascends to the same height as the target NPC, a prompt message is output to indicate that the virtual Character is the same height as the target NPC, where the prompt message may be a color, a text, and the like, for example, the virtual object and the BOSS are the same color, and indicate that the virtual object and the BOSS are at the same height, meanwhile, the height of the target NPC may also be set to the highest or lowest adjustable distance of the second mobile control, and the virtual character cannot continue to rise after reaching the height of the target NPC.
In this embodiment, after presenting the second movement control in the virtual scene, the method further includes: and controlling the virtual object to move in the vertical direction of the virtual scene in response to the movement operation triggered based on the second movement control.
Alternatively, the vertical area includes an area above the ground level and an area below the ground level, and may also be applied to special features such as a cave, a canyon, a cliff, and the like. Controlling the virtual object to move in a vertical direction of the virtual scene includes: controlling the virtual object to move in the vertical direction in an airspace above the ground level of the virtual scene; and/or controlling the virtual object to move in the vertical direction in a water area below the ground level of the virtual scene.
In some embodiments of this embodiment, the second mobile control is a floating control, and the user or the system may drag the second mobile control, for example, drag the second mobile control to an area that the user is accustomed to operating, drag the second mobile control to a control interface where the first mobile control is located, and the like. After presenting the second movement control in the virtual scene, further comprising: and controlling the second mobile control to move from the starting position to the target position in the virtual scene in response to the dragging operation aiming at the second mobile control.
In the above embodiment, when the second movement control (e.g., the ascending control, the descending control, etc.) is independent of the direction wheel, dragging the second movement control to the control interface where the first movement control is located may simultaneously trigger the opening of the vertical direction movement mode, and operating the direction wheel to realize the movement in the left-right front-back direction + the vertical direction. For example, when the second mobile control is a rising control, the second mobile control is independent of the direction wheel disc, the second mobile control is dragged to the control interface where the first mobile control is located, the rising mode can be triggered to be started at the same time, the central region of the direction wheel disc is operated and dragged forwards, and the forward and vertical movement is realized, that is, the forward rising movement is realized. In some examples, after the second mobile control is dragged, the second mobile control covers the first mobile control in the control interface, the first mobile control is controlled to be in an unselected state or a hidden state, the directional wheel disc can only control the virtual character to move in the vertical direction, and after the second mobile control is released, the directional wheel disc can only control the virtual character to move in the horizontal direction. In addition, when the vertical direction movement mode is triggered to be started, the adjustment can be performed according to a carrier currently carried by the virtual object, for example, if the user selects a flying carrier (such as a wing), the second movement control is dragged to the direction wheel disc, and then the ascending mode is automatically triggered; if the user selects a vehicle (e.g., a submarine) in the water, the dive mode is automatically triggered by dragging the second motion control to the steering wheel.
In some examples, the system may also automatically move the second mobile control based on the virtual scene, for example, when the second mobile control blocks a key screen in the virtual scene, the second mobile control is moved to a blank area of the virtual scene, or the second mobile control is bound to a prop of the virtual scene, for example, when the second mobile control controls a virtual rocket in the virtual scene, the second mobile control is bound to the virtual rocket in the virtual scene, and the second mobile control may control the virtual rocket to move up and down and also move along with the virtual rocket.
In some implementations of this embodiment, presenting the second movement control in the virtual scene includes: when a virtual object in the virtual scene is in a specified virtual state, a second movement control is presented in the virtual scene.
When the system is configured in advance, the first mobile control is associated with the virtual object, as long as the virtual object appears in the virtual scene, a control interface is generated, and the first mobile control and the switching control are presented, the second mobile control can be associated with specified elements of the virtual scene, such as a virtual scenario, a virtual prop, a virtual NPC and the like, and when the specified elements appear in the virtual scene, the second mobile control is presented in the virtual scene.
In some examples, the second movement control is displayed or hidden according to the real-time state (motion state, movement state, habitation state) of the game character, or set to a selectable medium state and a non-selectable state. For example, when the game character is in a state without a carrier moving, the second moving control is hidden, when the game character is in a fighting state, the second moving control is hidden, when no carrier capable of moving up and down is arranged in the visual field range of the game character, the second moving control is hidden, when the game character is not loaded or configured with a carrier capable of moving up and down, the second moving control is hidden to prevent mistaken touch or invalid instructions, and in other states, the second moving control can be presented in a virtual scene.
In an implementation manner of this embodiment, the controlling of the virtual object to switch from the third motion state to the fourth motion state may be, but is not limited to: controlling the virtual object to be switched from a non-carrier state to a carrier state; controlling the virtual object to be switched from a carrier-carrying state to a carrier-free state; and controlling the virtual object to be switched from a first vehicle state to a second vehicle state, wherein the first vehicle state is used for indicating that the virtual object is carried on the first virtual vehicle, and the second vehicle state is used for indicating that the virtual object is carried on the second virtual vehicle.
Fig. 4 is a schematic diagram illustrating an embodiment of switching carrier states by using a switching control, where in a first carrier state, a virtual character carries a horse and a julian horse, and a user clicks the switching control to trigger a state switching operation to enter a second carrier state, and in the second carrier state, the virtual character unfolds wings and enters a flight state.
Optionally, presenting the first moving control and the switching control in the control interface of the virtual object includes: and presenting a first moving control in a first area of the directional wheel of the virtual object, and presenting a switching control in a second area of the directional wheel of the virtual object, wherein the control interface comprises the first area and the second area. In one example, the control interface is a circular area, the first movement control is in a peripheral annular area of the control interface, and the switching control is in a central area of the control interface.
In some implementations of this embodiment, when the switching control is presented in the virtual scene, the method further includes: and displaying a control identification on a control interface of the switching control, wherein the control identification is used for representing a virtual carrier to be selected currently or a currently carried virtual carrier of the virtual object.
In this embodiment, a control identifier is further disposed on the switching control (e.g., the control identifier is disposed in a central area of the switching control) to indicate a vehicle or a state that the game character can currently select, where the identifier may be a thumbnail picture, text, or the like of the vehicle, and the vehicle may be a vehicle moving in a horizontal direction, such as a horse, a car, a pet, or the like, or a vehicle moving in a vertical direction, such as a flying sword, a rocket, a submarine, or the like. By clicking the control identification, a second mobile control (if the vehicle moves in the vertical direction) can be triggered (for example, the position where the switching control is located is switched to the second mobile control), and the user controls the vehicle corresponding to the control identification by operating the second mobile control. The control identifiers may be multiple and respectively correspond to carriers that can be carried by one virtual character in the current scene.
Optionally, displaying the control identifier on the control interface of the switching control includes: and displaying a switching area in an extension of a skill release area of the virtual scene, wherein the switching area is used for displaying control identifications of a plurality of switching controls, and each control identification is used for representing a virtual carrier.
In one embodiment, the skill release area is an area of a right wheel disc, a skill release control is displayed on the right wheel disc, and a switching control is displayed around the right wheel disc, fig. 5 is a schematic diagram of displaying the switching control around the right wheel disc according to an embodiment of the present invention, and two switching controls for switching to a horse riding state and a flight state are displayed. In some examples, the switching control further includes several sub-controls, the sub-controls are used for controlling the form of the virtual vehicle and/or controlling the riding state of the virtual object on the same virtual vehicle, and the sub-controls may be displayed at adjacent positions of the switching control after the user selects the switching control to which the sub-controls belong. For example, after the user operates the switching control a (the user rides the horse), the virtual object rides the horse (in a conventional riding state), and meanwhile, sub-controls (for controlling the front hoof of the raised horse, the hip of the raised horse, standing on the back of the horse, sitting on the neck of the horse, riding on the horse with one leg, raising the penis and the like) are displayed around the right wheel disc, so that the user can control the virtual character and the virtual carrier to cooperate to complete richer and more precise actions based on the current virtual carrier by clicking the sub-controls.
In some examples, when there are multiple vehicles, the vehicle to be selected may be displayed according to a preset policy, and the user may be prompted through the control identifier. The most relevant carrier to be selected can be displayed in the control identification according to the current game progress and game task, and a user is prompted to select the carrier, for example, after related objects (such as teammates in the virtual game, enemy characters and the like) interacting with the virtual objects in the virtual scene enter an airspace from the land, a flying carrier is prompted; or, a plurality of carriers owned by the virtual object are displayed around the first mobile control or the second mobile control (i.e. a plurality of switching control controls are displayed); or, when multiple vehicles are owned, the multiple owned vehicles are displayed after the switching control is clicked (at this time, the control identifier is a general identifier), the user can select one of the multiple owned vehicles to carry on, and then the switching control displays the control identifier (such as a horse or a wing) corresponding to the vehicle currently selected by the user.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a control device for a virtual object is further provided, which is used to implement the foregoing embodiments and preferred embodiments, and the description that has been already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 6 is a block diagram of a control apparatus for a virtual object according to an embodiment of the present invention, as shown in fig. 6, the apparatus including: a first rendering module 60, a first switching module 62, wherein,
a first presenting module 60, configured to present a first moving control in a control interface of a virtual object in a virtual scene when the virtual object is in a first motion state, where the first moving control is used to control the virtual object to move in a horizontal direction under a world coordinate system of the virtual scene, and the first moving control includes a first portion and a second portion surrounding the first portion;
a first switching module 62, configured to control the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion.
Optionally, the apparatus further comprises: a second presentation module, configured to present a second moving control and a switching control in the virtual scene, where the second moving control is used to control the virtual object to move in the vertical direction, the switching control is disposed on the first portion, and the switching control is used to control a vehicle state of the virtual object; and the second switching module is used for responding to the state switching operation triggered based on the switching control and controlling the virtual object to be switched from the third motion state to the fourth motion state.
Optionally, the second presenting module includes one of: the first presentation unit is used for presenting a second mobile control at a first position of the virtual scene, wherein the first position is overlapped with the area of the control interface; and the second presenting unit is used for presenting a second mobile control at a second position of the virtual scene, wherein the second position is not overlapped with the area of the control interface.
Optionally, the second presenting module further includes: a first detecting unit, configured to detect, after the first presenting unit presents the second mobile control at the first position of the virtual scene, control actions for the first mobile control and the second mobile control at a first specified position and a second specified position of the control interface, respectively; a generation unit configured to generate a control instruction for the virtual object based on an action type of the control action; the control unit is used for controlling the virtual object to move towards a first direction if the control command is a first control command; if the control command is a second control command, controlling the virtual object to move towards a second direction; and if the control command is a third control command, controlling the virtual object to be maintained at the current position.
Optionally, the apparatus further comprises: a first detection module, configured to detect, at a first time, a first movement instruction on a first movement control after a first presentation unit presents the second movement control at a first position of the virtual scene, where the first movement control is in a selectable state at the first time, and the second movement control is in an unselected state or a hidden state at the first time; and the second detection module is used for detecting a second moving instruction on the second moving control at a second time, wherein the second moving control is in a selectable state at the second time, and the first moving control is in an unselected state or a hidden state at the second time.
Optionally, the apparatus further comprises: and the first control module is used for controlling the virtual object to move in the vertical direction in an airspace above the ground level of the virtual scene and/or controlling the virtual object to move in the vertical direction in a water area below the ground level of the virtual scene in response to a movement operation triggered based on the second movement control after the second movement control is presented in the virtual scene by the second presentation module.
Optionally, the apparatus further comprises: a third detecting module, configured to detect a control instruction on the control interface after the second presenting module presents a second moving control in the virtual scene, where the control interface includes the first moving control, the switching control, and the second moving control; and the second control module is used for controlling the virtual object to move in the horizontal direction and the vertical direction simultaneously based on the detected control instruction.
Optionally, the apparatus further comprises: and the third control module is used for responding to the dragging operation of the second mobile control after the second mobile control is presented in the virtual scene by the second presentation module, and controlling the second mobile control to move from the starting position to the target position in the virtual scene.
Optionally, the second presenting module includes: and the first presentation unit is used for presenting the second mobile control in the virtual scene when the virtual object in the virtual scene is in a specified virtual state.
Optionally, the second switching module includes one of: the first switching unit is used for controlling the virtual object to be switched from a non-carrier state to a carrier state; the second switching unit is used for controlling the virtual object to be switched from a first carrier state to a second carrier state, wherein the first carrier state is used for indicating that the virtual object is carried on a first virtual carrier, and the second carrier state is used for indicating that the virtual object is carried on a second virtual carrier; and the third switching unit is used for controlling the virtual object to be switched from a carrier-carrying state to a carrier-free state.
Optionally, the first presenting module includes: and a second presenting unit, configured to present the first movement control in a first region of the directional wheel of the virtual object, and present the switching control in a second region of the directional wheel of the virtual object, where the control interface includes the first region and the second region.
Optionally, the apparatus further comprises: and the display module is used for displaying a control identifier on a control interface of the switching control when the first presentation module presents the first mobile control and the switching control in the control interface of the virtual object, wherein the control identifier is used for representing the virtual carrier to be selected currently or the virtual carrier carried currently by the virtual object.
Optionally, the display module includes: and the display unit is used for displaying a switching area in an extension of the skill release area of the virtual scene, wherein the switching area is used for displaying control identifications of a plurality of switching controls, and each control identification is used for representing a virtual carrier.
Optionally, the apparatus further comprises: a determining module, configured to determine, after the first switching module responds to a state switching operation triggered based on the first moving control, an operation direction of the state switching operation; and a generating module, configured to generate a corresponding operation identifier on the first mobile control based on the operation direction, where the operation identifier is used to characterize a horizontal movement orientation of the virtual object.
Optionally, the apparatus further comprises: and the fourth control module is used for controlling the first part to move from the starting position to the target position in the virtual scene in response to the dragging operation aiming at the first part after the first moving control is presented in the control interface of the virtual object by the first presenting module.
Optionally, the first movement control further includes: a third portion fixedly disposed in the center region, wherein the third portion is configured to characterize an operational bulls-eye of the first movement control.
Optionally, the first switching module includes one of: a first control unit for controlling the virtual object to switch from a first force state to a second force state; the second control unit is used for controlling the virtual object to be switched from the first amplitude state to the second amplitude state; a third control unit for controlling the virtual object to switch from the first motion posture to the second motion posture; the fourth control unit is used for controlling the virtual object to be switched from the first carrier state to the second carrier state; a fifth control unit, configured to control the virtual object to switch from the first movement direction to the second movement direction in the same plane; and the sixth control unit is used for controlling the virtual object to be switched from the first motion axis to the second motion axis.
Optionally, the first switching module includes at least one of: a first response unit for responding to a touch operation at the first portion and/or the second portion; a second response unit for responding to a drag operation at the first portion and/or the second portion; a third response unit for responding to a voltage control operation at the first portion and/or the second portion.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
Example 3
Fig. 7 is a structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 7, the electronic device includes a processor 71, a communication interface 72, a memory 73 and a communication bus 74, where the processor 71, the communication interface 72, and the memory 73 complete communication with each other through the communication bus 74, and the memory 73 is used for storing a computer program;
the processor 71, when executing the program stored in the memory 73, implements the following steps: when a virtual object in a virtual scene is in a first motion state, presenting a first movement control in a control interface of the virtual object, wherein the first movement control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first movement control comprises a first part and a second part surrounding the first part; controlling the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion.
Optionally, the method further includes: presenting a second moving control and a switching control in the virtual scene, wherein the second moving control is used for controlling the virtual object to move in the vertical direction under the world coordinate system, the switching control is arranged on the first part, and the switching control is used for controlling the carrier state of the virtual object; and controlling the virtual object to be switched from the third motion state to the fourth motion state in response to a state switching operation triggered based on the switching control.
Optionally, presenting the second movement control in the virtual scene includes one of: presenting a second movement control at a first location of the virtual scene, wherein the first location overlaps with a region of the control interface; presenting a second movement control at a second location of the virtual scene, wherein the second location does not overlap with a region of the control interface.
Optionally, after presenting the second moving control at the first position of the virtual scene, the method further includes: detecting control actions for the first mobile control and the second mobile control at a first designated position and a second designated position of the control interface respectively; generating a control instruction for the virtual object based on the action type of the control action; if the control command is a first control command, controlling the virtual object to move towards a first direction; if the control command is a second control command, controlling the virtual object to move towards a second direction; and if the control command is a third control command, controlling the virtual object to be maintained at the current position.
Optionally, after presenting the second moving control at the first position of the virtual scene, the method further includes: detecting a first movement instruction on the first movement control at a first time, wherein the first movement control is in a selectable state at the first time, and the second movement control is in a non-selectable state or a hidden state at the first time; and detecting a second movement instruction on the second movement control at a second time, wherein the second movement control is in a selectable state at the second time, and the first movement control is in a non-selectable state or a hidden state at the second time.
Optionally, after presenting the second moving control in the virtual scene, the method further includes: and in response to the movement operation triggered based on the second movement control, controlling the virtual object to move in the vertical direction in an airspace above the ground level of the virtual scene, and/or controlling the virtual object to move in the vertical direction in a water area below the ground level of the virtual scene.
Optionally, after presenting the second moving control in the virtual scene, the method further includes: detecting a control instruction on the control interface, wherein the control interface comprises the first moving control, the switching control and the second moving control; controlling the virtual object to move in a horizontal direction and a vertical direction simultaneously based on the detected control instruction.
Optionally, after presenting the second moving control in the virtual scene, the method further includes: and controlling the second mobile control to move from the starting position to the target position in the virtual scene in response to the dragging operation of the second mobile control.
Optionally, presenting a second movement control in the virtual scene includes: presenting the second movement control in a virtual scene when a virtual object in the virtual scene is in a specified virtual state.
Optionally, controlling the virtual object to switch from the third motion state to the fourth motion state includes one of: controlling the virtual object to be switched from a non-carrier state to a carrier state; controlling the virtual object to be switched from a carrier-loaded state to a carrier-free state; and controlling the virtual object to be switched from a first carrier state to a second carrier state, wherein the first carrier state is used for indicating that the virtual object is carried on a first virtual carrier, and the second carrier state is used for indicating that the virtual object is carried on a second virtual carrier.
Optionally, presenting the first moving control and the switching control in the control interface of the virtual object includes: and presenting the first moving control in a first area of the directional wheel of the virtual object, and presenting the switching control in a second area of the directional wheel of the virtual object, wherein the control interface comprises the first area and the second area.
Optionally, when a switching control is presented in the virtual scene, the method further includes: and displaying a control identification on a control interface of the switching control, wherein the control identification is used for representing the virtual carrier to be selected currently or the currently carried virtual carrier of the virtual object.
Optionally, displaying a control identifier on the control interface of the switching control includes: and displaying a switching area in an extension of the skill release area of the virtual scene, wherein the switching area is used for displaying control identifications of a plurality of switching controls, and each control identification is used for representing a virtual carrier.
Optionally, after responding to a state switching operation triggered based on the first moving control, the method further includes: determining an operation direction of the state switching operation; and generating a corresponding operation identifier on the first mobile control based on the operation direction, wherein the operation identifier is used for representing the horizontal movement orientation of the virtual object.
Optionally, after the first movement control is presented in the control interface of the virtual object, the method further includes: controlling the first portion to move from a start position to a target position in the virtual scene in response to a drag operation for the first portion.
Optionally, the first movement control further includes: a third portion fixedly disposed in the center region, wherein the third portion is configured to characterize an operational bulls-eye of the first movement control.
Optionally, controlling the virtual object to switch from the first motion state to the second motion state includes one of: controlling the virtual object to switch from a first force state to a second force state; controlling the virtual object to switch from a first amplitude state to a second amplitude state; controlling the virtual object to switch from a first motion posture to a second motion posture; controlling the virtual object to be switched from a first carrier state to a second carrier state; controlling the virtual object to be switched from a first motion direction to a second motion direction of the same plane; and controlling the virtual object to be switched from the first motion axis to the second motion axis.
Optionally, responding to the touch operation at the first portion and/or the second portion includes at least one of: responding to a touch operation at the first portion and/or the second portion; in response to a drag operation at the first portion and/or the second portion; in response to a voltage controlled operation at the first portion and/or the second portion.
The communication bus mentioned in the above terminal may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the terminal and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In another embodiment provided by the present application, there is further provided a computer-readable storage medium having stored therein instructions, which when run on a computer, cause the computer to execute the method for controlling a virtual object according to any one of the above embodiments.
In yet another embodiment provided by the present application, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method for controlling a virtual object according to any one of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (15)

1. A method for controlling a virtual object, comprising:
when a virtual object in a virtual scene is in a first motion state, presenting a first movement control in a control interface of the virtual object, wherein the first movement control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first movement control comprises a first part and a second part surrounding the first part;
controlling the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion.
2. The method of claim 1, further comprising:
presenting a second moving control and a switching control in the virtual scene, wherein the second moving control is used for controlling the virtual object to move in the vertical direction under the world coordinate system, the switching control is arranged on the first part, and the switching control is used for controlling the carrier state of the virtual object;
and controlling the virtual object to be switched from the third motion state to the fourth motion state in response to a state switching operation triggered based on the switching control.
3. The method of claim 2, wherein presenting a second movement control in the virtual scene comprises one of:
presenting a second movement control at a first location of the virtual scene, wherein the first location overlaps with a region of the control interface;
presenting a second movement control at a second location of the virtual scene, wherein the second location does not overlap with a region of the control interface.
4. The method of claim 3, wherein after presenting the second movement control at the first location of the virtual scene, the method further comprises:
detecting control actions for the first mobile control and the second mobile control at a first designated position and a second designated position of the control interface respectively;
generating a control instruction for the virtual object based on the action type of the control action;
if the control command is a first control command, controlling the virtual object to move towards a first direction; if the control command is a second control command, controlling the virtual object to move towards a second direction; and if the control command is a third control command, controlling the virtual object to be maintained at the current position.
5. The method of claim 3, wherein after presenting the second movement control at the first location of the virtual scene, the method further comprises:
detecting a first movement instruction on the first movement control at a first time, wherein the first movement control is in a selectable state at the first time, and the second movement control is in a non-selectable state or a hidden state at the first time;
and detecting a second movement instruction on the second movement control at a second time, wherein the second movement control is in a selectable state at the second time, and the first movement control is in a non-selectable state or a hidden state at the second time.
6. The method of claim 2, wherein after presenting the second movement control in the virtual scene, the method further comprises:
and in response to the movement operation triggered based on the second movement control, controlling the virtual object to move in the vertical direction in an airspace above the ground level of the virtual scene, and/or controlling the virtual object to move in the vertical direction in a water area below the ground level of the virtual scene.
7. The method of claim 2, wherein controlling the virtual object to switch from the third motion state to the fourth motion state comprises one of:
controlling the virtual object to be switched from a non-carrier state to a carrier state;
controlling the virtual object to be switched from a carrier-loaded state to a carrier-free state;
and controlling the virtual object to be switched from a first carrier state to a second carrier state, wherein the first carrier state is used for indicating that the virtual object is carried on a first virtual carrier, and the second carrier state is used for indicating that the virtual object is carried on a second virtual carrier.
8. The method of claim 2, wherein when rendering a toggle control in the virtual scene, the method further comprises:
and displaying a control identification on a control interface of the switching control, wherein the control identification is used for representing the virtual carrier to be selected currently or the currently carried virtual carrier of the virtual object.
9. The method of claim 1, wherein after presenting the first movement control in the control interface of the virtual object, the method further comprises:
controlling the first portion to move from a start position to a target position in the virtual scene in response to a drag operation for the first portion.
10. The method of claim 1, wherein the first movement control further comprises: a third portion fixedly disposed in the center region, wherein the third portion is configured to characterize an operational bulls-eye of the first movement control.
11. The method of claim 1, wherein controlling the virtual object to switch from the first motion state to the second motion state comprises one of:
controlling the virtual object to switch from a first force state to a second force state;
controlling the virtual object to switch from a first amplitude state to a second amplitude state;
controlling the virtual object to switch from a first motion posture to a second motion posture;
controlling the virtual object to be switched from a first carrier state to a second carrier state;
controlling the virtual object to be switched from a first motion direction to a second motion direction of the same plane;
and controlling the virtual object to be switched from the first motion axis to the second motion axis.
12. The method of claim 1, wherein responding to the touch operation at the first portion and/or the second portion comprises at least one of:
responding to a touch operation at the first portion and/or the second portion;
in response to a drag operation at the first portion and/or the second portion;
in response to a voltage controlled operation at the first portion and/or the second portion.
13. An apparatus for controlling a virtual object, comprising:
the first presentation module is used for presenting a first moving control in a control interface of a virtual object when the virtual object in a virtual scene is in a first motion state, wherein the first moving control is used for controlling the virtual object to move in the horizontal direction under a world coordinate system of the virtual scene, and the first moving control comprises a first part and a second part surrounding the first part;
a first switching module, configured to control the virtual object to switch from the first motion state to a second motion state in response to a touch operation at the first portion and/or the second portion.
14. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 12 when executed.
15. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 12.
CN202110580068.1A 2021-05-26 2021-05-26 Virtual object control method and device, storage medium and electronic device Pending CN113440850A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110580068.1A CN113440850A (en) 2021-05-26 2021-05-26 Virtual object control method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110580068.1A CN113440850A (en) 2021-05-26 2021-05-26 Virtual object control method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN113440850A true CN113440850A (en) 2021-09-28

Family

ID=77810316

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110580068.1A Pending CN113440850A (en) 2021-05-26 2021-05-26 Virtual object control method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN113440850A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114669052A (en) * 2022-04-21 2022-06-28 芜湖听松网络科技有限公司 Game control method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107185232A (en) * 2017-05-25 2017-09-22 网易(杭州)网络有限公司 Virtual objects motion control method, device, electronic equipment and storage medium
CN111318019A (en) * 2020-03-02 2020-06-23 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN111399639A (en) * 2020-03-05 2020-07-10 腾讯科技(深圳)有限公司 Method, device and equipment for controlling motion state in virtual environment and readable medium
CN111773677A (en) * 2020-07-23 2020-10-16 网易(杭州)网络有限公司 Game control method and device, computer storage medium and electronic equipment
CN112402959A (en) * 2020-11-19 2021-02-26 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and computer readable storage medium
WO2021036581A1 (en) * 2019-08-30 2021-03-04 腾讯科技(深圳)有限公司 Method for controlling virtual object, and related apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107185232A (en) * 2017-05-25 2017-09-22 网易(杭州)网络有限公司 Virtual objects motion control method, device, electronic equipment and storage medium
WO2021036581A1 (en) * 2019-08-30 2021-03-04 腾讯科技(深圳)有限公司 Method for controlling virtual object, and related apparatus
CN111318019A (en) * 2020-03-02 2020-06-23 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN111399639A (en) * 2020-03-05 2020-07-10 腾讯科技(深圳)有限公司 Method, device and equipment for controlling motion state in virtual environment and readable medium
CN111773677A (en) * 2020-07-23 2020-10-16 网易(杭州)网络有限公司 Game control method and device, computer storage medium and electronic equipment
CN112402959A (en) * 2020-11-19 2021-02-26 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114669052A (en) * 2022-04-21 2022-06-28 芜湖听松网络科技有限公司 Game control method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US20230015409A1 (en) Information prompt method and apparatus in virtual scene, electronic device, and storage medium
CN108744512A (en) Information cuing method and device, storage medium and electronic device
TWI831066B (en) Method for state switching in virtual scene, device, apparatus, medium and program product
US20220297004A1 (en) Method and apparatus for controlling virtual object, device, storage medium, and program product
KR102698789B1 (en) Method and apparatus for processing information of virtual scenes, devices, media and program products
CN113546417A (en) Information processing method and device, electronic equipment and storage medium
CN114225372B (en) Virtual object control method, device, terminal, storage medium and program product
CN113633964B (en) Virtual skill control method, device, equipment and computer readable storage medium
CN114344906B (en) Control method, device, equipment and storage medium for partner object in virtual scene
CN114840092A (en) Method, device, equipment and storage medium for interacting with vehicle-mounted display equipment
CN113440850A (en) Virtual object control method and device, storage medium and electronic device
CN113018862B (en) Virtual object control method and device, electronic equipment and storage medium
CN111318020B (en) Virtual object control method, device, equipment and storage medium
US20230330543A1 (en) Card casting method and apparatus, device, storage medium, and program product
US20230086441A1 (en) Method and apparatus for displaying picture in virtual scene, device, storage medium, and program product
KR20240067252A (en) Interface display methods and devices, terminals, storage media, and computer program products
CN115120979A (en) Display control method and device of virtual object, storage medium and electronic device
CN114042315A (en) Virtual scene-based graphic display method, device, equipment and medium
CN114210051A (en) Carrier control method, device, equipment and storage medium in virtual scene
CN114288660A (en) Virtual character flight control method and device, storage medium and electronic device
US12097428B2 (en) Method and apparatus for state switching in virtual scene, device, medium, and program product
KR102706744B1 (en) Method and apparatus, device, storage medium and program product for controlling virtual objects
KR20220035324A (en) Position adjustment method and apparatus, device and storage medium for control in application
CN115569380A (en) Game role control method, device, computer equipment and storage medium
CN117753007A (en) Interactive processing method and device for virtual scene, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination