WO2022007429A1 - 一种游戏中的寻路控制方法及装置 - Google Patents

一种游戏中的寻路控制方法及装置 Download PDF

Info

Publication number
WO2022007429A1
WO2022007429A1 PCT/CN2021/081063 CN2021081063W WO2022007429A1 WO 2022007429 A1 WO2022007429 A1 WO 2022007429A1 CN 2021081063 W CN2021081063 W CN 2021081063W WO 2022007429 A1 WO2022007429 A1 WO 2022007429A1
Authority
WO
WIPO (PCT)
Prior art keywords
location identifier
location
game scene
touch operation
user interface
Prior art date
Application number
PCT/CN2021/081063
Other languages
English (en)
French (fr)
Inventor
党向前
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Priority to US18/004,308 priority Critical patent/US20230219000A1/en
Publication of WO2022007429A1 publication Critical patent/WO2022007429A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present disclosure relates to the technical field of games, and in particular, to a pathfinding control method in a game, and a pathfinding control device in a game.
  • an automatic pathfinding function such as automatic pathfinding to the task location after the player clicks on the task, and at the same time, marking the target location on the large map is also a high-frequency routine operation.
  • the organic integration of the two to optimize the design of the player's automatic pathfinding for specific target points is not.
  • the embodiments of the present disclosure are proposed to provide an in-game pathfinding control method and a corresponding in-game pathfinding control device that overcome or at least partially solve the above problems.
  • the embodiment of the present disclosure discloses a pathfinding control method in a game.
  • a graphical user interface is provided through a mobile terminal, and the content displayed on the graphical user interface includes at least a part of a game scene and a first virtual character located in the game scene,
  • the graphical user interface provides a movement control area, and the method includes:
  • At least one location identifier is provided in a designated area of the graphical user interface, and the location identifier corresponds to a location in the game scene;
  • the first virtual character is controlled to automatically find a path in the game scene to a position corresponding to the first target position identifier.
  • controlling the first virtual character to move in the game scene in response to a first touch operation acting on the movement control area includes:
  • a movement direction is determined, and the first virtual character is controlled to move in the game scene according to the movement direction.
  • the first touch operation includes a sliding operation
  • the sliding operation includes any one of a direct sliding operation, a sliding operation after a long press, or a sliding operation after a heavy press, wherein the sliding operation has a corresponding touch point.
  • the triggering condition includes: the touch point that detects the sliding operation slides out of the movement control area, or the touch point moves to a specified position, or the moving distance of the touch point greater than the preset distance threshold, or, the moving speed of the touch point is greater than the preset speed threshold, or, the pressing time of the touch point is greater than the preset time threshold, or, the pressing pressure of the touch point is greater than the preset speed threshold.
  • the pressure threshold is set the pressure threshold.
  • the designated area is located above the movement control area.
  • the graphical user interface further provides a game scene thumbnail; the designated area is a display area other than the game scene thumbnail in the graphical user interface.
  • the location identifier includes a static location identifier
  • the static location identifier corresponds to a static location in the game scene.
  • the location identifier includes a dynamic location identifier
  • the dynamic location identifier corresponds to the dynamic location of the second virtual character in the game scene.
  • the location identifier includes an avatar of the second virtual character.
  • the graphical user interface further provides a game scene thumbnail
  • the method includes:
  • the providing at least one location identifier in a designated area of the graphical user interface when the first touch operation satisfies a trigger condition including:
  • a position identifier corresponding to each of the pathfinding positions one-to-one is provided in a designated area of the graphical user interface.
  • a triggered wayfinding location is determined from the at least one wayfinding location, and the triggered wayfinding location is deleted;
  • the location identifier corresponding to the triggered wayfinding location is deleted from the designated area.
  • the determining the first target location identifier from the at least one location identifier includes:
  • a first target location identifier is determined from the at least one location identifier.
  • the determining the first target location identifier from the at least one location identifier includes:
  • a first target location identifier is determined from the at least one location identifier according to the touch point of the first touch operation.
  • the determining the first target location identifier from the at least one location identifier according to the touch point of the first touch operation includes:
  • the touch point of the first touch operation from the at least one location identifier, determine the location identifier corresponding to the touch point as the first target location identifier; or,
  • any location identifier corresponding to the route passed by the touch point is determined as the first target location identifier.
  • the determining the first target location identifier from the at least one location identifier according to the touch point of the first touch operation includes:
  • a first target location identifier is determined from the at least one location identifier according to the position or moving direction of the touch point.
  • the method further includes:
  • the first virtual character is controlled to stop the automatic pathfinding.
  • the method further includes:
  • An embodiment of the present disclosure further discloses a pathfinding control device in a game, which provides a graphical user interface through a mobile terminal, and the content displayed on the graphical user interface includes at least part of a game scene and a first virtual character located in the game scene , the graphical user interface provides a mobile control area, and the device includes:
  • a movement control module configured to control the first virtual character to move in the game scene in response to a first touch operation acting on the movement control area
  • a location identifier providing module configured to provide at least one location identifier in a designated area of the graphical user interface in response to the first touch operation satisfying a trigger condition, and the location identifier corresponds to a location in the game scene;
  • a location identifier determination module configured to determine a first target location identifier from the at least one location identifier
  • An automatic pathfinding control module configured to control the first virtual character to automatically find a path in the game scene to a position corresponding to the first target position identifier.
  • the embodiment of the present disclosure also discloses an electronic device, including:
  • One or more machine-readable media having instructions stored thereon, when executed by the one or more processors, cause the electronic device to perform the method of any of the embodiments of the present disclosure.
  • the embodiment of the present disclosure also discloses a computer-readable storage medium, on which instructions are stored, and when executed by one or more processors, cause the processors to perform the method according to any one of the embodiments of the present disclosure .
  • FIG. 1 is a flow chart of steps of an embodiment of a pathfinding control method in a game of the present disclosure
  • FIG. 2 is a schematic diagram of providing at least one location identifier in a designated area according to the present disclosure
  • FIG. 3 is a schematic diagram of determining a first target location identifier according to the present disclosure
  • FIG. 4 is a schematic diagram of an avatar whose position is identified as a second virtual character according to the present disclosure
  • FIG. 5 is a schematic diagram of a graphical user interface of the present disclosure including game scene thumbnails corresponding to game scenes;
  • FIG. 6 is a schematic diagram of an expanded game scene thumbnail of the present disclosure
  • FIG. 7 is a schematic diagram of closing the thumbnail of the game scene shown in FIG. 5 according to the present disclosure.
  • FIG. 8 is a schematic diagram of displaying automatic pathfinding prompt information in a designated area according to the present disclosure.
  • FIG. 9 is a structural block diagram of an embodiment of a pathfinding control device in a game of the present disclosure.
  • the pathfinding control method in a game in one of the embodiments of the present disclosure may be executed on a terminal device or a server.
  • the terminal device may be a local terminal device.
  • the pathfinding control method in the game runs on the server, the pathfinding control method in the game can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system includes a server and a client device.
  • cloud gaming refers to a game method based on cloud computing.
  • the running mode of the cloud game the running main body of the game program and the main body of the game screen are separated.
  • the storage and operation of the pathfinding control method in the game are completed on the cloud game server.
  • the role of the client device is for data
  • the client device can be a display device with data transmission function close to the user side, such as a mobile terminal, a TV, a computer, a handheld computer, etc.
  • the terminal device of the pathfinding control method is a cloud game server in the cloud.
  • the player When playing the game, the player operates the client device to send operation instructions to the cloud game server, and the cloud game server runs the game according to the operation instructions, encodes and compresses the game screen and other data, and returns it to the client device through the network. Decode and output game screen.
  • the terminal device may be a local terminal device.
  • a local terminal device stores a game program and is used to present a game screen.
  • the local terminal device is used to interact with the player through a graphical user interface, that is, conventionally, the game program is downloaded, installed and executed through an electronic device.
  • the local terminal device may provide the graphical user interface to the player in various ways, for example, it may be rendered and displayed on the display screen of the terminal, or provided to the player through holographic projection.
  • the local terminal device may include a display screen for presenting a graphical user interface, the graphical user interface including game screens, and a processor for running the game, generating the graphical user interface, and controlling the graphical user interface display on the display.
  • a graphical user interface is provided through a mobile terminal, and the displayed content of the graphical user interface includes at least part of the game scene and For the first virtual character of the game scene, the graphical user interface provides a movement control area, which may specifically include the following steps:
  • Step 101 in response to a first touch operation acting on the movement control area, controlling the first virtual character to move in the game scene;
  • the mobile terminal may be the aforementioned local terminal device, or may be the aforementioned client device in the cloud interaction system.
  • the operating system of the mobile terminal may include Android (Android), IOS, Windows Phone, Windows, etc., and can usually support the running of various game applications.
  • the content displayed on the graphical user interface at least partially includes a part or all of the game scene, and the specific shape of the game scene may be a square, Other shapes (eg, circular, etc.) are also possible.
  • the game scene may include a first virtual character
  • the first virtual character may be a game virtual character controlled by a player through a mobile terminal, and may be presented through a graphical user interface
  • the presented content may include the first virtual character
  • the whole may be a part of the first avatar.
  • the content presented by the GUI may include the entirety of the first avatar, or, in a first-person game, the content presented by the GUI may include part of the first avatar or local.
  • the game scene may also include at least one virtual object, and the virtual object may be a game virtual character controlled by an enemy player in the game, or a non-player character preset by the game developer in a specific game scene ( Non-Practicing Character, NPC).
  • a movement control area may be provided in the graphical user interface, and the player may operate in the movement control area to control the movement of the first virtual character.
  • a virtual joystick control may be included in the movement control area, and the virtual joystick control has an operable joystick. The player can adjust the moving direction of the first virtual character by rotating the joystick, and control the first A virtual character moves according to the moving direction.
  • the player When the player needs to control the first virtual character in the game scene to move, he can perform a first touch operation on the movement control area of the graphical user interface. operate. After the game application receives the first touch operation acting on the movement control area, it can respond to the first touch operation and control the first virtual character to move in the game scene. Specifically, the moving direction may be determined according to the first touch operation, and the first virtual character is controlled to move according to the moving direction in the game scene.
  • Step 102 when the first touch operation satisfies the trigger condition, provide at least one location identifier in a designated area of the graphical user interface, and the location identifier corresponds to a location in the game scene;
  • the control operation of automatic pathfinding is combined with the first touch operation on the movement control area, so that the first touch operation on the movement control area is directly Touch operation to select the location to be automatically routed.
  • a trigger condition can be set, and whether the first touch operation satisfies the trigger condition can be detected in real time.
  • the first touch operation satisfies the trigger condition, it can be considered that the player needs to select the position of automatic pathfinding, then at least one position identifier is provided in the designated area of the graphical user interface, and the position identifier corresponds to the position in the game scene and is used to represent the game The token corresponding to the position in the scene.
  • the designated area may be a pre-set area in the GUI for displaying the position identification. As shown in Figure 2, the designated area can be an arc-shaped area located above the mobile control area, and the designated area includes three location markers. Not the same shape and/or color.
  • Step 103 determining a first target location identifier from the at least one location identifier
  • a first target location identifier may be determined from at least one location identifier according to the first touch operation, and the first target location identifier may be a location corresponding to a location used to control the first virtual character to perform automatic pathfinding. location identifier.
  • the first touch operation may have corresponding touch points, so that the first target location identifier is determined from the at least one location identifier according to the location where the touch point is located or the route passed. As shown in FIG. 3 , from the at least one location identifier, a location identifier located in the middle is determined as the first target location identifier.
  • Step 104 controlling the first virtual character to automatically find a path in the game scene to a position corresponding to the first target position identifier.
  • the first virtual character can be controlled to automatically find a path in the game scene to a position corresponding to the first target position identifier.
  • the first target position identifier can be selected from the designated area for automatic pathfinding.
  • the position of the automatic pathfinding can be set in advance, and operations such as pathfinding and switching destinations do not need to open the game scene thumbnail again, so as to avoid interrupting the player's current game behavior, and reduce the need to open the game scene thumbnail and select automatic pathfinding. occlude the scene when you are in the position to improve the player's gaming experience.
  • the automatic pathfinding state can be quickly entered again, the player can interrupt the currently ongoing automatic pathfinding at will, thereby improving the player's degree of game freedom.
  • the step 101 may include the following sub-steps:
  • a movement direction is determined, and the first virtual character is controlled to move in the game scene according to the movement direction.
  • the first touch operation may have a touch point
  • the orientation of the touch point in the movement control area may indicate the movement direction
  • the movement direction may be determined by the position of the touch point
  • the first virtual character may be further controlled in the movement control area. In the game scene, move according to the moving direction.
  • the movement control area may include a virtual joystick control
  • a joystick is included in the virtual joystick control
  • the first touch operation is performed in the virtual joystick control
  • the joystick in the virtual joystick control may follow Rotate with the movement of the touch point
  • the position of the joystick in the virtual joystick control can indicate the moving direction. For example, if the joystick is in the true north direction in the virtual joystick control, it can be determined that the moving direction is the current direction of the virtual character. Due North.
  • the first touch operation includes a sliding operation
  • the sliding operation includes any one of a direct sliding operation, a sliding operation after a long press, or a sliding operation after a heavy press, wherein , the sliding operation has corresponding touch points.
  • the first touch operation may include a sliding operation, the sliding operation has corresponding touch points, and the touch position of the operation medium on the graphical user interface is the touch point.
  • the sliding operation includes any one of a direct sliding operation, a long-press followed by a sliding operation, or a re-pressed followed by a sliding operation, wherein the direct sliding operation refers to an operation in which the operating medium is directly slid on the display screen of the mobile terminal;
  • the operation means that the operation medium first presses the display screen of the mobile terminal for a preset duration (the preset duration can be a preset duration, such as 2 seconds), and then performs a sliding operation on the display screen of the mobile terminal;
  • the operation refers to performing a sliding operation on the display screen of the mobile terminal after the pressure value of the operating medium pressing the display screen of the mobile terminal reaches a preset pressure (the preset pressure may be a preset pressure value).
  • the trigger condition includes: the touch point that detects the sliding operation slides out of the movement control area, or the touch point moves to a specified position, or all The moving distance of the touch point is greater than a preset distance threshold, or, the moving speed of the touch point is greater than a preset speed threshold, or, the pressing time of the touch point is greater than a preset time threshold, or, the touch point
  • the pressing pressure of the handle is greater than the preset pressure threshold.
  • the first touch operation includes a sliding operation, and the sliding operation has corresponding touch points.
  • a trigger condition can be set including: the touch point that detects the sliding operation slides out of the movement control area, or the touch point moves to a specified position, or the moving distance of the touch point is greater than a preset distance threshold , or, the moving speed of the touch point is greater than the preset speed threshold, or, the pressing time of the touch point is greater than the preset time threshold, or, the pressing pressure of the touch point is greater than the preset pressure threshold.
  • the specified position may be a preset position.
  • the specified position is an edge position of the movement control area.
  • the preset distance threshold may be a preset distance threshold. For example, the preset distance threshold is 10 cm. When the moving distance of the touch point is greater than 10 cm, it is considered that the first touch operation satisfies the trigger condition.
  • the preset speed threshold may be a preset speed threshold. For example, the preset speed threshold is 10 cm per second. When the moving speed of the touch point is greater than 10 cm per second, it is considered that the first touch operation satisfies the trigger condition.
  • the preset time threshold may be a preset pressing time length threshold. For example, the preset time threshold is 2 seconds.
  • the preset pressure threshold may be a preset pressure threshold.
  • the preset pressure threshold is 10 Newtons.
  • the pressing pressure of the touch point is greater than 10 Newtons, it is considered that the first touch operation satisfies the triggering condition.
  • the designated area is located above the movement control area.
  • the designated area may be located at a certain distance from the upper edge of the movement control area, for example, the designated area is located at 5 cm from the upper edge of the movement control area.
  • the graphical user interface further provides a game scene thumbnail; the designated area is a display area in the graphical user interface other than the game scene thumbnail.
  • the game scene thumbnail can be set at a specific position in the GUI, such as the upper left corner of the GUI, or the upper right corner, and so on.
  • the designated area may be a display area other than the game scene thumbnail in the GUI.
  • the graphical user interface may further include a setting entry, through which the player can freely set the position of the designated area in the graphical user interface.
  • the location identifier includes a static location identifier, and the static location identifier corresponds to a static location in the game scene.
  • the location identifier may include a static location identifier, and the static location identifier corresponds to a static location in the game scene, that is, a fixed location in the game scene, such as the location of a certain store in the game scene.
  • the location identifier includes a dynamic location identifier
  • the dynamic location identifier corresponds to the dynamic location of the second virtual character in the game scene.
  • a second virtual character may also be included in the game scene, and the second virtual character may be a game virtual character (that is, a teammate of the first virtual character) in the same team as the first virtual character in the game scene ), the second virtual character may also be an NPC in the game scene.
  • the second virtual character may be a game virtual character (that is, a teammate of the first virtual character) in the same team as the first virtual character in the game scene ), the second virtual character may also be an NPC in the game scene.
  • a dynamic position identifier can be set to correspond to the second virtual character in the game scene. so that the player can quickly and automatically find the way to the dynamic position of the second virtual character in the game scene.
  • the location identifier includes an avatar of the second virtual character.
  • the location identifier corresponds to the dynamic position of the second virtual character in the game scene, the location identifier may be the avatar of the second virtual character, so that the player can better distinguish these location identifiers and further improve the speed of selecting the location for automatic pathfinding.
  • FIG. 4 in the arc area above the movement control area, there are three position markers, and the three position markers correspond to the avatars of the three second virtual characters respectively.
  • the method may further include the following steps:
  • the avatar of the second virtual character is determined as the dynamic position identifier.
  • a dynamic position marker can be set to correspond to the dynamic position of the second avatar in the game scene, and the player can quickly find a way to the dynamic position of the second avatar for rescue or cooperative operations.
  • the dynamic position identifier can be set to correspond to the dynamic position of the second avatar in the game scene position, players can quickly find their way to the dynamic position of the second avatar for rescue or cooperation.
  • the The avatar of the second virtual character is determined as a dynamic location identifier.
  • the preset positional relationship may be a preset relationship between the positions of the two virtual characters, for example, the preset positional relationship is the rear, the left side or the right side, and the like. In some games, the player needs to go to the location of a teammate to complete a certain task. If the second avatar is located behind the first avatar controlled by the player and satisfies the preset positional relationship, a dynamic location identifier can be set to correspond to the second avatar In the dynamic position in the game scene, in this way, the player can find the second virtual character in the non-visual area in time, and quickly find the way to the dynamic position of the second virtual character in the game scene.
  • a dynamic position marker can be set to correspond to the dynamic position of the second avatar in the game scene, so that when teammates are in danger, players can quickly Find the way to the dynamic position of the second virtual character for rescue, which improves the game experience.
  • the graphical user interface further provides a game scene thumbnail
  • the method may further include the following steps:
  • At least one wayfinding location is selected on the game scene thumbnail.
  • the player may perform a trigger operation on the game scene thumbnail
  • the trigger operation may refer to an operation for selecting a pathfinding location, such as a click operation, a long-press operation, and the like.
  • the game application may respond to the trigger operation, and select at least one pathfinding position on the game scene thumbnail according to the trigger position in the trigger operation.
  • Each of the selected pathfinding positions may have a one-to-one corresponding position identifier, and the position identifiers corresponding to these pathfinding positions may be displayed on the thumbnail of the game scene.
  • the GUI includes a movement control area (the movement control area has virtual joystick controls), and a game scene thumbnail corresponding to the game scene.
  • the game scene thumbnail is located in the upper right corner of the GUI and displays for the minimap.
  • the player can expand the game scene thumbnail by triggering the game scene thumbnail.
  • the expanded game scene thumbnail is shown in Figure 6. In Figure 6, the player has performed three trigger operations on the expanded game scene thumbnail. , 3 pathfinding positions are set.
  • FIG. 7 shows a schematic diagram of closing the game scene thumbnail as shown in FIG. 6 .
  • a specified area of the graphical user interface is provided with each wayfinding position in the designated area. a location identifier. Therefore, the player can select a position marker from the designated area, and control the first virtual character to automatically find a way to the position corresponding to the position marker, further speeding up the speed of controlling the first virtual character to automatically find a path.
  • the step 102 may include the following sub-steps:
  • a position identifier corresponding to each of the pathfinding positions one-to-one is provided in a designated area of the graphical user interface.
  • a position identifier corresponding to each pathfinding position can be provided in a designated area of the graphical user interface, so that the player can You can select the location identifier corresponding to the wayfinding location from the specified area to automatically find the way to the selected wayfinding location.
  • the method may further include the following steps:
  • a triggered wayfinding location is determined from the at least one wayfinding location, and the triggered wayfinding location is deleted; The location identifier corresponding to the triggered pathfinding location.
  • a trigger operation may be performed on at least one pathfinding position in the thumbnail image of the game scene, and the trigger operation may include a click operation, a long press operation, and the like.
  • the game application After the game application receives a trigger operation acting on at least one wayfinding location, it can respond to the trigger operation, determine the triggered wayfinding location from the at least one wayfinding location, and delete the triggered wayfinding location from the at least one wayfinding location.
  • the location identifier corresponding to the triggered pathfinding position in the thumbnail of the game scene can be hidden, and further, delete the triggered pathfinding position from the designated area. The location identifier corresponding to the location.
  • the step 103 may include the following sub-steps:
  • a first target location identifier is determined from the at least one location identifier.
  • the first target location identifier may be determined from the at least one location identifier in response to the end of the first touch operation.
  • the touch point at the end of receiving the first touch operation may be determined, and the first target location identifier may be determined from at least one location identifier according to the touch point, or it may be determined to perform the first touch operation
  • the route along which the touch point of the first touch operation passes is determined from the at least one location identifier according to the route.
  • the first virtual character When the first virtual character is controlled to move through the first touch operation, it can be detected in real time whether the first touch operation satisfies the trigger condition. When the first touch operation satisfies the trigger condition, it is considered that the player needs to perform automatic pathfinding.
  • at least one location marker is provided in a designated area on the graphical user interface for the player to select, and the player continues to select the location marker corresponding to the pathfinding position through the first touch operation, and further, can respond to the end of the first touch operation. , determining the first target location identifier from at least one location identifier, so that the first virtual character can be controlled to automatically find a way to the location corresponding to the first target location identifier.
  • the determining of the first target location identifier from the at least one location identifier includes:
  • a first target location identifier is determined from the at least one location identifier according to the touch point of the first touch operation.
  • the first touch operation has corresponding touch points, and when the first touch operation satisfies the trigger condition, the first target location identifier may be determined from at least one location identifier according to the touch point of the first touch operation. .
  • the determining of the first target location identifier from the at least one location identifier according to the touch point of the first touch operation includes:
  • the touch point of the first touch operation from the at least one position identifier, determine the location identifier corresponding to the touch point as the first target location identifier; or, according to the first touch operation the touch point, and from the at least one location identifier, determine any one location identifier corresponding to the route passed by the touch point as the first target location identifier.
  • the touch point of the first touch operation may be continuously moved, and when the first target position identifier needs to be determined, the touch point of the first touch operation may be last For the location, the location identifier corresponding to the touch point is determined from at least one location identifier as the first target location identifier.
  • any position identifier corresponding to the route passed by the touch point may be determined as the first target position from at least one position identifier according to the touch point of the first touch operation. logo.
  • one location identifier may be randomly determined as the first target location identifier from the multiple location identifiers passed through; When there is only one corresponding location identifier, the location identifier may be determined as the first target location identifier.
  • the determining the first target position identifier according to the touch point of the first touch operation includes:
  • a first target location identifier is determined from the at least one location identifier according to the position or moving direction of the touch point.
  • the location identifier corresponding to the position can be determined from at least one location identifier as the first target location identifier, or the location identifier closest to the location can be determined from the at least one location identifier as the first target location identifier.
  • Target location identifier In addition, according to the moving direction of the touch point, it is also possible to determine from at least one position identifier that the location identifier pointed to by the moving direction is the first target location identifier, so that the touch point that does not require the first touch operation is moved to the location By identifying the location where the marker is located, the first target location marker can be selected to further speed up the speed of manipulating the first virtual character for automatic pathfinding.
  • the method may further include the following steps:
  • the first virtual character is controlled to stop the automatic pathfinding.
  • the player may interrupt the automatic pathfinding by performing a second touch operation on the movement control area.
  • the second touch operation may be an operation for interrupting automatic pathfinding, for example, a click operation, a double-click operation, etc., which is not limited in the embodiment of the present disclosure, and the trigger operation may act on the movement control area, thereby
  • the automatic pathfinding of the avatar is prevented from being interrupted by mistakenly touching other areas on the graphical user interface, thereby reducing the probability of misoperation.
  • the game application may, after receiving the second touch operation acting on the movement control area, respond to the second touch operation, and control the virtual character to stop automatically finding the way to the position corresponding to the first target position marker.
  • the method may further include the following steps:
  • prompt information for automatic pathfinding may be displayed in a designated area, and the prompt information may include the name of the position corresponding to the first target position identifier to remind the player that the player is currently searching for a path automatically. location, avoid going to the wrong location, causing unnecessary time consumption.
  • the prompt information may be "Automatic wayfinding to the airport”.
  • the automatic pathfinding prompt information displayed in the designated area is “Automatically find a path to XXX”.
  • the method may further include the following steps:
  • the player may perform a third touch operation on the designated area to switch the position of the automatic pathfinding.
  • the third touch operation may refer to an operation for switching the position of automatic pathfinding, for example, the third touch operation is a double-click operation, a sliding operation, or a click operation, which is not limited in this embodiment of the present disclosure.
  • the game application may respond to the third touch operation, and determine the second target location identifier from the remaining location identifiers, where the remaining location identifiers are among the at least one location identifiers.
  • Location identifiers other than the first target location identifier As an example, in the designated area, there are four location identifiers, namely A, B, C, and D. Assuming that the first target location identifier is A, the remaining location identifiers are B, C, and D.
  • the player can directly select the second target position mark by performing the third touch operation in the designated area, so that the position where the first avatar will automatically find the way can be controlled, from The position corresponding to the first target position identification is switched to the position corresponding to the second target position identification, so that when switching the position of automatic pathfinding, it is not necessary to open the thumbnail of the game scene again for selection, so as to avoid interrupting the current game behavior of the player and reduce the number of games being opened.
  • the scene thumbnail selects the position of automatic pathfinding, the scene is blocked, which solves the problem that the position of automatic pathfinding cannot be quickly switched, and improves the game experience of players.
  • the automatic pathfinding state can be quickly returned again, the player can interrupt the currently ongoing automatic pathfinding at will, thereby improving the player's game freedom.
  • FIG. 9 a structural block diagram of an embodiment of a pathfinding control device in a game of the present disclosure is shown, a graphical user interface is provided through a mobile terminal, and the displayed content of the graphical user interface includes at least part of the game scene and the The first virtual character of the game scene, the graphical user interface provides a movement control area, which may specifically include the following modules:
  • a movement control module 901 configured to control the first virtual character to move in the game scene in response to a first touch operation acting on the movement control area;
  • a location identifier providing module 902 configured to provide at least one location identifier in a designated area of the graphical user interface in response to the first touch operation satisfying a trigger condition, and the location identifier corresponds to a location in the game scene;
  • a location identifier determination module 903, configured to determine a first target location identifier from the at least one location identifier
  • the automatic pathfinding control module 904 is configured to control the first virtual character to automatically find a path in the game scene to a position corresponding to the first target position identifier.
  • the movement control module 901 includes:
  • a movement control sub-module configured to determine a movement direction in response to a first touch operation acting on the movement control area, and control the first virtual character to move in the game scene according to the movement direction.
  • the first touch operation includes a sliding operation
  • the sliding operation includes any one of a direct sliding operation, a sliding operation after a long press, or a sliding operation after a heavy press, wherein , the sliding operation has corresponding touch points.
  • the trigger condition includes: the touch point that detects the sliding operation slides out of the movement control area, or the touch point moves to a specified position, or all The moving distance of the touch point is greater than a preset distance threshold, or, the moving speed of the touch point is greater than a preset speed threshold, or, the pressing time of the touch point is greater than a preset time threshold, or, the touch point
  • the pressing pressure of the handle is greater than the preset pressure threshold.
  • the designated area is located above the movement control area.
  • the graphical user interface further provides a game scene thumbnail; the designated area is a display area in the graphical user interface other than the game scene thumbnail.
  • the location identifier includes a static location identifier, and the static location identifier corresponds to a static location in the game scene.
  • the location identifier includes a dynamic location identifier
  • the dynamic location identifier corresponds to the dynamic location of the second virtual character in the game scene.
  • the location identifier includes an avatar of the second virtual character.
  • the graphical user interface further provides a game scene thumbnail image; the device further includes:
  • a pathfinding position selection module used for selecting at least one pathfinding position on the game scene thumbnail
  • the location identification providing module 902 includes:
  • the location identifier providing sub-module is configured to provide location identifiers corresponding to each of the pathfinding locations one-to-one in a designated area of the graphical user interface when the first touch operation satisfies a trigger condition.
  • a wayfinding location deletion module configured to determine a triggered wayfinding location from the at least one wayfinding location in response to a triggering operation acting on the at least one wayfinding location, and delete the triggered wayfinding location;
  • a location identifier deletion module configured to delete the location identifier corresponding to the triggered pathfinding location from the designated area.
  • the location identifier determination module 903 includes:
  • the first location identifier determination sub-module is configured to determine the first target location identifier from the at least one location identifier in response to the end of the first touch operation.
  • the location identifier determination module 903 includes:
  • the second location identifier determination sub-module is configured to determine the first target location identifier from the at least one location identifier according to the touch point of the first touch operation.
  • the second location identifier determination submodule includes:
  • a first position identification determination unit configured to determine, from the at least one position identification, a position identification corresponding to the touch point as a first target position identification according to the touch point of the first touch operation; or ,
  • the second position identification determining unit is configured to, according to the touch point of the first touch operation, from the at least one position identification, determine any position identification corresponding to the route passed by the touch point as the first position identification Target location identifier.
  • the second location identifier determination submodule includes:
  • a third location identifier determining unit configured to determine a first target location identifier from the at least one location identifier according to the position or moving direction of the touch point.
  • the device further includes:
  • the automatic pathfinding stop module is configured to control the first virtual character to stop the automatic pathfinding in response to a second touch operation acting on the movement control area during the automatic pathfinding process.
  • the device further includes:
  • a second location identifier determination module configured to determine a second target location identifier from the remaining location identifiers in response to the third touch operation acting on the designated area, where the remaining location identifiers are obtained by dividing the at least one location identifier. location identifiers other than the first target location identifier.
  • Embodiments of the present disclosure also provide an electronic device, including:
  • One or more machine-readable media having instructions stored thereon, when executed by the one or more processors, cause the electronic device to perform the method of any of the embodiments of the present disclosure.
  • Embodiments of the present disclosure also provide a computer-readable storage medium, on which instructions are stored, and when executed by one or more processors, cause the processors to perform the method according to any one of the embodiments of the present disclosure .
  • embodiments of the embodiments of the present disclosure may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present disclosure may take the form of a computer program product implemented on one or more computer-usable storage media having computer-usable program code embodied therein, including but not limited to disk storage, CD-ROM, optical storage, and the like.
  • Embodiments of the present disclosure are described with reference to flowcharts and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the present disclosure. It will be understood that each flow and/or block in the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to the processor of a general purpose computer, special purpose computer, embedded processor or other programmable data processing terminal equipment to produce a machine that causes the instructions to be executed by the processor of the computer or other programmable data processing terminal equipment Means are created for implementing the functions specified in the flow or flows of the flowcharts and/or the blocks or blocks of the block diagrams.
  • These computer program instructions may also be stored in a computer readable memory capable of directing a computer or other programmable data processing terminal equipment to operate in a particular manner, such that the instructions stored in the computer readable memory result in an article of manufacture comprising instruction means, the The instruction means implement the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.

Abstract

一种游戏中的寻路控制方法及装置,包括:响应作用于移动控制区域的第一触控操作,控制第一虚拟角色在游戏场景中移动(101);当第一触控操作满足触发条件时,在图形用户界面的指定区域提供至少一个位置标识,位置标识对应游戏场景中的位置(102);从至少一个位置标识中确定第一目标位置标识(103);控制第一虚拟角色在游戏场景中自动寻路至第一目标位置标识对应的位置(104)。该方案可以快速控制虚拟角色进行自动寻路,将自动寻路和切换目的地的功能融合于虚拟摇杆控件。

Description

一种游戏中的寻路控制方法及装置
本公开要求于2020年07月06日提交的申请号为202010642984.9、名称为“一种游戏中的寻路控制方法及装置”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入全文。
技术领域
本公开涉及游戏技术领域,特别是涉及一种游戏中的寻路控制方法,以及一种游戏中的寻路控制装置。
背景技术
在一些游戏中,提供有自动寻路功能,如在玩家点击任务后自动寻路前往任务地点,同时,在大地图中标记目标地点也是一种高频的常规操作。然而,将两者有机的融合,优化玩家针对特定目标点进行自动寻路的设计却没有。
目前,自动寻路主要有两种方案,一种方案是:点击地图中的某一位置标记或虚拟角色标识,进行自动寻路,但是,打断寻路状态后需要重新打开地图进行操作,打开地图必然会打断玩家当前游戏行为,降低玩家的游戏体验,而且当有多个标记目标点时,无法进行快速切换,需要不断打开、关闭地图进行切换操作。另一种是方案:自动寻路与标记绑定,在地图上标记一处位置后开始自动寻路,打断自动寻路标记消失,但是,这种方案很容易导致误操作,一旦误操作打断自动寻路,就需要重新打开地图进行重复性标记寻路操作,体验较差,而且,无法针对地图进行多个地点标记,因为绑定新的标记会打断原有的寻路,游戏的自由度不高,标记后只能自动寻路,一旦打断就需要重新标记。
需要说明的是,在上述背景技术部分公开的信息仅用于加强对本公开的背景的理解,因此可以包括不构成对本领域普通技术人员已知的现有技术的信息。
发明内容
鉴于上述问题,提出了本公开实施例以便提供一种克服上述问题或者至少部分地解决上述问题的一种游戏中的寻路控制方法和相应的一种游戏中的寻路控制装置。
本公开实施例公开了一种游戏中的寻路控制方法,通过移动终端提供图形用户界面,所述图形用户界面所显示的内容包含至少部分游戏场景以及位于所述游戏场景的第一虚拟角色,所述图形用户界面提供一移动控制区域,所述方法包括:
响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏 场景中移动;
当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供至少一个位置标识,所述位置标识对应所述游戏场景中的位置;
从所述至少一个位置标识中确定第一目标位置标识;
控制所述第一虚拟角色在所述游戏场景中自动寻路至所述第一目标位置标识对应的位置。
可选地,所述响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动,包括:
响应作用于所述移动控制区域的第一触控操作,确定移动方向,并根据所述移动方向控制所述第一虚拟角色在所述游戏场景中移动。
可选地,所述第一触控操作包括滑动操作,所述滑动操作包括直接滑动操作、长按后滑动操作、或重按后滑动操作中的任一项,其中,所述滑动操作具有对应的触控点。
可选地,所述触发条件包括:检测到所述滑动操作的触控点滑出所述移动控制区域,或,所述触控点移动至指定位置,或,所述触控点的移动距离大于预设距离阈值,或,所述触控点的移动速度大于预设速度阈值,或,所述触控点的按压时间大于预设时间阈值,或,所述触控点的按压压力大于预设压力阈值。
可选地,所述指定区域位于所述移动控制区域的上方。
可选地,所述图形用户界面还提供一游戏场景缩略图;所述指定区域为所述图形用户界面中除所述游戏场景缩略图之外的显示区域。
可选地,所述位置标识包括静态位置标识,所述静态位置标识对应所述游戏场景中的静态位置。
可选地,所述位置标识包括动态位置标识,所述动态位置标识对应第二虚拟角色在游戏场景中的动态位置。
可选地,所述位置标识包括所述第二虚拟角色的头像。
可选地,所述图形用户界面还提供一游戏场景缩略图;
在所述当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供至少一个位置标识之前,所述方法包括:
在所述游戏场景缩略图上选定至少一个寻路位置;
所述当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供至少一个位置标识,包括:
当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供与各个所述寻路位置一一对应的位置标识。
可选地,还包括:
响应作用于所述至少一个寻路位置的触发操作,从所述至少一个寻路位置中确定被 触发的寻路位置,并删除所述被触发的寻路位置;
从所述指定区域中删除与所述被触发的寻路位置对应的位置标识。
可选地,所述从所述至少一个位置标识中确定第一目标位置标识,包括:
响应所述第一触控操作的结束,从所述至少一个位置标识中确定第一目标位置标识。
可选地,所述从所述至少一个位置标识中确定第一目标位置标识,包括:
根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识。
可选地,所述根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识,包括:
根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点对应的位置标识为第一目标位置标识;或,
根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点所经过路线对应的任一个位置标识为第一目标位置标识。
可选地,所述根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识,包括:
根据所述触控点的位置或移动方向,从所述至少一个位置标识中确定第一目标位置标识。
可选地,所述方法还包括:
在进行所述自动寻路的过程中,响应作用于所述移动控制区域的第二触控操作,控制所述第一虚拟角色停止所述自动寻路。
可选地,所述方法还包括:
响应作用于所述指定区域的第三触控操作,从剩余位置标识中确定第二目标位置标识,所述剩余位置标识为所述至少一个位置标识中除所述第一目标位置标识之外的位置标识。
本公开实施例还公开了一种游戏中的寻路控制装置,通过移动终端提供图形用户界面,所述图形用户界面所显示的内容包含至少部分游戏场景以及位于所述游戏场景的第一虚拟角色,所述图形用户界面提供一移动控制区域,所述装置包括:
移动控制模块,用于响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动;
位置标识提供模块,用于响应所述第一触控操作满足触发条件,在所述图形用户界面的指定区域提供至少一个位置标识,所述位置标识对应所述游戏场景中的位置;
位置标识确定模块,用于从所述至少一个位置标识中确定第一目标位置标识;
自动寻路控制模块,用于控制所述第一虚拟角色在所述游戏场景中自动寻路至所述第一目标位置标识对应的位置。
本公开实施例还公开了一种电子设备,包括:
一个或多个处理器;和
其上存储有指令的一个或多个机器可读介质,当由所述一个或多个处理器执行时,使得所述电子设备执行如本公开实施例任一项所述的方法。
本公开实施例还公开了一种计算机可读存储介质,其上存储有指令,当由一个或多个处理器执行时,使得所述处理器执行如本公开实施例任一项所述的方法。
附图说明
图1是本公开的一种游戏中的寻路控制方法实施例的步骤流程图;
图2是本公开的一种在指定区域提供至少一个位置标识的示意图;
图3是本公开的一种确定第一目标位置标识的示意图;
图4是本公开的一种位置标识为第二虚拟角色的头像的示意图;
图5是本公开的一种图形用户界面包含与游戏场景对应的游戏场景缩略图的示意图;
图6是本公开的一种展开后的游戏场景缩略图的示意图;
图7是本公开的一种关闭如图5所示的游戏场景缩略图的示意图;
图8是本公开的一种在指定区域显示自动寻路提示信息的示意图;
图9是本公开的一种游戏中的寻路控制装置实施例的结构框图。
具体实施方式
为使本公开的上述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本公开作进一步详细的说明。
在本公开其中一种实施例中的游戏中的寻路控制方法可以运行于终端设备或者是服务器。其中,终端设备可以为本地终端设备。当游戏中的寻路控制方法运行于服务器时,该游戏中的寻路控制方法则可以基于云交互系统来实现与执行,其中,云交互系统包括服务器和客户端设备。
在一可选的实施方式中,云交互系统下可以运行各种云应用,例如:云游戏。以云游戏为例,云游戏是指以云计算为基础的游戏方式。在云游戏的运行模式下,游戏程序的运行主体和游戏画面呈现主体是分离的,游戏中的寻路控制方法的储存与运行是在云游戏服务器上完成的,客户端设备的作用用于数据的接收、发送以及游戏画面的呈现,举例而言,客户端设备可以是靠近用户侧的具有数据传输功能的显示设备,如,移动终端、电视机、计算机、掌上电脑等;但是进行游戏中的寻路控制方法的终端设备为云端的云游戏服务器。在进行游戏时,玩家操作客户端设备向云游戏服务器发送操作指令,云游戏服务器根据操作指令运行游戏,将游戏画面等数据进行编码压缩,通过网络返回客户端设备,最后,通过客户端设备进行解码并输出游戏画面。
在一可选的实施方式中,终端设备可以为本地终端设备。以游戏为例,本地终端设备存储有游戏程序并用于呈现游戏画面。本地终端设备用于通过图形用户界面与玩家进行交互,即,常规的通过电子设备下载安装游戏程序并运行。该本地终端设备将图形用户界面提供给玩家的方式可以包括多种,例如,可以渲染显示在终端的显示屏上,或者,通过全息投影提供给玩家。举例而言,本地终端设备可以包括显示屏和处理器,该显示屏用于呈现图形用户界面,该图形用户界面包括游戏画面,该处理器用于运行该游戏、生成图形用户界面以及控制图形用户界面在显示屏上的显示。
参照图1,示出了本公开的一种游戏中的寻路控制方法实施例的步骤流程图,通过移动终端提供图形用户界面,所述图形用户界面所显示的内容包含至少部分游戏场景以及位于所述游戏场景的第一虚拟角色,所述图形用户界面提供一移动控制区域,具体可以包括如下步骤:
步骤101,响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动;
需要说明的是,移动终端可以是前述提到的本地终端设备,也可以是前述提到的云交互系统中的客户端设备。该移动终端的操作系统可以包括Android(安卓)、IOS、Windows Phone、Windows等等,通常可以支持各种游戏应用的运行。
通过在移动终端上运行游戏应用,并在移动终端的显示器上渲染得到图形用户界面,图形用户界面所显示的内容至少部分地包含一局部或全部的游戏场景,游戏场景的具体形态可以是方形,也可以是其它形状(比如,圆形等)。
具体地,游戏场景中可以包括第一虚拟角色,该第一虚拟角色可以是玩家通过移动终端进行操控的游戏虚拟角色,可以通过图形用户界面所呈现,所呈现的内容可以包含第一虚拟角色的全部,也可以是第一虚拟角色的局部。例如,在第三人称视角游戏中,图形用户界面所呈现的内容可以包含第一虚拟角色的全部,或者,在第一人称视角游戏中,图形用户界面所呈现的内容可以包含第一虚拟角色的部分或局部。此外,游戏场景中还可以包括至少一虚拟对象,该虚拟对象可以是游戏中敌方玩家所控制的游戏虚拟角色,也可以是游戏开发者在某个具体游戏场景中预先设置的非玩家角色(Non-Practicing Character,NPC)。
在图形用户界面中可以提供一移动控制区域,玩家可在该移动控制区域进行操作以控制第一虚拟角色进行移动。可选的,在移动控制区域中可以包括一虚拟摇杆控件,该虚拟摇杆控件中具有可操作的摇杆,玩家可以通过转动摇杆来调整第一虚拟角色的移动方向,并以控制第一虚拟角色按照该移动方向进行移动。
当玩家需要操控游戏场景中的第一虚拟角色进行移动时,可以在图形用户界面的移动控制区域上进行第一触控操作,该第一触控操指用于控制第一虚拟角色进行移动的操作。游戏应用接收到作用于移动控制区域的第一触控操作之后,可以对该第一触控操作 进行响应,控制第一虚拟角色在游戏场景中移动。具体的,可以根据第一触控操作确定移动方向,控制第一虚拟角色在游戏场景中按照该移动方向进行移动。
步骤102,当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供至少一个位置标识,所述位置标识对应所述游戏场景中的位置;
为了实现快速操控虚拟角色进行自动寻路,在本公开实施例中,将自动寻路的控制操作与针对移动控制区域的第一触控操作结合,以便于直接通过作用于移动控制区域的第一触控操作来选择所要自动寻路的位置。
具体的,可以设置一触发条件,并实时检测第一触控操作是否满足该触发条件。当第一触控操作满足触发条件时,可以认为玩家需要选择自动寻路的位置,则在图形用户界面的指定区域提供至少一个位置标识,该位置标识对应游戏场景中的位置,用于表示游戏场景中的位置对应的记号。指定区域可以是图形用户界面中预先设定的区域,用于展示位置标识。如图2所示,指定区域可以是位于移动控制区域上方的一个弧形区域,在指定区域中包含3个位置标识,在具体实现中,为了对位置标识进行区分,可以设置每一个位置标识的形态和/或颜色不相同。
步骤103,从所述至少一个位置标识中确定第一目标位置标识;
在提供位置标识之后,可以进一步根据第一触控操作,从至少一个位置标识中确定第一目标位置标识,该第一目标位置标识可以是用于控制第一虚拟角色进行自动寻路的位置对应的位置标识。
具体的,第一触控操作可以具有对应的触控点,从而根据触控点所处的位置或所经过的路线从至少一个位置标识中确定第一目标位置标识。如图3所示,从至少一个位置标识中,确定位于中间的位置标识为第一目标位置标识。
步骤104,控制所述第一虚拟角色在所述游戏场景中自动寻路至所述第一目标位置标识对应的位置。
在确定第一目标位置标识之后,可以控制第一虚拟角色在游戏场景中自动寻路至第一目标位置标识对应的位置。
通过将自动寻路的控制操作与针对移动控制区域的第一触控操作结合,从而可以在第一触控操作满足触发条件时,从指定区域选择第一目标位置标识进行自动寻路。同时,自动寻路的位置可以在先设定好,寻路、切换目的地等操作不需要再次打开游戏场景缩略图,避免打断玩家当前的游戏行为,减少打开游戏场景缩略图选择自动寻路的位置时遮挡场景,提高玩家的游戏体验。而且,由于可以再次快速进入自动寻路状态,因此,玩家可以随意打断当前正在进行的自动寻路,从而提高玩家游戏自由度。
在本公开的一种优选实施例中,所述步骤101可以包括如下子步骤:
响应作用于所述移动控制区域的第一触控操作,确定移动方向,并根据所述移动方向控制所述第一虚拟角色在所述游戏场景中移动。
具体的,第一触控操作可以具有触控点,触控点在移动控制区域中所处的方位可以指示移动方向,通过触控点的位置可以确定移动方向,并进一步控制第一虚拟角色在游戏场景中按照该移动方向进行移动。
作为一种示例,移动控制区域可以包括一虚拟摇杆控件,在虚拟摇杆控件内包含一摇杆,在虚拟摇杆控件中进行第一触控操作,虚拟摇杆控件中的摇杆可以随着触控点的移动进行转动,摇杆在虚拟摇杆控件中的方位可以指示移动方向,例如,摇杆在虚拟摇杆控件中处于正北方向,则可以确定移动方向为虚拟角色当前朝向的正北方向。
在本公开的一种优选实施例中,所述第一触控操作包括滑动操作,所述滑动操作包括直接滑动操作、长按后滑动操作、或重按后滑动操作中的任一项,其中,所述滑动操作具有对应的触控点。
具体的,第一触控操作可以包括滑动操作,滑动操作具有对应的触控点,操作介质在图形用户界面上的触控位置即为触控点。滑动操作包括直接滑动操作、长按后滑动操作、或重按后滑动操作中的任一项,其中,直接滑动操作指操作介质直接在移动终端的显示屏上进行滑动的操作;长按后滑动操作指操作介质首先按压移动终端的显示屏预设时长(该预设时长可以是预先设定的时间长度,如2秒),然后在移动终端的显示屏上进行滑动的操作;重按后滑动操作指操作介质按压移动终端的显示屏的压力值达到预设压力(该预设压力可以是预先设定的压力值)后,在移动终端的显示屏上进行滑动的操作。
在本公开的一种优选实施例中,所述触发条件包括:检测到所述滑动操作的触控点滑出所述移动控制区域,或,所述触控点移动至指定位置,或,所述触控点的移动距离大于预设距离阈值,或,所述触控点的移动速度大于预设速度阈值,或,所述触控点的按压时间大于预设时间阈值,或,所述触控点的按压压力大于预设压力阈值。
第一触控操作包括滑动操作,滑动操作具有对应的触控点。在本公开实施例中,可以设置触发条件包括:检测到滑动操作的触控点滑出移动控制区域,或,触控点移动至指定位置,或,触控点的移动距离大于预设距离阈值,或,触控点的移动速度大于预设速度阈值,或,触控点的按压时间大于预设时间阈值,或,触控点的按压压力大于预设压力阈值。其中,指定位置可以是预先设定的位置,如指定位置为移动控制区域的边缘位置,当触控点移动至移动控制区域的边缘位置,则认为第一触控操作满足触发条件。预设距离阈值可以是预先设定的距离临界值,如,预设距离阈值为10厘米,当触控点的移动距离大于10厘米,则认为第一触控操作满足触发条件。预设速度阈值可以是预先设定的速度临界值,如预设速度阈值为10厘米每秒,当触控点的移动速度大于10厘米每秒,则认为第一触控操作满足触发条件。预设时间阈值可以是预先设定的按压时间长度临界值,如,预设时间阈值为2秒,当触控点的按压时间大于2秒,则认为第一触控操作满足触发条件。预设压力阈值可以是预先设定的压力临界值,如,预设压力阈值为10牛顿,当触控点的按压压力大于10牛顿,则认为第一触控操作满足触发条件。
在本公开的一种优选实施例中,所述指定区域位于所述移动控制区域的上方。具体的,指定区域可以位于所述移动控制区域的上方边缘一定距离的位置,如,指定区域位于移动控制区域的上方边缘5厘米处。
在本公开的一种优选实施例中,所述图形用户界面还提供一游戏场景缩略图;所述指定区域为所述图形用户界面中除所述游戏场景缩略图之外的显示区域。
游戏场景缩略图可以设置在图形用户界面中的特定位置,如设置在图形用户界面的左上角,或右上角等等。指定区域可以为图形用户界面中除游戏场景缩略图之外的显示区域。
此外,图形用户界面中还可以包括一设置入口,玩家可以通过该设置入口自由设置指定区域在图形用户界面中的位置。
在本公开的一种优选实施例中,所述位置标识包括静态位置标识,所述静态位置标识对应所述游戏场景中的静态位置。
在本公开实施例中,位置标识可以包括静态位置标识,该静态位置标识对应所述游戏场景中的静态位置,即游戏场景中固定的位置,如在游戏场景中的某个商店的位置。
在本公开的一种优选实施例中,所述位置标识包括动态位置标识,所述动态位置标识对应第二虚拟角色在游戏场景中的动态位置。
在本公开实施例中,游戏场景中还可以包含第二虚拟角色,该第二虚拟角色可以是游戏场景中与第一虚拟角色处于同一个团队中的游戏虚拟角色(即第一虚拟角色的队友),第二虚拟角色还可以是游戏场景中的NPC。
在一些游戏中,玩家需要前往队友的位置以完成某一个任务,或,在进行任务时玩家需要与游戏场景中的NPC进行交互,因此,可以设置动态位置标识对应第二虚拟角色在游戏场景中的动态位置,以便于玩家快速自动寻路至第二虚拟角色在游戏场景中的动态位置。
在本公开的一种优选实施例中,所述位置标识包括所述第二虚拟角色的头像。当位置标识对应第二虚拟角色在游戏场景中的动态位置时,位置标识可以是第二虚拟角色的头像,以便于玩家更好地分辨这些位置标识,进一步提高选择自动寻路的位置的速度。如图4所示,在移动控制区域上方的弧形区域中,包含3个位置标识,这三个位置标识分别对应于三个第二虚拟角色的头像。
在本公开的一种优选实施例中,所述的方法还可以包括如下步骤:
根据第二虚拟角色的位置或状态属性,将所述第二虚拟角色的头像确定为动态位置标识。
可选的,在一些游戏中,可以根据第二虚拟角色的位置是否进入到特定区域来确定是否将第二虚拟角色的头像确定为动态位置标识,比如,当第二虚拟对象(如队友)进入到野区时,可以设置动态位置标识对应第二虚拟角色在游戏场景中的动态位置,玩家 可以快速寻路至第二虚拟角色的动态位置进行救援或配合作战。可选的,也可以根据第二虚拟角色的状态属性来确定是否将第二虚拟角色的头像确定为动态位置标识,上述状态属性可以包括第二虚拟角色的血量、技能CD(Cool Down,冷却)时间、等级、受到攻击或攻击敌方等属性,比如,第二虚拟角色处于残血状态(血量低于指定数值)时,可以设置动态位置标识对应第二虚拟角色在游戏场景中的动态位置,玩家可以快速寻路至第二虚拟角色的动态位置进行救援或配合作战。
在本公开一种的优选实施例中,在所述第一虚拟角色的位置与所述第二虚拟角色的位置满足预设位置关系,或者,第二虚拟角色受到虚拟攻击时,将所述第二虚拟角色的头像确定为动态位置标识。
预设位置关系可以为预先设定的两个虚拟角色所处位置之间的关系,例如,预设位置关系为后方,左侧方或右侧方等。在一些游戏中,玩家需要前往队友的位置以完成某一个任务,如果第二虚拟角色位于玩家操控的第一虚拟角色的后方,满足预设位置关系,则可以设置动态位置标识对应第二虚拟角色在游戏场景中的动态位置,这样,玩家可以及时发现处于非可视区域内的第二虚拟角色,并快速寻路至第二虚拟角色在游戏场景中的动态位置。在另一些游戏中,可以在第二虚拟角色遭受敌方单位的虚拟攻击时,设置动态位置标识对应第二虚拟角色在游戏场景中的动态位置,这样,在队友遇到危险时,玩家可以快速寻路至第二虚拟角色的动态位置进行救援,提高了游戏体验。
在本公开的一种优选实施例中,所述图形用户界面还提供一游戏场景缩略图;
在所述步骤102之前,所述方法还可以包括如下步骤:
在所述游戏场景缩略图上选定至少一个寻路位置。
具体的,玩家可以在游戏场景缩略图上进行触发操作,该触发操作可以指用于选择寻路位置的操作,如,点击操作,长按操作等等。游戏应用接收到作用于在游戏场景缩略图上的触发操作之后,可以对该触发操作进行响应,根据触发操作中的触发位置在游戏场景缩略图上选定至少一个寻路位置。其中,选定的各个寻路位置可以具有一一对应的位置标识,可以在游戏场景缩略图上展示这些寻路位置对应的位置标识。
如图5所示,图形用户界面包括移动控制区域(移动控制区域具有虚拟摇杆控件),以及与游戏场景对应的游戏场景缩略图,该游戏场景缩略图位于图形用户界面的右上角,并显示为小地图。玩家可以通过在游戏场景缩略图的触发操作展开游戏场景缩略图,展开后的游戏场景缩略图如图6所示,图6中,玩家在展开后的游戏场景缩略图上进行了3次触发操作,设置了3个寻路位置。
在设置完成至少一个寻路位置之后,玩家可以通过点击图形用户界面上的非游戏场景缩略图的区域关闭游戏场景缩略图,或者,在展开后的游戏场景缩略图中包含关闭控件,玩家通过点击该关闭控件来关闭游戏场景缩略图,如图7示出了一种关闭如图6所示的游戏场景缩略图的示意图。
在本公开实施例中,还可以在选定至少一个寻路位置之后,若检测到游戏场景缩略图由展开状态切换为非展开状态,则在图形用户界面的指定区域提供与各个寻路位置一一位置标识。从而玩家可以从指定区域中选择一个位置标识,并控制第一虚拟角色自动寻路至该位置标识对应的位置,进一步加快控制第一虚拟角色进行自动寻路的速度。
在本公开的一种优选实施例中,所述步骤102可以包括如下子步骤:
当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供与各个所述寻路位置一一对应的位置标识。
具体的,通过实时检测第一触控操作是否满足触发条件,当第一触控操作满足触发条件时,可以在图形用户界面的指定区域提供与各个寻路位置一一对应的位置标识,从而玩家可以从指定区域中选择寻路位置对应的位置标识,以自动寻路至所选定的寻路位置。
在本公开的一种优选实施例中,所述方法还可以包括如下步骤:
响应作用于所述至少一个寻路位置的触发操作,从所述至少一个寻路位置中确定被触发的寻路位置,并删除所述被触发的寻路位置;从所述指定区域中删除与所述被触发的寻路位置对应的位置标识。
具体的,当玩家需要删除所选择的寻路位置时,可以在游戏场景缩略图中的至少一个寻路位置上进行触发操作,该触发操作可以包括点击操作、长按操作等等。游戏应用接收到作用于至少一个寻路位置的触发操作之后,可以对该触发操作进行响应,从至少一个寻路位置中确定被触发的寻路位置,并从至少一个寻路位置中删除被触发的寻路位置,在删除被触发的寻路位置的同时,可以隐藏游戏场景缩略图中与该被触发的寻路位置对应的位置标识,进一步地,从指定区域中删除与被触发的寻路位置对应的位置标识。
在本公开的一种优选实施例中,所述步骤103可以包括如下子步骤:
响应所述第一触控操作的结束,从所述至少一个位置标识中确定第一目标位置标识。
具体的,当检测到进行第一触控操作的操作介质(用于输入设备信号的介质,如手指或触控笔等)离开图形用户界面时,则认为接收到了第一触控操作的结束的操作,可以对该第一触控操作的结束进行响应,从至少一个位置标识中确定第一目标位置标识。在具体实现中,可以确定接收到第一触控操作的结束时的触控点,根据该触控点从至少一个位置标识中确定第一目标位置标识,或者,可以确定进行第一触控操作时,第一触控操作的触控点所经过的路线,根据该路线从至少一个位置标识中确定第一目标位置标识。
在通过第一触控操作控制第一虚拟角色进行移动时,可以实时检测第一触控操作是否满足触发条件,当第一触控操作满足触发条件时,则认为玩家需要进行自动寻路,此时,在图形用户界面上的指定区域提供至少一个位置标识供玩家进行选择,玩家继续通过第一触控操作来选择寻路位置对应的位置标识,进而,可以响应于第一触控操作的结 束,从至少一个位置标识中确定第一目标位置标识,使得可以控制第一虚拟角色自动寻路至第一目标位置标识对应的位置。从而,只需要通过第一触控操作就可以完成触发自动寻路,选择具体的自动寻路的位置,以及控制第一虚拟角色进行自动寻路的一系列操作,进一步加快操控第一虚拟角色进行自动寻路的速度,提高玩家的游戏体验。
在本公开的一种优选实施例中,所述从所述至少一个位置标识中确定第一目标位置标识,包括:
根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识。
具体的,第一触控操作具有对应的触控点,当第一触控操作满足触发条件时,可以根据第一触控操作的触控点,从至少一个位置标识中确定第一目标位置标识。
在本公开的一种优选实施例中,所述根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识,包括:
根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点对应的位置标识为第一目标位置标识;或,根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点所经过路线对应的任一个位置标识为第一目标位置标识。
具体的,在持续进行第一触控操作时,第一触控操作的触控点可以是持续移动的,在需要确定第一目标位置标识时,可以根据第一触控操作的触控点最后所处的位置,从至少一个位置标识中确定与触控点对应的位置标识为第一目标位置标识。
或者,在需要确定第一目标位置标识时,可以根据第一触控操作的触控点,从至少一个位置标识中,确定与触控点所经过路线对应的任一个位置标识为第一目标位置标识。在具体实现中,当触控点所经过路线具有多个对应的位置标识时,可以从所经过的多个位置标识中随机确定一个位置标识为第一目标位置标识;当触控点所经过路线仅有一个对应的位置标识时,可以确定该位置标识为第一目标位置标识。
在本公开的一种优选实施例中,所述根据所述第一触控操作的触控点确定第一目标位置标识,包括:
根据所述触控点的位置或移动方向,从所述至少一个位置标识中确定第一目标位置标识。
具体的,可以根据触控点的位置,从至少一个位置标识中确定与该位置对应的位置标识为第一目标位置标识,或,从至少一个位置标识中确定该位置最近的位置标识为第一目标位置标识。此外,还可以根据触控点的移动方向,从至少一个位置标识中确定该移动方向所指向的位置标识为第一目标位置标识,从而,不需要第一触控操作的触控点移动至位置标识所处的位置,就可以选定第一目标位置标识,进一步加快操控第一虚拟角色进行自动寻路的速度。
在本公开的一种优选实施例中,所述方法还可以包括如下步骤:
在进行所述自动寻路的过程中,响应作用于所述移动控制区域的第二触控操作,控制所述第一虚拟角色停止所述自动寻路。
具体的,控制第一虚拟角色进行自动寻路的过程中,玩家可以通过在移动控制区域上进行第二触控操作,来打断该自动寻路。其中,第二触控操作可以是用于打断自动寻路的操作,例如,点击操作,双击操作等,本公开实施例对此并不限制,该触发操作可以作用于移动控制区域上,从而避免在玩家不需要打断虚拟角色的自动寻路时,由于误触图形用户界面上的其他区域而打断虚拟角色的自动寻路,从而减少误操作的机率。
游戏应用可以在接收到作用于移动控制区域上的第二触控操作之后,对该第二触控操作进行响应,控制虚拟角色停止自动寻路至第一目标位置标识对应的位置。
在本公开的一种优选实施例中,所述的方法还可以包括如下步骤:
在所述指定区域显示自动寻路提示信息。
具体的,在控制第一虚拟角色进行自动寻路之后,可以在指定区域显示自动寻路提示信息,该提示信息可以包括与第一目标位置标识对应位置的名称,以提示玩家目前正在自动寻路的位置,避免前往错误的位置,造成不必要的时间消耗。例如,提示信息可以为“自动寻路至机场”。如图8所示,在控制第一虚拟角色进行自动寻路时,在指定区域显示的自动寻路提示信息为“自动寻路至XXX”。
在本公开的一种优选实施例中,所述的方法还可以包括如下步骤:
响应作用于所述指定区域的第三触控操作,从剩余位置标识中确定第二目标位置标识,所述剩余位置标识为所述至少一个位置标识中除所述第一目标位置标识之外的位置标识。
具体的,在控制第一虚拟角色进行自动寻路的过程中,玩家可以在指定区域上进行第三触控操作,来切换自动寻路的位置。其中,第三触控操作可以指用于切换自动寻路的位置的操作,如,第三触控操作为双击操作,滑动操作或点击操作等,本公开实施例对此并不限制。
游戏应用在接收到作用于指定区域的第三触控操作之后,可以对该第三触控操作进行响应,从剩余位置标识中确定第二目标位置标识,该剩余位置标识为至少一个位置标识中除第一目标位置标识之外的位置标识。作为一种示例,在指定区域中,具有四个位置标识,分别为A,B,C,D,假设第一目标位置标识为A,则剩余位置标识为B,C,D。
在控制第一虚拟角色进行自动寻路的过程中,玩家可以直接通过在指定区域进行第三触控操作,选择第二目标位置标识,使得可以控制第一虚拟角色将自动寻路的位置,从第一目标位置标识对应的位置切换至第二目标位置标识对应的位置,从而切换自动寻路的位置时不需要再次打开游戏场景缩略图进行选择,避免打断玩家当前的游戏行为, 减少打开游戏场景缩略图选择自动寻路的位置时遮挡场景,解决无法快速切换自动寻路的位置的问题,提高玩家的游戏体验。而且,由于可以再次快速回到自动寻路状态,因此,玩家可以随意打断当前正在进行的自动寻路,从而提高玩家游戏自由度。
需要说明的是,对于方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本公开实施例并不受所描述的动作顺序的限制,因为依据本公开实施例,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作并不一定是本公开实施例所必须的。
参照图9,示出了本公开的一种游戏中的寻路控制装置实施例的结构框图,通过移动终端提供图形用户界面,所述图形用户界面所显示的内容包含至少部分游戏场景以及位于所述游戏场景的第一虚拟角色,所述图形用户界面提供一移动控制区域,具体可以包括如下模块:
移动控制模块901,用于响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动;
位置标识提供模块902,用于响应所述第一触控操作满足触发条件,在所述图形用户界面的指定区域提供至少一个位置标识,所述位置标识对应所述游戏场景中的位置;
位置标识确定模块903,用于从所述至少一个位置标识中确定第一目标位置标识;
自动寻路控制模块904,用于控制所述第一虚拟角色在所述游戏场景中自动寻路至所述第一目标位置标识对应的位置。
在本公开的一种优选实施例中,所述移动控制模块901,包括:
移动控制子模块,用于响应作用于所述移动控制区域的第一触控操作,确定移动方向,并根据所述移动方向控制所述第一虚拟角色在所述游戏场景中移动。
在本公开的一种优选实施例中,所述第一触控操作包括滑动操作,所述滑动操作包括直接滑动操作、长按后滑动操作、或重按后滑动操作中的任一项,其中,所述滑动操作具有对应的触控点。
在本公开的一种优选实施例中,所述触发条件包括:检测到所述滑动操作的触控点滑出所述移动控制区域,或,所述触控点移动至指定位置,或,所述触控点的移动距离大于预设距离阈值,或,所述触控点的移动速度大于预设速度阈值,或,所述触控点的按压时间大于预设时间阈值,或,所述触控点的按压压力大于预设压力阈值。
在本公开的一种优选实施例中,所述指定区域位于所述移动控制区域的上方。
在本公开的一种优选实施例中,所述图形用户界面还提供一游戏场景缩略图;所述指定区域为所述图形用户界面中除所述游戏场景缩略图之外的显示区域。
在本公开的一种优选实施例中,所述位置标识包括静态位置标识,所述静态位置标识对应所述游戏场景中的静态位置。
在本公开的一种优选实施例中,所述位置标识包括动态位置标识,所述动态位置标识对应第二虚拟角色在游戏场景中的动态位置。
在本公开的一种优选实施例中,所述位置标识包括所述第二虚拟角色的头像。
在本公开的一种优选实施例中,所述图形用户界面还提供一游戏场景缩略图;所述装置还包括:
寻路位置选定模块,用于在所述游戏场景缩略图上选定至少一个寻路位置;
所述位置标识提供模块902,包括:
位置标识提供子模块,用于当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供与各个所述寻路位置一一对应的位置标识。
在本公开的一种优选实施例中,还包括:
寻路位置删除模块,用于响应作用于所述至少一个寻路位置的触发操作,从所述至少一个寻路位置中确定被触发的寻路位置,并删除所述被触发的寻路位置;
位置标识删除模块,用于从所述指定区域中删除与所述被触发的寻路位置对应的位置标识。
在本公开的一种优选实施例中,所述位置标识确定模块903,包括:
第一位置标识确定子模块,用于响应所述第一触控操作的结束,从所述至少一个位置标识中确定第一目标位置标识。
在本公开的一种优选实施例中,所述位置标识确定模块903,包括:
第二位置标识确定子模块,用于根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识。
在本公开的一种优选实施例中,所述第二位置标识确定子模块,包括:
第一位置标识确定单元,用于根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点对应的位置标识为第一目标位置标识;或,
第二位置标识确定单元,用于根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点所经过路线对应的任一个位置标识为第一目标位置标识。
在本公开的一种优选实施例中,所述第二位置标识确定子模块,包括:
第三位置标识确定单元,用于根据所述触控点的位置或移动方向,从所述至少一个位置标识中确定第一目标位置标识。
在本公开的一种优选实施例中,所述装置还包括:
自动寻路停止模块,用于在进行所述自动寻路的过程中,响应作用于所述移动控制区域的第二触控操作,控制所述第一虚拟角色停止所述自动寻路。
在本公开的一种优选实施例中,所述装置还包括:
第二位置标识确定模块,用于响应作用于所述指定区域的第三触控操作,从剩余位置标识中确定第二目标位置标识,所述剩余位置标识为所述至少一个位置标识中除所述第一目标位置标识之外的位置标识。
对于装置实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
本公开实施例还提供了一种电子设备,包括:
一个或多个处理器;和
其上存储有指令的一个或多个机器可读介质,当由所述一个或多个处理器执行时,使得所述电子设备执行如本公开实施例任一项所述的方法。
本公开实施例还提供了一种计算机可读存储介质,其上存储有指令,当由一个或多个处理器执行时,使得所述处理器执行如本公开实施例任一项所述的方法。
本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。
本领域内的技术人员应明白,本公开实施例的实施例可提供为方法、装置、或计算机程序产品。因此,本公开实施例可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本公开实施例可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本公开实施例是参照根据本公开实施例的方法、终端设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理终端设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理终端设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理终端设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理终端设备上,使得在计算机或其他可编程终端设备上执行一系列操作步骤以产生计算机实现的处理,从而在 计算机或其他可编程终端设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了本公开实施例的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例做出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本公开实施例范围的所有变更和修改。
最后,还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者终端设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者终端设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者终端设备中还存在另外的相同要素。
以上对本公开所提供的一种游戏中的寻路控制方法和一种游戏中的寻路控制装置,进行了详细介绍,本文中应用了具体个例对本公开的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本公开的方法及其核心思想;同时,对于本领域的一般技术人员,依据本公开的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本公开的限制。

Claims (20)

  1. 一种游戏中的寻路控制方法,通过移动终端提供图形用户界面,所述图形用户界面所显示的内容包含至少部分游戏场景以及位于所述游戏场景的第一虚拟角色,所述图形用户界面提供一移动控制区域,所述方法包括:
    响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动;
    当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供至少一个位置标识,所述位置标识对应所述游戏场景中的位置;
    从所述至少一个位置标识中确定第一目标位置标识;
    控制所述第一虚拟角色在所述游戏场景中自动寻路至所述第一目标位置标识对应的位置。
  2. 根据权利要求1所述的方法,其中,所述响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动,包括:
    响应作用于所述移动控制区域的第一触控操作,确定移动方向,并根据所述移动方向控制所述第一虚拟角色在所述游戏场景中移动。
  3. 根据权利要求1所述的方法,其中,所述第一触控操作包括滑动操作,所述滑动操作包括直接滑动操作、长按后滑动操作、或重按后滑动操作中的任一项,其中,所述滑动操作具有对应的触控点。
  4. 根据权利要求3所述的方法,其中,所述触发条件包括:检测到所述滑动操作的触控点滑出所述移动控制区域,或,所述触控点移动至指定位置,或,所述触控点的移动距离大于预设距离阈值,或,所述触控点的移动速度大于预设速度阈值,或,所述触控点的按压时间大于预设时间阈值,或,所述触控点的按压压力大于预设压力阈值。
  5. 根据权利要求1所述的方法,其中,所述指定区域位于所述移动控制区域的上方。
  6. 根据权利要求1所述的方法,其中,所述图形用户界面还提供一游戏场景缩略图;所述指定区域为所述图形用户界面中除所述游戏场景缩略图之外的显示区域。
  7. 根据权利要求1所述的方法,其中,所述位置标识包括静态位置标识,所述静态位置标识对应所述游戏场景中的静态位置。
  8. 根据权利要求1所述的方法,其中,所述位置标识包括动态位置标识,所述动态位置标识对应第二虚拟角色在游戏场景中的动态位置。
  9. 根据权利要求8所述的方法,其中,所述位置标识包括所述第二虚拟角色的头像。
  10. 根据权利要求1所述的方法,其中,所述图形用户界面还提供一游戏场景缩略图;
    在所述当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供 至少一个位置标识之前,所述方法包括:
    在所述游戏场景缩略图上选定至少一个寻路位置;
    所述当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供至少一个位置标识,包括:
    当所述第一触控操作满足触发条件时,在所述图形用户界面的指定区域提供与各个所述寻路位置一一对应的位置标识。
  11. 根据权利要求10所述的方法,其中,还包括:
    响应作用于所述至少一个寻路位置的触发操作,从所述至少一个寻路位置中确定被触发的寻路位置,并删除所述被触发的寻路位置;
    从所述指定区域中删除与所述被触发的寻路位置对应的位置标识。
  12. 根据权利要求1所述的方法,其中,所述从所述至少一个位置标识中确定第一目标位置标识,包括:
    响应所述第一触控操作的结束,从所述至少一个位置标识中确定第一目标位置标识。
  13. 根据权利要求1所述的方法,其中,所述从所述至少一个位置标识中确定第一目标位置标识,包括:
    根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识。
  14. 根据权利要求13所述的方法,其中,所述根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识,包括:
    根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点对应的位置标识为第一目标位置标识;或,
    根据所述第一触控操作的触控点,从所述至少一个位置标识中,确定与所述触控点所经过路线对应的任一个位置标识为第一目标位置标识。
  15. 根据权利要求13所述的方法,其中,所述根据所述第一触控操作的触控点,从所述至少一个位置标识中确定第一目标位置标识,包括:
    根据所述触控点的位置或移动方向,从所述至少一个位置标识中确定第一目标位置标识。
  16. 根据权利要求1所述的方法,其中,所述方法还包括:
    在进行所述自动寻路的过程中,响应作用于所述移动控制区域的第二触控操作,控制所述第一虚拟角色停止所述自动寻路。
  17. 根据权利要求1所述的方法,其中,所述方法还包括:
    响应作用于所述指定区域的第三触控操作,从剩余位置标识中确定第二目标位置标识,所述剩余位置标识为所述至少一个位置标识中除所述第一目标位置标识之外的位置标识。
  18. 一种游戏中的寻路控制装置,通过移动终端提供图形用户界面,所述图形用户界面所显示的内容包含至少部分游戏场景以及位于所述游戏场景的第一虚拟角色,所述图形用户界面提供一移动控制区域,所述装置包括:
    移动控制模块,用于响应作用于所述移动控制区域的第一触控操作,控制所述第一虚拟角色在所述游戏场景中移动;
    位置标识提供模块,用于响应所述第一触控操作满足触发条件,在所述图形用户界面的指定区域提供至少一个位置标识,所述位置标识对应所述游戏场景中的位置;
    位置标识确定模块,用于从所述至少一个位置标识中确定第一目标位置标识;
    自动寻路控制模块,用于控制所述第一虚拟角色在所述游戏场景中自动寻路至所述第一目标位置标识对应的位置。
  19. 一种电子设备,包括:
    一个或多个处理器;和
    其上存储有指令的一个或多个机器可读介质,当由所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1-17任一项所述的方法。
  20. 一种计算机可读存储介质,其上存储有指令,当由一个或多个处理器执行时,使得所述处理器执行如权利要求1-17任一项所述的方法。
PCT/CN2021/081063 2020-07-06 2021-03-16 一种游戏中的寻路控制方法及装置 WO2022007429A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/004,308 US20230219000A1 (en) 2020-07-06 2021-03-16 Pathfinding control method and device in game

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010642984.9A CN111760268B (zh) 2020-07-06 2020-07-06 一种游戏中的寻路控制方法及装置
CN202010642984.9 2020-07-06

Publications (1)

Publication Number Publication Date
WO2022007429A1 true WO2022007429A1 (zh) 2022-01-13

Family

ID=72723891

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081063 WO2022007429A1 (zh) 2020-07-06 2021-03-16 一种游戏中的寻路控制方法及装置

Country Status (3)

Country Link
US (1) US20230219000A1 (zh)
CN (1) CN111760268B (zh)
WO (1) WO2022007429A1 (zh)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111760268B (zh) * 2020-07-06 2021-06-08 网易(杭州)网络有限公司 一种游戏中的寻路控制方法及装置
CN112870701B (zh) * 2021-03-16 2024-02-23 网易(杭州)网络有限公司 虚拟角色的控制方法和装置
CN113093795B (zh) * 2021-03-30 2022-04-22 华南理工大学 一种水面无人飞船的半自动无线控制方法及装置
CN113144608B (zh) * 2021-04-22 2024-02-02 网易(杭州)网络有限公司 消息发送方法、装置、设备及存储介质
CN113559504B (zh) * 2021-04-28 2024-04-16 网易(杭州)网络有限公司 信息处理方法、装置、存储介质及电子设备
CN113262492B (zh) * 2021-04-28 2024-02-02 网易(杭州)网络有限公司 游戏的数据处理方法、装置以及电子终端
CN113289327A (zh) * 2021-06-18 2021-08-24 Oppo广东移动通信有限公司 移动终端的显示控制方法及装置、存储介质及电子设备
CN113440854A (zh) * 2021-07-14 2021-09-28 网易(杭州)网络有限公司 游戏中的信息处理方法、装置、存储介质和计算机设备
CN113546412A (zh) * 2021-07-26 2021-10-26 网易(杭州)网络有限公司 游戏中的显示控制方法、装置和电子设备
CN113663333A (zh) * 2021-08-24 2021-11-19 网易(杭州)网络有限公司 游戏的控制方法、装置、电子设备及存储介质
CN113835812B (zh) * 2021-09-24 2024-04-30 深圳集智数字科技有限公司 聊天界面展示方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170193425A1 (en) * 2014-04-11 2017-07-06 ACR Development, Inc. Automated user task management
CN108245888A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置及计算机设备
US20180219426A1 (en) * 2017-02-01 2018-08-02 Ossia Inc. Central Controller Board Enhancements For Wireless Power Battery Charging Systems
CN108379844A (zh) * 2018-03-30 2018-08-10 腾讯科技(深圳)有限公司 控制虚拟对象移动的方法、装置、电子装置及存储介质
CN109621420A (zh) * 2018-12-26 2019-04-16 网易(杭州)网络有限公司 游戏中的寻路方法、装置、介质及电子设备
CN111760268A (zh) * 2020-07-06 2020-10-13 网易(杭州)网络有限公司 一种游戏中的寻路控制方法及装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9295895B2 (en) * 2007-02-07 2016-03-29 Skyhawker Technologies, LLC Coaching aid for golf
CN103207951B (zh) * 2013-04-22 2015-02-25 腾讯科技(深圳)有限公司 一种寻路方法和装置
CN104784934A (zh) * 2015-04-13 2015-07-22 四川天上友嘉网络科技有限公司 游戏角色移动方法
CN106582023B (zh) * 2016-12-01 2020-06-02 北京像素软件科技股份有限公司 一种游戏寻路方法和装置
CN107835148B (zh) * 2017-08-23 2020-06-23 杭州电魂网络科技股份有限公司 游戏角色控制方法、装置、系统及游戏客户端
JP7130317B2 (ja) * 2018-02-16 2022-09-05 株式会社大一商会 遊技機
CN110141862B (zh) * 2019-06-21 2023-05-05 网易(杭州)网络有限公司 游戏中移动控制的方法及装置、电子设备、存储介质
CN110772791B (zh) * 2019-11-05 2023-07-21 网易(杭州)网络有限公司 三维游戏场景的路线生成方法、装置和存储介质
CN110812844B (zh) * 2019-11-06 2023-04-07 网易(杭州)网络有限公司 一种游戏中的寻路方法、终端及可读存储介质
CN111013146B (zh) * 2019-12-25 2024-02-23 北京像素软件科技股份有限公司 超大地图的可动态修改的寻路导航方法和装置
CN111249735B (zh) * 2020-02-14 2023-04-25 网易(杭州)网络有限公司 控制对象的路径规划方法、装置、处理器及电子装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170193425A1 (en) * 2014-04-11 2017-07-06 ACR Development, Inc. Automated user task management
US20180219426A1 (en) * 2017-02-01 2018-08-02 Ossia Inc. Central Controller Board Enhancements For Wireless Power Battery Charging Systems
CN108245888A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置及计算机设备
CN108379844A (zh) * 2018-03-30 2018-08-10 腾讯科技(深圳)有限公司 控制虚拟对象移动的方法、装置、电子装置及存储介质
CN109621420A (zh) * 2018-12-26 2019-04-16 网易(杭州)网络有限公司 游戏中的寻路方法、装置、介质及电子设备
CN111760268A (zh) * 2020-07-06 2020-10-13 网易(杭州)网络有限公司 一种游戏中的寻路控制方法及装置

Also Published As

Publication number Publication date
US20230219000A1 (en) 2023-07-13
CN111760268B (zh) 2021-06-08
CN111760268A (zh) 2020-10-13

Similar Documents

Publication Publication Date Title
WO2022007429A1 (zh) 一种游戏中的寻路控制方法及装置
WO2022041663A1 (zh) 一种游戏虚拟道具的推荐、购买方法和电子设备
CN108404408B (zh) 信息处理方法、装置、存储介质及电子设备
CN109432766B (zh) 一种游戏控制方法及装置
US20190083887A1 (en) Information processing method, apparatus and non-transitory storage medium
JP6628443B2 (ja) 情報処理方法、端末、およびコンピュータ記憶媒体
CN110812838B (zh) 游戏中的虚拟单位控制方法、装置及电子设备
CN107519644B (zh) 一种3d游戏中的视角调整方法及装置
WO2022166143A1 (zh) 一种游戏信号处理方法和装置
WO2022213521A1 (zh) 游戏中控制虚拟对象移动的方法、装置、电子设备及存储介质
US20170235462A1 (en) Interaction control method and electronic device for virtual reality
JP6450794B2 (ja) ゲームシステム、ゲーム制御装置、及びプログラム
CN111888766B (zh) 一种游戏中的信息处理方法、装置、电子设备及存储介质
CN111330260A (zh) 一种游戏中道具控制的方法及装置、电子设备、存储介质
CN111467794A (zh) 游戏交互方法及装置、电子设备、存储介质
CN110339556A (zh) 一种游戏中的显示控制方法及装置
CN111265849B (zh) 一种虚拟卡牌的交互方法和装置
JP2022181206A (ja) ゲームシーンの処理方法、装置、記憶媒体及び電子デバイス
JP6043448B1 (ja) ゲームプログラム
WO2024007675A1 (zh) 虚拟对象的切换方法、装置、存储介质及电子装置
US20230350554A1 (en) Position marking method, apparatus, and device in virtual scene, storage medium, and program product
CN113440848A (zh) 游戏内信息标记方法、装置及电子装置
KR102495259B1 (ko) 온라인 게임에서 대상체를 정밀하게 타겟팅하는 방법 및 장치
CN112957735A (zh) 游戏控制方法和装置
JP7404541B2 (ja) 仮想オブジェクトの制御方法、装置、コンピュータ装置、及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837138

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837138

Country of ref document: EP

Kind code of ref document: A1