CN112121422B - Interface display method, device, equipment and storage medium - Google Patents

Interface display method, device, equipment and storage medium Download PDF

Info

Publication number
CN112121422B
CN112121422B CN202011060668.7A CN202011060668A CN112121422B CN 112121422 B CN112121422 B CN 112121422B CN 202011060668 A CN202011060668 A CN 202011060668A CN 112121422 B CN112121422 B CN 112121422B
Authority
CN
China
Prior art keywords
target
virtual object
virtual
controlled
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011060668.7A
Other languages
Chinese (zh)
Other versions
CN112121422A (en
Inventor
练建锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011060668.7A priority Critical patent/CN112121422B/en
Publication of CN112121422A publication Critical patent/CN112121422A/en
Application granted granted Critical
Publication of CN112121422B publication Critical patent/CN112121422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal

Abstract

The application discloses an interface display method, an interface display device, interface display equipment and a storage medium, and belongs to the technical field of computers. In the embodiment of the application, on one hand, when the controlled virtual object is located outside the target area, the target path is planned for the controlled virtual object in a path planning mode to assist the controlled virtual object to move to the target area, so that a user does not need to determine a route by himself, convenience is provided for the user operation, and the user operation difficulty is simplified. On the other hand, the method for indicating the direction visually and clearly indicates which direction the controlled virtual object needs to move to in a direction indicating information mode, so that the controlled virtual object can move according to the target path, the difficulty of controlling the movement of the controlled virtual object can be simplified, the information displayed in the interface is enriched, the information content of the interface is improved, and the display effect is improved.

Description

Interface display method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an interface display method, apparatus, device, and storage medium.
Background
With the development of computer technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, and the terminal can display a virtual scene in an interface and display virtual characters in the virtual scene.
At present, in some scenes, a user needs to control a virtual object to move to a certain position or a certain area in a virtual scene, and when moving, the user needs to determine a moving direction according to a small map of the virtual scene, and needs to repeatedly check the small map to determine whether the user moves to the certain position or the certain area.
When the virtual object is controlled, if the direction sense of the user is poor or the user is not familiar with the electronic game, the virtual object cannot be accurately controlled to move to a certain position or a certain area, so that the control effect is poor, the amount of information displayed on the interface is small, and the display effect is poor.
Disclosure of Invention
The embodiment of the application provides an interface display method, device, equipment and storage medium, which can simplify the operation difficulty of a user and improve the interface display effect. The technical solutions provided in the present application are explained below.
In one aspect, an interface display method is provided, and the method includes:
acquiring a target area and a current first position of a controlled virtual object, and continuously reducing the virtual life value of the virtual object positioned outside the target area;
responding to the first position outside the target area, and acquiring a target path in a virtual scene according to the first position and the target area, wherein the target path is a path from the first position to the target area in the virtual scene;
and displaying direction indication information corresponding to the position of the controlled virtual object in the visual field picture of the controlled virtual object along with the position change of the controlled virtual object according to the target path and the visual angle of the virtual scene, wherein the direction indication information is used for indicating the moving direction of the controlled virtual object when moving to the target area.
In one possible implementation, the direction indication information is displayed on a target display position in a visual field of the controlled virtual object.
The displaying, in the view frame of the controlled virtual object, direction indication information corresponding to the position of the controlled virtual object according to the target path and the viewing angle of the virtual scene along with the position change of the controlled virtual object includes:
along with the position change of the controlled virtual object, displaying direction indication information corresponding to the position of the controlled virtual object on a target display position in a visual field picture of the controlled virtual object according to the target path and the visual angle of the virtual scene.
In a possible implementation manner, the displaying, at a target display position in a visual field picture of the controlled virtual object according to the target path and the perspective of the virtual scene, direction indication information corresponding to a position where the controlled virtual object is located includes:
determining the direction indication information and a target display position of the direction indication information according to the target path, the position of the controlled virtual object and the visual angle of the virtual scene;
and displaying the direction indication information on the target display position in the visual field picture of the controlled virtual object.
In one possible implementation manner, the target display position is determined based on the target path, the position of the controlled virtual object, and the viewing angle of the virtual scene.
In one possible implementation, the target display position is determined according to the type of the direction indication information.
In one possible implementation, the direction indication information includes first direction indication information, second direction indication information, and third direction indication information. The target display position corresponding to the first direction indication information is a first target display position, the target display position corresponding to the second direction indication information is a second target display position, and the target display position corresponding to the third direction indication information is a third target display position.
In one possible implementation manner, the displaying the direction indication information corresponding to the position of the controlled virtual object includes:
and displaying the direction indication information according to the target display style.
In one possible implementation manner, the displaying the direction indication information according to the target display style includes:
and displaying the direction indication information according to the target special effect.
In one possible implementation manner, the displaying the direction indication information according to the target display style includes:
and displaying the direction indication information according to the target transparency.
In one possible implementation, the target display style is determined according to a style adjustment operation.
In one aspect, an interface display apparatus is provided, the apparatus including:
the first acquisition module is used for acquiring a target area and a current first position of a controlled virtual object, and the virtual life value of the virtual object positioned outside the target area is continuously reduced;
a second obtaining module, configured to obtain, in response to that the first position is located outside the destination area, a target path in a virtual scene according to the first position and the destination area, where the target path is a path from the first position to the destination area in the virtual scene;
and a display module, configured to display, in a visual field picture of the controlled virtual object, direction indication information corresponding to a position of the controlled virtual object according to the target path and a viewing angle of the virtual scene along with a change in the position of the controlled virtual object, where the direction indication information is used to indicate a moving direction of the controlled virtual object when the controlled virtual object moves to the target area.
In one possible implementation, the display module is to:
responding to the fact that the controlled virtual object is located on the target path, and displaying direction indication information corresponding to a first relation in a visual field picture of the controlled virtual object according to the first relation between the visual angle of the virtual scene and the direction indicated by the target path;
responding to that the controlled virtual object is located outside the target path, and displaying direction indication information corresponding to a second relation in a visual field picture of the controlled virtual object according to the second relation between the visual angle of the virtual scene and a target connecting line direction, wherein the target connecting line direction is a direction in which a connecting line with the shortest distance between the target path and the controlled virtual object is located.
In one possible implementation, the display module is configured to:
in response to that the visual angle of the virtual scene is consistent with the direction indicated by the target path, displaying first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating that the controlled virtual object moves forwards;
in response to that the visual angle of the virtual scene is not consistent with the direction indicated by the target path, according to a target rotation direction, displaying second direction indication information corresponding to the target rotation direction in a visual field picture of the controlled virtual object, wherein the second direction indication information is used for indicating that the controlled virtual object rotates leftwards or rightwards, and the visual angle of the virtual scene is consistent with the direction indicated by the target path after rotating for a target angle smaller than 180 degrees along the target rotation direction;
in one possible implementation, the display module is configured to:
responding to the consistency of the visual angle of the virtual scene and the direction of a target connecting line, and displaying first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating the controlled virtual object to move forwards;
and in response to the fact that the visual angle of the virtual scene is not consistent with the target connecting line direction, displaying second direction indication information corresponding to the target rotating direction in a visual field picture of the controlled virtual object according to the target rotating direction, wherein the second direction indication information is used for indicating the controlled virtual object to rotate leftwards or rightwards, and the visual angle of the virtual scene is consistent with the target connecting line direction after rotating for a target angle smaller than 180 degrees along the target rotating direction.
In one possible implementation, the display module is configured to:
responding to that the visual angle of the virtual scene is consistent with the direction indicated by the target path or the target connecting line direction after rotating the target angle anticlockwise, and displaying third direction indication information in a visual field picture of the controlled virtual object, wherein the third direction indication information is used for indicating the controlled virtual object to rotate to the left;
and in response to that the visual angle of the virtual scene is consistent with the direction indicated by the target path or the target connecting line direction after clockwise rotating the target angle, displaying fourth direction indication information in a visual field picture of the controlled virtual object, wherein the fourth direction indication information is used for indicating the controlled virtual object to rotate rightwards.
In one possible implementation, the display module is configured to:
responding to the controlled virtual object positioned on the target path, acquiring a first included angle between the visual angle of the virtual scene and the direction indicated by the target path, and displaying direction indication information corresponding to the first included angle in a visual field picture of the controlled virtual object;
and responding to the situation that the controlled virtual object is positioned outside the target path, acquiring a second included angle between the visual angle of the virtual scene and the direction of the target connecting line, and displaying direction indication information corresponding to the second included angle in a visual field picture of the controlled virtual object.
In one possible implementation, the target path is derived from a feasible point connection in the virtual scene.
In one possible implementation manner, the second obtaining module is configured to:
acquiring a map of the virtual scene, wherein the map of the virtual scene comprises places where the virtual scene can pass;
taking the first position in the map as a starting point, taking any position in the destination area as an end point, and searching a path according to a place which can be communicated between the starting point and the end point in the map to obtain at least one candidate path;
and taking the path with the shortest length in the at least one candidate path as the target path.
In one possible implementation, the apparatus further includes:
the third acquisition module is used for responding to the position marking operation and acquiring a second position indicated by the position marking operation;
the second obtaining module is further configured to, in response to the second position being located in the destination area, obtain a target path in a virtual scene according to the first position and the second position, where the target path is a path from the first position to the second position in the virtual scene.
In one possible implementation, the apparatus further includes:
and the updating module is used for responding that the distance between the position of the controlled virtual object and the target path is greater than a first distance threshold value, updating the target path according to the position of the controlled virtual object and the target area, wherein the updated target path is a path from the position of the controlled virtual object to the target area in the virtual scene.
In a possible implementation manner, the display module is further configured to display a target virtual prop at a position every target distance on the target path, where the target virtual prop is used to increase a virtual life value of a virtual object.
In one possible implementation manner, the display module is further configured to display that the virtual life value of the controlled virtual object is increased by a target virtual life value in response to a distance between the controlled virtual object and any of the target virtual props being smaller than a second distance threshold.
In one possible implementation, the apparatus further includes:
the first determining module is used for acquiring the speed of reduction of the virtual life value of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target virtual life value according to the speed, the moving speed, the length and the preset target distance.
In one possible implementation, the apparatus further includes:
the second determining module is used for acquiring the speed of reducing the virtual life value of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target distance according to the speed, the moving speed and the preset target virtual life value.
In one aspect, an electronic device is provided and includes one or more processors and one or more memories, where at least one program code is stored in the one or more memories and loaded into and executed by the one or more processors to implement various alternative implementations of the interface display method described above.
In one aspect, a computer-readable storage medium is provided, in which at least one program code is stored, and the at least one program code is loaded and executed by a processor to implement various optional implementations of the interface display method described above.
In one aspect, a computer program product or computer program is provided that includes one or more program codes stored in a computer-readable storage medium. One or more processors of the electronic device can read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the electronic device can execute the interface display method of any one of the above-mentioned possible embodiments.
In the embodiment of the application, on one hand, when the controlled virtual object is located outside the target area, the target path is planned for the controlled virtual object in a path planning mode to assist the controlled virtual object to move to the target area, so that a user does not need to determine a route by himself, convenience is provided for the user operation, and the user operation difficulty is simplified. On the other hand, the method for indicating the direction visually and clearly indicates which direction the controlled virtual object needs to move to in a direction indicating information mode, so that the controlled virtual object can move according to the target path, the difficulty of controlling the movement of the controlled virtual object can be simplified, the information displayed in the interface is enriched, the information content of the interface is improved, and the display effect is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to be able to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of an interface display method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of an interface display method provided in an embodiment of the present application;
FIG. 3 is a flowchart of an interface display method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a breadth-first search algorithm provided by an embodiment of the present application;
fig. 5 is a schematic diagram of obtaining a target path according to an embodiment of the present application;
fig. 6 is a schematic diagram of a determination manner of direction indication information provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
fig. 8 is a schematic diagram of a determination manner of direction indication information provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
fig. 10 is a schematic diagram of a determination manner of direction indication information provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 12 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
fig. 13 is a schematic diagram of a determination manner of direction indication information according to an embodiment of the present application;
fig. 14 is a schematic diagram of a determination manner of direction indication information provided in an embodiment of the present application;
fig. 15 is a schematic diagram of a determination manner of direction indication information provided in an embodiment of the present application;
FIG. 16 is a schematic diagram of a terminal interface provided by an embodiment of the present application;
FIG. 17 is a flowchart of an interface display method according to an embodiment of the present application;
FIG. 18 is a flowchart of an interface display method according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of an interface display device according to an embodiment of the present application;
fig. 20 is a block diagram of a terminal according to an embodiment of the present disclosure;
fig. 21 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution. It will be further understood that, although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, the first image can be referred to as a second image, and similarly, the second image can be referred to as a first image without departing from the scope of various such examples. The first image and the second image can both be images, and in some cases, can be separate and distinct images.
The term "at least one" is used herein to mean one or more, and the term "plurality" is used herein to mean two or more, e.g., a plurality of packets means two or more packets.
It is to be understood that the terminology used in the description of the various examples herein is for the purpose of describing particular examples only and is not intended to be limiting. As used in the description of the various examples and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term "and/or" is an associative relationship that describes an associated object, meaning that three relationships can exist, e.g., a and/or B, can mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present application generally indicates that the former and latter related objects are in an "or" relationship.
It should also be understood that, in the embodiments of the present application, the sequence numbers of the respective processes do not mean the execution sequence, and the execution sequence of the respective processes should be determined by the functions and the inherent logic thereof, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should also be understood that determining B from a does not mean determining B from a alone, but can also be determined from a and/or other information.
It will be further understood that the terms "Comprises," "Comprising," "inCludes" and/or "inCluding," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also understood that the term "if" may be interpreted to mean "when" ("where" or "upon") or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined." or "if [ the stated condition or event ] is detected" may be interpreted to mean "upon determining." or "in response to determining." or "upon detecting [ the stated condition or event ] or" in response to detecting [ the stated condition or event ] ", depending on the context.
The terms referred to in the present application will be explained below.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, the virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as desert, city, etc., and the user may control the virtual character to move in the virtual scene.
Virtual object: refers to an object in a virtual scene that is an imaginary object used to simulate a real object or creature. Such as characters, animals, plants, oil drums, walls, rocks, etc., displayed in a virtual scene. The virtual object includes a virtual object and a virtual character, wherein the virtual object is an object with an inanimate property, for example, the virtual object may be a virtual building, a virtual vehicle, a virtual prop, or the like. A virtual character refers to an object having a life attribute, for example, a virtual character may be a virtual character, a virtual animal, or the like.
Optionally, the virtual objects include movable virtual objects and non-movable virtual objects. Such as movable virtual vehicles, movable virtual characters, immovable virtual buildings, etc.
Virtual roles: refers to an object used to simulate a person or animal in a virtual scene. The virtual character can be a virtual character, a virtual animal, an animation character, etc., such as: characters and animals displayed in the virtual scene. The avatar may be an avatar in the virtual scene that is virtual to represent the user. A plurality of virtual characters can be included in the virtual scene, and each virtual character has the shape and the volume of the virtual scene and occupies a part of the space in the virtual scene.
Alternatively, the virtual Character may be a Player Character controlled by an operation on the client, an Artificial Intelligence (AI) set in virtual scene battle by training, or a Non-Player Character (NPC) set in virtual scene interaction. Alternatively, the virtual character may be a virtual character that plays a game in a virtual scene. Optionally, the number of the virtual characters participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of the clients participating in the interaction.
Taking a shooting game as an example, the user may control the virtual character to freely fall, glide, or open a parachute to fall in the sky of the virtual scene, to run, jump, crawl, bow to go ahead on land, or to swim, float, or dive in the sea, or the like, and the user may control the virtual character to move in the virtual scene by riding a virtual vehicle, such as a virtual car, a virtual aircraft, or a virtual yacht, which is only exemplified by the above-mentioned scene, but the present invention is not limited to this. The user can also control the virtual character to carry out the interaction of modes such as fight through virtual props with other virtual characters, for example, virtual props can include multiple, for example can be throw type virtual props such as paste burning agent, grenade, mine tied in a bundle, smog bullet, bomb, burning bottle or viscidity grenade (for short "glue thunder"), also can be shooting type virtual props such as machine gun, pistol, rifle, this application does not specifically limit the type of virtual props.
The following describes an embodiment of the present application.
Fig. 1 is a schematic diagram of an implementation environment of an interface display method provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual scene. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multiplayer Online Battle Arena game (MOBA), a virtual reality application program, a three-dimensional map program, or a Multiplayer gunfight type survival game. The first terminal 120 may be a terminal used by a first user, and the first user uses the first terminal 120 to operate a first virtual character located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first avatar is a first virtual character, such as a simulated persona or an animated persona. Illustratively, the first virtual character may be a first virtual animal, such as a simulated monkey or other animal.
The first terminal 120 and the second terminal 160 are connected to the server 140 through a wireless network or a wired network.
The server 140 may include at least one of a server, a plurality of servers, a cloud computing platform, or a virtualization center. The server 140 is used to provide background services for applications that support virtual scenarios. Alternatively, the server 140 may undertake primary computational tasks and the first and second terminals 120, 160 may undertake secondary computational tasks; alternatively, the server 140 undertakes the secondary computing job and the first terminal 120 and the second terminal 160 undertake the primary computing job; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual scene. The application program may be any one of an FPS, a third person shooter game, an MOBA, a virtual reality application program, a three-dimensional map program, or a multi-player gunfight type live game. The second terminal 160 may be a terminal used by a second user, who uses the second terminal 160 to operate a second virtual character located in the virtual scene for activities including, but not limited to: adjusting at least one of a body pose, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second avatar is a second virtual character, such as a simulated persona or an animated persona. Illustratively, the second virtual character may be a second virtual animal, such as a simulated monkey or other animal.
Optionally, the first virtual character controlled by the first terminal 120 and the second virtual character controlled by the second terminal 160 are in the same virtual scene, and the first virtual character may interact with the second virtual character in the virtual scene. In some embodiments, the first virtual character and the second virtual character may be in a hostile relationship, for example, the first virtual character and the second virtual character may belong to different teams and organizations, and the hostile virtual characters may interact with each other in a battle manner on land in a manner of shooting each other.
In other embodiments, the first avatar and the second avatar may be in a teammate relationship, for example, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The first terminal 120 and the second terminal 160 may be of the same or different device types, including: at least one of a smart phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, a laptop portable computer, and a desktop computer. For example, the first terminal 120 and the second terminal 160 may be smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart of an interface display method provided in an embodiment of the present application, where the method is applied to an electronic device, and the electronic device is a terminal or a server, and referring to fig. 2, taking the application of the method to a terminal as an example, the method includes the following steps.
200. The terminal acquires a target area and a current first position of a controlled virtual object, and the virtual life value of the virtual object located outside the target area is continuously reduced.
The first position is the position of the controlled virtual object in the virtual scene when the target path is determined. I.e. the position of the controlled virtual object in the virtual scene when it is necessary to determine whether the path planning of the controlled virtual object is required. The controlled virtual object is a virtual object controlled by the terminal.
For the destination area, the virtual objects located outside the destination area may be damaged, and the virtual life value of the virtual objects continuously decreases. If the virtual life value of a virtual object decreases to zero, the virtual object is eliminated from the virtual scene. When the virtual object in the target area is not attacked by other virtual objects or attack type virtual props, the virtual life value is not changed.
In some embodiments, the destination area may be referred to as a "safe area" to indicate that the virtual object is safe when located in the destination area and is damaged when located outside the destination area. For example, in an electronic game scene, at intervals, a target area is refreshed in a virtual scene, the virtual life value of a virtual object located outside the target area is continuously reduced, and the virtual object located in the target area is not affected.
In some embodiments, in step 200, the electronic device may perform the position acquisition shown in step 200 when the destination area is refreshed, so as to determine whether a path needs to be planned. Specifically, the electronic device may acquire the destination area and the first location in response to the destination area update instruction. Of course, if the path needs to be re-planned subsequently, the step 200 may be performed again to re-determine whether the path needs to be planned.
201. And the terminal responds that the first position is positioned outside the destination area, and acquires a target path in the virtual scene according to the first position and the destination area, wherein the target path is a path from the first position to the destination area in the virtual scene.
After the terminal acquires the first position and the target area, whether the first position is located in the target area or not can be judged, the judgment results are different, and steps required to be executed by the terminal can be different. It can be understood that, if the first location is located in the destination area, and the controlled virtual object is located in the destination area, the virtual life value of the controlled virtual object is not affected, and naturally, the path does not need to be planned, and the terminal may not perform any step, or discard the obtained first location and the destination area. If the first position is located outside the target area, the virtual life value of the controlled virtual object is continuously reduced if the controlled virtual object is located outside the target area. In order to avoid being eliminated, the controlled virtual object needs to move to the destination area and enter the destination area. Therefore, the terminal can plan a target path for the controlled virtual object to assist the controlled virtual object to move to the target area.
Wherein, the starting point of the target path is the first position, and the end point is the destination area. For the destination, the terminal may perform path planning by using any position in the destination area as the destination, and the determined destination of the target path is a position in the destination area. The end point may also be obtained according to a position marking operation of the user, for example, when the user marks a certain position, the terminal determines that the marked position is within the destination area, and then the end point may be used as the end point to plan a path for the user, and a target path from the first position to the marked position is planned.
The target path is a path automatically planned by the terminal for the user, and compared with a mode that the user roughly deduces the moving direction according to subjective feeling, the target path is more accurate and can assist the controlled virtual object to move, so that the user does not need to determine the path by himself, convenience is provided for the user operation, and the user operation difficulty is simplified.
202. And the terminal displays direction indication information corresponding to the position of the controlled virtual object in the visual field picture of the controlled virtual object along with the position change of the controlled virtual object according to the target path and the visual angle of the virtual scene, wherein the direction indication information is used for indicating the moving direction of the controlled virtual object when moving to the target area.
After the terminal acquires the target path, a movement instruction can be provided for a user in real time according to the position change of the virtual object in a direction instruction information mode. The direction indicating information can intuitively and clearly indicate which direction the user moves to, so that the user can directly know how to control the controlled virtual object without analyzing the target path by himself or herself.
When the controlled virtual object moves in the virtual scene, the controlled virtual object may move according to the target path, and after the position is changed, the position of the controlled virtual object is still on the target path. There is also a possible scenario: after the controlled virtual object moves, the controlled virtual object deviates from the target path, and the position of the controlled virtual object is not on the target path. The position of the controlled virtual object refers to the real-time position of the controlled virtual object. When the positions of the controlled virtual objects are different, the direction indication information provided for the controlled virtual objects may be different according to the target path.
Specifically, if the controlled virtual object is on the target path, the direction indication information is used to indicate which direction the controlled virtual object needs to move to when the controlled virtual object is at the current position, so as to move to the destination area according to the target path. If the controlled virtual object is not on the target path, the direction indication information is used for indicating the direction to which the controlled virtual object needs to move at the current position to move to the target path, so that the controlled virtual object can move to the target area according to the target path.
In the embodiment of the application, on one hand, when the controlled virtual object is located outside the target area, the target path is planned for the controlled virtual object in a path planning mode to assist the controlled virtual object to move to the target area, so that a user does not need to determine a route by himself or herself, convenience is provided for user operation, and the user operation difficulty is simplified. On the other hand, the method for indicating the direction visually and clearly indicates which direction the controlled virtual object needs to move to in a direction indicating information mode, so that the controlled virtual object can move according to the target path, the difficulty of controlling the movement of the controlled virtual object can be simplified, the information displayed in the interface is enriched, the information content of the interface is improved, and the display effect is improved.
In the embodiment shown in fig. 2, when the target path is acquired, the end point of the target path may include the following two cases.
In the first case, the end point is an arbitrary position within the destination area. The interface display flow in this first case will be described below with reference to the embodiment shown in fig. 3.
In the second case, the endpoint is a user-selected location. The interface display flow in this second case will be described below with reference to the embodiment shown in fig. 18.
Fig. 3 is a flowchart of an interface display method provided in an embodiment of the present application, and referring to fig. 3, the method includes the following steps.
301. The terminal acquires a target area and a first position of the controlled virtual object, and the virtual life value of the virtual object located outside the target area is continuously reduced.
In the embodiment of the application, the target area can be refreshed in the virtual scene every other period of time or when a target event occurs in the virtual scene, the virtual life value of the virtual object located outside the target area is continuously reduced, and the virtual object located in the target area is not affected. The user needs to control the virtual object to move into the destination area, otherwise the virtual life value of the virtual object outside the destination area is continuously reduced until the virtual life value is reduced to zero.
For example, in an electronic game scenario, the destination area may be referred to as a "safe zone," and during a period of time after the electronic game is played, the destination area may not exist in the virtual scenario. After this time, a destination area is refreshed in the virtual scene. The user needs to control the virtual object to reach the target area, otherwise, the virtual life value of the virtual object outside the target area is continuously reduced until the virtual life value is reduced to zero, and the virtual life value is eliminated. And after a period of time, the target area is updated in the virtual scene, and the size of the updated target area is smaller than that of the previous target area. The updated destination area may be located within the previous destination area. The user needs to control the virtual object to reach the updated destination area. In some embodiments, the area outside the target area in these video game scenes may also be referred to as a "poison circle" to visually express the phenomenon of reducing the virtual life value when the virtual object is outside the target area.
In some embodiments, this step 301 may be performed when the terminal receives a destination area update instruction, in response to the destination area update instruction. For the destination area update instruction, the destination area update instruction may be triggered periodically or according to a target event in the virtual scene, for example, the destination area update instruction may be triggered when the number of virtual objects in the virtual scene is less than a first number threshold. For another example, the destination area update instruction may be triggered when a number of culled virtual objects in the virtual scene is greater than a second number threshold. For another example, the destination area update instruction may be triggered when the number of elimination of other virtual objects by any virtual object in the virtual scene reaches a third number threshold. The embodiment of the present application does not limit the trigger condition of the target area update instruction.
In other embodiments, this step 301 may be triggered by a path determination operation by the user. When the terminal detects the path determination operation, the terminal may acquire the destination area and the first position in response to the path determination operation, and then perform the subsequent steps.
In this embodiment, the terminal can obtain the destination area and the first position of the controlled virtual object to automatically perform path planning for the controlled virtual object, and assist in controlling the controlled virtual object to move to the destination area. When planning a path, any position in a destination area can be used as an end point, and a target path is finally determined, wherein the end point of the target path is a certain position in the destination area. The terminal can also perform path planning based on the position marked by the user, and the embodiment of the application does not limit which way is specifically adopted.
302. And the terminal responds that the first position is positioned outside the destination area, and acquires a target path in the virtual scene according to the first position and the destination area, wherein the target path is a path from the first position to the destination area in the virtual scene.
After the terminal acquires the target area and the first position required by path planning, the starting point and the end point can be determined, and path planning is carried out by combining a virtual scene. For the end point, the controlled virtual object may reach the target area, and any position in the target area may be used as the end point. Specifically, which position in the destination area the end point is, may be determined in the process of determining the target path, where the target path is determined, and the end point is also determined.
In some embodiments, the target path is obtained by connecting places which can be run in the virtual scene. Some locations in the virtual scene may enable the virtual object to pass through, and other locations may not enable the virtual object to pass through. That is, the virtual scene may include a place where the user can pass through, or a place where the user cannot pass through. An obstacle may exist at the no-pass location. For example, the virtual scene includes some obstacles such as virtual buildings, virtual trees, or virtual hills, through which the virtual object cannot pass. And virtual roads, virtual plains, virtual fields, virtual oceans and the like in the virtual scene allow the virtual objects to pass. The description is given for illustrative purposes only and is not intended to limit the present application. For example, some virtual hills may also allow virtual objects to pass through.
In these embodiments, when the terminal acquires the target route, the terminal may first acquire a map of the virtual scene, the starting point, and the ending point, where the map of the virtual scene includes a place that can be traveled, and then the terminal performs a route search according to the place that can be traveled between the starting point and the ending point in the map by using the first position in the map as the starting point and any position in the destination area as the ending point, so as to obtain at least one candidate route, and a route with a shortest length in the at least one candidate route is used as the target route.
In the searching process, the map of the virtual scene comprises a plurality of places, after the terminal acquires the map of the virtual scene, the starting point and the end point are places which can be run in the virtual scene, and then a path which is connected to the destination area from the first position is determined by selecting the places which can be run in the map. The target route is a route from the first position to connect a plurality of places to the target area. The target path is a path which can be passed by the controlled virtual object.
In some embodiments, the map of the virtual scene further includes a no-pass location. When the route is planned, the terminal can avoid the place where the passing is forbidden, and the route search is carried out by taking the place where the passing is allowed as the basis, so that the obtained positions on the target route are all the places where the controlled virtual object can pass, and the situation of detour caused by blockage of a barrier can not occur if the controlled virtual object moves along the target route, thereby greatly providing convenience for the movement of the controlled virtual object.
In the route search, a target condition for the route search can be set, and a route satisfying the target condition can be set as the target route. When searching for a route, the terminal connects different points to reach a destination area from a first position, and a plurality of candidate routes can be obtained. The lengths of the different candidate paths may be different or the same.
In some embodiments, the target condition may be the shortest length. Therefore, the time spent on moving according to the target path is less than that spent on moving according to other candidate paths, and the controlled virtual object can reach the target area most quickly.
It should be noted that the number of the target paths may be one or multiple, that is, the number of the target paths is at least one. For example, when multiple candidate paths are obtained through path search, if there may be one candidate path among the multiple candidate paths that is the shortest, the shortest candidate path may be used as the target path. For another example, when multiple candidate paths are obtained through path search, if there may be N shortest candidate paths among the multiple candidate paths, and the N shortest candidate paths have the same length, then the N shortest candidate paths may all be taken as the target path. Wherein N is an integer greater than 1.
In the route search, a map of the virtual scene may be used as a graph, points that can be traveled in the map may be used as nodes of the graph, and a route between the points that can be traveled may be an edge of the graph. In some embodiments, the weight of the edge in the graph may be 1, and may also be determined according to the attribute information of the route between the locations corresponding to the nodes. For example, taking the first node and the second node as an example, the weight may be relatively large if the path from the first node to the second node is an ascending path, and the weight may be relatively small if the path from the second node to the first node is a descending path.
Therefore, in step 302, the terminal may use a point in the map of the virtual scene as a node of the graph, use a route between the points as an edge of the graph, determine the weight of the edge using the attribute information of the route, search for a route connecting two nodes corresponding to the first position and the destination area, and use the shortest route as the target route.
In one possible implementation, the map of the virtual scene may be a coordinate graph, and the horizontal and vertical coordinates of the coordinate graph correspond to the length and width of the virtual scene. That is, the coordinate map corresponds to an overhead view of the virtual scene. For example, the overhead view of the virtual scene may be mesh divided, with the vertex of each mesh as a location in the virtual scene. The side length of each grid may be the target pixel. The target pixel may be set by a relevant technician as required, or may be adjusted according to a path planning fineness adjustment operation, which is not limited in the embodiment of the present application.
In one possible implementation manner, the path searching process may be implemented in a breadth-first search manner. The breadth-first search, also called breadth-first search, is one of the simplest graph search algorithms. A graph G = (V, E) and one source vertex s are known, where G is a graph, V denotes a set of vertices, and E denotes a set of edges. Breadth-first search explores the edges of G in a systematic way, finding all vertices that s can reach, and calculating the distance s to all these vertices (the minimum number of edges), the algorithm can also generate a breadth-first tree rooted at s and including all the reachable vertices. For any vertex v reachable from vertex s, the path from s to v in the breadth-first tree corresponds to the shortest path from s to v in graph G, i.e., the path containing the smallest number of edges. The algorithm is equally applicable to directed graphs and undirected graphs. This is called breadth first algorithm because the algorithm expands all the way out through the boundary between found and unseen vertices, i.e., the algorithm first searches all vertices with s distance k and then searches other vertices with s distance k + l. k is a positive number.
The breadth first search algorithm is explained below with reference to fig. 4. For example, the best path finding method is also a shortest path finding method based on coordinate points. As shown in fig. 4, the player moves from the start position of number (1) to the safe zone position of number (9). With the graph computation algorithm, according to breadth-first search, there are 2 paths that are also reached quickly, which can follow: (1) - > (2) - > (4) - > (7) - > (9), and may be along (1) - > (2) - > (5) - > (8) - > (9).
For example, as shown in fig. 5, a first position 501 of a controlled virtual object of a virtual scene is taken as a starting point, a destination area is named as a safe area 502, a point in the safe area 502 is taken as an end point, and some obstacles exist between the first position 501 and the safe area 502: building 503, tree 504, and hill 505. The terminal may use the first position 501 as a starting point, perform a path search starting from the starting point, and perform a search in a way of increasing the path length, so that by the path search, when the end point is searched to be an end point in the secure area, the target path 506 can be searched, and the length of the target path 506 is the shortest, and the controlled virtual object can reach the secure area quickly by the target path 506.
It should be noted that the process of acquiring the start point and the end point is also the step 301, and the terminal may acquire the map of the virtual scene in step 302 and acquire the target route based on the three. In this embodiment, the above description is only for the end point
For an example of an arbitrary position in the destination area, the end point of the target path may also be a position selected by the user, that is, the second case, and specifically, reference may be made to the embodiment shown in fig. 18 below, which is not described herein too much.
303. And the terminal displays the direction indication information in a visual field picture of the controlled virtual object according to the target path, the first position and the visual angle of the virtual scene.
In step 303, the controlled virtual object is not changed in position, and the controlled virtual object is still located at the first position, so that the terminal can determine how to provide the direction indication for the user according to the first position. In step 303, the direction indication information is used to indicate a moving direction at the first position when the controlled virtual object moves along the target path.
When the controlled virtual object is controlled to move, if the controlled virtual object is controlled to advance, the moving direction of the controlled virtual object is consistent with the visual angle of the virtual scene. Thus, the perspective of the virtual scene coincides with the orientation of the controlled virtual object and with the direction of progress of the controlled virtual object.
After the target path is acquired, the terminal may determine whether a current forward direction of the controlled virtual object is consistent with a direction indicated by the target path. The advancing direction can be represented by a visual angle of a virtual scene, the first position is located on the target path, and a direction indicated by the target path is a tangential direction of the first position on the target path.
The terminal may display, according to a first relationship between the angle of view of the virtual scene and the direction indicated by the target path, direction indication information corresponding to the first relationship in the view screen of the controlled virtual object.
When the first relationship is different, it is necessary to indicate that the direction of the controlled virtual object may be different. Specifically, the following two cases may be included in this step 303.
The first condition is as follows: and the terminal responds to the visual angle of the virtual scene and the direction indicated by the target path, and displays first direction indication information in the visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating that the controlled virtual object moves forwards.
Case two: and the terminal responds to the fact that the visual angle of the virtual scene is not consistent with the direction indicated by the target path, and displays second direction indication information corresponding to the target rotation direction in a visual field picture of the controlled virtual object according to the target rotation direction, wherein the second direction indication information is used for indicating that the controlled virtual object rotates leftwards or rightwards, and the visual angle of the virtual scene is consistent with the direction indicated by the target path after rotating for a target angle smaller than 180 degrees along the target rotation direction.
In case two, the visual angle of the virtual scene is not consistent with the direction indicated by the target path, and specifically, whether the controlled virtual object rotates left or right is indicated, and it is necessary to determine which side the controlled virtual object rotates to enable the two to be consistent with each other most quickly. Specifically, the following two possible scenarios may be included.
Possible scenarios are one: and the terminal responds to the visual angle of the virtual scene, and displays third direction indication information in a visual field picture of the controlled virtual object after rotating the target angle anticlockwise and then the direction indicated by the target path is consistent, wherein the third direction indication information is used for indicating the controlled virtual object to rotate leftwards.
Possible scenario two: and the terminal responds to the visual angle of the virtual scene, and displays fourth direction indication information in the visual field picture of the controlled virtual object after the visual angle of the virtual scene is clockwise rotated by the target angle and then is consistent with the direction indicated by the target path, wherein the fourth direction indication information is used for indicating the controlled virtual object to rotate rightwards.
In some embodiments, the first relationship may be a first angle between a perspective of the virtual scene and a direction indicated by the target path. Whether the visual angle of the virtual scene is consistent with the direction indicated by the target path or not can be determined through a comparison result of the first included angle and an angle threshold, and when the target rotation direction is determined, the target angle is also the first included angle. The terminal may obtain a first included angle between the viewing angle of the virtual scene and the direction indicated by the target path, and display direction indication information corresponding to the first included angle in the view picture of the controlled virtual object.
Specifically, the terminal may display, in a view picture of the controlled virtual object, direction indication information corresponding to a first included angle according to the first included angle between the view angle of the virtual scene and the tangential direction at the first position on the target path. Wherein the first included angle is less than 180 degrees.
When the first included angles are different, the direction indication information may be different. In some embodiments, the direction indication information includes first direction indication information, second direction indication information, or third direction indication information, and the direction indication information may be text, an image, or a special effect animation. Specifically, this step 303 may include the following three cases.
In the first case, the terminal responds that the first included angle is smaller than the angle threshold, and displays first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating the controlled virtual object to move forwards.
In this case, the first included angle is smaller than the angle threshold, and the viewing angle of the virtual scene may be considered to be consistent with the tangential direction at the first position on the target path. Thus, the controlled virtual object can move forward along the target path, and therefore, the terminal can display the first direction indication information to indicate the user to control the controlled virtual object to move forward. The first condition corresponds to a condition that the viewing angle of the virtual scene is consistent with the direction indicated by the target path.
For example, as shown in fig. 6, when the virtual scene is viewed from top to bottom, the terminal determines a target path 603 from a first position 601 of the controlled virtual object to the safe zone 602, and the terminal may obtain a first included angle 606 between a viewing angle 604 of the virtual scene and a tangential direction 605 (i.e., a direction indicated by the target path) at the first position 601 on the target path 603. If the first angle 606 is smaller than the angle threshold, the advancing direction of the controlled virtual object is substantially the same as the tangential direction 605, and the controlled virtual object is controlled to advance. Thus, as shown in fig. 7, the terminal may display the first direction indication information 700, and the first direction indication information 700 may be a plurality of forward direction arrow icons.
And in a second situation, the terminal responds that the first included angle is larger than or equal to the angle threshold value, and the visual angle of the virtual scene is consistent with the tangential direction after the first included angle is rotated anticlockwise, and displays second direction indication information in a visual field picture of the controlled virtual object, wherein the second direction indication information is used for indicating the controlled virtual object to rotate left.
In the second case, if the first included angle is greater than or equal to the angle threshold, it can be considered that the viewing angle of the virtual scene is not consistent with the tangential direction (i.e. the direction indicated by the target path) at the first position on the target path. When the two are not consistent, the visual angle and the tangential direction of the virtual scene can be judged, and how to adjust the visual angle of the virtual scene to enable the two to be consistent.
For example, as shown in fig. 8, when the virtual scene is viewed from top to bottom, the terminal determines a target path 803 from the first position 801 of the controlled virtual object to the safety zone 802, and the terminal may obtain a first angle 806 between a viewing angle 804 of the virtual scene and a tangential direction 805 (i.e., a direction indicated by the target path) at the first position 801 on the target path 803. The first angle 806 is greater than the angle threshold, and the user needs to control the controlled virtual object to turn so that the forward direction of the controlled virtual object is consistent with the tangential direction 805 to control the controlled virtual object to move along the target path 803. When turning, the virtual scene can be rotated from the perspective 804 to the tangential direction 805 by rotating the first angle 806 counterclockwise. Fig. 8 is an overhead view from top to bottom, and for a controlled virtual object in a virtual scene, the process of counterclockwise rotation is also the process of left rotation of the controlled virtual object. As shown in fig. 9, the terminal may display second direction indication information 900, and the second direction indication information 900 may include a plurality of left direction arrow icons 9001 and a left-turning direction indication icon 9002. The left directional arrow icons 9001 can be displayed on a target path, which can be seen by a user.
And in response to the fact that the first included angle is larger than or equal to the angle threshold value and the visual angle of the virtual scene is consistent with the tangential direction after the first included angle is clockwise rotated, the terminal displays third direction indicating information in a visual field picture of the controlled virtual object, wherein the third direction indicating information is used for indicating the controlled virtual object to rotate rightwards.
In the third case, if the first included angle is greater than or equal to the angle threshold, it may be considered that the viewing angle of the virtual scene is not consistent with the tangential direction (i.e. the direction indicated by the target path) at the first position on the target path. When the two are not consistent, the visual angle and the tangential direction of the virtual scene can be judged, and how to adjust the visual angle of the virtual scene to enable the two to be consistent.
The second case and the third case are the second case corresponding to the direction indicated by the viewing angle of the virtual scene and the target path being inconsistent.
For example, as shown in fig. 10, when looking down the virtual scene from top to bottom, the terminal determines a target path 1003 from a first position 1001 of the controlled virtual object to the safe area 1002, and the terminal may obtain a first angle 1006 between a viewing angle 1004 of the virtual scene and a tangential direction 1005 (i.e., a direction indicated by the target path) at the first position 1001 on the target path 1003. If the first included angle 1006 is greater than the angle threshold, the user needs to control the direction of the controlled virtual object to be rotated, so that the advancing direction of the controlled virtual object is consistent with the tangential direction 1005, and the controlled virtual object can be controlled to move along the target path 1003. When turning, the angle of view 1004 of the virtual scene needs to be turned to the tangential direction 1005 by turning the first angle 1006 clockwise. Fig. 10 is a top-down overhead view, and for a controlled virtual object in a virtual scene, the clockwise rotation process is the process of rotating the controlled virtual object to the right. As shown in fig. 11, the terminal may display third direction indication information 1100, and the third direction indication information 1100 may include a plurality of direction arrow icons 1101 to the right and a direction indication icon 1102 to turn to the right. The right direction arrow icons 1101 can be displayed on the target path, and the user can see the target path.
As for the angle threshold, the angle threshold may be set by a person skilled in the art as needed, for example, the angle threshold is 5 degrees, which is not limited in the embodiment of the present application.
In one possible implementation, a display style may be set for the direction indication information. Specifically, when displaying the direction instruction information, the terminal may display the direction instruction information in a target display style on a visual field screen of the controlled virtual object.
Wherein, the target display style may include at least one of a shape, a color, a display special effect, and a transparency of the direction indication information. For example, for the shape, the direction indication information may be a direction arrow icon, or a triangular direction indication icon in which a direction is displayed. For example, the color of the direction indication information may be green, yellow, red, or the like. For another example, the display special effect may be a blinking special effect, a highlighting special effect, or the like. For another example, the transparency may be 50%, or may be other values. The target display sample can be set by a related technician, and can also be adjusted by a user according to the use habit of the user. The embodiments of the present application do not limit this.
In the mode that the target display style is adjusted by the user according to the use habit of the user, the target display style is determined according to the style adjustment operation. If the user wants to adjust the target display style, a style adjustment operation can be performed on the terminal, and the terminal detects the style adjustment operation and can adjust the target display style in response to the style adjustment operation.
In one possible implementation, when the target display style includes displaying a special effect, the direction indication information may be displayed in accordance with the target special effect. In step 303, the terminal displays the direction instruction information in accordance with the target special effect on the visual field screen of the controlled virtual object. For example, the target special effect may be a blinking special effect, and as shown in fig. 12, the displayed direction indication information is taken as first direction indication information, the first direction indication information is a plurality of forward direction arrow icons 1201, and the plurality of forward direction arrow icons 1201 present the blinking special effect. The blinking manner of the blinking special effect can be set by the related technical personnel according to the requirement, for example, blinking once per second, or blinking three times per second, or blinking once every two seconds, and the like. The embodiments of the present application do not limit this.
In one possible implementation, when the target display style includes transparency, the direction indication information may be displayed in accordance with the target transparency. In step 303, the terminal displays the direction instruction information in accordance with the target transparency on the visual field screen of the controlled virtual object. For example, the target transparency may be 50%, and the target transparency may be set by a related technician as required, or adjusted by a user according to own usage habits, which is not limited in the embodiments of the present application.
In one possible implementation, a target display position may be set for direction indication information displayed on the target display position in the visual field of the controlled virtual object. When displaying the direction instruction information, the terminal may display the direction instruction information at a target display position in the visual field screen of the controlled virtual object.
In one possible implementation, the target display position is determined according to the type of the direction indication information. The direction indication information may include first direction indication information, second direction indication information, and third direction indication information, and different direction indication information may correspond to different target display positions. In some embodiments, the target display position corresponding to the first direction indication information is a first target display position, the target display position corresponding to the second direction indication information is a second target display position, and the target display position corresponding to the third direction indication information is a third target display position. For example, as shown in fig. 7, 9 and 11, the position where the first direction indication information 700 is displayed may be a first target display position, the position where the second direction indication information 900 is displayed may be a second target display position, and the position where the third direction indication information 1100 is displayed may be a third target display position.
In these embodiments, the target display position is determined based on the target path, the current first position of the controlled virtual object, and the perspective of the virtual scene. In step 303, the terminal may determine the direction indication information and the target display position of the direction indication information according to the target path, the position of the controlled virtual object (the first position in step 303, and the real-time position of the controlled virtual object after the position of the subsequent controlled virtual object changes), and the angle of view of the virtual scene; and displaying the direction indication information on the target display position in the visual field picture of the controlled virtual object.
304. And the terminal responds to the position change of the controlled virtual object and acquires the position of the controlled virtual object.
The terminal can control the controlled virtual object to move in the virtual scene according to the movement control operation of the controlled virtual object, and the position of the controlled virtual object in the virtual scene may change. In this embodiment, the terminal may provide a direction indication for the controlled virtual object in real time according to the real-time position of the controlled virtual object and according to the target path in the moving process of the controlled virtual object.
The terminal may display, in the view of the controlled virtual object, direction indication information corresponding to the position of the controlled virtual object according to the target path and the angle of view of the virtual scene, along with the change of the position of the controlled virtual object, where the direction indication information is used to indicate a moving direction of the controlled virtual object when the controlled virtual object moves to the target area. The direction indication information display process shown in step 304 and step 305 can be applied to any time of the position change process, and only a certain time of the position change process is taken as an example for explanation.
305. And the terminal displays direction indication information in a visual field picture of the controlled virtual object according to the target path, the position of the controlled virtual object and the visual angle of the virtual scene.
In step 305, the direction indication information is used to indicate a moving direction of the controlled virtual object when moving to the destination area. The location of the controlled virtual object may be on the target path or outside the target path. Specifically, if the location of the controlled virtual object is located on the target path, the direction indication information is used to indicate a moving direction of the controlled virtual object at the location of the controlled virtual object when the controlled virtual object moves according to the target path. And if the controlled virtual object is not on the target path, the direction indication information is used for indicating the moving direction of the controlled virtual object moving to the target path at the position of the controlled virtual object.
When the location of the controlled virtual object is located on the target path, that is, when the controlled virtual object is located on the target path, in step 305, the same as step 303 is also performed, and the terminal may respond that the controlled virtual object is located on the target path, and display, in the view screen of the controlled virtual object, direction indication information corresponding to a first relationship according to the first relationship between the viewing angle of the virtual scene and the direction indicated by the target path.
Similarly, if the controlled virtual object is located on the target path, this step 305 includes the following two cases.
The first condition is as follows: and the terminal responds to the visual angle of the virtual scene and the direction indicated by the target path, and displays first direction indication information in the visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating that the controlled virtual object moves forwards.
Case two: and the terminal responds to the fact that the visual angle of the virtual scene is not consistent with the direction indicated by the target path, and displays second direction indication information corresponding to the target rotation direction in a visual field picture of the controlled virtual object according to the target rotation direction, wherein the second direction indication information is used for indicating that the controlled virtual object rotates leftwards or rightwards, and the visual angle of the virtual scene is consistent with the direction indicated by the target path after rotating for a target angle smaller than 180 degrees along the target rotation direction. Similarly, the second case includes two possible scenarios, which are not described herein.
Specifically, the first relation may be a first included angle, and the process may be: and the terminal responds that the controlled virtual object is positioned on the target path, acquires a first included angle between the visual angle of the virtual scene and the direction indicated by the target path, and displays direction indication information corresponding to the first included angle in a visual field picture of the controlled virtual object. The first included angle is less than 180 degrees. Except that in step 303, the direction indicated by the target path is a tangential direction at the first position on the target path, and in this step 305, the direction indicated by the target path is a tangential direction at the position of the controlled virtual object on the target path. In addition, when determining which kind of direction indication information to display, when determining the target display position, and the target display style of the direction indication information are all the same as those in step 303, which is not described herein again in this embodiment of the present application.
When the position of the controlled virtual object is located outside the target path, that is, when the controlled virtual object is located outside the target path, the terminal takes the direction of the connection line with the shortest distance between the target path and the controlled virtual object as the direction in which the controlled virtual object can reach the target path at the fastest speed, and compares the direction with the view angle of the virtual scene, so as to analyze how to control the controlled virtual object to move to the target path. The direction of the connection line with the shortest distance between the target path and the controlled virtual object is referred to herein as a target connection line direction, and the target connection line direction is the connection line direction between the position of the controlled virtual object and the nearest point on the target path.
Accordingly, in step 305, in response to that the controlled virtual object is located outside the target path, the terminal may display, according to a second relationship between the viewing angle of the virtual scene and a target connection direction, direction indication information corresponding to the second relationship in the viewing screen of the controlled virtual object, where the target connection direction is a direction of a connection line where a distance between the target path and the controlled virtual object is shortest.
When the controlled virtual object is located outside the target path, if it is determined that the second relationship between the view angle of the virtual scene and the target connection mode is different, different direction indication information can be displayed. Specifically, the following two cases may be included.
The first condition is as follows: and the terminal responds that the visual angle of the virtual scene is consistent with the direction of the target connecting line, and displays first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating the controlled virtual object to move forwards.
Case two: and the terminal responds that the visual angle of the virtual scene is inconsistent with the target connecting line direction, and displays second direction indication information corresponding to the target rotating direction in a visual field picture of the controlled virtual object according to the target rotating direction, wherein the second direction indication information is used for indicating the controlled virtual object to rotate leftwards or rightwards, and the visual angle of the virtual scene is consistent with the target connecting line direction after rotating for a target angle smaller than 180 degrees along the target rotating direction.
In this case two, the viewing angle of the virtual scene is not consistent with the target link direction. The terminal can specifically analyze and control which side the controlled virtual object rotates to enable the controlled virtual object and the controlled virtual object to be consistent with each other at the fastest speed. Specifically, the following two possible scenarios are included.
Possible scenarios one: and the terminal responds to the visual angle of the virtual scene, and displays third direction indicating information in the visual field picture of the controlled virtual object after rotating the target angle anticlockwise and the direction of the target connecting line is consistent with that of the target connecting line, wherein the third direction indicating information is used for indicating the controlled virtual object to rotate leftwards.
Possible scenario two: and the terminal responds to the visual angle of the virtual scene, and displays fourth direction indication information in a visual field picture of the controlled virtual object after clockwise rotating the target angle and then the direction of the target connecting line is consistent with that of the target, wherein the fourth direction indication information is used for indicating the controlled virtual object to rotate rightwards.
In some embodiments, the second relationship may be a second included angle, and whether the viewing angle of the virtual scene is consistent with the target connection line direction after the target angle is rotated clockwise corresponds to a relationship between the second included angle and an angle threshold, where the target angle is the second included angle. The above process may be: and the terminal responds that the controlled virtual object is positioned outside the target path, acquires a second included angle between the visual angle of the virtual scene and the target connecting line direction, and displays direction indication information corresponding to the second included angle in the visual field picture of the controlled virtual object.
Wherein the second included angle is less than 180 degrees. When the second included angles are different, the direction indication information displayed by the terminal can also be different. This step 305 may include the following three cases.
In the first case, the terminal responds that the second included angle is smaller than the angle threshold value, and displays first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating the controlled virtual object to move forwards.
In this case, the second included angle is smaller than the angle threshold, and the viewing angle of the virtual scene may be considered to be consistent with the direction of fast returning to the target path (i.e. the target link direction). Therefore, the controlled virtual object can quickly return to the target path when moving forwards to move along the target path, and therefore, the terminal can display the first direction indication information to indicate the user to control the controlled virtual object to move forwards. This is the case i, i.e. the above-mentioned case i in which the viewing angle of the virtual scene is consistent with the target link direction.
For example, as shown in fig. 13, when looking down the virtual scene from top to bottom, the terminal determines a target path 1303 from the first position 1301 of the controlled virtual object to the safety area 1302, and when the controlled virtual object moves to another position 1304, the terminal may obtain a target link direction 1306 from the other position 1304 to a closest position 1305 on the target path, and then obtain a second angle 1308 between the viewing angle 1307 of the virtual scene and the target link direction 1306. If the second angle 1308 is smaller than the angle threshold, the forward direction of the controlled virtual object is substantially the same as the target connection line direction 1306, and the controlled virtual object is controlled to move forward. The specific interface display may be as shown in fig. 7, which is not described herein in detail.
And in a second situation, the terminal responds that the second included angle is larger than or equal to the angle threshold value, the visual angle of the virtual scene is consistent with the direction of the target connecting line after the second included angle is rotated anticlockwise, and second direction indication information is displayed in a visual field picture of the controlled virtual object and is used for indicating the controlled virtual object to rotate left.
In the second case, if the second included angle is greater than or equal to the angle threshold, it may be considered that the viewing angle of the virtual scene is not consistent with the direction of returning to the target path quickly (i.e. the target link direction). When the two are not consistent, the visual angle of the virtual scene and the target connecting line direction can be judged, and how to adjust the visual angle of the virtual scene to enable the two to be consistent.
For example, as shown in fig. 14, when the virtual scene is viewed from top to bottom, the terminal determines a target path 1403 from the first position 1401 of the controlled virtual object to the safety area 1402, and when the controlled virtual object moves to another position 1404, the terminal may obtain a target connecting line direction 1406 of the another position 1404 and the closest position 1405 on the target path, and then obtain a second included angle 1408 between the viewing angle 1407 of the virtual scene and the target connecting line direction 1406. The second angle 1408 is greater than the angle threshold, and the user needs to control the controlled virtual object to turn to make the forward direction of the controlled virtual object consistent with the target connection direction 1406 to control the controlled virtual object to move along the direction of the target path 1403 to approach the target path 1403. When turning, the virtual scene view 1407 can be rotated to the target line direction 1406 by rotating the second angle 1408 counterclockwise. Fig. 14 is an overhead view from top to bottom, and for a controlled virtual object in a virtual scene, the process of counterclockwise rotation is also the process of left rotation of the controlled virtual object. The interface display in this case may be as shown in fig. 9, which is not described herein.
And thirdly, the terminal responds that the second included angle is larger than or equal to the angle threshold value, the visual angle of the virtual scene is consistent with the target connecting line direction after clockwise rotating the second included angle, and third direction indicating information is displayed in a visual field picture of the controlled virtual object and is used for indicating the controlled virtual object to rotate rightwards.
In this case three, the second angle is greater than or equal to the angle threshold, and the viewing angle of the virtual scene may be considered to be inconsistent with the direction of fast returning to the target path. When the two are not consistent, the visual angle of the virtual scene and the target connecting line direction can be judged, and how to adjust the visual angle of the virtual scene to enable the two to be consistent.
The second case and the third case are also corresponding to the second case where the view angle of the virtual scene is not consistent with the target connecting line direction.
For example, as shown in fig. 15, when the virtual scene is viewed from top to bottom, the terminal determines a target path 1503 from a first position 1501 of the controlled virtual object to the security area 1502, and when the controlled virtual object moves to another position 1504, the terminal may obtain a target link direction 1506 at a position 1505 closest to the other position 1504 and the target path, and further obtain a second angle 1508 between the viewing angle 1507 of the virtual scene and the target link direction 1506. The second angle 1508 is greater than the angle threshold, and the user needs to control the controlled virtual object to turn to make the forward direction of the controlled virtual object consistent with the target link direction 1506 to control the controlled virtual object to move along the direction of the target path 1503 to approach the target path 1503. When turning, the virtual scene view 1507 can be rotated to the target link direction 1506 by rotating the second angle 1508 clockwise. Fig. 14 is an overhead view from top to bottom, and for the controlled virtual object in the virtual scene, the clockwise rotation process is the process of the controlled virtual object rotating to the right. The interface display in this case may be as shown in fig. 11, which is not described herein.
In one possible implementation, the virtual life value of the controlled virtual object may be continuously decreased when located outside the destination area (the safe zone). In some embodiments, after the target path is obtained in step 302, the terminal may further provide a target virtual item on the target path, where the target virtual item is used to increase a virtual life value of the virtual object. Therefore, the target virtual prop is used for assisting the controlled virtual object to move along the target path, and a mode for increasing the virtual life value is provided when the controlled virtual object is out of the target area. Specifically, the terminal displays the target virtual prop at a position every target distance on the target path. The controlled virtual object can increase the virtual life value in a mode of acquiring the target virtual prop so as to safely reach the target area.
In one possible implementation manner, the user can control the controlled virtual object to move along the target path, and when the controlled virtual object contacts or approaches the target virtual item, the target virtual item can be acquired, so that the virtual life value is increased. Specifically, when the terminal responds that the distance between the controlled virtual object and any target virtual prop is smaller than a second distance threshold value, the terminal displays the virtual life value of the controlled virtual object and increases the target virtual life value.
The second distance threshold may be a relatively small value, and the distance between the controlled virtual object and any target virtual prop is smaller than the second distance threshold, which may be considered to be very close, so that the controlled virtual object can obtain the target virtual prop.
Optionally, the target virtual prop can be obtained automatically. Optionally, the target virtual item can be obtained according to a pickup operation. The embodiment of the application does not limit the obtaining mode of the target virtual prop.
In one possible implementation manner, the terminal may dynamically set the interval of the target virtual item or the target virtual life value according to the state of the controlled virtual object and the length of the target path. That is, the interval (i.e., target distance) of the above-described target virtual items may be variable, or the target virtual life value that each target virtual item may increase may be variable.
In some embodiments, the determination of the target distance may be: the terminal obtains the speed of the virtual life value reduction of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target distance according to the speed, the moving speed and the preset target virtual life value.
In other embodiments, the determination process of the target virtual life value may be: the terminal obtains the speed of the virtual life value reduction of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target virtual life value according to the speed, the moving speed, the length and the preset target distance.
For example, assuming that the interval L of the target virtual props on the target path, the moving speed V of the virtual object, and the speed of reducing the virtual life value of the virtual object outside the target area are the speed Q of subtracting the virtual life value per second, it can be determined that the controlled virtual object can enter the target area within the time T according to the length S of the target path and the moving speed V, and the total subtracted virtual life value within the time T is Q × T. The moving distance V of the player is T = S, and the number (V T)/L of the target virtual props is picked up. In one specific example, Q = (V × T)/L may be set, that is, the virtual life value deducted by the controlled virtual object is equal to the deducted virtual life value when moving to the destination area along the target path. The balance is achieved in that the player (i.e., the controlled virtual object) just has to go out of the poison circle along the entry mark path, and is just not harmed by the poison circle (i.e., outside the destination area). Of course, this is merely an exemplary illustration, and the relationship between the above parameters may be set by a person skilled in the relevant art according to requirements, and this is not limited in the embodiments of the present application.
In a possible implementation manner, if the user controls the controlled virtual object to be away from the target path by a certain distance, path planning may be performed again for the controlled virtual object, the target path is updated, and a new and better path is determined, so as to assist the controlled virtual object to quickly enter the target area. Specifically, in response to that the distance between the position of the controlled virtual object and the target path is greater than a first distance threshold, the terminal updates the target path according to the position of the controlled virtual object and the target area, where the updated target path is a path from the position of the controlled virtual object to the target area in the virtual scene.
The first distance threshold may be set by a person skilled in the art as needed, for example, the first distance threshold may be a length of 5 coordinate points, which is not limited in this embodiment of the present application.
In a possible implementation manner, when the controlled virtual object moves to the target area, the controlled virtual object does not need to be guided to enter the target area, a target path does not need to be provided, and direction indication information does not need to be displayed naturally. Specifically, the terminal may cancel the display of the direction indication information in response to the position of the controlled virtual object moving into the destination area. For example, as shown in fig. 16, after the terminal controls the controlled virtual object to enter the destination area, the display of the direction indication information may be cancelled. At the position where the direction indication information is originally displayed (which position has been indicated by an arrow), the direction indication information is no longer displayed.
A specific example is provided below by way of fig. 17. As shown in fig. 17, assuming that the target area is a "safe area", the area outside the target area is called a "poison circle", the process of controlling the controlled virtual object (player) to move to the target area is called escape or escape, the direction indication information is an escape indication, the target virtual item is called a bleeding item, and the increase of the virtual life value is called bleeding. Specifically, when the player is in a poison circle, the terminal may perform step 1701 of preferentially calculating an optimal path according to the breadth, then perform step 1702 of determining whether the current perspective of the player coincides with an escape route (i.e., a target path), if not, the terminal may perform step 1703 of prompting the player to turn the perspective, then perform step 1704 of determining whether 5 coordinate points (i.e., a first distance threshold) deviate from the original route, if not, the terminal may perform step 1702 again, and if so, the terminal may perform step 1701 again. After the step 1702, if the judgment is consistent, the terminal may perform a step 1705 of judging whether the player has walked the blood-returning prop, if so, the terminal may perform a step 1706 of returning blood to the player, if not, the terminal may perform a step 1707 of judging whether to escape from the poison circle, if not, the terminal may perform a step 1708 of continuing to display the escape path, and if so, the terminal may perform a step 1709 of closing the display of the escape path.
In the embodiment of the application, on one hand, when the controlled virtual object is located outside the target area, the target path is planned for the controlled virtual object in a path planning mode to assist the controlled virtual object to move to the target area, so that a user does not need to determine a route by himself or herself, convenience is provided for user operation, and the user operation difficulty is simplified. On the other hand, the method for indicating the direction visually and clearly indicates which direction the controlled virtual object needs to move to in a direction indicating information mode, so that the controlled virtual object can move according to the target path, the difficulty of controlling the movement of the controlled virtual object can be simplified, the information displayed in the interface is enriched, the information content of the interface is improved, and the display effect is improved.
In the embodiment of fig. 3, the above description has been made on how to interface to assist the player to escape from the poison circle and enter the security zone, i.e. the first case, where the endpoint is any location within the destination zone. The second case, that is, the end point is the user-selected position, will be described below with the embodiment shown in fig. 18.
Fig. 18 is a flowchart of an interface display method provided in an embodiment of the present application, and referring to fig. 18, the method includes the following steps.
1801. The terminal acquires a target area and a first position of the controlled virtual object, and the virtual life value of the virtual object located outside the target area is continuously reduced.
Step 1801 is similar to step 301, and will not be described herein.
1802. The terminal responds to the position marking operation and acquires a second position indicated by the position marking operation.
In this embodiment, the user may mark a location in the virtual scene to determine the endpoint. After the terminal acquires the end point, whether the end point is the position in the target area or not can be determined, whether the controlled virtual object is outside the target area or not also needs to be judged, and if the end point is outside the target area, the terminal can perform subsequent path planning and direction indication information display steps.
If the second location is outside the destination area and the first location is outside the destination area, the terminal may perform the above-described steps 302 to 305.
If the second location is within the destination area and the first location is within the destination area, the terminal may not perform any steps or discard the acquired data.
1803. And the terminal responds that the second position is located in the destination area, the first position is located outside the destination area, and a target path in the virtual scene is obtained according to the first position and the second position, wherein the target path is a path from the first position to the second position in the virtual scene.
If the second position is located in the destination area, the user may want to definitely enter the destination area and then want to reach the position, so that the user can plan the path by taking the position as an end point, and the obtained target path better meets the user requirements.
The process of obtaining the target path is similar to the step 302, and is not described herein again.
1804. And the terminal displays the direction indication information in a visual field picture of the controlled virtual object according to the target path, the first position and the visual angle of the virtual scene.
1805. And the terminal responds to the position change of the controlled virtual object and acquires the position of the controlled virtual object.
1806. And the terminal displays the direction indication information in a visual field picture of the controlled virtual object according to the target path, the position of the controlled virtual object and the visual angle of the virtual scene.
Steps 1804 to 1806 are similar to steps 303 to 305, and the embodiment of the present application is not described herein again.
In the embodiment of the application, on one hand, when the controlled virtual object is located outside the target area, the target path is planned for the controlled virtual object in a path planning mode to assist the controlled virtual object to move to the target area, so that a user does not need to determine a route by himself, convenience is provided for the user operation, and the user operation difficulty is simplified. On the other hand, the method for indicating the direction clearly indicates which direction the controlled virtual object needs to move to in a direction indicating information mode so as to move according to the target path, thereby simplifying the difficulty of controlling the movement of the controlled virtual object, enriching the information displayed in the interface, improving the information content of the interface and further improving the display effect.
All the above optional technical solutions can be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 19 is a schematic structural diagram of an interface display device according to an embodiment of the present application, and referring to fig. 19, the interface display device includes:
a first obtaining module 1901, configured to obtain a destination area and a current first position of a controlled virtual object, where a virtual life value of a virtual object located outside the destination area continuously decreases;
a second obtaining module 1902, configured to, in response to that the first location is outside the destination area, obtain a target path in a virtual scene according to the first location and the destination area, where the target path is a path from the first location to the destination area in the virtual scene;
a display module 1903, configured to display, in the view of the controlled virtual object, direction indication information corresponding to the location of the controlled virtual object according to the target path and the viewing angle of the virtual scene, along with the change of the location of the controlled virtual object, where the direction indication information is used to indicate a moving direction of the controlled virtual object when the controlled virtual object moves to the destination area.
In one possible implementation, the display module 1903 is configured to:
responding to the controlled virtual object positioned on the target path, and displaying direction indication information corresponding to a first relation in a visual field picture of the controlled virtual object according to the first relation between the visual angle of the virtual scene and the direction indicated by the target path;
and responding to that the controlled virtual object is positioned outside the target path, and displaying direction indication information corresponding to a second relation in a visual field picture of the controlled virtual object according to the second relation between the visual angle of the virtual scene and a target connecting line direction, wherein the target connecting line direction is the direction of a connecting line with the shortest distance between the target path and the controlled virtual object.
In one possible implementation, the display module 1903 is configured to:
in response to that the visual angle of the virtual scene is consistent with the direction indicated by the target path, displaying first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating that the controlled virtual object moves forwards;
in response to that the visual angle of the virtual scene is not consistent with the direction indicated by the target path, displaying second direction indication information corresponding to the target rotation direction in a visual field picture of the controlled virtual object according to the target rotation direction, wherein the second direction indication information is used for indicating that the controlled virtual object rotates leftwards or rightwards, and the visual angle of the virtual scene is consistent with the direction indicated by the target path after rotating for a target angle smaller than 180 degrees along the target rotation direction;
in one possible implementation, the display module 1903 is configured to:
responding to the consistency of the visual angle of the virtual scene and the direction of a target connecting line, and displaying first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating the controlled virtual object to move forwards;
and in response to the fact that the visual angle of the virtual scene is not consistent with the target connecting line direction, displaying second direction indication information corresponding to the target rotating direction in a visual field picture of the controlled virtual object according to the target rotating direction, wherein the second direction indication information is used for indicating the controlled virtual object to rotate leftwards or rightwards, and the visual angle of the virtual scene is consistent with the target connecting line direction after rotating for a target angle smaller than 180 degrees along the target rotating direction.
In one possible implementation, the display module 1903 is configured to:
responding to that the visual angle of the virtual scene is consistent with the direction indicated by the target path or the target connecting line direction after rotating the target angle anticlockwise, and displaying third direction indication information in a visual field picture of the controlled virtual object, wherein the third direction indication information is used for indicating the controlled virtual object to rotate leftwards;
and in response to that the visual angle of the virtual scene is consistent with the direction indicated by the target path or the target connecting line direction after the target angle is rotated clockwise, displaying fourth direction indication information in a visual field picture of the controlled virtual object, wherein the fourth direction indication information is used for indicating that the controlled virtual object rotates rightwards.
In one possible implementation, the display module 1903 is configured to:
responding to the situation that the controlled virtual object is located on the target path, acquiring a first included angle between the visual angle of the virtual scene and the direction indicated by the target path, and displaying direction indication information corresponding to the first included angle in a visual field picture of the controlled virtual object;
and responding to the situation that the controlled virtual object is positioned outside the target path, acquiring a second included angle between the visual angle of the virtual scene and the target connecting line direction, and displaying direction indication information corresponding to the second included angle in the visual field picture of the controlled virtual object.
In one possible implementation, the target path is obtained by connecting feasible places in the virtual scene.
In one possible implementation, the second obtaining module 1902 is configured to:
acquiring a map of the virtual scene, wherein the map of the virtual scene comprises a place capable of being passed;
taking the first position in the map as a starting point, taking any position in the destination area as an end point, and searching a path according to a position where the starting point and the end point can be communicated in the map to obtain at least one candidate path;
and taking the path with the shortest length in the at least one candidate path as the target path.
In one possible implementation, the apparatus further includes:
the third acquisition module is used for responding to the position marking operation and acquiring a second position indicated by the position marking operation;
the second obtaining module 1902, further configured to, in response to the second location being within the destination area, obtain a target path in the virtual scene according to the first location and the second location, where the target path is a path from the first location to the second location in the virtual scene.
In one possible implementation, the apparatus further includes:
and the updating module is used for responding to the fact that the distance between the position of the controlled virtual object and the target path is larger than a first distance threshold value, updating the target path according to the position of the controlled virtual object and the target area, and the updated target path is a path from the position of the controlled virtual object to the target area in the virtual scene.
In one possible implementation, the display module 1903 is further configured to display target virtual props at positions on the target path at target distance intervals, where the target virtual props are used to increase the virtual life value of the virtual object.
In one possible implementation, the display module 1903 is further configured to display that the virtual life value of the controlled virtual object is increased by the target virtual life value in response to the distance between the controlled virtual object and any of the target virtual props being less than a second distance threshold.
In one possible implementation, the apparatus further includes:
the first determining module is used for acquiring the speed of reducing the virtual life value of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target virtual life value according to the speed, the moving speed, the length and the preset target distance.
In one possible implementation, the apparatus further includes:
the second determining module is used for acquiring the speed of the virtual life value reduction of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target distance according to the speed, the moving speed and the preset target virtual life value.
According to the device provided by the embodiment of the application, on one hand, a target path is planned for the controlled virtual object in a path planning mode, the target path is a correct path obtained through path planning, the controlled virtual object can be assisted to move, a user does not need to determine a path by himself or herself, convenience is provided for the user operation, and the user operation difficulty is simplified. On the other hand, the method for indicating the direction visually and clearly indicates which direction the controlled virtual object needs to move to in a direction indicating information mode, so that the controlled virtual object can move according to the target path, the difficulty of controlling the movement of the controlled virtual object can be simplified, the information displayed in the interface is enriched, the information content of the interface is improved, and the display effect is improved.
It should be noted that: in the interface display device provided in the above embodiment, when displaying an interface, only the division of each function module is exemplified, and in practical applications, the function distribution can be completed by different function modules according to needs, that is, the internal structure of the interface display device is divided into different function modules to complete all or part of the functions described above. In addition, the interface display apparatus and the interface display method provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiments and are not described herein again.
The electronic device in the above method embodiment can be implemented as a terminal. For example, fig. 20 is a block diagram of a terminal according to an embodiment of the present disclosure. The terminal 2000 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 (Moving Picture Experts Group Audio Layer III, moving Picture Experts Group Audio Layer IV, moving Picture Experts Group Audio Layer 4) player, a notebook computer, or a desktop computer. Terminal 2000 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
In general, terminal 2000 includes: a processor 2001, and a memory 2002.
The processor 2001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 2001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 2001 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in a wake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 2001 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 2001 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
The memory 2002 may include one or more computer-readable storage media, which may be non-transitory. The memory 2002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 2002 is used to store at least one instruction for execution by processor 2001 to implement the interface display methods provided by method embodiments herein.
In some embodiments, terminal 2000 may further optionally include: a peripheral interface 2003 and at least one peripheral. The processor 2001, memory 2002 and peripheral interface 2003 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 2003 through a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 2004, display 2005, camera assembly 2006, audio circuitry 2007 and power 2009.
The peripheral interface 2003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 2001 and the memory 2002. In some embodiments, the processor 2001, memory 2002 and peripheral interface 2003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 2001, the memory 2002, and the peripheral interface 2003 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The Radio Frequency circuit 2004 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 2004 communicates with a communications network and other communications devices via electromagnetic signals. The radio frequency circuit 2004 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 2004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. Radio frequency circuitry 2004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 2004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 2005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 2005 is a touch display screen, the display screen 2005 also has the ability to capture touch signals on or over the surface of the display screen 2005. The touch signal may be input to the processor 2001 as a control signal for processing. At this point, the display 2005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 2005 may be one, provided on the front panel of terminal 2000; in other embodiments, the display screens 2005 can be at least two, respectively disposed on different surfaces of the terminal 2000 or in a folded design; in other embodiments, display 2005 may be a flexible display disposed on a curved surface or a folded surface of terminal 2000. Even more, the display screen 2005 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 2005 can be made of a material such as an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), and the like.
Camera assembly 2006 is used to capture images or video. Optionally, camera assembly 2006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 2006 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 2007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 2001 for processing or inputting the electric signals to the radio frequency circuit 2004 so as to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different positions of the terminal 2000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 2001 or the radio frequency circuit 2004 into sound waves. The loudspeaker can be a traditional film loudspeaker and can also be a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuit 2007 may also include a headphone jack.
Power supply 2009 is used to power the various components in terminal 2000. Power supply 2009 may be ac, dc, disposable or rechargeable. When power supply 2009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 2000 also includes one or more sensors 2010. The one or more sensors 2010 include, but are not limited to: acceleration sensor 2011, gyro sensor 2012, pressure sensor 2013, optical sensor 2015, and proximity sensor 2016.
The acceleration sensor 2011 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 2000. For example, the acceleration sensor 2011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 2001 may control the display screen 2005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 2011. The acceleration sensor 2011 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 2012 can detect the body direction and rotation angle of terminal 2000, and gyroscope sensor 2012 can cooperate with acceleration sensor 2011 to gather the 3D action of user to terminal 2000. The processor 2001 may implement the following functions according to the data collected by the gyro sensor 2012: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 2013 may be disposed on the side frame of terminal 2000 and/or underlying display screen 2005. When the pressure sensor 2013 is disposed on the side frame of the terminal 2000, the holding signal of the user to the terminal 2000 can be detected, and the processor 2001 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 2013. When the pressure sensor 2013 is disposed at the lower layer of the display screen 2005, the processor 2001 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 2005. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 2015 is used to collect ambient light intensity. In one embodiment, the processor 2001 may control the display brightness of the display screen 2005 according to the ambient light intensity collected by the optical sensor 2015. Specifically, when the ambient light intensity is high, the display luminance of the display screen 2005 is adjusted high; when the ambient light intensity is low, the display luminance of the display screen 2005 is adjusted down. In another embodiment, the processor 2001 may also dynamically adjust the shooting parameters of the camera assembly 2006 according to the ambient light intensity collected by the optical sensor 2015.
The proximity sensor 2016, also known as a distance sensor, is typically disposed on a front panel of the terminal 2000. The proximity sensor 2016 is used to collect a distance between a user and the front surface of the terminal 2000. In one embodiment, the processor 2001 controls the display 2005 to switch from the bright screen state to the dark screen state when the proximity sensor 2016 detects that the distance between the user and the front surface of the terminal 2000 is gradually reduced; when the proximity sensor 2016 detects that the distance between the user and the front surface of the terminal 2000 is gradually increasing, the display screen 2005 is controlled by the processor 2001 to switch from a rest screen state to a light screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 20 is not intended to be limiting of terminal 2000 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
The electronic device in the above method embodiment can be implemented as a server. For example, fig. 21 is a schematic structural diagram of a server provided in this embodiment, where the server 2100 may generate a relatively large difference due to different configurations or performances, and can include one or more processors (CPUs) 2101 and one or more memories 2102, where at least one program code is stored in the memory 2102, and the at least one program code is loaded and executed by the processors 2101 to implement the interface display method provided in each method embodiment. Certainly, the server can also have a wired or wireless network interface, an input/output interface, and other components to facilitate input and output, and the server can also include other components for implementing the functions of the device, which are not described herein again.
In an exemplary embodiment, there is also provided a computer readable storage medium, such as a memory, including at least one program code, the at least one program code being executable by a processor to perform the interface display method in the above embodiments. For example, the computer-readable storage medium can be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product or computer program is also provided that includes one or more program codes stored in a computer-readable storage medium. The one or more processors of the electronic device can read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the electronic device can perform the interface display method.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should be understood that determining B from a does not mean determining B from a alone, but can also determine B from a and/or other information.
Those skilled in the art will appreciate that all or part of the steps for implementing the above embodiments can be implemented by hardware, or can be implemented by hardware related to instructions of a program, and the program can be stored in a computer readable storage medium, and the above-mentioned storage medium can be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is intended only to be an alternative embodiment of the present application, and not to limit the present application, and any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (13)

1. An interface display method, characterized in that the method comprises:
when a target area is refreshed, acquiring a current first position of the target area and a controlled virtual object, wherein the target area is refreshed once in a virtual scene at intervals, the virtual life value of the virtual object outside the target area is continuously reduced, and the virtual object in the target area is not influenced;
responding to the first position located outside the target area, and acquiring a target path in the virtual scene according to the first position and the target area, wherein the target path is a path from the first position to the target area in the virtual scene;
when the controlled virtual object moves on the target path or does not move on the target path, displaying direction indication information corresponding to the position of the controlled virtual object in a visual field picture of the controlled virtual object along with the position change of the controlled virtual object according to the target path and the visual angle of the virtual scene, wherein the direction indication information is used for indicating the moving direction of the controlled virtual object when moving to the target area;
dynamically setting a target distance between target virtual props on a target path or a target virtual life value which can be increased by the target virtual props according to the state of the controlled virtual object and the length of the target path, and displaying the target virtual props at positions of every target distance on the target path, wherein the target virtual props are used for increasing the virtual life values of the virtual objects;
when the distance between the controlled virtual object and any target virtual prop is smaller than a second distance threshold value, displaying that the virtual life value of the controlled virtual object is increased by the target virtual life value;
when the controlled virtual object moves into the target area along the target path, the virtual life value deducted by the controlled virtual object is equal to the virtual life value added by the target virtual item on the target path.
2. The method according to claim 1, wherein the displaying, in the view of the controlled virtual object, direction indication information corresponding to a position of the controlled virtual object according to the target path and a viewing angle of the virtual scene as the position of the controlled virtual object changes includes:
responding to the controlled virtual object positioned on the target path, and displaying direction indication information corresponding to a first relation in a visual field picture of the controlled virtual object according to the first relation between the visual angle of the virtual scene and the direction indicated by the target path;
and responding to that the controlled virtual object is positioned outside the target path, and displaying direction indication information corresponding to a second relation in a visual field picture of the controlled virtual object according to the second relation between the visual angle of the virtual scene and a target connecting line direction, wherein the target connecting line direction is the direction of a connecting line with the shortest distance between the target path and the controlled virtual object.
3. The method according to claim 2, wherein the displaying, according to a first relationship between the viewing angle of the virtual scene and the direction indicated by the target path, direction indication information corresponding to the first relationship in the view screen of the controlled virtual object comprises:
in response to that the visual angle of the virtual scene is consistent with the direction indicated by the target path, displaying first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating that the controlled virtual object moves forwards;
in response to that the visual angle of the virtual scene is inconsistent with the direction indicated by the target path, according to a target rotation direction, displaying second direction indication information corresponding to the target rotation direction in a visual field picture of the controlled virtual object, wherein the second direction indication information is used for indicating that the controlled virtual object rotates leftwards or rightwards, and the visual angle of the virtual scene is consistent with the direction indicated by the target path after rotating for a target angle smaller than 180 degrees along the target rotation direction;
the displaying, according to a second relationship between the viewing angle of the virtual scene and the target link direction, direction indication information corresponding to the second relationship in a view picture of the controlled virtual object includes:
responding to the consistency of the visual angle of the virtual scene and the direction of a target connecting line, and displaying first direction indication information in a visual field picture of the controlled virtual object, wherein the first direction indication information is used for indicating the controlled virtual object to move forwards;
and in response to the fact that the visual angle of the virtual scene is not consistent with the target connecting line direction, displaying second direction indication information corresponding to the target rotation direction in a visual field picture of the controlled virtual object according to the target rotation direction, wherein the second direction indication information is used for indicating the controlled virtual object to rotate leftwards or rightwards, and the visual angle of the virtual scene is consistent with the target connecting line direction after rotating for a target angle smaller than 180 degrees along the target rotation direction.
4. The method according to claim 3, wherein the displaying, in the visual field of the controlled virtual object, second direction indication information corresponding to the target rotation direction according to the target rotation direction includes:
responding to that the visual angle of the virtual scene is consistent with the direction indicated by the target path or the target connecting line direction after rotating the target angle anticlockwise, and displaying third direction indication information in a visual field picture of the controlled virtual object, wherein the third direction indication information is used for indicating the controlled virtual object to rotate to the left;
and in response to that the visual angle of the virtual scene is consistent with the direction indicated by the target path or the target connecting line direction after clockwise rotating the target angle, displaying fourth direction indication information in a visual field picture of the controlled virtual object, wherein the fourth direction indication information is used for indicating the controlled virtual object to rotate rightwards.
5. The method according to claim 2, wherein the displaying, in the view frame of the controlled virtual object, direction indication information corresponding to the position of the controlled virtual object according to the target path and the perspective of the virtual scene as the position of the controlled virtual object changes includes:
responding to the situation that the controlled virtual object is located on the target path, acquiring a first included angle between the visual angle of the virtual scene and the direction indicated by the target path, and displaying direction indication information corresponding to the first included angle in a visual field picture of the controlled virtual object;
and responding to the situation that the controlled virtual object is positioned outside the target path, acquiring a second included angle between the visual angle of the virtual scene and the target connecting line direction, and displaying direction indication information corresponding to the second included angle in a visual field picture of the controlled virtual object.
6. The method of claim 1, wherein the target path is derived from a traversable point connection in the virtual scene.
7. The method of claim 6, wherein the obtaining a target path in a virtual scene according to the first location and the destination area comprises:
acquiring a map of the virtual scene, wherein the map of the virtual scene comprises places where the virtual scene can pass;
taking the first position in the map as a starting point, taking any position in the destination area as an end point, and searching a path according to a place which can be communicated between the starting point and the end point in the map to obtain at least one candidate path;
and taking the path with the shortest length in the at least one candidate path as the target path.
8. The method of claim 7, further comprising:
responding to position marking operation, and acquiring a second position indicated by the position marking operation;
and responding to the second position in the destination area, and acquiring a target path in a virtual scene according to the first position and the second position, wherein the target path is a path from the first position to the second position in the virtual scene.
9. The method of claim 1, further comprising:
and in response to the fact that the distance between the position of the controlled virtual object and the target path is larger than a first distance threshold value, updating the target path according to the position of the controlled virtual object and the target area, wherein the updated target path is a path from the position of the controlled virtual object to the target area in the virtual scene.
10. The method according to claim 1, wherein the dynamically setting a target distance between target virtual props on a target path or a target virtual life value that the target virtual props can increase according to the state of the controlled virtual object and the length of the target path comprises:
acquiring the speed of reducing the virtual life value of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; determining the target virtual life value according to the speed, the moving speed, the length and the preset target distance; or the like, or, alternatively,
acquiring the speed of reduction of the virtual life value of the virtual object outside the target area, the moving speed of the controlled virtual object and the length of the target path; and determining the target distance according to the speed, the moving speed and the preset target virtual life value.
11. An interface display apparatus, the apparatus comprising:
the first obtaining module is used for obtaining a current first position of a target area and a controlled virtual object when the target area is refreshed, wherein the target area is refreshed once in a virtual scene at intervals, the virtual life value of the virtual object located outside the target area is continuously reduced, and the virtual object located in the target area is not influenced;
a second obtaining module, configured to, in response to that the first location is outside the destination area, obtain, according to the first location and the destination area, a target path in the virtual scene, where the target path is a path from the first location to the destination area in the virtual scene;
a display module, configured to display, in a view screen of the controlled virtual object, direction indication information corresponding to a position of the controlled virtual object according to the target path and a view angle of the virtual scene along with a change in the position of the controlled virtual object when the controlled virtual object moves on the target path or does not move on the target path, where the direction indication information is used to indicate a moving direction of the controlled virtual object when the controlled virtual object moves to the destination area;
the display module is further configured to dynamically set a target distance between target virtual props on a target path or a target virtual life value that the target virtual props can increase according to the state of the controlled virtual object and the length of the target path, and display the target virtual props at positions on the target path at intervals of the target distance, where the target virtual props are used to increase the virtual life value of the virtual object;
the display module is further used for displaying that the virtual life value of the controlled virtual object is increased by the target virtual life value when the distance between the controlled virtual object and any target virtual prop is smaller than a second distance threshold;
when the controlled virtual object moves into the target area along the target path, the virtual life value deducted by the controlled virtual object is equal to the virtual life value added by the target virtual item on the target path.
12. An electronic device, characterized in that the electronic device comprises one or more processors and one or more memories, in which at least one program code is stored, which is loaded and executed by the one or more processors to implement the interface display method of any one of claims 1 to 10.
13. A computer-readable storage medium having stored therein at least one program code, the at least one program code being loaded and executed by a processor to implement the interface display method according to any one of claims 1 to 10.
CN202011060668.7A 2020-09-30 2020-09-30 Interface display method, device, equipment and storage medium Active CN112121422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011060668.7A CN112121422B (en) 2020-09-30 2020-09-30 Interface display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011060668.7A CN112121422B (en) 2020-09-30 2020-09-30 Interface display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112121422A CN112121422A (en) 2020-12-25
CN112121422B true CN112121422B (en) 2023-01-10

Family

ID=73843506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011060668.7A Active CN112121422B (en) 2020-09-30 2020-09-30 Interface display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112121422B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113318447B (en) * 2021-05-25 2022-07-29 网易(杭州)网络有限公司 Game scene processing method and device, storage medium and electronic equipment
CN113262475A (en) * 2021-06-07 2021-08-17 网易(杭州)网络有限公司 Method, device, equipment and storage medium for using virtual props in game
CN113559508A (en) * 2021-07-27 2021-10-29 网易(杭州)网络有限公司 Orientation prompting method, device, equipment and storage medium of virtual object
CN113680064A (en) * 2021-08-18 2021-11-23 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual character in game
CN114042315B (en) * 2021-10-29 2023-06-16 腾讯科技(深圳)有限公司 Virtual scene-based graphic display method, device, equipment and medium
CN115222926A (en) * 2022-07-22 2022-10-21 领悦数字信息技术有限公司 Method, apparatus, and medium for planning a route in a virtual environment
CN116999824A (en) * 2022-08-19 2023-11-07 腾讯科技(深圳)有限公司 Method, apparatus, device, medium and program product for booting in virtual scene

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4890624B2 (en) * 2010-03-15 2012-03-07 株式会社コナミデジタルエンタテインメント GAME SYSTEM AND COMPUTER PROGRAM THEREOF
CN107096222A (en) * 2017-06-08 2017-08-29 深圳市乃斯网络科技有限公司 Location path method and system for planning in game
CN107789837B (en) * 2017-09-12 2021-05-11 网易(杭州)网络有限公司 Information processing method, apparatus and computer readable storage medium
CN108310770A (en) * 2018-01-05 2018-07-24 腾讯科技(深圳)有限公司 Control method, device, storage medium and the electronic device of virtual controlling object
WO2020027347A1 (en) * 2018-07-31 2020-02-06 펍지 주식회사 Method and device for controlling gaming virtual space
CN109876442A (en) * 2019-04-15 2019-06-14 网易(杭州)网络有限公司 Route indicating means, equipment and storage medium in game based on map
CN110237528A (en) * 2019-06-21 2019-09-17 腾讯科技(深圳)有限公司 The control method and device of object, storage medium and electronic device
CN111298439B (en) * 2020-01-21 2021-04-13 腾讯科技(深圳)有限公司 Data processing method, device, medium and electronic equipment

Also Published As

Publication number Publication date
CN112121422A (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN112121422B (en) Interface display method, device, equipment and storage medium
CN111265869B (en) Virtual object detection method, device, terminal and storage medium
CN111228804B (en) Method, device, terminal and storage medium for driving vehicle in virtual environment
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN111589142B (en) Virtual object control method, device, equipment and medium
CN110507994B (en) Method, device, equipment and storage medium for controlling flight of virtual aircraft
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN110201403B (en) Method, device and medium for controlling virtual object to discard virtual article
CN110613938B (en) Method, terminal and storage medium for controlling virtual object to use virtual prop
CN109529356B (en) Battle result determining method, device and storage medium
CN111068324B (en) Virtual object control method, device, equipment and storage medium
CN110694273A (en) Method, device, terminal and storage medium for controlling virtual object to use prop
CN110681156B (en) Virtual role control method, device, equipment and storage medium in virtual world
CN110585695B (en) Method, apparatus, device and medium for using near-war property in virtual environment
CN111589133A (en) Virtual object control method, device, equipment and storage medium
CN112138383B (en) Virtual item display method, device, equipment and storage medium
CN110917619A (en) Interactive property control method, device, terminal and storage medium
CN111298440A (en) Virtual role control method, device, equipment and medium in virtual environment
CN112221142B (en) Control method and device of virtual prop, computer equipment and storage medium
CN113058264A (en) Virtual scene display method, virtual scene processing method, device and equipment
CN112221141A (en) Method and device for controlling virtual object to use virtual prop
CN111672102A (en) Virtual object control method, device, equipment and storage medium in virtual scene
CN112843679A (en) Skill release method, device, equipment and medium for virtual object
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40035733

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant