CN107890669B - Display control method, device and storage medium - Google Patents

Display control method, device and storage medium Download PDF

Info

Publication number
CN107890669B
CN107890669B CN201711114665.5A CN201711114665A CN107890669B CN 107890669 B CN107890669 B CN 107890669B CN 201711114665 A CN201711114665 A CN 201711114665A CN 107890669 B CN107890669 B CN 107890669B
Authority
CN
China
Prior art keywords
display
user interface
icon
graphic object
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711114665.5A
Other languages
Chinese (zh)
Other versions
CN107890669A (en
Inventor
刘毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shanghai Co Ltd
Original Assignee
Tencent Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shanghai Co Ltd filed Critical Tencent Technology Shanghai Co Ltd
Priority to CN201711114665.5A priority Critical patent/CN107890669B/en
Publication of CN107890669A publication Critical patent/CN107890669A/en
Application granted granted Critical
Publication of CN107890669B publication Critical patent/CN107890669B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/305Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A display control method, apparatus and storage medium. The method comprises the following steps: controlling a display device to display a graphical user interface, and displaying a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object and a preset position; displaying a second graphic object representing the first arraying movable unit in the graphic user interface, and controlling the second graphic object to move along the route from the first graphic object; displaying an operation option in the graphical user interface, and in response to a first operation on the operation option, displaying a third graphical object representing a second camping mobile unit in the graphical user interface; when a second operation on the third graphic object is monitored, determining a movement target point in the graphic user interface, and controlling the third graphic object to move towards the movement target point; and when detecting that the first graphic object or the second graphic object is positioned in the attack range of the third graphic object, controlling the third graphic object to attack the graphic object and modifying the attribute value of the graphic object.

Description

Display control method, device and storage medium
Technical Field
The present application relates to computer technologies, and in particular, to a display control method and apparatus, and a storage medium.
Background
Tower Defense, i.e., the Tower Defense (Tower Defense), is one of the strategy games. In the tower defense game, enemies appear from the starting point of a set path in batches according to the set sequence and time and travel along the path, and the target is the destination of the path, and the land or the possessions of the players who occupy or attack the path. The goal of the player is to protect his territory or possessions by placing a defensive tower on the set path or by the side of the road to block hostile attacks. Tower defense games typically provide various types of defense towers that can automatically block, impede, attack, or destroy enemies after deployment. When all the present enemies are eliminated, it is judged that the player wins. In the existing tower defense game, as enemies arrive at different time intervals, even if the battle power of a player greatly exceeds that of an enemy, the player can only wait for the next wave of enemies to arrive, the game progress is slow, and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a game control method and a game control system, and provides a unit with active attack capability, so that the game process can be accelerated, and the user experience is improved.
The display control method provided by the embodiment of the application can comprise the following steps:
controlling a display device to display a graphical user interface, wherein the graphical user interface is used for displaying a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object with a preset position;
controlling the display device to display at least one second graphic object representing a first arraying movable unit in the graphic user interface, and controlling the second graphic object to move from the first graphic object to the preset position along the route;
controlling the display device to display at least one operation option in the graphical user interface, and in response to a first operation on the operation option received through an input device, controlling the display device to display at least one third graphical object representing a second marketing movable unit in the graphical user interface;
when a second operation on the third graphic object is monitored, determining a movement target point in the graphic user interface, and controlling the third graphic object to move towards the movement target point;
and when detecting that the first graphic object or the second graphic object is located in the attack range of the third graphic object, controlling the third graphic object to attack the first graphic object or the second graphic object, and modifying the attribute value of the first graphic object or the second graphic object.
The game control device provided by the embodiment of the application can comprise:
the system comprises a display control module, a display control module and a display control module, wherein the display control module is used for displaying a graphical user interface, and the graphical user interface is used for displaying a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object with a preset position; controlling the display device to display at least one second graphic object representing a first arraying movable unit in the graphic user interface, and controlling the second graphic object to move from the first graphic object to the preset position along the route; controlling the display device to display at least one operation option in the graphical user interface, and in response to a first operation on the operation option received through an input device, controlling the display device to display at least one third graphical object representing a second marketing movable unit in the graphical user interface;
a behavior processing module, configured to determine a movement target point in the graphical user interface when a second operation on the third graphical object is monitored, and control the third graphical object to move to the movement target point; (ii) a And when detecting that the first graphic object or the second graphic object is located in the attack range of the third graphic object, controlling the third graphic object to attack the first graphic object or the second graphic object, and modifying the attribute value of the first graphic object or the second graphic object.
Embodiments of the present application further provide a computer-readable storage medium, in which computer-readable instructions are stored, and when executed by a processor, the method of any embodiment of the present application can be implemented.
The scheme of this application embodiment is through providing the unmovable unit of first formation in the game is prevented to the tower, provides the operation option and makes the user can set up the portable unit of second formation and control it and attack the unmovable unit of first formation to accelerate the game process, suggestion user experience.
Drawings
Fig. 1 is a flowchart of a display control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a graphical user interface according to an embodiment of the present application;
FIG. 3 is a flowchart of a display control method according to an embodiment of the present application;
FIGS. 4a and 4b are schematic diagrams of a graphical user interface;
FIG. 5 is a flowchart of a method for generating portable units according to an embodiment of the present application;
FIG. 6 is a flow chart of a method for mobile unit creation according to an embodiment of the present application;
FIGS. 7a and 7b are schematic diagrams of a graphical user interface according to an embodiment of the present application;
fig. 8 is a schematic diagram of a game control device according to an embodiment of the present application.
Detailed Description
For simplicity and clarity of description, the invention will be described below by describing several representative embodiments. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. Some embodiments are not described in detail, but rather are merely provided as frameworks, in order to avoid unnecessarily obscuring aspects of the invention. Hereinafter, "comprising" means "including but not limited to", "according to … …" means "according to at least … …, but not limited to only … …", it being construed that other features may be present in addition to the features mentioned later.
The scheme of each embodiment of the application provides the immovable unit of the first formation in the graphical user interface of the tower defense game, and provides the operation options to enable the user to set the movable unit of the second formation and control the movable unit to attack the immovable unit of the first formation. Fig. 1 is a flowchart of a display control method according to an embodiment of the present disclosure. As shown in fig. 1, the method may include the following steps.
Step S11, controlling a display device to display a graphical user interface, where the graphical user interface is used to display a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object with a preset position.
Step S12, controlling the display device to display at least one second graphic object representing a first battle movable unit in the graphic user interface, and controlling the second graphic object to move from the first graphic object to the preset position along the route.
And step S13, controlling the display device to display at least one operation option (hereinafter also referred to as a first operation option) in the graphical user interface, and in response to a first operation on the operation option received through the input device, controlling the display device to display at least one third graphical object representing a second marketing movable unit in the graphical user interface.
Step S14, when the second operation on the third graphical object is monitored, determining a movement target point in the graphical user interface, and controlling the third graphical object to move to the movement target point.
Step S15, when it is detected that the first graphic object or the second graphic object is located within the attack range of the third graphic object, controlling the third graphic object to attack the first graphic object or the second graphic object, and modifying the attribute value of the first graphic object or the second graphic object.
The unit is a basic unit that can be controlled by one of the two parties in the game. Different kinds of units have different attributes, providing different functions. A mobile unit generally refers to a unit that can be moved to participate in combat, such as an infantry, a vehicle, a weapon, and the like. A non-mobile unit generally refers to a non-mobile unit such as various buildings used to produce mobile units or to enhance combat capability, a fixed location turret, etc.
In various embodiments, the operational options may be in the form of icons or buttons; the first operation in step S13 may be a touch operation, a mouse click operation, or the like. In step S14, the purpose of the second operation on the third graphical object is to move the first graphical object. The second operation may comprise a single operation, or a combination of a series of operations, such as a selection operation performed on the target object and then on the target location, and two consecutive click operations may constitute one operation of moving the third graphical object.
The embodiments of the present application may be performed by a terminal device. The processor in the terminal device provides the graphical user interface of the embodiments by running the processing logic in the memory, listening for instructions input by the input device, and performing the processing steps of the embodiments. The terminal device may include, but is not limited to, a smart phone, a Mobile Internet Device (MID), a tablet phone (tablet computer), a tablet computer (tablet computer), an ultra portable computer (UMPC), a Personal Digital Assistant (PDA), a handheld personal computer, a notebook computer, an interactive entertainment device, a game terminal, and the like.
Here, the presenting of each unit includes creating data corresponding to the unit, and performing operations such as assigning and modifying on each parameter of the unit to present the behavior and the state of the unit. The data for each unit includes the identity of the unit, various parameters and their values, such as appearance, behavioral logic, defense, attack, vital values, and the like.
According to various embodiments, the formation to which the current user belongs is referred to as a second formation, the first formation representing a party that is competing against the second formation, e.g., a party controlled by game logic or another user. Embodiments provide an immovable unit for a first camp, take its graphic object as an optional attack target for a second camp, and when the immovable unit is destroyed, determine that the second camp wins. Meanwhile, the embodiments also provide a first operation option, present a graphical object of a second campaign of movable units in the graphical user interface according to the received operation on the first operation option, and present movement and attack behaviors of the graphical object in the graphical user interface according to the monitored operation on the graphical object. By providing the above functions, the embodiments allow a player of the tower defense game to win a victory by actively attacking the building of the other party, and under the condition that the player is abundant in the battle force, the player can actively attack to accelerate the game progress, shorten the game progress, and improve the user experience.
In some embodiments, when it is detected that the attribute value corresponding to the first graphical object reaches a set threshold, the display device may be controlled to display a second graphical user interface indicating the second marketing win.
Fig. 2 is a schematic diagram of a graphical user interface according to an embodiment of the present application. The first graphic object 21 is an immovable unit of the first camp, and may represent a base, a military camp, a factory, or the like of the first camp. The graph 22 represents the set route. The second graphical object 23 is a movable unit of the first battle, representing an infantry or weapon or the like of the first battle. The user's operation for setting the defense unit may be monitored, and the display device may be controlled to display the immovable unit of the second camp, such as the turret 26, around the set route according to the monitored operation, intercepting and attacking the second graphic object 23. The display device may also be controlled to display the first operation option 24 on the graphical user interface, and to display a third graphical object 25 representing a movable unit of the second campaign in the graphical user interface upon operation of the first operation option 24.
In some embodiments, the mobile units of the first row move according to a predetermined route. The position of a series of points may be stored in the terminal device in advance to represent the set route. These points may be stored as an array, or in other forms. The mobile units of the first row can move according to these points. For example, a second graphical object of a first run of movable units may be presented in proximity to the first graphical object, a first position in the array acquired, a route calculated from a current position of the second graphical object to the first position, and a display device controlled to display movement of the second graphical object along the route. And after the second graphic object moves to the first position, acquiring the next position in the array, controlling the display equipment to display the movement of the second graphic object to the next position, and repeating the steps until all the positions in the array are passed, and completing the movement of the second graphic object.
In some examples, the movement of the mobile units of the second row is influenced by the terrain. The terminal device may store preset topographic information corresponding to the current map, and calculate the moving route according to the operation and the instruction received by the input device and the preset route searching mechanism according to the moving target point and the topographic information. The terminal equipment can also adopt a terrain baking algorithm to control the display equipment to display preset terrain information in the graphical user interface.
In some embodiments, the application further provides one or more second operation options in the graphical user interface for creating the immovable units of the second array, representing the buildings for producing the movable units of the second array. Fig. 3 is a flowchart of a display control method according to an embodiment of the present application. As shown in fig. 3, the method 30 may include the following steps.
Steps S31, S32 are similar to steps S11, S12.
Fig. 4a and 4b are schematic diagrams of a graphical user interface. Wherein the first graphical object 41 represents an immovable unit of the first formation. Here, the first play is a party controlled by the game logic. The non-mobile unit is deployed at a location where a mobile unit of the first camp appears, representing a building for producing the mobile unit of the first camp. The second graphical object 43 represents a movable unit of the first battle, and may appear to move along the set path 42. The one or more second graphic objects 43 may be presented in the graphic user interface in a preset number in batches at a plurality of preset points in time. Also presented in the graphical user interface are: attribute information of the first battle immovable unit, such as a life value, defensive power and the like; information of the first barrage mobile unit, such as the time (e.g., countdown) when the next wave of the first barrage mobile unit 43 appears, the total number of issuing lots, the number of remaining issuing lots, and the like. As shown in fig. 4a, the hint 407 indicates that there are 7 first barracks of mobile units, and the first barracks of mobile units that are currently present belong to the 3 rd lot.
Step S33, the display device is controlled to display the second operation option 46 in the graphical user interface, and in response to an operation of the second operation option 46 received through the input device and a selection of a placement position in the graphical user interface, the display device is controlled to display a fourth graphical object 47 representing an immovable unit of the second campaign on the placement position, as shown in fig. 4 a.
Here, the immovable unit of the second camp means a building for producing a movable unit of the second camp, such as a barracks, weaponry, etc.
Step S34, in response to the operation on the fourth graphical object 47 received through the input device, controls the display device to display the first operation option 44 in the graphical user interface, as shown in fig. 4 b.
In some examples, the first operation option 44 may include one or more options, each representing a different kind of movable unit. According to the monitored operation on one of the first operation options, the terminal device can control the display device to display the graphic object of the movable unit of the type corresponding to the operated first operation option in the graphic user interface.
In some examples, as shown in fig. 4b, the display device may also be controlled to display in the graphical user interface: current resource information 401 for the second array, the maximum number of units that can be moved and the current total number 402, information of units that are not movable for the second array, such as type and level 403, vital value 404, attack 405, defense 406, etc.
Step S35, in response to the operation of the first operation option 44 received through the input device, controls the display device to display a third graphic object 45 representing a movable unit of the second lineup in the graphic user interface, as shown in fig. 4 b.
Steps S36, S37 are similar to steps S14, S15.
In some examples, the terminal device may also control the display device to display a third operation option 48 in the graphical user interface, as shown in fig. 4b, in response to a selection of the fourth graphical object 47 received via the input device. The terminal device may change the attribute value of the second marketing immovable unit in response to an operation of the third operation option 48 received through the input device. The attribute values of the immovable unit may include: appearance, defense value, type of mobile unit produced therefrom, life value of each type of mobile unit, attack range, attack power, defense power, and the like.
In some examples, the terminal device may also control the display device to display a fourth operation option 49 in the graphical user interface, as shown in fig. 4b, in response to a selection of the fourth graphical object 47 received via the input device. In response to an operation on the fourth operation option 49 received through the input device, the terminal device may delete the data of the second marketing immovable unit and control the display device to cancel the display of the fourth graphic object 47. For example, when the user (player) wishes to build another unit at the site of the building, the building can be cancelled and another building can be rebuilt by operating the fourth operation option 49. For another example, when the user wishes to sell the building in exchange for a resource value, the building may be cancelled by operating the fourth operation option 49; accordingly, the terminal device may delete the data of the building, control the display device to cancel the display of the fourth graphical object 47, add the resource value corresponding to the current user, and present the updated resource value in the graphical user interface.
In some examples, the operation of the third graphic object may be a single operation or a combination of operations sequentially performed according to a set order.
For example, selection operations for the third graphical object and the movement target point, respectively, may be monitored, and the position selected by the selection operation in the graphical user interface may be taken as the movement target point. The selection operation for the third graphical object may be a click or touch operation or may be a slide selection operation. In some examples, when a click or touch operation on the third graphical object is heard, it is determined that a selection operation on the third graphical object is heard. In still other examples, a selection area is determined in the graphical user interface based on the monitored start and end positions of the sliding selection operation, and the selection operation of the third graphical object is determined to be monitored when the third graphical object is located in the selection area. That is, when the sliding selection operation is listened to, it is determined that all of the third graphical objects in the selection area are selected.
When the selected third graphic object and the movement target point are determined in sequence according to the monitored selection operation, the terminal device can read preset topographic information of the current map, determine a path from the current position of the selected third graphic object to the movement target point according to a preset route searching mechanism, and then present the movement of the selected third graphic object to the movement target point along the path.
For another example, a drag operation on the third graphic object may be monitored, and an end position of the drag operation may be determined as the movement target point. Here, the drag operation is an operation of moving the target continuously in accordance with the hand motion of the operator. The dragging operation may include, but is not limited to, an operation of pressing a mouse key and dragging the mouse, a sliding gesture on a screen, or a movement through a handheld device (e.g., a handle, etc.), and the like.
When the selected third graphic object and the mobile target point are determined according to the monitored dragging operation, the terminal device may read preset topographic information of the current map and determine whether the mobile target point is reachable. If yes, the display device may be controlled to display a line (which may be a straight line, a broken line, or a curved line) of the first color in the graphical user interface, where one end of the line is the current location of the selected third graphical object unit, and the other end of the line is the movement target point. And if the third graphic object unit is not reachable, controlling the display device to display a line of the second color in the graphic user interface, wherein one end of the line is the current position of the selected third graphic object unit, and the other end of the line is the moving target point.
In some examples, the movement of the third graphical object may also be accomplished by listening to another icon. For example, the display device may be controlled to display an icon in the graphical user interface, the icon corresponding to a category of the second camp movable unit. When the selection operation (such as clicking or touching) on the icon is monitored, all the movable units of the category corresponding to the icon are determined to be selected, that is, when the second marketing movable unit corresponding to the third graphic object belongs to the category, the third graphic object is determined to be selected. For another example, when a drag operation on the icon is monitored, it is determined that all the movable units of the category corresponding to the icon are objects of the drag operation, that is, when the second camp movable unit corresponding to the third graphic object belongs to the category, it is determined that the third graphic object is an object of the drag operation.
The operation on the third graphic object is an operation that can move the position of the third graphic object, and includes, but is not limited to, a move operation, a move-and-attack operation, and the like. When the movement target point represents an empty space in the graphical user interface, then the operation is a movement operation. When the moving target point is in a graphic object of an attack unit in the graphic user interface, the operation is a moving and attacking operation, namely, the moving target point is moved to the position of the graphic object and attacks the graphic object.
In some embodiments, the basic data of the movable unit is separated from the behavior logic, that is, no camp is allocated when the movable unit is generated according to the basic data, and then the behavior logic of a certain camp is loaded to make the camp have the corresponding behavior. FIG. 5 is a flowchart of a method for generating portable units according to an embodiment of the present application. In this embodiment, the terminal device stores basic data of a mobile unit, first behavior logic corresponding to the mobile unit of the first camp, and second behavior logic corresponding to the mobile unit of the second camp in advance. The basic data comprises a series of basic attributes common to movable units, such as life value, attack power, defense power, attack range, moving speed and the like. Behavior logic, also called behavior patterns, refers to a collective term for unit default behavior patterns and/or handling patterns of various events. For example, the behavior logic of the movable unit of the first marketing may include moving along a set path, leaving the set path when the unit of the second marketing appears in the attack range, destroying the other party, and returning to the set path. The behavior logic of the moveable units of the second array may include moving in the scene under control of operations and instructions received by the input device; and when the unit of the first marketing appears in the attack range, the attack is actively initiated. The behavior logic may be stored as a separate file. The format of the file may be in an existing format, such as XML, or in a proprietary format that is specifically defined, etc.
As shown in fig. 5, the method 50 may include the following steps.
Step S51, in response to a determination that a preset condition is satisfied, generates a creation instruction in which indication information for indicating a first marketing is set.
Step S52, in response to the operation on the first operation option received through the input device, generating a creation instruction in which indication information for indicating the second marketing is set.
For example, when the preset condition is met (for example, a preset time point is reached), a creation instruction is generated according to the preset information, and the creation instruction may include indication information indicating that the movable unit is produced by the non-movable unit of the first formation. When an operation (for example, a touch operation, a mouse click, and the like) on the first operation option is received through the input device, a creation instruction is generated, and the creation instruction may include indication information indicating that the movable unit is produced by a non-movable unit of the second formation.
In step S53, an instance of the movable unit is created based on the creation instruction and the basic data of the movable unit.
And loading the first behavior logic or the second behavior logic for the instance according to the indication information in the creating instruction.
In step S54, it is determined whether the indication information in the create instruction indicates the first marketing, and if the first marketing is indicated, step S55 is performed, and if the second marketing is indicated, step S56 is performed.
Step S55, load the behavioral logic of the first banker.
Step S56, load the behavior logic of the second banker.
In some examples, the corresponding appearance representation may also be loaded for the instance.
And step S57, controlling the display device to display the graphic object corresponding to the instance, and displaying the action of the graphic object under the control of the loaded behavior logic.
And controlling the display equipment to display a second graphic object or a third graphic object corresponding to the instance, and displaying the action of the second graphic object or the third graphic object under the control of the loaded first behavior logic or second behavior logic.
By separately loading the basic attribute and the behavior logic of the movable unit, the establishment mechanism of the movable unit can be more flexible, the stored data amount is saved, the basic data or the behavior logic can be conveniently and independently updated, the updating process is more time-saving, and the network transmission resource is saved.
In some examples, in order to control the power increase of the second camp, a certain amount of resources (such as virtual coins) owned by the current user may be set to be reduced for each movable unit creating the second camp, or a maximum number of movable units of the second camp may be set. In some examples, a queue may be utilized to maintain the order of creation of the movable units of the second array. The information of the movable units to be created can be added to the queue according to the operating instructions received by the input device. Furthermore, it may be arranged that resources are not reduced when adding a movable unit to the queue, and resources are reduced when creating the movable unit. If the amount of resources is insufficient when a certain movable unit is created, the creation process is suspended and it is checked at preset intervals whether the amount of resources is sufficient to start the creation process. When the amount of resources is sufficient, the movable units to be created that are ranked the most forward in the creation queue are created.
In some examples, the movable units of the second array may be sequentially created according to a plurality of creation instructions, and the creation order of the movable units may be adjusted according to an operation received through the input device. FIG. 6 is a flowchart of a method for creating a portable unit according to an embodiment of the present application. As shown in fig. 6, the method 60 may include the following steps.
Step S61, the control display device displays the icon list in the graphical user interface.
The icon list may include one or more icons, each icon representing a second camp movable unit to be created.
And step S62, responding to the operation of the second operation option received by the input device, controlling the display device to add the icon of the second camp movable unit corresponding to the second operation option to the first end of the icon list.
Step S63, in response to a movement operation of one or more icons in the icon list received through the input device, controls the display device to move the one or more icons to a position in the icon list determined according to the movement operation.
Fig. 7a and 7b are schematic diagrams of a graphical user interface according to an embodiment of the present application. As shown in fig. 7a, the graphical user interface displays an icon list 71, which includes icons 72, 73, 74 corresponding to the first type movable unit, the second type movable unit, and the first type movable unit, respectively. The icon list 71 indicates that there are currently 3 second-type portable units to be created, and the creation order is the first-type portable unit, the second-type portable unit, and the first-type portable unit. In response to the moving operation of the icon 73, the terminal device controls the display device to move the icon 73 one bit to the left and displays the updated icon list 71 in which the icons are arranged in the order of the icons 73, 72, 74. The icon list 71 at this time indicates that there are currently 3 second-type mobile units to be created, and the creation order thereof is the second-type mobile unit and the two first-type mobile units, respectively.
Step S64 of controlling the display device to display in the graphical user interface a graphical object of a second camping movable unit corresponding to an icon located at a second end of the icon list and to delete the icon from the second end of the icon list and to move the remaining icons one position towards said second end.
Therefore, the graphic objects of the second camp movable units corresponding to the icons can be sequentially presented in the graphic user interface according to the adjusted arrangement sequence of the icons in the icon list. In some examples, the display device is controlled to display a movable unit in the graphical user interface after a predetermined time after receiving the creation command, simulating the time consumed by manufacturing. With the change of the fighting conditions, the most urgent war of the second formation may change. By providing the icon list and adjusting the icons of the movable units to be created according to the input operation, the movable units to be created can be sequentially presented in the graphical user interface according to the adjusted sequence, and the user experience is improved.
Fig. 8 is a schematic diagram of a game control device according to an embodiment of the present application. As shown in fig. 8, the apparatus 80 may include the following modules.
The display control module 861 may control the display device to display a graphical user interface, where the graphical user interface is used to display a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object with a preset position; controlling the display device to display at least one second graphic object representing a first operation movable unit in the graphic user interface, and controlling the second graphic object to move from the first graphic object to the preset position along the route; controlling the display device to display one or more first operation options in the graphical user interface, and in response to a first operation on the first operation options received through an input device, controlling the display device to display at least one third graphical object representing a movable unit of a second camp in the graphical user interface;
a behavior processing module 862, configured to determine a movement target point in the graphical user interface and control the third graphical object to move to the movement target point when a second operation on the third graphical object is monitored; and when detecting that the first graphic object or the second graphic object is located in the attack range of the third graphic object, controlling the third graphic object to attack the first graphic object or the second graphic object, and modifying the attribute value corresponding to the first graphic object or the second graphic object.
In some embodiments, the apparatus 80 may further include a determining module 863, configured to control the display device to display a second graphical user interface representing the second marketing win when detecting that the attribute value corresponding to the first graphical object reaches a set threshold.
In some examples, the apparatus 80 may also include a processor 81, an input device 84, a display device 85, a storage device 86, and a bus 89. The storage device 86 includes an operating system 867, an input processing module 868, and a logic control module 866.
The processor 81 may have one or more, may be in the same physical device, or may be distributed among multiple physical devices.
The apparatus 80 may receive operations and instructions from a user using the input device 84 and display a graphical user interface via the display device 85. Input devices 84 may include, but are not limited to, a keyboard, mouse, joystick, hand-held remote control device, touch screen, etc. The display device 85 may be a display screen, a touch screen, or the like. In some examples, the input device 84 and the display device 85 are both implemented as a single touch screen.
The logic control module 866 may include a display control module 861, a behavior processing module 862, and a determination module 863, which may be implemented by computer readable instructions.
In some examples, the apparatus 80 may further include a storage module (not shown) in which basic data of the movable unit, first behavior logic corresponding to the movable unit of the first bank, and second behavior logic corresponding to the movable unit of the second bank are stored in advance. The display control module 861 may create an instance of the movable unit from the base data of the movable unit; loading the first behavior logic or the second behavior logic for the instance; and controlling the display equipment to display the second graphic object or the third graphic object corresponding to the instance, and displaying the action of the second graphic object or the third graphic object under the control of the loaded first behavior logic or the loaded second behavior logic.
In some examples, the display control module 861 may generate a creation instruction in which indication information indicating the first marketing is set in response to a determination that a preset condition is satisfied; or, responding to the operation of the first operation option received by the input device, generating a creation instruction, and setting indication information for indicating the second formation in the creation instruction. The display control module 861 may create the instance according to the creation instruction, and load the first behavior logic or the second behavior logic for the instance according to the indication information in the creation instruction.
In some examples, the display control module 861 may control the display device to display a second operation option in the graphical user interface, and in response to an operation of the second operation option received via an input device and a selection of a placement location in the graphical user interface, control the display device to display a fourth graphical object representing an immovable unit of a second lineup on the placement location; and controlling the display device to display the first operation option in the graphical user interface in response to the operation on the fourth graphical object received through the input device.
In some examples, the display control module 861 may control the display device to display one or more of the following in the graphical user interface in response to a selection of the fourth graphical object received through an input device:
a third operational option, the method further comprising: changing an attribute value of the second marketing immovable unit in response to an operation on the third option received through an input device;
a fourth operational option, the method further comprising: deleting the data of the second marketing immovable unit and controlling the display device to cancel the display of the fourth graphic object in response to the operation on the fourth option received through the input device.
In some examples, the behavior processing module 862 may regard a position selected by the selection operation in the graphical user interface as the movement target point when the selection operation for the third graphical object and the movement target point respectively is monitored; or when the dragging operation of the third graphic object is monitored, determining the end position of the dragging operation as the movement target point.
In some examples, the behavior processing module 862 may perform at least one of:
monitoring clicking or touching operation on the third graphic object;
monitoring a sliding selection operation, determining a selection area in the graphical user interface according to the starting position and the ending position of the sliding selection operation, and determining to monitor the selection operation of the third graphical object when the third graphical object is positioned in the selection area;
providing an icon in the graphical user interface, wherein the icon corresponds to a category of a second camp movable unit, monitoring clicking or touching operation on the icon, and determining that the third graphical object is selected when the second camp movable unit corresponding to the third graphical object belongs to the category.
In some examples, the behavior processing module 862 may perform at least one of:
monitoring a dragging operation on the third graphic object;
providing an icon in the graphical user interface, wherein the icon corresponds to a category of a second camp mobile unit, monitoring the dragging operation of the icon, and determining that the third graphical object is the object of the dragging operation when the second camp mobile unit corresponding to the third graphical object belongs to the category.
In some examples, the display control module 861 may also present a list of icons in the graphical user interface, including one or more icons, each icon representing a second camping mobile unit to be created; in response to a movement operation of one or more icons in the icon list, which is received through an input device, moving the one or more icons to a position in the icon list, which is determined according to the movement operation; and sequentially presenting the graphic objects of the second marketing movable unit corresponding to each icon in the graphic user interface according to the adjusted arrangement sequence of the icons in the icon list.
In some examples, the display control module 861 may add an icon of the second camping movable unit corresponding to the first operation option to the first end of the icon list in response to an operation on the first operation option received through an input device;
controlling a display device to display a graphical object of a second camping mobile unit in the graphical user interface corresponding to an icon located at a second end of the list of icons, deleting the icon from the second end of the list of icons, and moving the remaining icons one position toward the second end.
Embodiments of the present application also provide a computer-readable storage medium having computer-readable instructions stored therein. These computer readable instructions, when executed by a processor, may implement the methods of the embodiments of the present application.
It should be noted that not all steps and modules in the above flows and structures are necessary, and some steps or modules may be omitted according to actual needs. The execution sequence of the steps is not immovable and can be adjusted as required. The division of each module is only for convenience of describing adopted functional division, and in actual implementation, one module may be divided into multiple modules, and the functions of multiple modules may also be implemented by the same module, and these modules may be located in the same device or in different devices. In addition, the use of "first" and "second" in the above description is merely for convenience of distinguishing two objects having the same meaning, and does not indicate substantial differences.
In various examples, the modules may be implemented by specialized hardware or hardware executing machine-readable instructions. For example, the hardware may be specially designed permanent circuits or logic devices (e.g., special purpose processors, such as FPGAs or ASICs) for performing the specified operations. Hardware may also include programmable logic devices or circuits temporarily configured by software (e.g., including a general purpose processor or other programmable processor) to perform certain operations.
Machine-readable instructions corresponding to the modules may be stored in a non-volatile computer-readable storage medium, which may cause an operating system or the like operating on the computer to perform some or all of the operations described herein. The nonvolatile computer readable storage medium includes a floppy disk, a hard disk, a magneto-optical disk, an optical disk (e.g., CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, DVD + RW), a magnetic tape, a nonvolatile memory card, and a ROM. Alternatively, the program code may be downloaded from a server computer via a communications network.
In view of the above, the scope of the claims should not be limited to the embodiments in the examples described above, but should be given the broadest interpretation given the description as a whole.

Claims (21)

1. A display control method, comprising:
controlling a display device to display a graphical user interface, wherein the graphical user interface is used for displaying a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object with a preset position;
controlling the display device to display at least one second graphic object representing a first arraying movable unit in the graphic user interface, and controlling the second graphic object to move from the first graphic object to the preset position along the route;
controlling the display device to display at least one operation option in the graphical user interface, and in response to a first operation on the operation option received through an input device, controlling the display device to display at least one third graphical object representing a second marketing movable unit in the graphical user interface;
when a second operation on the third graphic object is monitored, determining a movement target point in the graphic user interface, and controlling the third graphic object to move towards the movement target point;
when detecting that the first graphic object or the second graphic object is located in the attack range of the third graphic object, controlling the third graphic object to attack the first graphic object or the second graphic object, and modifying the attribute value of the first graphic object or the second graphic object;
providing an icon in the graphical user interface, wherein the icon corresponds to one category of the second camp movable unit, monitoring the operation on the icon, and determining that all the movable units of the category corresponding to the icon are the objects of the operation;
controlling the display device to display an icon list in the graphical user interface, wherein the icon list comprises one or more icons, and each icon represents a second camp movable unit to be created;
in response to a movement operation of one or more icons in the icon list, which is received through an input device, moving the one or more icons to a position in the icon list, which is determined according to the movement operation;
and controlling the display equipment to sequentially display the graphic objects of the second marketing movable unit corresponding to each icon in the graphic user interface according to the adjusted arrangement sequence of the icons in the icon list.
2. The method of claim 1, further comprising:
and when detecting that the attribute value corresponding to the first graphic object reaches a set threshold value, controlling the display equipment to display a second graphic user interface representing the second marketing win.
3. The method of claim 1, further comprising: storing basic data of a movable unit, first behavior logic corresponding to the first arraying movable unit and second behavior logic corresponding to the second arraying movable unit in advance; wherein controlling the display device to display at least one of the second graphical object or the third graphical object in the graphical user interface comprises:
creating an instance of the movable unit from the base data of the movable unit;
loading the first behavior logic or the second behavior logic for the instance; and
and controlling the display equipment to display the second graphic object or the third graphic object corresponding to the instance in the graphic user interface, and controlling the behavior of the second graphic object or the third graphic object according to the loaded first behavior logic or the loaded second behavior logic.
4. The method of claim 3, further comprising:
generating a creating instruction in response to the determination that a preset condition is met, and setting indication information for indicating first marketing in the creating instruction; or
Generating a creating instruction in response to a first operation on the operation option received through an input device, wherein indication information for indicating a second marketing is set in the creating instruction;
wherein creating an instance of the movable unit, loading the first behavior logic or the second behavior logic for the instance comprises: and creating the instance according to the creating instruction, and loading the first behavior logic or the second behavior logic for the instance according to the indication information in the creating instruction.
5. The method of claim 1, further comprising:
controlling the display device to display at least one second operation option in the graphical user interface, and in response to a third operation on the second operation option received through an input device and a selection of a placement position in the graphical user interface, controlling the display device to display a fourth graphical object representing a second marketing immovable unit at the placement position;
controlling the display device to display at least one operation option in the graphical user interface includes: and controlling the display device to display the operation options in the graphical user interface in response to the operation on the fourth graphical object received through the input device.
6. The method of claim 5, further comprising:
in response to a selection of the fourth graphical object received through an input device, control the display device to display one or more of the following in the graphical user interface:
a third operational option, the method further comprising: changing an attribute value of the second marketing immovable unit in response to an operation on the third operation option received through an input device;
a fourth operational option, the method further comprising: and deleting the data of the second marketing immovable unit and controlling the display device to cancel the display of the fourth graphic object in response to the operation of the fourth operation option received through an input device.
7. The method of claim 1, wherein listening to the second operation on the third graphical object, determining the movement target point in the graphical user interface, comprises:
monitoring selection operations on the third graphic object and the movement target point respectively, and taking the position selected by the selection operations in the graphic user interface as the movement target point; or
And monitoring the dragging operation of the third graphic object, and determining the end position of the dragging operation as the movement target point.
8. The method of claim 1, wherein the operation on the icon is monitored, and the determination that all movable units of the category corresponding to the icon are the object of the operation comprises:
and monitoring clicking or touching operation on the icon, and determining that all movable units of the category corresponding to the icon are selected.
9. The method of claim 1, wherein the operation on the icon is monitored, and the determination that all movable units of the category corresponding to the icon are the object of the operation comprises:
and monitoring the dragging operation of the icon, and when the second camp movable unit corresponding to the third graphic object belongs to the category, determining all movable units of the category corresponding to the icon as the object of the dragging operation.
10. The method of claim 1, wherein controlling the display device to sequentially display the graphic objects of the second camp movable units corresponding to the respective icons in the graphic user interface according to the adjusted arrangement order of the icons in the icon list comprises:
responding to a first operation on the operation option received through input equipment, and adding an icon of a second camp movable unit corresponding to the operation option into a first end of the icon list;
controlling the display device to display, in the graphical user interface, a graphical object of a second camping movable unit corresponding to an icon located at a second end of the list of icons, delete the icon from the second end of the list of icons, and move the remaining icons one position toward the second end.
11. A game control apparatus, comprising:
the display control module is used for displaying a graphical user interface, and the graphical user interface is used for displaying a first graphical object representing an immovable unit of a first camp and a route connecting the first graphical object and a preset position; controlling a display device to display at least one second graphic object representing a first arraying movable unit in the graphic user interface, and controlling the second graphic object to move from the first graphic object to the preset position along the route; controlling the display device to display at least one operation option in the graphical user interface, and in response to a first operation on the operation option received through an input device, controlling the display device to display at least one third graphical object representing a second marketing movable unit in the graphical user interface;
a behavior processing module, configured to determine a movement target point in the graphical user interface when a second operation on the third graphical object is monitored, and control the third graphical object to move to the movement target point; when detecting that the first graphic object or the second graphic object is located in the attack range of the third graphic object, controlling the third graphic object to attack the first graphic object or the second graphic object, and modifying the attribute value of the first graphic object or the second graphic object; providing an icon in the graphical user interface, wherein the icon corresponds to one category of the second camp movable unit, monitoring the operation on the icon, and determining that all the movable units of the category corresponding to the icon are the objects of the operation;
the display control module is further to:
controlling the display device to display an icon list in the graphical user interface, wherein the icon list comprises one or more icons, and each icon represents a second camp movable unit to be created;
in response to a movement operation of one or more icons in the icon list, which is received through an input device, moving the one or more icons to a position in the icon list, which is determined according to the movement operation;
and controlling the display equipment to sequentially display the graphic objects of the second marketing movable unit corresponding to each icon in the graphic user interface according to the adjusted arrangement sequence of the icons in the icon list.
12. The apparatus of claim 11, further comprising:
and the judging module is used for controlling the display equipment to display a second graphical user interface representing the second marketing win when detecting that the attribute value corresponding to the first graphical object reaches a set threshold value.
13. The apparatus of claim 11, further comprising:
the storage module is used for storing basic data of a movable unit, first behavior logic corresponding to the first arraying movable unit and second behavior logic corresponding to the second arraying movable unit in advance; wherein,
the display control module is used for: creating an instance of the movable unit from the base data of the movable unit; loading the first behavior logic or the second behavior logic for the instance; and controlling the display equipment to display the second graphic object or the third graphic object corresponding to the instance in the graphic user interface, and controlling the behavior of the second graphic object or the third graphic object according to the loaded first behavior logic or the loaded second behavior logic.
14. The apparatus of claim 13, wherein the display control module is configured to:
generating a creating instruction in response to the determination that a preset condition is met, and setting indication information for indicating first marketing in the creating instruction; or
Generating a creating instruction in response to a first operation on the operation option received through an input device, wherein indication information for indicating a second marketing is set in the creating instruction;
and creating the instance according to the creating instruction, and loading the first behavior logic or the second behavior logic for the instance according to the indication information in the creating instruction.
15. The apparatus of claim 11, wherein the display control module is further configured to:
controlling the display device to display at least one second operation option in the graphical user interface, and in response to a third operation on the second operation option received through an input device and a selection of a placement location in the graphical user interface, controlling the display device to display a fourth graphical object representing an immovable unit of a second lineup at the placement location;
and controlling the display device to display the operation options in the graphical user interface in response to the operation on the fourth graphical object received through the input device.
16. The apparatus of claim 15, wherein the display control module is further configured to:
in response to a selection of the fourth graphical object received through an input device, control the display device to display one or more of the following in the graphical user interface:
a third operation option, the display control module is further configured to: changing an attribute value of the second marketing immovable unit in response to an operation on the third operation option received through an input device;
a fourth operation option, the display control module is further configured to: and deleting the data of the second marketing immovable unit and controlling the display device to cancel the display of the fourth graphic object in response to the operation of the fourth operation option received through an input device.
17. The apparatus of claim 11, wherein the behavior processing module is configured to:
monitoring selection operations on the third graphic object and the movement target point respectively, and taking the position selected by the selection operations in the graphic user interface as the movement target point; or
And monitoring the dragging operation of the third graphic object, and determining the end position of the dragging operation as the movement target point.
18. The apparatus of claim 11, wherein the behavior processing module is configured to:
and monitoring clicking or touching operation on the icon, and determining that all movable units of the category corresponding to the icon are selected.
19. The apparatus of claim 11, wherein the behavior processing module is configured to:
and monitoring the dragging operation of the icon, and determining all movable units of the category corresponding to the icon as the object of the dragging operation.
20. The apparatus of claim 11, wherein the display control module is configured to:
responding to a first operation on the operation option received through input equipment, and adding an icon of a second camp movable unit corresponding to the operation option into a first end of the icon list;
controlling the display device to display, in the graphical user interface, a graphical object of a second camping movable unit corresponding to an icon located at a second end of the list of icons, delete the icon from the second end of the list of icons, and move the remaining icons one position toward the second end.
21. A computer-readable storage medium storing computer-readable instructions, which when executed by a processor implement the steps of any one of the methods of claims 1-10.
CN201711114665.5A 2017-11-13 2017-11-13 Display control method, device and storage medium Active CN107890669B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711114665.5A CN107890669B (en) 2017-11-13 2017-11-13 Display control method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711114665.5A CN107890669B (en) 2017-11-13 2017-11-13 Display control method, device and storage medium

Publications (2)

Publication Number Publication Date
CN107890669A CN107890669A (en) 2018-04-10
CN107890669B true CN107890669B (en) 2021-04-16

Family

ID=61805286

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711114665.5A Active CN107890669B (en) 2017-11-13 2017-11-13 Display control method, device and storage medium

Country Status (1)

Country Link
CN (1) CN107890669B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108744512A (en) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 Information cuing method and device, storage medium and electronic device
CN110619069A (en) * 2018-06-18 2019-12-27 富士施乐株式会社 Information processing apparatus and non-transitory computer readable medium
CN109343773B (en) * 2018-10-11 2021-07-09 广州要玩娱乐网络技术股份有限公司 Control method and device of portable touch equipment, storage medium and terminal
CN112076469A (en) * 2020-09-18 2020-12-15 腾讯科技(深圳)有限公司 Virtual object control method and device, storage medium and computer equipment
CN114344913B (en) * 2022-01-04 2023-06-20 腾讯科技(深圳)有限公司 Game data processing method, device, equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012090844A (en) * 2010-10-28 2012-05-17 Square Enix Co Ltd Game system, program for game system, and information recording medium
CN104423841A (en) * 2013-09-11 2015-03-18 中兴通讯股份有限公司 Method and device for dragging icon
CN104645616A (en) * 2015-03-16 2015-05-27 成都优聚软件有限责任公司 Method and system for setting moving path of game object in tower defence game
CN107050862A (en) * 2017-05-19 2017-08-18 网易(杭州)网络有限公司 Display control method and system, the storage medium of scene of game

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012090844A (en) * 2010-10-28 2012-05-17 Square Enix Co Ltd Game system, program for game system, and information recording medium
CN104423841A (en) * 2013-09-11 2015-03-18 中兴通讯股份有限公司 Method and device for dragging icon
CN104645616A (en) * 2015-03-16 2015-05-27 成都优聚软件有限责任公司 Method and system for setting moving path of game object in tower defence game
CN107050862A (en) * 2017-05-19 2017-08-18 网易(杭州)网络有限公司 Display control method and system, the storage medium of scene of game

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
(9分佳作)RTS和塔防的完美结合——手游钢铁战队试玩;Colddagger;《https://www.bilibili.com/video/av14822626/》;20170926;视频第09分00秒至第16分34秒 *
《灰蛊》游戏系统详细解析 新手上手图文教程攻略;热心网友;《https://gl.ali213.net/html/2015-1/60219_35.html》;20150126;第2-3页 *
Colddagger.(9分佳作)RTS和塔防的完美结合——手游钢铁战队试玩.《https://www.bilibili.com/video/av14822626/》.2017,视频第09分00秒至第16分34秒. *
皇室战争最全新手教程 新手初学超详细攻略;Augustus;《http://www.benshouji.com/zhinan/5607964526/》;20160309;第6页 *

Also Published As

Publication number Publication date
CN107890669A (en) 2018-04-10

Similar Documents

Publication Publication Date Title
CN107890669B (en) Display control method, device and storage medium
US11439906B2 (en) Information prompting method and apparatus, storage medium, and electronic device
WO2021244322A1 (en) Method and apparatus for aiming at virtual object, device, and storage medium
US9480921B2 (en) Potential damage indicator of a targeted object
US9764226B2 (en) Providing enhanced game mechanics
CN110433493B (en) Virtual object position marking method, device, terminal and storage medium
CN111249735B (en) Path planning method and device for control object, processor and electronic device
Uriarte et al. Game-tree search over high-level game states in RTS games
US9004997B1 (en) Providing enhanced game mechanics
JP2022532870A (en) How to display operation controls based on virtual scenes, devices, computer devices and computer programs
WO2022057624A1 (en) Method and apparatus for controlling virtual object to use virtual prop, and terminal and medium
US11266909B2 (en) Storage medium storing game program, information processing apparatus, game processing method, and game system
US20230330530A1 (en) Prop control method and apparatus in virtual scene, device, and storage medium
US9033797B1 (en) Multiple user viewing modes of an environment
CN113262488B (en) Control method, device, equipment and storage medium for virtual objects in virtual scene
CN110465090B (en) Virtual object control method, device, terminal and storage medium
US20230031248A1 (en) Level screen display method, apparatus, device and storage medium
KR20230087602A (en) Virtual character control method and device, terminal, storage medium, and program product
CN114247146A (en) Game display control method and device, electronic equipment and medium
US11110358B2 (en) Storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
WO2024093941A1 (en) Method and apparatus for controlling virtual object in virtual scene, device, and product
CN114534258A (en) Game guide information display method, device, equipment and medium
JP2017000409A (en) Game apparatus, control method thereof, and program
US20240293743A1 (en) Build element availability for real-time strategy game
CN112870708B (en) Information display method, device, equipment and storage medium in virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant