CN113952709A - Game interaction method and device, storage medium and electronic equipment - Google Patents

Game interaction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113952709A
CN113952709A CN202111227922.2A CN202111227922A CN113952709A CN 113952709 A CN113952709 A CN 113952709A CN 202111227922 A CN202111227922 A CN 202111227922A CN 113952709 A CN113952709 A CN 113952709A
Authority
CN
China
Prior art keywords
game
area
combat
designated
virtual character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111227922.2A
Other languages
Chinese (zh)
Inventor
郑夏桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111227922.2A priority Critical patent/CN113952709A/en
Publication of CN113952709A publication Critical patent/CN113952709A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal

Abstract

The disclosure relates to the technical field of games, and provides a game interaction method, a game interaction device, a computer storage medium and an electronic device, wherein the game interaction method comprises the following steps: responding to touch operation aiming at the game picture, and acquiring touch information generated by the touch operation; generating a designated combat area according to the touch information; designating a fighting area for indicating a fighting range of the target virtual character; the target virtual character comprises a first virtual character and a virtual character which is in the same game formation with the first virtual character; the designated combat area is displayed to prompt the target virtual character to enter the designated combat area for game combat. This openly can be according to the scope of battle of the self-defined recreation of player's thought to avoid some players to fall behind and keep away from the team, in time salvage between the player of being convenient for same recreation formation.

Description

Game interaction method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of game technologies, and in particular, to a game interaction method, a game interaction apparatus, a computer storage medium, and an electronic device.
Background
With the development and popularization of computer and internet technologies, related game technologies are also developing vigorously, and various battle games emerge endlessly.
At present, in related battle games, most game players need to form teams for game battles, all the players move in the same game map, but as the related areas of the game map are very wide, some players possibly cannot keep up with the teams in real time in the game process, the phenomenon of team falling is easy to occur, and thus team battles are not facilitated.
In view of the above, there is a need in the art to develop a new game interaction method and apparatus.
It is to be noted that the information disclosed in the background section above is only used to enhance understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure is directed to a game interaction method, a game interaction apparatus, a computer storage medium and an electronic device, so as to avoid, at least to a certain extent, a defect that a game battle area cannot be defined by a user in the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a method of game interaction, comprising: providing a graphical user interface through a display screen of a terminal device, wherein the graphical user interface comprises a game picture corresponding to a first virtual character, the game picture comprises at least part of a game scene and a three-dimensional virtual object positioned in the game scene, and the method comprises the following steps: responding to touch operation aiming at the game picture, and acquiring touch information generated by the touch operation; generating a designated combat area according to the touch information; the designated combat area is used for indicating the combat range of the target virtual character; the target virtual character comprises the first virtual character and a virtual character in the same game formation with the first virtual character; and displaying the designated combat area to prompt the target virtual character to enter the designated combat area for game combat.
In an exemplary embodiment of the present disclosure, the touch information includes a first touch point located on an outer surface of the three-dimensional virtual object, and/or a second touch point located in another area except for the area where the three-dimensional virtual object is located in the game scene.
In an exemplary embodiment of the present disclosure, when the touch information includes at least three first touch points, the generating a designated combat area according to the touch information includes: determining a reference area according to the at least three first touch points; and mapping the boundary line of the reference area to the surface of the three-dimensional virtual object, and obtaining the designated combat area according to the mapped boundary line.
In an exemplary embodiment of the present disclosure, the determining a reference area according to the at least three touch points includes: connecting each first touch point with the rest first touch points by lines with preset radians to obtain a reference graph; and selecting the closed area with the largest area from the reference graph as the reference area.
In an exemplary embodiment of the present disclosure, the method further comprises: and reducing the designated combat area every other preset time length.
In an exemplary embodiment of the present disclosure, the method further comprises: adjusting the shape of the designated combat zone in response to an adjustment operation on the designated combat zone.
In an exemplary embodiment of the present disclosure, before the adjusting the shape of the designated play area in response to the adjusting the designated play area, the method further comprises: and hiding the three-dimensional virtual object.
According to a second aspect of the present disclosure, there is provided an apparatus for game interaction, comprising: providing a graphical user interface through a display screen of a terminal device, wherein the graphical user interface comprises a game picture corresponding to a first virtual character, the game picture comprises at least part of a game scene and a three-dimensional virtual object positioned in the game scene, and the device comprises: the acquisition module is used for responding to the touch operation aiming at the game picture and acquiring touch information generated by the touch operation; the area generating module is used for generating a designated combat area according to the touch information; the designated combat area is used for indicating the combat range of the target virtual character; the target virtual character comprises the first virtual character and a virtual character in the same game formation with the first virtual character; and the display module is used for displaying the appointed combat area so as to prompt the target virtual character to enter the appointed combat area for game combat.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the method of game interaction of the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of game interaction of the first aspect described above via execution of the executable instructions.
According to the technical scheme, the game interaction method, the game interaction device, the computer storage medium and the electronic equipment in the exemplary embodiment of the disclosure have at least the following advantages and positive effects:
in some embodiments of the present disclosure, on one hand, in response to a touch operation on a game screen, touch information generated by the touch operation is acquired, and a designated combat area (for indicating a combat range of a target virtual character) is generated according to the touch information, so that a new game interaction manner is provided, and a player can customize the combat range of a game according to his own idea, thereby improving game interest. On the other hand, the appointed combat area is displayed to prompt the target virtual characters (the target virtual characters comprise the first virtual characters and the virtual characters in the same game formation with the first virtual characters) to enter the appointed combat area to carry out game combat, compared with the combat in the whole game scene, the activity range is reduced, part of players are prevented from falling behind and being far away from the team, and the players in the same game formation can be helped in time.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 shows a flow diagram of a method of game interaction in an embodiment of the present disclosure;
FIG. 2 illustrates a flow chart for generating designated areas of engagement in an embodiment of the present disclosure;
FIG. 3 illustrates a flow chart for determining a reference region in an embodiment of the present disclosure;
FIG. 4A illustrates a schematic diagram of a reference region determined in an embodiment of the present disclosure;
FIG. 4B illustrates a schematic view of a designated combat zone generated in an embodiment of the present disclosure;
FIG. 5 illustrates a flow chart for automatically adjusting designated areas of engagement in accordance with an embodiment of the present disclosure;
FIG. 6 shows a schematic structural diagram of an apparatus for game interaction in an exemplary embodiment of the present disclosure;
fig. 7 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
In the embodiment of the disclosure, firstly, a method for game interaction is provided, which overcomes the defect that the game battle area cannot be defined in the related art at least to a certain extent.
Fig. 1 shows a flowchart of a method of game interaction in an embodiment of the present disclosure, and an execution subject of the method of game interaction may be a game server that processes game interaction operations.
Referring to fig. 1, a method of game interaction according to one embodiment of the present disclosure includes the steps of:
step S110, responding to touch operation aiming at the game picture, and acquiring touch information generated by the touch operation;
step S120, generating a designated combat area according to the touch information; designating a fighting area for indicating a fighting range of the target virtual character; the target virtual character comprises a first virtual character and a virtual character which is in the same game formation with the first virtual character;
in step S130, a designated combat zone is displayed to prompt the target virtual character to enter the designated combat zone for game combat.
In the technical solution provided in the embodiment shown in fig. 1, on one hand, touch information generated by touch operation is acquired in response to the touch operation on the game screen, and a designated combat area (for indicating the combat range of the target virtual character) is generated according to the touch information, so that a new game interaction manner is provided, and a player can customize the combat range of a game according to his own idea, thereby improving the game interest. On the other hand, the appointed combat area is displayed to prompt the target virtual characters (the target virtual characters comprise the first virtual characters and the virtual characters in the same game formation with the first virtual characters) to enter the appointed combat area to carry out game combat, compared with the combat in the whole game scene, the activity range is reduced, part of players are prevented from falling behind and being far away from the team, and the players in the same game formation can be helped in time.
The following describes the specific implementation of each step in fig. 1 in detail:
in the present disclosure, a graphical user interface may be provided through a display screen of the terminal device, where the graphical user interface includes a game screen corresponding to the first virtual character.
The terminal device may be various electronic devices having a display screen and supporting web browsing, including but not limited to a smart phone, a tablet computer, a laptop portable computer, a desktop computer, and the like.
A Graphical User Interface (GUI) refers to a computer operation User Interface displayed in a graphical manner, is a dialogue Interface between a computer and a User thereof, and is an important component of a computer system. It allows players to interact with electronic devices not through text interfaces, typed command tags, or text, but through icons and visual indicators (e.g., secondary symbol cues).
The first virtual character may be a virtual game character currently played by the master player.
The game screen may be a display screen including a game scene and three-dimensional virtual objects located in the game scene, where the game scene may be an environment (e.g., sky, road), a building (building), a machine, a prop, and the like in the game. The three-dimensional virtual object may refer to a building, a mountain model, or the like in a game scene.
In step S110, touch information generated by the touch operation is acquired in response to the touch operation on the game screen.
In this step, after entering the game screen, a start control with a range definition may be provided in the game screen, and if a click operation (e.g., a single click or a double click operation) for the start control is detected, a range definition mode may be entered, and then the current master player may perform a touch operation on the game screen to define the fighting range of the game according to his own idea.
The touch operation of the player on the game screen may be a fixed point operation, for example: the player performs a plurality of independent clicking operations at different positions in the game picture, wherein the different positions can be located on the outer surface of the three-dimensional virtual object and/or located in other areas (for example, on a virtual ground in the game scene) except for the area where the three-dimensional virtual object is located in the game scene. Correspondingly, the touch information generated by the touch operation may be a first touch point located on the outer surface of the three-dimensional virtual object, and/or a second touch point located in another area except the area where the three-dimensional virtual object is located in the game scene.
It should be noted that, specific ways of pointing may include: the touch control point can be set by itself according to the actual situation by directly clicking in the game scene through a finger to input the touch control point, by clicking in the game scene through a mouse or a touch control pen to input the touch control point, and the like, and the method is not particularly limited by the disclosure.
In an alternative embodiment, during the pointing process, the player can also select to zoom the game scene by two fingers or mouse touch, for example: the scene is enlarged by sliding the two fingers outwards, and the scene is reduced by sliding the two fingers inwards, so that a player can conveniently input a plurality of touch points which are positioned at different positions and have any distance according to own ideas.
In an alternative embodiment, after the player inputs the touch point, a "position adjustment" control icon may be provided, so that the player may click the "position adjustment" control icon to adjust the position of the touch point again, so that the designated combat area generated subsequently has a higher matching degree with the expected result of the player.
In an optional embodiment, a "fixed point completion button" may be further provided, and after the player's fixed point is finished, the "fixed point completion button" may be clicked, and further, the server may determine the number of touch points according to the received touch information, and if the number of touch points is less than 3 (i.e. a closed delineation range cannot be generated subsequently), a prompt message may be returned to the player, for example: "fixed point invalid please re-enter" to ensure that the player's fixed point count can generate a valid delineation scope.
In an optional embodiment, after determining that the number of the touch points is not less than 3, it may be determined whether the touch points input by the player are located on the same straight line, and if the touch points are located on the same straight line (i.e., a closed delineation range cannot be generated subsequently), a prompt message, such as "fixed point invalid and please re-input", may be returned to the player so that the player can re-fix the point.
In step S120, a designated combat area is generated according to the touch information.
In this step, after the touch information generated by the touch operation is acquired, a designated combat area may be generated according to the touch information. The designated combat zone is used for limiting the combat range of the target virtual character, so that the target virtual character (such as the first virtual character and the virtual character in the same game formation with the first virtual character) can only act in the adjusted designated combat zone, thereby reducing the active ranges of a plurality of players belonging to the same combat team and avoiding problems such as falling of the players.
In this step, when the touch information is at least 3 first touch points located on the surface of the three-dimensional virtual object, refer to fig. 2, where fig. 2 shows a flowchart for generating a designated combat zone in the embodiment of the present disclosure, and the method includes steps S201 to S202:
in step S201, a reference area is determined according to at least three first touch points.
In this step, reference may be made to fig. 3, where fig. 3 shows a flowchart of determining a reference region in the embodiment of the present disclosure, and the method includes steps S301 to S302:
in step S301, each first touch point is connected to the remaining first touch points by a line with a preset radian to obtain a reference pattern.
In this step, each first touch point may be connected to the remaining first touch points by a line with a preset radian to obtain a reference pattern, where the line with the preset radian may be a straight line, a curve with a radian, and the like, and may be set by itself according to an actual situation, which is not particularly limited by this disclosure.
In step S302, a closed region having the largest area is selected from the reference pattern as a reference region.
In this step, in an optional implementation manner, when the number of the at least 3 first touch points is 3, after each first touch point is connected to other first touch points, a maximum closed region not including any intersecting line may be formed, and thus, a reference pattern formed after the connection may be directly determined as the reference region.
In another optional implementation manner, when the number of the at least 3 first touch points is greater than 3, for example: fig. 4A may be referred to, and fig. 4A illustrates a schematic diagram of a reference region determined in an embodiment of the present disclosure, where a peripheral cone is shown as a three-dimensional virtual object, and 1-5 are shown as the first touch points, and after each first touch point is connected to other first touch points, it is known that an obtained reference pattern includes a cross line, that is, includes a plurality of closed small regions, so that a closed region with a largest area (that is, a shaded portion with the largest area in the figure, a region surrounded by 1-5-4-2-3-1) may be selected from the reference pattern as the reference region.
Referring next to fig. 2, in step S202, the boundary line of the reference area is mapped to the three-dimensional virtual object surface, and the designated combat area is obtained according to the mapped boundary line.
In this step, the boundary line of the reference area may be mapped to the surface of the three-dimensional virtual object, and the designated combat area may be obtained according to the mapped boundary line. For example, fig. 4B may be referred to, and fig. 4B shows a schematic diagram of the designated combat zone generated in the embodiment of the present disclosure, and specifically shows a schematic diagram of the designated combat zone obtained after mapping the reference pattern formed in fig. 4A onto the surface of the three-dimensional virtual object, as can be seen from fig. 4B, the present disclosure can avoid cutting the three-dimensional virtual object, and the like, ensure the feasibility and effectiveness of the finally generated combat zone, and can determine the designated combat zone with high limitation, thereby improving the game interest.
It should be noted that after the boundary line of the reference area is mapped to the surface of the three-dimensional virtual object, it may also be determined whether the interior of the three-dimensional virtual object is a movable area, and in an alternative embodiment, if the three-dimensional virtual object is an entity model (e.g., a mountain model) whose interior is not movable, the area of the mapped boundary line on the outer surface of the mountain model may be determined as the designated combat area. In another alternative embodiment, if the three-dimensional virtual object is a model (e.g., a building model) whose interior can be moved, the area of the mapped boundary line inside the building model may be determined as the designated combat area.
It should be noted that, when the touch information is second touch points located in other areas (for example, a plurality of second touch points located on the ground) except the area where the three-dimensional virtual object is located in the game scene, the second touch points may be directly connected by a line with a preset radian to form a plane battle range located on the ground and having no height.
When the touch information includes a first touch point located on the outer surface of the three-dimensional virtual object and a second touch point located in another area except the area where the three-dimensional virtual object is located in the game scene, for example: when 2 first touch points and 1 second touch point are included, the 3 touch points can be connected to obtain a reference graph, and then a closed area with the largest area can be selected from the reference graph to serve as a reference area. Further, an intersection point between the reference area and an outer surface of the three-dimensional virtual object may be obtained, the first target area may be mapped to a surface of the three-dimensional virtual object with reference to the execution steps of the steps S201 to S203 according to the 2 first touch points and the intersection point for a first target area falling inside the three-dimensional virtual model, so as to obtain a first fighting area, and the second target area falling outside the three-dimensional virtual model may be horizontally mapped to a virtual ground in a game scene, so as to obtain a second fighting area having no height, so that the first fighting area and the second fighting area may be determined as the generated designated fighting area.
Referring next to fig. 1, in step S130, a designated combat area is displayed to prompt the target virtual character to enter the designated combat area for game play.
In this step, after the designated combat zone is generated, in an alternative embodiment, the designated combat zone may be displayed on a client of a game player corresponding to the first virtual character (i.e., the current master player) and a game player corresponding to a virtual character in the same game play as the first virtual character (i.e., a teammate player of the current master player). In another alternative embodiment, the designated combat area may be displayed on a game map. In yet another alternative embodiment, the designated combat zone may be displayed in synchronization with the client of the player and the game map.
The designated combat zone may be displayed in a differentiated manner, for example, by thickening, highlighting or differentiating the boundary line thereof with a color different from that of the other zone, so as to attract the attention of the player and facilitate the player to adjust the combat strategy in time.
In an alternative embodiment, after displaying the designated combat zone, a timing control may be displayed at the player terminal, where the timing control is used to prompt the current master player and its teammate player to control the remaining time for the target virtual character selected by the current master player to enter the designated combat zone, and, for example, a prompt message "such as: please reach the designated combat zone within 10 seconds ", and further, if the target virtual character reaches the designated combat zone within a preset time period, a reward may be sent to the target virtual character, for example: awarding gold coins, points and the like, and if the target virtual character does not reach the designated combat area within the preset time length, punishing can be carried out on the target virtual character, for example: the integral is reduced, and the integral can be set according to the actual situation, which is not limited in the present disclosure.
In an alternative embodiment, the designated combat zone may be narrowed at intervals of a preset duration to achieve a game effect that the security of the character is guaranteed only when the player is in the delineation zone.
In an alternative embodiment, after the designated combat zone is generated, the current master player or the teammate player who is in the same game formation with the current master player may further adjust the shape of the designated combat zone, specifically, a "combat zone adjustment" control may be provided, and further, after the player clicks the control, the boundary line of the designated combat zone may be manually adjusted, for example: and adjusting the radian of the boundary line to obtain the adjusted designated combat area. Furthermore, a 'battle area effective' control can be provided on the game interface, when the player clicks the control, the adjustment can be determined to be effective, and further, relevant prompt information can be sent to the current master player and players of teammates thereof to remind the current master player and the players of teammates of the current master player to enter the designated battle area for battle.
It should be noted that after the player clicks the "combat zone adjustment" control, the player can hide the three-dimensional virtual object in the game scene, so that the player can visually see which touch point is adjusted, and compare the touch point with the path and area of the designated combat zone before adjustment, so as to better match the adjustment intention.
In an alternative embodiment, the present disclosure may further collect the area of the designated combat zone and the radian of each boundary line of the designated combat zone after the player adjusts, and then automatically adjust the designated combat zone according to the correspondence between the area and the radian. Specifically, referring to fig. 5, fig. 5 is a flowchart illustrating automatic adjustment of a designated combat zone, including steps S501 to S503:
in step S501, the area of the designated combat zone is acquired; and acquiring the radian of each boundary line of the adjusted designated combat area.
In this step, the area of the designated combat zone formed by a plurality of different players when the combat zone is defined by themselves during the running of the game and the radian of each boundary line of the designated combat zone after each player performs manual adjustment can be collected. For example, the position of each boundary line of the adjusted designated combat zone in the game scene, the relationship between the boundary line and a horizontal line in the game scene, and the like can be collected to clarify the characteristics of each boundary line.
In step S502, a correspondence table of the area and the radian is stored.
In this step, a correspondence table of the area and the radian may be stored, for example: radian R1, R2, R3 of each boundary line when area is S1; the radian of each boundary line when the area is S2R 4, R5, R6. For example, the corresponding relationship table may further specifically store the characteristics of each boundary line, and may be set by itself according to an actual situation, which is not particularly limited in this disclosure.
In step S503, after a preset time period, the designated combat zone is automatically adjusted according to the correspondence table.
In this step, after a preset time period (for example, after the method in the present disclosure is online for one month, or when the data stored in the correspondence table reaches a certain amount, the data can be set by itself according to the actual situation), the designated combat area can be automatically adjusted according to the correspondence table. By collecting the area of the designated combat zone and the radian of each boundary line of the designated combat zone after being adjusted by the player, the designated combat zone meeting the requirements of the player can be conveniently and rapidly generated in the follow-up process, the operation times of the player are reduced, and the generation efficiency of the designated combat zone is improved.
According to the game interaction method and the game interaction device, the player interacts in the game scene, on one hand, a new game interaction mode is provided, the game battle area with height limitation can be defined according to the idea of the player, and the interestingness of the game is improved. On the other hand, for battle in the whole game scene, the activity range is reduced, partial players are prevented from falling behind and being far away from the team, and the players in the same game formation can conveniently rescue in time.
The present disclosure also provides a game interaction device, and fig. 6 shows a schematic structural diagram of the game interaction device in an exemplary embodiment of the present disclosure; as shown in fig. 6, the apparatus 600 for game interaction may include an obtaining module 610, an area generating module 620, and a display module 630. Wherein:
an obtaining module 610, configured to respond to a touch operation for the game screen, and obtain touch information generated by the touch operation;
the area generating module 620 is configured to generate a designated combat area according to the touch information; the designated combat area is used for indicating the combat range of the target virtual character; the target virtual character comprises the first virtual character and a virtual character in the same game formation with the first virtual character;
a display module 630, configured to display the designated combat area, so as to prompt the target virtual character to enter the designated combat area for game combat.
In an exemplary embodiment of the present disclosure, the obtaining module 610 is configured to:
in an exemplary embodiment of the present disclosure, the touch information includes a first touch point located on an outer surface of the three-dimensional virtual object, and/or a second touch point located in another area except for the area where the three-dimensional virtual object is located in the game scene.
In an exemplary embodiment of the present disclosure, when the touch information includes at least three first touch points, the area generating module 620 is configured to:
determining a reference area according to the at least three first touch points; and mapping the boundary line of the reference area to the surface of the three-dimensional virtual object, and obtaining the designated combat area according to the mapped boundary line.
In an exemplary embodiment of the present disclosure, the region generation module 620 is configured to:
connecting each first touch point with the rest first touch points by lines with preset radians to obtain a reference graph; and selecting the closed area with the largest area from the reference graph as the reference area.
In an exemplary embodiment of the present disclosure, the designated combat area is narrowed every preset duration.
In an exemplary embodiment of the present disclosure, the region generation module 620 is configured to:
adjusting the shape of the designated combat zone in response to an adjustment operation on the designated combat zone.
In an exemplary embodiment of the present disclosure, the region generation module 620 is configured to:
hiding the three-dimensional virtual object before adjusting the shape of the designated combat zone.
The specific details of each module in the above game interaction device have been described in detail in the corresponding game interaction method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
The present application also provides a computer-readable storage medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the above embodiments.
In addition, the embodiment of the disclosure also provides an electronic device capable of implementing the method.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 700 according to this embodiment of the disclosure is described below with reference to fig. 7. The electronic device 700 shown in fig. 7 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 is embodied in the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: the at least one processing unit 710, the at least one memory unit 720, a bus 730 connecting different system components (including the memory unit 720 and the processing unit 710), and a display unit 740.
Wherein the storage unit stores program code that is executable by the processing unit 710 to cause the processing unit 710 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification. For example, the processing unit 710 may perform the following as shown in fig. 1: step S110, responding to touch operation aiming at the game picture, and acquiring touch information generated by the touch operation; step S120, generating a designated combat area according to the touch information; designating a fighting area for indicating a fighting range of the target virtual character; the target virtual character comprises a first virtual character and a virtual character which is in the same game formation with the first virtual character; in step S130, a designated combat zone is displayed to prompt the target virtual character to enter the designated combat zone for game combat.
The storage unit 720 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)7201 and/or a cache memory unit 7202, and may further include a read only memory unit (ROM) 7203.
The storage unit 720 may also include a program/utility 7204 having a set (at least one) of program modules 7205, such program modules 7205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 730 may be any representation of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 700 may also communicate with one or more external devices 800 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a player to interact with the electronic device 700, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 700 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 750. Also, the electronic device 700 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 760. As shown, the network adapter 760 communicates with the other modules of the electronic device 700 via the bus 730. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 700, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. A game interaction method is characterized in that a graphical user interface is provided through a display screen of a terminal device, the graphical user interface comprises a game picture corresponding to a first virtual character, wherein the game picture comprises at least part of game scenes and three-dimensional virtual objects positioned in the game scenes, and the method comprises the following steps:
responding to touch operation aiming at the game picture, and acquiring touch information generated by the touch operation;
generating a designated combat area according to the touch information; the designated combat area is used for indicating the combat range of the target virtual character; the target virtual character comprises the first virtual character and a virtual character in the same game formation with the first virtual character;
and displaying the designated combat area to prompt the target virtual character to enter the designated combat area for game combat.
2. The method according to claim 1, wherein the touch information comprises a first touch point located on an outer surface of the three-dimensional virtual object and/or a second touch point located in another area except the area where the three-dimensional virtual object is located in the game scene.
3. The method of claim 2, wherein when the touch information includes at least three first touch points, the generating a designated combat area according to the touch information includes:
determining a reference area according to the at least three first touch points;
and mapping the boundary line of the reference area to the surface of the three-dimensional virtual object, and obtaining the designated combat area according to the mapped boundary line.
4. The method of claim 3, wherein determining a reference area according to the at least three touch points comprises:
connecting each first touch point with the rest first touch points by lines with preset radians to obtain a reference graph;
and selecting the closed area with the largest area from the reference graph as the reference area.
5. The method of claim 1, further comprising:
and reducing the designated combat area every other preset time length.
6. The method of claim 1, further comprising:
adjusting the shape of the designated combat zone in response to an adjustment operation on the designated combat zone.
7. The method of claim 6, wherein prior to adjusting the shape of the designated play area in response to adjusting the designated play area, the method further comprises:
and hiding the three-dimensional virtual object.
8. A game interaction device is characterized in that a graphical user interface is provided through a display screen of a terminal device, the graphical user interface comprises a game picture corresponding to a first virtual character, wherein the game picture comprises at least part of game scenes and three-dimensional virtual objects positioned in the game scenes, and the device comprises:
the acquisition module is used for responding to the touch operation aiming at the game picture and acquiring touch information generated by the touch operation;
the area generating module is used for generating a designated combat area according to the touch information; the designated combat area is used for indicating the combat range of the target virtual character; the target virtual character comprises the first virtual character and a virtual character in the same game formation with the first virtual character;
and the display module is used for displaying the appointed combat area so as to prompt the target virtual character to enter the appointed combat area for game combat.
9. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements a method of game interaction as claimed in any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of game interaction of any of claims 1-7 via execution of the executable instructions.
CN202111227922.2A 2021-10-21 2021-10-21 Game interaction method and device, storage medium and electronic equipment Pending CN113952709A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111227922.2A CN113952709A (en) 2021-10-21 2021-10-21 Game interaction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111227922.2A CN113952709A (en) 2021-10-21 2021-10-21 Game interaction method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113952709A true CN113952709A (en) 2022-01-21

Family

ID=79465398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111227922.2A Pending CN113952709A (en) 2021-10-21 2021-10-21 Game interaction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113952709A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114721566A (en) * 2022-04-11 2022-07-08 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and equipment
CN117667293A (en) * 2024-02-01 2024-03-08 中国人民解放军军事科学院国防科技创新研究院 Multi-task collaborative window visualization method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114721566A (en) * 2022-04-11 2022-07-08 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and equipment
CN114721566B (en) * 2022-04-11 2023-09-29 网易(上海)网络有限公司 Virtual object control method and device, storage medium and equipment
CN117667293A (en) * 2024-02-01 2024-03-08 中国人民解放军军事科学院国防科技创新研究院 Multi-task collaborative window visualization method and device

Similar Documents

Publication Publication Date Title
US11833426B2 (en) Virtual object control method and related apparatus
CN108465238B (en) Information processing method in game, electronic device and storage medium
CN109966738B (en) Information processing method, processing device, electronic device, and storage medium
CN108287657B (en) Skill applying method and device, storage medium and electronic equipment
CN111185004A (en) Game control display method, electronic device, and storage medium
KR102602113B1 (en) Information interaction methods and related devices
EP3939681A1 (en) Virtual object control method and apparatus, device, and storage medium
CN111957032A (en) Game role control method, device, equipment and storage medium
CN111530073B (en) Game map display control method, storage medium and electronic device
CN112569611B (en) Interactive information display method, device, terminal and storage medium
CN107694089A (en) Information processing method, device, electronic equipment and storage medium
CN107715454A (en) Information processing method, device, electronic equipment and storage medium
EP3970819B1 (en) Interface display method and apparatus, and terminal and storage medium
CN113952709A (en) Game interaction method and device, storage medium and electronic equipment
CN110772789B (en) Method, device, storage medium and terminal equipment for skill control in game
CN112933591A (en) Method and device for controlling game virtual character, storage medium and electronic equipment
CN107185232B (en) Virtual object motion control method and device, electronic equipment and storage medium
CN110302530A (en) Virtual unit control method, device, electronic equipment and storage medium
CN106984044B (en) Method and equipment for starting preset process
CN108170338A (en) Information processing method, device, electronic equipment and storage medium
US20230338847A1 (en) Display method for virtual chessboard , terminal, storage medium and program product
CN114189731B (en) Feedback method, device, equipment and storage medium after giving virtual gift
CN111068325B (en) Method, device, equipment and storage medium for collecting articles in game scene
Pfeiffer et al. Conversational pointing gestures for virtual reality interaction: implications from an empirical study
CN113117329A (en) Display control method and device in game, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination