CN114546240A - Interactive implementation method, device, equipment and storage medium of game - Google Patents

Interactive implementation method, device, equipment and storage medium of game Download PDF

Info

Publication number
CN114546240A
CN114546240A CN202210178847.3A CN202210178847A CN114546240A CN 114546240 A CN114546240 A CN 114546240A CN 202210178847 A CN202210178847 A CN 202210178847A CN 114546240 A CN114546240 A CN 114546240A
Authority
CN
China
Prior art keywords
touch
touch operation
virtual
magnifier
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210178847.3A
Other languages
Chinese (zh)
Other versions
CN114546240B (en
Inventor
张泽权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310938504.7A priority Critical patent/CN116893774A/en
Priority to CN202210178847.3A priority patent/CN114546240B/en
Publication of CN114546240A publication Critical patent/CN114546240A/en
Application granted granted Critical
Publication of CN114546240B publication Critical patent/CN114546240B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a game interaction realization method, device, equipment and storage medium, and relates to the technical field of games. The method comprises the following steps: responding to a first touch operation acted on the magnifier control, displaying the virtual magnifier at a position corresponding to a target area in the game scene, and displaying an amplified target area in the virtual magnifier, wherein the target area is an area in a preset range taking a position corresponding to the touch point of the first touch operation as a center in the game scene; and in the keeping process of the first touch operation, responding to a second touch operation acting on the game scene, and determining that the virtual object at the preset position in the virtual magnifier in the game scene is the selected virtual object. According to the method and the device, the accurate positioning selection of the target in the touch game can be realized, and the game experience of the player is improved.

Description

Interactive implementation method, device, equipment and storage medium of game
Technical Field
The present application relates to the field of game technologies, and in particular, to a game interaction implementation method, apparatus, device, and storage medium.
Background
Along with the rapid increase of life and working pressure, game application programs applied to touch screen devices are becoming more popular as mainstream application programs for enriching the lives of people and relieving the life pressure.
For a game application program applied to a touch screen device, when touch operation is required, because a click area is located below a finger, the area below the finger cannot be seen clearly when a player inputs the touch operation due to the shielding of the finger, which is very unfriendly in some games requiring precise positioning. For example, when a player needs to press a small moving object for a long time, the player cannot see the object after pressing the object for a long time, and at this time, if the object moves out of the long-press area, the long-press operation is interrupted, so that the player cannot smoothly complete the operation.
Therefore, in the conventional display interaction mode, when a player performs touch operation, the player cannot clearly see the region to be operated, and cannot realize accurate positioning selection of a target in a game, so that the game experience of the user is seriously influenced.
Disclosure of Invention
An object of the present application is to provide a method, an apparatus, a device, and a storage medium for implementing game interaction, so as to implement accurate positioning selection of a target in a touch game and improve game experience of a user.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides an interactive implementation method for a game, where a graphical interactive interface is provided through a terminal device, where: game scenes and magnifier controls; the method comprises the following steps:
responding to a first touch operation acted on the magnifier control, displaying a virtual magnifier at a position corresponding to a target area in the game scene, and displaying an amplified target area in the virtual magnifier, wherein the target area is an area in a preset range taking a position corresponding to the touch operation of the first touch operation as a center in the game scene;
and in the keeping process of the first touch operation, responding to a second touch operation acting on the game scene, and determining that the virtual object at the preset position in the virtual magnifier in the game scene is the selected virtual object.
Optionally, the first touch operation is a drag touch operation.
Optionally, the method further includes:
responding to the first touch operation, and displaying an auxiliary center of sight in the virtual magnifier to indicate the preset position of the virtual magnifier.
Optionally, the method further includes:
and in the process of keeping the first touch control operation, responding to the second touch control operation, and displaying a click special effect at the preset position in the virtual magnifier to indicate a position point associated with the second touch control operation in the virtual magnifier as the preset position.
Optionally, the second touch operation is: and the click touch operation is acted on any position in the game scene, or the click touch operation is acted on any position in the game scene, wherein the distance between the click touch operation and the touch point of the first touch operation is within a preset range.
Optionally, the method further includes:
in the process of keeping the first touch control operation, responding to a first moving touch control operation with a relatively static double touch control point, controlling the virtual magnifier to move in the game scene, and displaying an area in the preset range with the corresponding position of the first moving touch control operation as the center after the virtual magnifier is magnified;
wherein the dual touch point comprises: the touch control device comprises a first touch point of the first touch operation and a second touch point different from the first touch point.
Optionally, the first moving touch operation is a double-finger synchronous dragging touch operation, or a double-finger synchronous sliding touch operation.
Optionally, the method further includes:
and in the holding process of the first touch operation, responding to a third touch operation acting on the game scene, and executing a target action corresponding to the third touch operation on the selected virtual object.
Optionally, the third touch operation is: and the double-click touch operation is acted on any position in the game scene, or the double-click touch operation is acted on any position in the game scene, wherein the distance between the touch point of the first touch operation and the touch point of the first touch operation is within the preset range.
Optionally, the method further includes:
and responding to the release operation of the first touch operation, and canceling the display of the virtual magnifier.
Optionally, the method further includes:
in the holding process of the first touch control operation, responding to a second movement touch control operation of the relative movement of the double touch control points, changing the amplification factor of the virtual magnifier, and displaying the target area amplified by the changed amplification factor in the virtual magnifier;
wherein the dual touch point comprises: the touch control device comprises a first touch point of the first touch operation and a third touch point different from the first touch point.
Optionally, the second moving touch operation is to: and dragging the touch operation by a single finger, or sliding the touch operation by the single finger.
Optionally, the method further includes:
in the holding process of the first touch control operation, responding to a third moving touch control operation of relative rotation of the double touch control points, changing the position angle of the virtual magnifier relative to the touch control points of the first touch control operation, and displaying an area in the preset range with the corresponding position of the finishing touch control point of the third moving touch control operation as the center after the position angle is changed in the changed virtual magnifier;
the double touch points comprise a first touch point of the first touch operation and a fourth touch point different from the first touch point.
Optionally, the changing the position angle of the virtual magnifier relative to the touch point of the first touch operation in response to a third movement touch operation of relative rotation of the dual touch points includes:
and responding to the third mobile touch operation, changing the position angle of the virtual magnifier relative to the first touch point, and keeping the position angle of the finishing touch point of the third mobile touch operation relative to the first touch point consistent with the position angle of the finishing touch point of the third mobile touch operation.
Optionally, the third moving touch operation is to: and continuously dragging touch operation or sliding touch operation with the fourth touch point by taking the first touch point as a center.
Optionally, the method further includes:
and in the maintaining process of the first touch operation, responding to a third touch operation acting on the virtual magnifier, and changing the size of the virtual magnifier.
In a second aspect, an embodiment of the present application further provides an interaction implementing apparatus for a game, where a graphical interaction interface is provided through a terminal device, and the graphical interaction interface displays: game scenes and magnifier controls; the interaction realization device of the game comprises:
the display module is used for responding to a first touch operation acted on the magnifier control, displaying a virtual magnifier at a position corresponding to a target area in the game scene, and displaying the amplified target area in the virtual magnifier, wherein the target area is an area in a preset range taking a position corresponding to the touch point of the first touch operation as a center in the game scene;
and the selecting module is used for responding to a second touch operation acting on the game scene in the maintaining process of the first touch operation and determining the virtual object at the preset position in the virtual magnifier in the game scene as the selected virtual object.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a memory and a processor, wherein the memory stores a computer program executable by the processor, and the processor implements the interactive implementation method of any game provided by the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is read and executed, the interactive implementation method of any game provided in the first aspect is implemented.
The beneficial effect of this application is:
in the game interaction realization method, the device, the equipment and the storage medium provided by the application, the virtual magnifier is displayed at the position corresponding to the target area in the game scene by responding to the first touch control operation of the magnifier control acted on the graphical interaction interface, the magnified target area is displayed in the virtual magnifier, the clear display of the magnified target area by starting the magnifying function triggered by the first touch control operation is realized, secondly, in the holding process of the first touch control operation, the second touch control operation acted on the game scene can be responded, the virtual object at the preset position in the virtual magnifier in the game scene is determined as the selected object, in the holding process of the first touch control operation, the selection of the virtual object at the preset position in the virtual magnifier is realized by responding to the input second touch control operation, the player is ensured to clearly see the target area of the operation when executing the touch control operation, and the accurate positioning selection of the virtual object in the game is also ensured, so that the game operation experience of the user is effectively improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
FIG. 1 is a first schematic diagram of a graphical interaction interface provided herein;
FIG. 2 is a schematic flowchart of an interactive implementation method of a game according to an embodiment of the present disclosure;
FIG. 3 is a second schematic diagram of a graphical interaction interface provided by the present application;
FIG. 4 is a third schematic view of a graphical interaction interface provided herein;
FIG. 5 is a fourth schematic view of a graphical interaction interface provided herein;
FIG. 6 is a fifth schematic view of a graphical interaction interface provided herein;
FIG. 7 is a schematic diagram of an interaction implementation apparatus for a game according to an embodiment of the present disclosure;
fig. 8 is a schematic view of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, in various portions of this application and in the drawings, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method flow diagrams referred to in the following embodiments of the present application are exemplary only, and do not necessarily include all of the contents and steps, nor do they necessarily have to be performed in the order described. For example, some steps may be broken down, and the steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
The functional blocks in the block diagrams designed in the embodiments described below are only functional entities, and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in software, or in one or more physical modules or integrated circuits, or in different networks and/or processors and/or microcontrollers.
The interactive implementation method of the game provided by the present application is illustrated by a plurality of examples as follows.
The interactive implementation method of the game provided by the embodiment of the application can be implemented by executing corresponding software codes by a processing device, such as a processor, in an electronic device installed with a game application, or by executing corresponding software codes by the processor of the electronic device and combining other hardware entities. The electronic device may be any terminal device supporting a touch function, such as a desktop computer, a notebook, a personal digital assistant, a smart phone, a tablet computer, and a game machine, or may be a game server communicatively connected to the touch terminal device. In the following example of the present application, an electronic device is mainly used as a touch terminal device for example description, and a process executed by a game server is similar to that executed by the touch terminal device, which is not described in detail in the present application.
Specifically, if the electronic device is a touch terminal device, a Graphical User Interface (GUI), also called a Graphical interactive Interface or a Graphical User Interface, may be generated on a processor of the electronic device by running a game application and rendering the GUI on a display device of the electronic device. The graphical interactive interface is generated by rendering when the game application runs on the electronic device, for example, and a user can control the virtual object by operating the graphical interactive interface on the display device of the electronic device, so as to play the game.
The interactive implementation method of the game provided by the application can provide a graphical interaction interface through the terminal device, wherein at least a game scene and a magnifier control are displayed in the graphical interaction interface, the game scene comprises at least one virtual object, and the interactive implementation method of the game comprises the following steps: the controlled virtual object may or may not be included. Virtual objects in a game scene may include: movable virtual objects, such as virtual characters, virtual animals, virtual vehicles, etc., may also include: fixed virtual objects, i.e. immovable virtual objects, such as virtual buildings, virtual plants, virtual props, etc.
The interactive realization method of the game provided by the embodiment of the application can provide an image interactive interface through the terminal equipment. Fig. 1 is a first schematic diagram of a graphical interaction interface provided in the present application. As shown in fig. 1, a game scene 1 and a magnifier control 2 are displayed on the graphical user interface. In game scene 1, a plurality of virtual objects are displayed, and taking the item picking scene as an example, the plurality of virtual objects may include: game character object 11, and at least one virtual item 12. For example, when game character object 11 moves to game scene 1 in the game world, or game scene 1 of existing game character object 11 and virtual item 12 triggered by game character object 11 after executing a certain game event in the game world. The game character object 11 may be a controlled virtual object, a controlled virtual object of another player, or a non-player character object. Virtual props 12 may include, for example: virtual gems, virtual property boxes, virtual material boxes, virtual pets, virtual monsters, and the like.
The magnifier control 2 may be fixedly displayed in the game scene 1, or may be dynamically displayed based on the virtual object displayed in the game scene 1. For example, the magnifier control 2 may detect that the total number of virtual objects in the game scene 1 reaches a first preset number threshold, display the virtual objects when detecting that the arrangement density of the virtual objects in the game scene 1 reaches a preset density threshold, and display the virtual objects when detecting that the number of virtual objects whose bounding box size is smaller than the preset size threshold in the game scene 1 reaches a second preset number threshold. It should be noted that the dynamic condition of the game scene when the magnifier control 2 is dynamically displayed may not be limited to the above example, and may also be other dynamic example scenes, which is not limited in the embodiment of the present application.
Secondly, it should be further noted that the graphical interaction interface shown in fig. 1 is only one possible example, in other examples, other controls may also be displayed in the graphical interaction interface, and the display icon of the magnifier control may also be other types of icons, which is not limited in this respect, and the embodiments of the present application are not limited thereto.
Fig. 2 is a schematic flowchart of an interactive implementation method of a game provided in an embodiment of the present application, and as shown in fig. 2, the interactive implementation method of the game may include:
s201, responding to a first touch operation acted on the magnifier control, displaying the virtual magnifier at a position corresponding to a target area in a game scene, and displaying the amplified target area in the virtual magnifier.
The target area is an area within a preset range with the corresponding position of the touch point of the first touch operation as the center in the game scene.
If a player desires to select a virtual object from a game scene, but the virtual object is selected by directly touching the virtual object, the virtual object may be blocked by fingers when touching, in this case, the player may input a first touch operation through an input action on the magnifier control, trigger the opening of the magnifying function, and thus realize the selection of the virtual object by means of the magnifying function.
The first touch operation shown above may be, for example, a dragging touch operation applied to the magnifier control.
When a first touch operation acting on the magnifier control is received, responding to the first touch operation, determining an area within a preset range with a corresponding position of the touch point of the first touch operation as a center as a target area to be amplified in a game scene, displaying the virtual magnifier at a position corresponding to the target area in the game scene, and displaying the amplified target area in the virtual magnifier. The position corresponding to the target area may be, for example, a center position of the target area, that is, a touch point of the first touch operation, a preset edge position of the target area, or a position point where a distance from the center position in a preset direction of the center position of the target area is a preset distance. The preset direction may be, for example, a relative direction of a central position on the graphical interface and any boundary position of the graphical interface, such as any direction of a left upper direction, a right upper direction, a left lower direction, a right lower direction, and a right lower direction of the central position on the graphical interface. Of course, the preset direction is only an example, and the embodiment of the present application does not limit this.
The enlarged target area refers to a display area obtained by enlarging the target area by using a preset enlargement coefficient.
Taking the first touch operation as the dragging touch operation as an example, the touch point of the first touch operation referred to by the center of the target area is the ending touch point of the first touch operation. The drag touch operation refers to a long-press touch operation and a sliding touch operation consecutive to the long-press touch operation, and then the ending touch point of the first touch operation is also referred to as the ending touch point of the sliding touch operation. It should be noted that, in the embodiment of the present application, unless otherwise specified, the touch points of the first touch operation all refer to the ending touch points of the first touch operation.
Continuing to refer to fig. 1, when receiving that the player inputs the first touch operation by acting on the magnifier control 2, the player may respond to the first touch operation, display the virtual magnifier 13 at the preset position above and to the left of the touch point of the first touch operation in the game scene 1, and display the magnified target area in the virtual magnifier 13.
S202, in the keeping process of the first touch operation, responding to a second touch operation acting on the game scene, and determining that the virtual object at the preset position in the virtual magnifier in the game scene is the selected virtual object.
The player can trigger the virtual magnifier to be displayed on the game interface through the first touch operation acting on the magnifier control, and the amplified target area is displayed in the virtual magnifier so as to clearly show the target area to the player. Under the condition that the amplified target area is displayed through the virtual magnifier to clearly display the target area, if a player desires to select a virtual object from the target area, the player can act on the second touch operation of the game scene in the holding process of the first touch operation to trigger the selection of the virtual object at the preset position in the virtual magnifier. The preset position in the virtual magnifier may be, for example, the center position of the virtual magnifier, or the preset edge position of the virtual magnifier, or other positions of the virtual magnifier.
In the holding process of the first touch operation, the finger does not leave the display screen of the graphical interactive interface after the first touch operation is input. Then, in the holding process of the first touch operation, the second touch operation that acts on the game scene is a touch operation input by another finger. That is to say, the first touch operation and the second touch operation are touch operations input by different fingers, and when the second touch operation is input, the fingers of the first touch operation are not lifted up and loose and are always in contact with the display screen of the graphical interaction interface. For example, a player can input a first touch operation by acting on the magnifier control through a finger of a left hand, the finger of the left hand does not need to be lifted and loosened, and then act on a game scene through the finger of the right hand to input a second touch operation, so that the second touch operation is input in the process of keeping the first touch operation. Here, the touch fingers of the first touch operation and the second touch operation are only one example provided for convenience of understanding, and in practical applications, the touch fingers mainly pass through different fingers, which is not limited in the embodiments of the present application.
In this embodiment of the application, the second touch operation may be a single-click touch operation applied to a game scene, and may be, for example: the method comprises the steps of acting on a click touch operation at any position in a game scene, or acting on a click touch operation at any position in a preset range, wherein the distance between the click touch operation and a touch point of a first touch operation in the game scene is between the click touch operation and the touch point of the first touch operation. The preset range may be, for example, a circular area with a preset radius centered on the touch point of the first touch operation, and the specific size of the preset radius may be set by game settings.
With continued reference to fig. 1, in the process of maintaining the first touch operation, that is, the amplification function is turned on, and the virtual magnifier 13 is displayed in the game scene 1, the player acts on the second touch operation of the game scene, which is actually equivalent to the touch operation on the preset position in the virtual magnifier 13, and when the second touch operation is received, the virtual object at the preset position in the virtual magnifier 13 is selected, such as the selected virtual object 14 shown in fig. 1.
Optionally, the interactive implementation method for the game provided in the embodiment of the present application may further include:
and responding to the release operation of the first touch operation, and canceling the display of the virtual magnifier.
Under the condition that the amplified target area is displayed in the virtual magnifier, if a player does not want to select a virtual object from the target area, the first touch operation can be released, namely, the finger of the first touch operation is released, so that the finger leaves the screen, the touch operation of the first touch operation can be responded, the display of the virtual magnifier is cancelled, the hiding of the virtual magnifier is realized, and the magnifying function is quitted.
In the interaction implementation method for the game provided by this embodiment, the virtual magnifier is displayed at a position corresponding to the target area in the game scene by responding to the first touch operation of the magnifier control acting on the graphical interaction interface, and the enlarged target area is displayed in the virtual magnifier, so that the magnified target area is clearly displayed after the magnifying function is triggered and started by the first touch operation, and then, in the process of maintaining the first touch operation, the second touch operation acting on the game scene can be responded, so as to determine that the virtual object at the preset position in the virtual magnifier in the game scene is the selected object, so that in the process of maintaining the first touch operation, the selection of the virtual object at the preset position in the virtual magnifier is realized by responding to the input second touch operation, thereby ensuring that the player can clearly see the target area of the operation when executing the touch operation, and the accurate positioning selection of the virtual object in the game is also ensured, so that the game operation experience of the user is effectively improved.
On the basis of the interactive realization method of the game provided by the embodiment, the method can further comprise the following steps:
and responding to the first touch operation, and displaying the auxiliary center of sight in the virtual magnifier to indicate the preset position of the virtual magnifier.
For example, the auxiliary centroid can be displayed at a preset position of the virtual magnifier, and for example, the preset position is a central position of the virtual magnifier, and the auxiliary centroid can be displayed at the central position of the virtual magnifier.
With continued reference to fig. 1, in response to the first touch operation, in the case of displaying the virtual magnifier 13, the auxiliary centroid 15 may be further displayed at a preset position in the virtual magnifier 13 to indicate the preset position of the virtual magnifier 13, so as to indicate the player to input the second touch operation, and actually and equivalently select the virtual object at the preset position. If the player does not desire to select the virtual object at the preset position, a second touch operation is not required to be input; if the player desires to select the virtual object at the preset position, the player inputs a second touch operation, and then the virtual object at the preset position can be selected.
According to the method provided by the embodiment, under the condition of receiving the first touch operation, the first touch operation can be responded, and the auxiliary center of sight is displayed at the preset position of the virtual magnifier to prompt the preset position in the virtual magnifier, so that a player can conveniently and accurately distinguish the virtual object to be selected at the preset position, and the game experience of the player is improved.
In some other possible implementation examples, the interactive implementation method for the game provided by the embodiment of the present application may further include:
and in the process of keeping the first touch control operation, responding to the second touch control operation, and displaying a click special effect at the preset position in the virtual magnifier to indicate a position point associated with the second touch control operation in the virtual magnifier as a preset position.
The click special effect may be, for example, a preset click special effect animation, the click special effect is displayed at a preset position in the virtual magnifier, and the click special effect is actually played at the preset position to remind the player that a position point associated with the second touch operation is the preset position in the virtual magnifier.
Fig. 3 is a schematic diagram of a graphical interaction interface provided by the present application. As shown in fig. 3, in the holding process of the first touch operation, that is, in the case that the virtual magnifier 13 is displayed in the game scene 1, the click special effect 15 may be displayed at the preset position in the virtual magnifier 13 in response to the second touch operation, and by indicating that the position point associated with the second touch operation in the virtual magnifier 3 is the preset position, the virtual object at the preset position is indicated to the player that the virtual object has been selected by the second touch operation.
According to the method provided by the embodiment, in the holding process of the first touch operation, the second touch operation is responded, the click special effect is displayed at the preset position of the virtual magnifier, so that the position point associated with the second touch operation in the virtual magnifier is indicated as the preset position, the virtual object selected by the second touch operation is clearly shown to the player as the virtual object at the position where the click special effect is displayed, the player can clearly distinguish the selected virtual object, the interaction operation of the game is more humanized, and the game experience of the player is effectively improved.
In other possible implementation examples, the interactive implementation method for the game provided by the embodiment of the present application may further include:
in the keeping process of the first touch control operation, responding to the first moving touch control operation with the relative static double touch control points, controlling the virtual magnifier to move in the game scene, and displaying an area in a preset range with the corresponding position of the amplified first moving touch control operation as the center in the virtual magnifier;
wherein, two touch-control points include: a first touch point of the first touch operation and a second touch point different from the first touch point.
In the process of maintaining the first touch operation, the fingers of the first touch operation maintain the pressing state, in this case, when the screen is pressed by another finger, the other finger also maintains the pressing state, and then the two fingers are kept to slide relatively statically, so that the input of the first mobile touch operation with the relatively static double touch points can be realized.
When the input first moving touch operation is detected, the first moving touch operation can be responded, the virtual magnifier is controlled to move along the moving direction of the first moving touch operation, and in the moving process of the virtual magnifier, the amplified area within the preset range with the position corresponding to the first moving touch operation as the center is displayed in the virtual magnifier, so that the movement of the virtual magnifier and the change of the target area in the virtual magnifier along with the moving process are realized. The position corresponding to the first moving touch operation may be, for example, a position where a finger of the first touch operation moves along with the first moving touch operation.
For example, the first moving touch operation may be a two-finger synchronous drag touch operation, or a two-finger synchronous slide touch operation.
In the interaction method for the game provided by the embodiment, in the process of maintaining the first touch control operation, the movement of the virtual magnifier in the game scene is controlled in response to the first movement touch control operation with the relatively static double touch control points, an area within a preset range with the corresponding position of the amplified first movement touch control operation as the center is displayed in the virtual magnifier, and through the first movement touch control operation with the relatively static double touch control points, namely the dragging and sliding function of the double touch control points, the movement of the virtual magnifier in the game scene and the dynamic display of the amplified area in the virtual magnifier in the moving process are realized, so that a player can dynamically adjust the amplified area of the virtual magnifier, the accurate positioning and selection of a virtual object are realized, and the game experience of the player is effectively improved.
In still another possible implementation example, the interactive implementation method for a game provided by the embodiment of the present application may further include:
and in the holding process of the first touch operation, responding to a third touch operation acting on the game scene, and executing a target action corresponding to the third touch operation on the selected virtual object.
The third touch operation is a continuous touch operation with the second touch operation. If the second touch operation is a single-click touch operation acting on the game scene, the third touch operation is a double-click touch operation acting on the game scene. For example, the third touch operation may be: the method comprises the steps of acting on double-click touch operation at any position in a game scene, or acting on double-click touch operation at any position in the game scene, wherein the distance between the touch point of the first touch operation and the touch point of the second touch operation is within a preset range.
The target action corresponding to the third touch operation may be, for example, executing a pickup action on the selected virtual object, or releasing a skill operation on the selected virtual object. In an example scene, the amplifying function can be started by responding to the first touch operation, the virtual magnifier and the amplified target area in the virtual magnifier are displayed, in this case, the selected virtual object can be selected by responding to the second touch operation, and then the corresponding target action is executed on the selected virtual object by responding to the third touch operation continuously input by the second touch operation.
According to the method provided by the embodiment, the virtual object can be selected by responding to the second touch operation under the condition that the amplification function is started by responding to the first touch operation, and then the corresponding action is executed on the selected virtual object by responding to the third touch operation.
In still another possible implementation example, the interactive implementation method for a game provided by the embodiment of the present application may further include:
in the holding process of the first touch operation, responding to a second movement touch operation of the relative motion of the double touch points, changing the amplification factor of the virtual magnifier, and displaying the target area amplified by the changed amplification factor in the virtual magnifier; wherein, two touch-control points include: the touch control device comprises a first touch point of the first touch operation and a third touch point different from the first touch point.
Different from the first moving touch operation, the first moving touch operation is a moving touch operation in which the dual touch points are relatively stationary, and the second moving touch operation is a moving touch operation in which the dual touch points are relatively moving. In the process of keeping the first touch operation, namely in the state of the magnifying function, the fingers of the first touch operation keep the pressing state, in this case, when the screen is pressed by the other finger, one of the two fingers is kept fixed, and the other finger is kept moving on the screen, so that the input of the second movement touch operation of the relative movement of the double touch points can be realized. The relative motion of the dual touch points may be, for example, that the first touch point is fixed and the third touch point slides linearly, thereby implementing the relative motion of the dual touch points. Wherein the second mobile touch operation is: and dragging the touch operation by a single finger, or sliding the touch operation by the single finger.
When the first touch operation is received, the amplified target area displayed in the virtual magnifier is an area obtained by amplifying the target area by adopting a preset amplification factor. If the player is not satisfied with the amplified region in the virtual magnifier displayed on the graphical interaction interface, the amplification factor of the virtual magnifier can be changed by responding to the second movement touch operation of the double touch points under the condition of the first touch operation, and the target region amplified according to the changed amplification factor is displayed in the virtual magnifier.
It should be noted that, in a possible implementation example, the magnification factor of the virtual magnifier may be changed according to the change of the distance between the two touch points. Specifically, the distance between the double touch points is reduced, and correspondingly, the amplification factor of the virtual magnifier can be reduced; the relative distance of the double touch points is increased, and correspondingly, the amplification factor of the virtual magnifier can be increased.
Fig. 4 is a schematic diagram three of a graphical interaction interface provided by the present application. As shown in fig. 4, in the holding process of the first touch operation, that is, in the case of displaying the virtual magnifier 13 in the game scene 1, the magnification factor of the virtual magnifier may be changed in response to the second movement touch operation of the relative movement of the two touch points, and the target area magnified by the changed magnification factor is displayed in the virtual magnifier.
Assuming that the first touch operation is a touch operation input by a first finger, the first finger is kept still while the first finger keeps a pressed state, that is, the first touch operation is kept, and when the second finger slides in a direction away from the first finger, a second moving touch operation input by relative movement of two touch points is realized, in this example, because the distance between the first finger and the second finger is increased when the second moving touch operation is input, the amplification factor of the virtual magnifier 13 can be increased, and the target area amplified by the increased amplification factor is displayed in the virtual magnifier 13.
The method provided by the embodiment can respond to the second mobile touch operation of the relative motion of the double touch points in the keeping process of the first touch operation, namely in the state that the amplification function is started, so as to change the amplification coefficient of the virtual magnifier, realize the dynamic amplification adjustment of the target area in the virtual magnifier, facilitate the player to accurately distinguish the virtual object in the target area, and further improve the game experience of the player.
In still another possible implementation example, the interactive implementation method for a game provided by the embodiment of the present application may further include:
in the process of keeping the first touch control operation, responding to a third movement touch control operation of relative rotation of the double touch control points, changing the position angle of the virtual magnifier relative to the touch control points of the first touch control operation, and displaying an area in a preset range with the corresponding position of the finishing touch control point of the third movement touch control operation as the center after the position angle is changed in the changed virtual magnifier; the double-touch point comprises a first touch point of the first touch operation and a fourth touch point different from the first touch point.
The third moving touch operation is different from the second moving touch operation in that the second moving touch operation is a moving touch operation in which the dual touch points move relatively, and the motion track of the second moving touch operation is a linear motion track, while the third moving touch operation in this embodiment is a moving touch operation in which the dual touch points rotate relatively, and the motion track of the third moving touch operation is an arc motion track. In the holding process of the first touch operation, that is, in the zoom-in function state, the fingers of the first touch operation are held in the pressed state, and in this case, when the screen is pressed by another finger, the two fingers rotate and slide, so that the input of the third movement touch operation can be realized. The relative rotation of the dual touch points may be, for example, that the first touch point of the first touch operation is fixed, and the fourth touch point rotates around the first touch point with the first touch point as the center, so as to realize the relative rotation of the dual touch points.
For example, the third moving touch operation is: and continuously dragging the touch operation or sliding the touch operation with the fourth touch point by taking the touch point of the first touch operation as a center.
Optionally, as shown above, the changing the position angle of the virtual magnifier relative to the touch point of the first touch operation in response to the third moving touch operation of the relative rotation of the dual touch points includes:
and responding to the third mobile touch operation, changing the position angle of the virtual magnifier relative to the first touch point, and keeping the position angle of the finishing touch point of the third mobile touch operation relative to the first touch point consistent with the position angle of the finishing touch point of the third mobile touch operation.
Fig. 5 is a fourth schematic diagram of a graphical interaction interface provided by the present application. As shown in fig. 5, in the holding process of the first touch operation, that is, in the case that the virtual magnifier 13 is displayed in the game scene 1, the position angle of the virtual magnifier relative to the first touch point may be changed in response to the third movement touch operation in which the dual touch points rotate relatively, and an area within a preset range centered on the corresponding position of the ending touch point of the third touch operation after the position angle is changed is displayed in the changed virtual magnifier.
The position angle of the virtual magnifier relative to the first touch point means that the virtual magnifier surrounds the first touch point as a center, and the rotation direction and the angle of the virtual magnifier relative to the first touch point are consistent with those of the fourth touch point. For example, the first touch operation is input by a first finger, the second touch operation is input by a second finger, and when the second finger rotates clockwise by a preset angle relative to the first finger, the virtual magnifier also rotates clockwise by the preset angle around the first touch point.
According to the method provided by the embodiment, in the process of keeping the first touch operation, namely in the state of opening the amplification function, the third movement touch operation of the relative rotation of the double touch points is responded, the position angle of the virtual magnifier relative to the touch points of the first touch operation is changed, the dynamic adjustment of the display position of the virtual magnifier is realized, the situation that the area below the finger cannot be checked through the virtual magnifier is avoided, and the game experience of a player is effectively ensured.
In other possible implementation examples, the interactive implementation method for the game provided by the embodiment of the present application may further include:
and in the holding process of the first touch operation, responding to a third touch operation acting on the virtual magnifier, and changing the size of the virtual magnifier.
In the holding process of the first touch operation, the finger inputting the first touch operation is not loosened, the position of the touch point of the first touch operation is kept unchanged, and when a third touch operation acting on the virtual magnifier is detected, the third touch operation can be responded, and the size of the virtual magnifier is changed. For example, the size of the virtual magnifier can be changed according to the pressing force, the pressing time and/or the moving distance of the third touch operation.
The third touch operation may be, for example, a sliding touch operation or a dragging touch operation applied to a preset edge of the virtual magnifier. If the third touch operation is a moving touch operation which acts on the preset edge position of the virtual magnifier and faces a direction deviating from the center position of the virtual magnifier, namely a touch operation moving towards the outside of the virtual magnifier, responding to the third touch operation, and magnifying the size of the virtual magnifier; on the contrary, if the third touch operation is a moving touch operation that acts on the preset edge position of the virtual magnifier and faces the direction close to the center position of the virtual magnifier, namely the moving touch operation that faces the inside of the virtual magnifier, the fourth touch operation is responded, and the size of the virtual magnifier can be reduced.
It should be noted that, when responding to the first touch operation, the size of the virtual magnifier is displayed in a preset size, the target area in the virtual magnifier is also magnified by a preset magnification factor, and the preset size and the preset magnification factor are both preset parameters, for example, corresponding settings may be performed through the game background.
Fig. 6 is a schematic diagram of a graphical interaction interface provided by the present application. As shown in fig. 6, in the holding process of the first touch operation, that is, in the case where the virtual magnifier 13 is displayed in the game scene 1, the size of the virtual magnifier may be changed in response to the third touch operation applied to the virtual magnifier, so as to change the display range of the virtual magnifier.
The method provided by the embodiment can respond to the third touch operation acting on the virtual magnifier in the first touch operation keeping process, namely in the state that the magnifying function is opened, change the size of the virtual magnifier, realize the dynamic adjustment of the display range of the virtual magnifier, avoid the insufficient display space of the virtual magnifier, and avoid incomplete display of the magnified target area, thereby effectively ensuring the clear display in the target area under the magnifying function, and improving the game experience of the player.
The following describes a device, an apparatus, a storage medium, and the like for implementing the interactive implementation method of the game provided by the present application, and specific implementation procedures and technical effects thereof are referred to above and will not be described again below.
Fig. 7 is a schematic diagram of an interaction implementation apparatus for a game provided in an embodiment of the present application, and as shown in fig. 7, an interaction implementation apparatus 700 for a game may provide a graphical interaction interface through a terminal device, where the graphical interaction interface is displayed with: game scene and magnifier control, as shown in fig. 7, an interaction implementation apparatus 700 for a game may include:
the display control module 701 is configured to respond to a first touch operation acting on the magnifier control, display a virtual magnifier at a position corresponding to a target area in a game scene, and display the magnified target area in the virtual magnifier, where the target area is an area within a preset range in the game scene, where the position corresponding to the touch point of the first touch operation is a center;
the determining module 702 is configured to, in a holding process of the first touch operation, respond to a second touch operation applied to the game scene, and determine that a virtual object at a preset position in the virtual magnifier in the game scene is the selected virtual object.
Optionally, the first touch operation is a drag touch operation.
Optionally, the display control module 701 is further configured to respond to the first touch operation, and display an auxiliary centroid in the virtual magnifier to indicate the preset position of the virtual magnifier.
Optionally, the display control module 701 is further configured to respond to the second touch operation in the process of maintaining the first touch operation, and display a click special effect at the preset position in the virtual magnifier, so as to indicate a position point associated with the second touch operation in the virtual magnifier to be a preset position.
Optionally, the second touch operation is: the method comprises the steps of acting on a click touch operation at any position in a game scene, or acting on a click touch operation at any position in a preset range, wherein the distance between the click touch operation and a touch point of a first touch operation in the game scene is between the click touch operation and the touch point of the first touch operation.
Optionally, the display control module 701 is further configured to, in a process of maintaining the first touch operation, respond to a first moving touch operation in which the dual touch points are relatively stationary, control movement of the virtual magnifier in the game scene, and display an area within a preset range with a position corresponding to the enlarged first moving touch operation as a center in the virtual magnifier;
wherein, two touch-control points include: the touch control device comprises a first touch point of a first touch operation and a second touch point different from the first touch point.
Optionally, the first moving touch operation is a double-finger synchronous dragging touch operation, or a double-finger synchronous sliding touch operation.
Optionally, the display control module 701 is further configured to, in the process of maintaining the first touch operation, respond to a third touch operation applied to the game scene, and execute a target action corresponding to the third touch operation on the selected virtual object.
Optionally, the third touch operation is: the method comprises the steps of acting on double-click touch operation at any position in a game scene, or acting on double-click touch operation at any position in the game scene, wherein the distance between the touch point of the first touch operation and the touch point of the first touch operation is within a preset range.
Optionally, the display control module 701 is further configured to cancel the display of the virtual magnifier in response to a release operation of the first touch operation.
Optionally, the display control module 701 is further configured to respond to a second moving touch operation of the relative motion of the dual touch points in the holding process of the first touch operation, change the amplification factor of the virtual magnifier, and display the target area amplified by the changed amplification factor in the virtual magnifier;
wherein, two touch-control points include: the touch control device comprises a first touch point of the first touch operation and a third touch point different from the first touch point.
Optionally, the second moving touch operation is: and dragging the touch operation by a single finger, or sliding the touch operation by the single finger.
Optionally, the display control module 701 is further configured to, in the holding process of the first touch operation, respond to a third moving touch operation in which the dual touch points rotate relatively, change the position angle of the virtual magnifier with respect to the touch point of the first touch operation, and display, in the changed virtual magnifier, an area within a preset range centered at a corresponding position of the touch point at the end of the third moving touch operation after the change of the position angle;
the double touch points comprise a first touch point of the first touch operation and a fourth touch point different from the first touch point.
Optionally, the display control module 701 is specifically configured to respond to the third mobile touch operation, change a position angle of the virtual magnifier relative to the first touch point, and keep the position angle of the ending touch point of the third mobile touch operation relative to the first touch point consistent with the position angle of the ending touch point of the third mobile touch operation.
Optionally, the third moving touch operation is: and continuously dragging touch operation or sliding touch operation with the fourth touch point by taking the first touch point as a center.
Optionally, the display control module 701 is further configured to change the size of the virtual magnifier in response to a third touch operation applied to the virtual magnifier in the process of maintaining the first touch operation.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 8 is a schematic view of an electronic device according to an embodiment of the present application. The electronic device 800 includes: memory 801, processor 802. The memory 801 and the processor 802 are connected by a bus.
The memory 801 is used for storing programs, and the processor 802 calls the programs stored in the memory 801 to execute the above-mentioned method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
Optionally, the invention also provides a program product, for example a computer-readable storage medium, comprising a program which, when being executed by a processor, is adapted to carry out the above-mentioned method embodiments.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or in the form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. An interactive realization method of games is characterized in that a graphical interactive interface is provided through a terminal device, and the graphical interactive interface displays: game scenes and magnifier controls; the method comprises the following steps:
responding to a first touch operation acted on the magnifier control, displaying a virtual magnifier at a position corresponding to a target area in the game scene, and displaying an amplified target area in the virtual magnifier, wherein the target area is an area in a preset range taking a position corresponding to the touch point of the first touch operation as a center in the game scene;
and in the holding process of the first touch operation, responding to a second touch operation acting on the game scene, and determining that the virtual object at the preset position in the virtual magnifier in the game scene is the selected virtual object.
2. The method of claim 1, wherein the first touch operation is a drag touch operation.
3. The method of claim 1, further comprising:
responding to the first touch operation, and displaying an auxiliary center of sight in the virtual magnifier to indicate the preset position of the virtual magnifier.
4. The method of claim 1, further comprising:
and in the process of keeping the first touch control operation, responding to the second touch control operation, and displaying a click special effect at the preset position in the virtual magnifier to indicate a position point associated with the second touch control operation in the virtual magnifier as the preset position.
5. The method of claim 1, wherein the second touch operation is: and the click touch operation is acted on any position in the game scene, or the click touch operation is acted on any position in the game scene, wherein the distance between the click touch operation and the touch point of the first touch operation is within a preset range.
6. The method of claim 1, further comprising:
in the process of keeping the first touch control operation, responding to a first moving touch control operation with a relatively static double touch control point, controlling the virtual magnifier to move in the game scene, and displaying an area in the preset range with the corresponding position of the first moving touch control operation as the center after the virtual magnifier is magnified;
wherein the dual touch point comprises: the touch control device comprises a first touch point of the first touch operation and a second touch point different from the first touch point.
7. The method of claim 6, wherein the first moving touch operation is a two-finger synchronous drag touch operation or a two-finger synchronous slide touch operation.
8. The method of claim 1, further comprising:
and in the holding process of the first touch operation, responding to a third touch operation acting on the game scene, and executing a target action corresponding to the third touch operation on the selected virtual object.
9. The method of claim 8, wherein the third touch operation is: and the double-click touch operation is acted on any position in the game scene, or the double-click touch operation is acted on any position in the game scene, wherein the distance between the touch point of the first touch operation and the touch point of the first touch operation is within the preset range.
10. The method of claim 1, further comprising:
and responding to the release operation of the first touch operation, and canceling the display of the virtual magnifier.
11. The method of claim 1, further comprising:
in the holding process of the first touch control operation, responding to a second movement touch control operation of the relative movement of the double touch control points, changing the amplification factor of the virtual magnifier, and displaying the target area amplified by the changed amplification factor in the virtual magnifier;
wherein the dual touch point comprises: the touch control device comprises a first touch point of the first touch operation and a third touch point different from the first touch point.
12. The method of claim 11, wherein the second mobile touch operation is: and dragging the touch operation by a single finger, or sliding the touch operation by the single finger.
13. The method of claim 1, further comprising:
in the holding process of the first touch control operation, responding to a third moving touch control operation of relative rotation of the double touch control points, changing the position angle of the virtual magnifier relative to the touch control points of the first touch control operation, and displaying an area in the preset range with the corresponding position of the finishing touch control point of the third moving touch control operation as the center after the position angle is changed in the changed virtual magnifier;
the double touch points comprise a first touch point of the first touch operation and a fourth touch point different from the first touch point.
14. The method according to claim 13, wherein changing the position angle of the virtual magnifier relative to the touch point of the first touch operation in response to a third movement touch operation of relative rotation of the dual touch points comprises:
and responding to the third mobile touch operation, changing the position angle of the virtual magnifier relative to the first touch point, and keeping the position angle of the finishing touch point of the third mobile touch operation relative to the first touch point consistent with the position angle of the finishing touch point of the third mobile touch operation.
15. The method of claim 13, wherein the third moving touch operation is: and continuously dragging touch operation or sliding touch operation with the fourth touch point by taking the first touch point as a center.
16. The method of claim 1, further comprising:
and in the maintaining process of the first touch operation, responding to a third touch operation acting on the virtual magnifier, and changing the size of the virtual magnifier.
17. An interaction realization device of a game is characterized in that a graphical interaction interface is provided through a terminal device, and the graphical interaction interface is displayed with: game scenes and magnifier controls; the interaction realization device of the game comprises:
the display module is used for responding to a first touch operation acted on the magnifier control, displaying a virtual magnifier at a position corresponding to a target area in the game scene, and displaying the amplified target area in the virtual magnifier, wherein the target area is an area in a preset range taking a position corresponding to the touch point of the first touch operation as a center in the game scene;
and the selecting module is used for responding to a second touch operation acting on the game scene in the maintaining process of the first touch operation and determining the virtual object at the preset position in the virtual magnifier in the game scene as the selected virtual object.
18. An electronic device, comprising: a memory storing a computer program executable by the processor, and a processor implementing an interactive implementation method of a game according to any one of claims 1 to 16 when the processor executes the computer program.
19. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when the computer program is read and executed, the interactive implementation method of the game according to any one of claims 1 to 16 is implemented.
CN202210178847.3A 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium Active CN114546240B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310938504.7A CN116893774A (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium
CN202210178847.3A CN114546240B (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210178847.3A CN114546240B (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310938504.7A Division CN116893774A (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium

Publications (2)

Publication Number Publication Date
CN114546240A true CN114546240A (en) 2022-05-27
CN114546240B CN114546240B (en) 2023-08-22

Family

ID=81680111

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210178847.3A Active CN114546240B (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium
CN202310938504.7A Pending CN116893774A (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310938504.7A Pending CN116893774A (en) 2022-02-25 2022-02-25 Interactive implementation method, device and equipment for game and storage medium

Country Status (1)

Country Link
CN (2) CN114546240B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050251760A1 (en) * 2004-05-07 2005-11-10 Sony Corporation Portable electronic device, display method, program, and graphical user interface thereof
US20070265081A1 (en) * 2006-04-28 2007-11-15 Shimura Yukimi Touch-controlled game character motion providing dynamically-positioned virtual control pad
CN102298504A (en) * 2011-09-27 2011-12-28 汉王科技股份有限公司 Method and system for magnifying display
CN103677643A (en) * 2013-12-20 2014-03-26 上海天奕达电子科技有限公司 Method and device for locally amplifying content of screen based on floating touch
CN107272962A (en) * 2017-06-30 2017-10-20 努比亚技术有限公司 Show amplification method, terminal and computer-readable recording medium
CN107741819A (en) * 2017-09-01 2018-02-27 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN108008868A (en) * 2016-10-28 2018-05-08 南宁富桂精密工业有限公司 Interface control method and electronic device
CN108536354A (en) * 2018-04-04 2018-09-14 网易(杭州)网络有限公司 The method and apparatus of location character position in virtual reality scenario
US20190118078A1 (en) * 2017-10-23 2019-04-25 Netease (Hangzhou) Network Co.,Ltd. Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN109701264A (en) * 2018-12-21 2019-05-03 努比亚技术有限公司 Mirror control method, device, mobile terminal and storage medium are amplified in game
CN110052021A (en) * 2019-04-12 2019-07-26 网易(杭州)网络有限公司 Game object processing method, mobile terminal device, electronic equipment and storage medium
CN113082696A (en) * 2021-04-01 2021-07-09 网易(杭州)网络有限公司 Display control method and device and electronic equipment
CN113318434A (en) * 2021-06-10 2021-08-31 网易(杭州)网络有限公司 Game information processing method and device and storage medium
CN113680067A (en) * 2021-08-19 2021-11-23 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual props in game
CN113821133A (en) * 2017-09-05 2021-12-21 华为终端有限公司 Method for controlling cursor movement, content selection method, method for controlling page scrolling and electronic equipment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050251760A1 (en) * 2004-05-07 2005-11-10 Sony Corporation Portable electronic device, display method, program, and graphical user interface thereof
US20070265081A1 (en) * 2006-04-28 2007-11-15 Shimura Yukimi Touch-controlled game character motion providing dynamically-positioned virtual control pad
CN102298504A (en) * 2011-09-27 2011-12-28 汉王科技股份有限公司 Method and system for magnifying display
CN103677643A (en) * 2013-12-20 2014-03-26 上海天奕达电子科技有限公司 Method and device for locally amplifying content of screen based on floating touch
CN108008868A (en) * 2016-10-28 2018-05-08 南宁富桂精密工业有限公司 Interface control method and electronic device
CN107272962A (en) * 2017-06-30 2017-10-20 努比亚技术有限公司 Show amplification method, terminal and computer-readable recording medium
CN107741819A (en) * 2017-09-01 2018-02-27 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN113821133A (en) * 2017-09-05 2021-12-21 华为终端有限公司 Method for controlling cursor movement, content selection method, method for controlling page scrolling and electronic equipment
US20190118078A1 (en) * 2017-10-23 2019-04-25 Netease (Hangzhou) Network Co.,Ltd. Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN108536354A (en) * 2018-04-04 2018-09-14 网易(杭州)网络有限公司 The method and apparatus of location character position in virtual reality scenario
CN109701264A (en) * 2018-12-21 2019-05-03 努比亚技术有限公司 Mirror control method, device, mobile terminal and storage medium are amplified in game
CN110052021A (en) * 2019-04-12 2019-07-26 网易(杭州)网络有限公司 Game object processing method, mobile terminal device, electronic equipment and storage medium
CN113082696A (en) * 2021-04-01 2021-07-09 网易(杭州)网络有限公司 Display control method and device and electronic equipment
CN113318434A (en) * 2021-06-10 2021-08-31 网易(杭州)网络有限公司 Game information processing method and device and storage medium
CN113680067A (en) * 2021-08-19 2021-11-23 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual props in game

Also Published As

Publication number Publication date
CN114546240B (en) 2023-08-22
CN116893774A (en) 2023-10-17

Similar Documents

Publication Publication Date Title
US11776352B2 (en) Graphical user interface for a gaming system
US9292161B2 (en) Pointer tool with touch-enabled precise placement
KR101720849B1 (en) Touch screen hover input handling
US9389777B2 (en) Gestures for manipulating tables, charts, and graphs
US9335913B2 (en) Cross slide gesture
JP4093823B2 (en) View movement operation method
US20110157027A1 (en) Method and Apparatus for Performing an Operation on a User Interface Object
US20100053221A1 (en) Information processing apparatus and operation method thereof
CN107748641B (en) Numerical value adjustment control method and device, electronic equipment and storage medium
CN111701226A (en) Control method, device and equipment for control in graphical user interface and storage medium
CN114377383A (en) Information processing method, device, equipment and storage medium
US10073617B2 (en) Touchscreen precise pointing gesture
CN111880715A (en) Method and device for editing virtual control in interface, mobile terminal and storage medium
CN113244611B (en) Virtual article processing method, device, equipment and storage medium
US10698601B2 (en) Second touch zoom control
US20200293155A1 (en) Device and method for providing reactive user interface
CN114546240A (en) Interactive implementation method, device, equipment and storage medium of game
CN114625294A (en) Operation method and device of virtual navigation key of intelligent terminal, terminal and storage medium
CN114011052B (en) Game prop display and control method and device, terminal equipment and storage medium
CN114082185A (en) Virtual map display method and device, terminal equipment and storage medium
KR20160126848A (en) Method for processing a gesture input of user
CN118286691A (en) Method for controlling movement of virtual game character, storage medium and electronic device
CN114307131A (en) Game control method and device
JP2015153239A (en) Portable terminal device, display method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant