CN111420395B - Interaction method and device in game, readable storage medium and electronic equipment - Google Patents

Interaction method and device in game, readable storage medium and electronic equipment Download PDF

Info

Publication number
CN111420395B
CN111420395B CN202010277707.2A CN202010277707A CN111420395B CN 111420395 B CN111420395 B CN 111420395B CN 202010277707 A CN202010277707 A CN 202010277707A CN 111420395 B CN111420395 B CN 111420395B
Authority
CN
China
Prior art keywords
game
feedback
determining
detection area
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010277707.2A
Other languages
Chinese (zh)
Other versions
CN111420395A (en
Inventor
黄华颖
郭宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010277707.2A priority Critical patent/CN111420395B/en
Publication of CN111420395A publication Critical patent/CN111420395A/en
Application granted granted Critical
Publication of CN111420395B publication Critical patent/CN111420395B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/307Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the technical field of games, and provides an interaction method, an apparatus, a readable storage medium and an electronic device in a game, wherein the interaction method in the game provides a graphical user interface through a terminal device, the graphical user interface displays at least part of a game scene, and the interaction method comprises the following steps: responding to a first operation acted on the graphical user interface, and determining a game detection area according to the first operation; and determining corresponding feedback instructions according to the target virtual objects in the game detection area. According to the method and the device, feedback instructions are given to the player through responding to the trigger operation of the player, and the player can independently realize interaction with the terminal equipment, so that the interaction of the game is richer, and the user experience is improved.

Description

Interaction method and device in game, readable storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of game technologies, and in particular, to an interaction method in a game, an interaction apparatus in a game, a readable storage medium, and an electronic device.
Background
With the development of the technology, the physical touch screen has been widely applied to smart phone, virtual reality device and other intelligent terminal devices. Particularly in the field of games, for example, where the movement of a virtual character in a game is controlled by touching a control, the terminal provides vibration feedback when the virtual character moves to an unreachable area. However, when a player needs to acquire scene information in a game scene, the virtual character needs to be controlled to move in the game scene to trigger acquisition of corresponding scene information, game operation is complex, understanding cost is high, and a terminal prompts the player by adopting characters or direction instructions, which only visually prompt the player and cannot perceptually increase game experience and game exploration of the player.
In view of this, there is a need in the art to develop a new method and apparatus for interaction in a game.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The purpose of the present disclosure is to provide an interaction method in a game, an interaction device in a game, a readable storage medium and an electronic device, so as to improve the use experience of a game player at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to one aspect of the present disclosure, an in-game interaction method is provided, in which a terminal device provides a graphical user interface, and the graphical user interface displays at least part of a game scene, and the in-game interaction method includes: responding to a first operation acted on the graphical user interface, and determining a game detection area in the game scene according to the first operation; and determining corresponding feedback instructions according to the target virtual objects in the game detection area.
In an exemplary embodiment of the present disclosure, the first operation includes a sliding operation or a non-contact operation.
In an exemplary embodiment of the disclosure, the feedback indication comprises at least one of: visual indication, audible indication and tactile indication.
In an exemplary embodiment of the disclosure, the determining the respective feedback indications from the target virtual object in the game probe area comprises: responding to a second operation acting on the graphical user interface, and determining a feedback position according to the second operation; and determining corresponding feedback indication according to the feedback position of the target virtual object in the game detection area.
In an exemplary embodiment of the disclosure, the determining the respective feedback indications from the target virtual object in the game probe area comprises: determining a feedback position according to the position of the operating point of the first operation when the first operation is finished; and determining corresponding feedback indication according to the feedback position of the target virtual object in the game detection area.
In an exemplary embodiment of the disclosure, the determining the respective feedback indications from the target virtual object in the game probe area comprises: and determining a corresponding feedback instruction according to the preset feedback position of the target virtual object in the game detection area.
In an exemplary embodiment of the disclosure, the determining the respective feedback indications from the target virtual object in the game probe area comprises: acquiring target information of a target virtual object contained in the game detection area, and generating a feedback signal according to the target information; and determining corresponding feedback indication at the feedback position according to the feedback signal.
In an exemplary embodiment of the present disclosure, one or more feedback devices are provided on the terminal device; the determining a corresponding feedback indication at the feedback location according to the feedback signal includes: determining a target feedback device corresponding to the feedback position according to the feedback position; and sending the feedback signal to the target feedback device so that the target feedback device generates a feedback indication of a corresponding degree according to the feedback signal.
In an exemplary embodiment of the disclosure, before the responding to a first operation applied to the graphical user interface, determining a game probing area in the game scene according to the first operation, the method further includes: and responding to the trigger operation acted on the terminal equipment, and controlling the terminal equipment to enter a feedback response state.
In an exemplary embodiment of the present disclosure, the controlling the terminal device to enter a feedback response state in response to a trigger operation acting on the terminal device includes: and responding to a trigger operation acted on the terminal equipment, and sending prompt information to the terminal equipment according to the trigger operation, wherein the prompt information is used for indicating a feedback area in the graphical user interface.
In an exemplary embodiment of the present disclosure, a feedback area is provided in the graphical user interface.
In an exemplary embodiment of the disclosure, the determining, in response to a first operation applied to the graphical user interface, a game detection area in the game scene according to the first operation includes: in response to the first operation acting on the feedback area, determining the game detection area in the game scene according to the first operation.
In an exemplary embodiment of the disclosure, before the responding to a first operation on the graphical user interface, determining a game detection area in the game scene according to the first operation, the method further comprises: and responding to a third operation acted on the graphical user interface, and determining a scene position corresponding to the operation point of the third operation in the game scene.
In an exemplary embodiment of the present disclosure, the graphical user interface includes a small map, and the small map is a thumbnail of the game scene; the responding to a third operation acted on the graphical user interface, and determining a scene position corresponding to the operation point of the third operation in the game scene comprises the following steps: responding to the third operation acting on the small map, and determining the scene position corresponding to the touch position in the game scene according to the touch position of the operation point corresponding to the third operation in the small map.
In an exemplary embodiment of the present disclosure, the method further comprises: and acquiring scene positions of game characters in the game scene.
In an exemplary embodiment of the present disclosure, the determining a game detection area in the game scene according to the first operation includes: determining the scene position as a detection base point; determining the game detection area in the game scene according to the detection base point and the first operation.
In an exemplary embodiment of the present disclosure, the determining a game detection area in the game scene according to the first operation includes: determining a change direction of the game detection area in the game scene according to the moving direction of the first operation.
In an exemplary embodiment of the present disclosure, the determining a change direction of the game detection area in the game scene according to the moving direction of the first operation includes: when the moving direction is a first direction, increasing the game detection area in the game scene; when the moving direction is a second direction, the game detection area is reduced in the game scene, wherein the second direction is different from the first direction.
In an exemplary embodiment of the present disclosure, the determining a change direction of the game detection area in the game scene according to the moving direction of the first operation includes: acquiring an initial operation point position of the first operation; increasing the game detection area in the game scene when the operation point of the first operation is far away from the initial operation point position; when the operation point of the first operation is close to the initial operation point position, reducing the game detection area in the game scene.
In an exemplary embodiment of the present disclosure, the determining a game detection area in the game scene according to the first operation includes: determining a region area of the game detection region in the game scene according to the moving distance of the first operation.
In an exemplary embodiment of the present disclosure, the determining a region area of a game detection region in the game scene according to the movement distance of the first operation includes: determining a detection distance in a game scene according to the moving distance in the game scene; acquiring a detection base point in the game scene; and determining the area of the game detection area by taking the detection base point as a circle center and the detection distance as a radius.
According to an aspect of the present disclosure, an interactive device in a game is provided, a graphical user interface is provided through a terminal device, the graphical user interface displays at least a part of a game scene, and the interactive device in the game includes: the response module is used for responding to a first operation acted on the graphical user interface and determining a game detection area in the game scene according to the first operation; and the feedback module is used for determining corresponding feedback instructions according to the target virtual objects in the game detection area.
According to an aspect of the present disclosure, there is provided a computer-readable medium, on which a computer program is stored, which program, when executed by a processor, implements the in-game interaction method as described in the above embodiments.
According to an aspect of the present disclosure, there is provided an electronic device including: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the in-game interaction method as described in the above embodiments.
According to the technical scheme, the interaction method and device in the game, the readable storage medium and the electronic device in the exemplary embodiment of the disclosure have at least the following advantages and positive effects:
the interactive method in the game of the exemplary embodiment of the present disclosure provides a graphical user interface through a terminal device, the graphical user interface displays at least a part of a game scene, and the interactive method responds to a first operation acting on the graphical user interface, determines a game detection area in the game scene according to the first operation, and determines a corresponding feedback indication according to a target virtual object in the game detection area. According to the interaction method in the game, on one hand, the game detection area can be determined in the game scene according to the first operation acting on the graphical user interface, and the game detection area can be a part beyond the visual field of the player, so that the game role controlled by the player can obtain the game information of the game scene which is not presented on the screen without approaching the game detection area, the exploration degree of the player in the game is increased, and the participation of the player is improved; on the other hand, the feedback instruction can be determined according to the target virtual object in the game detection area, and the feedback instruction corresponding to the target virtual object is determined by the player through the self-interaction with the terminal equipment, so that the interaction of the game is richer, the playability of the game is improved, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically shows a flow diagram of an interaction method in a game according to an embodiment of the present disclosure;
fig. 2 schematically shows a schematic structural diagram of a terminal device according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow diagram for determining a game detection zone according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a structural diagram for determining a direction of change of a game detection area according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates another configuration for determining a direction of change of a play detection area according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow diagram for determining a feedback indication according to an embodiment of the present disclosure;
FIG. 7 schematically illustrates a structural schematic of a feedback device position profile according to an embodiment of the present disclosure;
FIG. 8 schematically illustrates a structural schematic of a feedback device position profile according to an embodiment of the present disclosure;
FIG. 9 schematically shows a block diagram of an interaction device in a game according to an embodiment of the present disclosure;
FIG. 10 schematically shows a block schematic of an electronic device according to an embodiment of the disclosure;
FIG. 11 schematically shows a program product schematic according to an embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The method for interacting in the game provided by the embodiment of the disclosure provides a graphical user interface through terminal equipment, and the graphical user interface displays at least part of game scenes. Fig. 1 shows a flow diagram of an interaction method in a game, as shown in fig. 1, the interaction method in a game at least comprises the following steps:
step S110: responding to a first operation acted on a graphical user interface, and determining a game detection area in a game scene according to the first operation;
step S120: and determining corresponding feedback indication according to the target virtual object in the game detection area.
On one hand, the interaction method in the game in the embodiment of the disclosure can determine the game detection area in the game scene according to the first operation acting on the graphical user interface, and the game detection area can be a part beyond the visual field of the player, so that the player can obtain the game information of the game scene which is not presented on the screen without approaching the game detection area, thereby increasing the exploration degree of the player in the game and improving the participation of the player; on the other hand, the feedback indication can be determined according to the target virtual object in the game detection area, and the interaction with the terminal equipment is automatically realized by the player, so that the interaction of the game is richer, the playability of the game is increased, and the user experience is improved.
The interaction method in the embodiment of the application can be operated on a terminal device or a server. The terminal device may be a local terminal device. When the interactive method operates as a server, it may be a cloud game.
In an alternative embodiment, cloud gaming refers to a cloud computing-based gaming mode. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the interaction method are completed on a cloud game server, and the cloud game client is used for receiving and sending data and presenting the game picture, for example, the cloud game client can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palmtop computer and the like; however, the terminal device performing the game data processing is a cloud game server in the cloud. When a game is played, a player operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are coded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. The local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. By way of example, the local terminal device may include a display screen for presenting a graphical user interface including game screens and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen. In order to make the technical solution of the present disclosure clearer, the following takes a terminal device as a local terminal device as an example, and details an interaction method in a game in the exemplary embodiment are described:
in step S110: in response to a first operation applied to the graphical user interface, a game detection area is determined in the game scene based on the first operation.
In an exemplary embodiment of the present disclosure, the first operation includes a sliding operation or a non-contact operation, where the non-contact operation may be a floating touch operation, which is a technology that may detect an operation of an operation medium in front of a screen of a touch terminal using capacitive touch sensing or by a sensor (e.g., a light-sensitive sensor or an ultrasonic sensor) carried in the touch terminal. Of course, the first operation may also be a non-contact operation in other manners, and the player may implement an interaction function with the terminal device without contacting the terminal device. The operating medium for the first operation may be a finger of the player, a stylus pen, or the like, and the present disclosure is not limited thereto.
In an exemplary embodiment of the present disclosure, before determining a game detection area in a game scene according to a first operation in response to the first operation applied to the graphical user interface, the terminal device is controlled to enter a feedback response state in response to a trigger operation applied to the terminal device. The trigger operation may be a click operation or a non-contact operation of a player on the graphical user interface, or may also be a shortcut trigger operation preset by a game or a terminal device, for example, a circle drawing operation is performed on the graphical user interface. The trigger operation may also be a voice trigger operation, such as receiving a target voice of a player and controlling the terminal device to enter a feedback response state according to the target voice of the player. The triggering operation may also be a touch operation or a pressing operation of the player on a non-graphical user interface of the terminal device, for example, a pressing operation of a volume key applied to the terminal device by the player, so as to control the terminal device to enter a feedback response state. In addition, a shortcut key can be arranged on the terminal device, and the terminal device is controlled to enter a feedback state in response to the trigger operation acted on by the player. Of course, the trigger operation may also be a trigger operation acting on an external device connected to the terminal device, which is not specifically limited by the present disclosure.
In an exemplary embodiment of the present disclosure, a feedback area is provided in the graphical user interface, and the feedback area may be the entire graphical user interface or a specific area on the graphical user interface, fig. 2 schematically illustrates a structural diagram of the terminal device, as shown in fig. 2, a graphical user interface 200 is provided on the terminal device, and a feedback area 201 is provided in a middle area of the graphical user interface 200. Of course, the feedback area 201 may be any area of the graphical user interface 200, which is not specifically limited by the present disclosure.
In an exemplary embodiment of the present disclosure, controlling the terminal device to enter a feedback response state in response to a trigger operation acting on the terminal device includes: and responding to the trigger operation acted on the terminal equipment, and sending prompt information to the terminal equipment according to the trigger operation, wherein the prompt information is used for indicating a feedback area 201 in the graphical user interface 200. Specifically, the prompt information is sent to the terminal device according to the trigger operation, so that the terminal device displays the prompt information on the graphical user interface 200, where the prompt information includes the feedback area 201 on the graphical user interface 200, the game detection area information of the game detection area, and the operation prompt information for prompting the player to perform the next operation. As shown in fig. 2, a feedback area 201 is displayed in the graphical user interface 200, and operation prompt information 202 prompting the player to perform the next operation is displayed in the feedback area 201, so that the player performs a touch operation in the feedback area 201 according to the operation prompt information 202 to determine a game detection area, and in addition, game detection area information 203 is also displayed in the graphical user interface 200, and the game detection area information 203 is displayed in real time according to the touch operation of the player, where the game detection area information 203 may include an area of the game detection area, and may also include information such as a length and a width of the game detection area, which is not specifically limited by the present disclosure.
In an exemplary embodiment of the present disclosure, in response to a first operation applied to the feedback area, a game detection area is determined in a game scene according to the first operation. Specifically, the method for determining the game detection area in the game scene according to the first operation comprises the following steps: determining a scene position as a detection base point; a game detection area is determined in a game scene based on the detection base point and the first operation. Specifically, fig. 3 shows a schematic flowchart of the process for determining the game detection area, as shown in fig. 3, which specifically includes the following steps:
in step S310, a scene position is acquired, and the scene position is determined as a detection base point.
In an exemplary embodiment of the present disclosure, the method for acquiring a scene position includes the following methods:
first, a scene position of a game character in a game scene is acquired.
In the exemplary embodiment of the present disclosure, the scene position of the character in the game scene may be obtained in real time, or the scene position of the character in the game scene may be obtained when a trigger operation on the terminal device is received.
Second, in response to a third operation applied to the graphical user interface 200, a scene position in the game scene corresponding to the operation point of the third operation is determined.
In an exemplary embodiment of the present disclosure, before determining a game detection area in a game scene according to a first operation in response to a first operation applied to the graphical user interface 200, a scene position corresponding to an operation point of a third operation in the game scene is determined in response to the third operation applied to the graphical user interface 200, where the third operation may be a click operation applied to the graphical user interface 200 by a player, the click operation may be a click of any position in the graphical user interface 200, and the scene position may be any position in the game scene. That is, the scene position may be a certain position in the game scene displayed by the graphical user interface 200, for example, a touch position corresponding to the game scene is directly clicked in the game scene displayed by the graphical user interface 200 as the scene position.
Of course, the scene position may also be a scene position in a game scene not displayed by the gui 200, for example, the gui 200 includes a small map, and the small map is a thumbnail of the game scene. Specifically, in response to a third operation acting on the minimap, a scene position corresponding to the touch position in the game scene is determined according to the touch position of an operation point corresponding to the third operation in the minimap. The present disclosure does not specifically limit the manner of determining the scene position.
It should be noted that, the determining of the scene position corresponding to the operation point of the third operation in the game scene may be performed before the control terminal device enters the feedback response state, or may be performed after the control terminal device enters the feedback response state, which is not specifically limited in this disclosure.
In step S320, a changing direction of the game detection area is determined in the game scene according to the moving direction of the first operation.
In an exemplary embodiment of the present disclosure, the method of determining the changing direction of the game detecting area includes several methods as follows
Firstly, when the moving direction is a first direction, increasing a game detection area in a game scene; and when the moving direction is a second direction, reducing the game detection area in the game scene, wherein the second direction is different from the first direction. For example, fig. 4 is a schematic structural diagram illustrating the determination of the changing direction of the game detection area, and as shown in fig. 4, assuming that the first direction is an upward direction 401, the game detection area is increased, and the second direction is a downward direction 402, the game detection area is decreased; assuming that the game detection area is increased when the first direction is the right direction 403, the second direction is the left direction 404, and the game detection area is decreased. Of course, the first and second directions may be any different directions, which the present disclosure does not specifically limit.
Secondly, acquiring the position of an initial operation point of the first operation; when the operation point of the first operation is far away from the position of the initial operation point, increasing a game detection area in a game scene; when the operation point of the first operation is close to the initial operation point position, the game detection area is reduced in the game scene. For example, fig. 5 shows another structure diagram for determining a change direction of a game detection area, as shown in fig. 5, an initial operation point position 501 of a first operation is determined, and the game detection area is determined according to a movement of an operation point of the first operation, for example, when the operation point of the first operation is a direction 502 away from the initial operation point position, the game detection area is increased, and when the operation point of the first operation is a direction 503 close to the initial operation point position, the game detection area is decreased.
In step S330, a zone area of the game detection zone is determined in the game scene according to the movement distance of the first manipulation.
In an exemplary embodiment of the present disclosure, determining a region area of a game detection region in a game scene according to a movement distance of a first operation includes: determining the detection distance in the game scene according to the moving distance; acquiring a detection base point in a game scene; and determining a game detection area by taking the detection base point as a circle center and the detection distance as a radius.
It should be noted that the game detection area may be circular, semicircular, or square. The circular or square game detection area is enlarged or reduced according to the moving distance of the first operation with the detection base point as the center. Or the semi-circular or directional game detection area is enlarged or reduced by taking the straight line where the detection base point is located as a reference line according to the moving direction and the moving distance of the first operation.
Continuing to refer to FIG. 1, in step S120, a corresponding feedback indication is determined based on the target virtual object in the game probe area.
In an exemplary embodiment of the present disclosure, the feedback indication comprises at least one of: visual indication, auditory indication and tactile indication, wherein the visual indication can be indication of the flashing frequency of an indicator light on the terminal equipment, and can also be indication that the graphical user interface 200 displays the information of the target virtual object; the audible indication may be a volume level indication generated by the terminal device, or an audio frequency speed indication, and the tactile indication may be a tactile indication sensed by the player contacting the terminal device, and specifically, may be an electrostatic tactile indication of a corresponding level given at a corresponding position on the terminal device contacted by the player, so that the player may determine the target information of the target virtual object according to the level of the electrostatic sensed on the terminal device.
In an exemplary embodiment of the present disclosure, a corresponding feedback indication is determined at a feedback position according to a target virtual object in a game detection area, and specifically, a method of determining the feedback position includes the following three methods:
first, in response to a second operation applied to the graphic user interface 200, a feedback position is determined according to the second operation.
In an exemplary embodiment of the present disclosure, after determining the game detection area in the game scene according to the first operation, in response to a second operation applied to the graphical user interface 200, the second operation includes a touch operation or a non-contact operation, a feedback position is determined according to a position of an operation point corresponding to the second operation, and the feedback position may be any position on the graphical user interface 200, which is not limited in this disclosure.
And secondly, determining the feedback position according to the position of the operating point of the first operation when the first operation is finished.
In an exemplary embodiment of the present disclosure, determining the feedback position according to the operation point position of the first operation at the end of the first operation includes: when the operation point corresponding to the first operation stops moving, determining the position of the operation point at the moment as a feedback position; or when the operation point corresponding to the first operation leaves the graphical user interface 200, determining the position of the operation point at the moment as the feedback position.
Thirdly, corresponding feedback instructions are determined according to the preset feedback positions of the target virtual objects in the game detection area.
In an exemplary embodiment of the present disclosure, the preset feedback position may be set when the player enters the game for the first time, and the setting may be changed according to the player's requirement after the first setting. The preset feedback position may be any position on the graphical user interface 200, may also be any button on the terminal device, and may also be a certain position on an external device connected to the terminal device, which is not specifically limited by the present disclosure.
In an exemplary embodiment of the present disclosure, specifically, fig. 6 shows a flowchart for determining a feedback instruction, and as shown in fig. 6, determining a corresponding feedback instruction at a feedback position according to a target virtual object in a game detection area specifically includes the following steps:
in step S610, target information of a target virtual object included in the game detection area is acquired, and a feedback signal is generated based on the target information.
In an exemplary embodiment of the present disclosure, after a game detection area is determined in a game scene, target information of a target virtual object included in the game detection area is obtained, where the target virtual object may be a virtual character in the game, or may also be a certain prop in the game, and the present disclosure does not specifically limit this. The target information may be one or more of number information, level information, rareness degree, and the like of the target virtual object. For example, when the acquired target virtual object is a high-level boss, a strong feedback signal is generated.
In step S620, a corresponding feedback indication is determined at the feedback position according to the feedback signal.
In an exemplary embodiment of the present disclosure, the terminal device is provided with one or more feedback devices, the feedback devices may generate feedback indications of corresponding degrees according to the feedback signals, and the feedback devices may be disposed in a plane of the terminal device corresponding to the graphical user interface 200 or in an external button of the terminal device, which is not limited in this disclosure.
For example, fig. 7 and fig. 8 show structural schematic diagrams of the position distribution of the feedback device, as shown in fig. 7, the feedback devices 701 are uniformly disposed in the terminal device corresponding to the gui 200, as shown in fig. 8, the feedback devices 701 may also be disposed in the power-on button 801, the volume button 802, or the game shortcut button 803 of the terminal device.
In an exemplary embodiment of the present disclosure, a target feedback device corresponding to the feedback position is determined according to the feedback position, and a feedback signal is sent to the target feedback device, so that the target feedback device generates a feedback indication of a corresponding degree according to the feedback signal.
By way of example, the feedback device may be a tactile actuator disposed in the terminal device and a conductive coating and a passivation coating disposed on a surface of the terminal device corresponding to the tactile actuator, and in particular, a conductive coating and a passivation coating disposed on a surface of a display screen of the terminal device, wherein the tactile actuator is capable of generating different current signals according to a particular use scenario and a player's operation on the terminal device, so that the player feels different strengths of tactile feedback at the feedback position. After the feedback position is determined, a feedback signal is sent to a target haptic actuator corresponding to the feedback position, which, upon receiving the feedback signal, generates a corresponding degree of haptic feedback, which may be electrostatic haptic feedback.
The following describes embodiments of the apparatus of the present disclosure, which can be used to perform the interaction method in the game described above in the present disclosure. For details not disclosed in the embodiments of the device of the present disclosure, please refer to the embodiments of the interaction method in the game described above in the present disclosure.
FIG. 9 schematically shows a block diagram of an interaction device in a game according to one embodiment of the present disclosure.
Referring to fig. 9, according to an in-game interaction apparatus 900 according to an embodiment of the present disclosure, the in-game interaction apparatus 900 provided by the present disclosure provides a graphical user interface 200 through a terminal device, and the graphical user interface 200 displays at least part of a game scene. The in-game interaction device 900 includes: response module 901, feedback module 902. Specifically, the method comprises the following steps:
a response module 901, configured to determine, in response to a first operation applied to the graphical user interface 200, a game detection area in a game scene according to the first operation;
a feedback module 902, configured to determine a corresponding feedback indication according to the target virtual object in the game detection area.
The specific details of the interaction devices in the games have been described in detail in the corresponding interaction methods in the games, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.), or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1000 according to this embodiment of the invention is described below with reference to fig. 10. The electronic device 1000 shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 10, the electronic device 1000 is embodied in the form of a general purpose computing device. The components of the electronic device 1000 may include, but are not limited to: the at least one processing unit 1010, the at least one memory unit 1020, a bus 1030 connecting different system components (including the memory unit 1020 and the processing unit 1010), and a display unit 1040.
Wherein the storage unit stores program code that may be executed by the processing unit 1010 to cause the processing unit 1010 to perform the steps according to various exemplary embodiments of the present invention described in the "exemplary methods" section above in this specification. For example, the processing unit 1010 may execute step S110 shown in fig. 1, in response to a first operation applied to the graphical user interface 200, determining a game detection area in a game scene according to the first operation; step S120, corresponding feedback instructions are determined according to the target virtual objects in the game detection area.
The storage unit 1020 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM) 10201 and/or a cache memory unit 10202, and may further include a read-only memory unit (ROM) 10203.
The memory unit 1020 may also include a program/utility 10204 having a set (at least one) of program modules 10205, such program modules 10205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may comprise an implementation of a network environment.
Bus 1030 may be any one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, and a local bus using any of a variety of bus architectures.
The electronic device 1000 may also communicate with one or more external devices 1200 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a viewer to interact with the electronic device 1000, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1000 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interfaces 1050. Also, the electronic device 1000 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1060. As shown, the network adapter 1060 communicates with the other modules of the electronic device 1000 over the bus 1030. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 1000, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 11, a program product 1100 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (23)

1. An interaction method in a game, which is characterized in that a terminal device provides a graphical user interface, and the graphical user interface displays at least part of game scenes, and comprises the following steps:
responding to a first operation acted on the graphical user interface, and determining a game detection area in the game scene according to the first operation; the game detection area is a part beyond the visual field of the player;
responding to a second operation acting on the graphical user interface, and determining a feedback position according to the second operation;
and determining corresponding feedback indication according to the feedback position of the target virtual object in the game detection area.
2. The interaction method of claim 1, wherein the first operation comprises a swipe operation or a contactless operation.
3. The interactive method of claim 1, wherein the feedback indication comprises at least one of: visual indication, audible indication and tactile indication.
4. The interaction method according to claim 1, wherein said determining respective feedback indications from target virtual objects in said game exploration area comprises:
determining a feedback position according to the position of the operating point of the first operation when the first operation is finished;
and determining corresponding feedback indication according to the feedback position of the target virtual object in the game detection area.
5. The interaction method according to claim 1, wherein said determining respective feedback indications from target virtual objects in said game exploration area comprises:
and determining a corresponding feedback instruction according to the preset feedback position of the target virtual object in the game detection area.
6. The interaction method according to any one of claims 4 to 5, wherein the determining respective feedback indications from the target virtual objects in the game detection area comprises:
acquiring target information of a target virtual object contained in the game detection area, and generating a feedback signal according to the target information;
and determining a corresponding feedback indication at the feedback position according to the feedback signal.
7. The interaction method according to claim 6, wherein one or more feedback devices are provided on the terminal device;
the determining a corresponding feedback indication at the feedback location according to the feedback signal includes:
determining a target feedback device corresponding to the feedback position according to the feedback position;
and sending the feedback signal to the target feedback device so that the target feedback device generates a feedback indication of a corresponding degree according to the feedback signal.
8. The interaction method according to claim 1, wherein before the responding to the first operation on the graphical user interface, determining a game detection area in the game scene according to the first operation, the method further comprises:
and responding to the trigger operation acted on the terminal equipment, and controlling the terminal equipment to enter a feedback response state.
9. The interaction method according to claim 8, wherein the controlling the terminal device to enter a feedback response state in response to the trigger action acting on the terminal device comprises:
and responding to a trigger operation acted on the terminal equipment, and sending prompt information to the terminal equipment according to the trigger operation, wherein the prompt information is used for indicating a feedback area in the graphical user interface.
10. The interactive method of claim 1, further comprising:
providing a feedback area on the graphical user interface.
11. The interaction method according to claim 9 or 10, wherein the determining a game detection area in the game scene according to a first operation acting on the graphical user interface in response to the first operation comprises:
in response to the first operation acting on the feedback area, determining the game detection area in the game scene according to the first operation.
12. The interaction method according to claim 1, wherein before the responding to a first operation on the graphical user interface, determining a game probing area in the game scene according to the first operation, the method further comprises:
and responding to a third operation acted on the graphical user interface, and determining a scene position corresponding to the operation point of the third operation in the game scene.
13. The interaction method according to claim 12, wherein a small map is included in the graphical user interface, and the small map is a thumbnail of the game scene;
the responding to a third operation acted on the graphical user interface, and determining a scene position corresponding to the operation point of the third operation in the game scene comprises the following steps:
responding to the third operation acting on the small map, and determining the scene position corresponding to the touch position in the game scene according to the touch position of the operation point corresponding to the third operation in the small map.
14. The interaction method of claim 1, wherein the method further comprises:
and acquiring scene positions of game characters in the game scene.
15. The interaction method according to claim 12 or 14, wherein the determining a game detection area in the game scene according to the first operation comprises:
determining the scene position as a detection base point;
determining the game detection area in the game scene according to the detection base point and the first operation.
16. The interaction method according to claim 1, wherein the determining a game detection area in the game scene according to the first operation comprises:
determining a change direction of the game detection area in the game scene according to the moving direction of the first operation.
17. The interaction method according to claim 16, wherein the determining the change direction of the game detection area in the game scene according to the movement direction of the first operation comprises:
when the moving direction is a first direction, increasing the game detection area in the game scene;
when the moving direction is a second direction, the game detection area is reduced in the game scene, wherein the second direction is different from the first direction.
18. The interaction method according to claim 16, wherein the determining the change direction of the game detection area in the game scene according to the movement direction of the first operation comprises:
acquiring an initial operation point position of the first operation;
increasing the game detection area in the game scene when the operation point of the first operation is far away from the initial operation point position;
when the operation point of the first operation is close to the initial operation point position, reducing the game detection area in the game scene.
19. The interaction method according to claim 1, wherein the determining a game detection area in the game scene according to the first operation comprises:
determining the area of the game detection area in the game scene according to the moving distance of the first operation.
20. The interaction method according to claim 19, wherein the determining a region area of a game detection region in the game scene according to the movement distance of the first operation comprises:
determining a detection distance in a game scene according to the moving distance in the game scene;
acquiring a detection base point in the game scene;
and determining the area of the game detection area by taking the detection base point as a circle center and the detection distance as a radius.
21. An interaction device in a game, wherein a graphical user interface is provided through a terminal device, the graphical user interface displays at least part of a game scene, and the interaction device comprises:
the response module is used for responding to a first operation acted on the graphical user interface and determining a game detection area in the game scene according to the first operation; the game detection area is a part beyond the visual field of the player; and a feedback position is determined according to a second operation in response to the second operation acting on the graphical user interface;
and the feedback module is used for determining corresponding feedback indication according to the feedback position of the target virtual object in the game detection area.
22. A computer-readable storage medium, on which a computer program is stored, which program, when executed by a processor, carries out the in-game interaction method according to any one of claims 1 to 20.
23. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the in-game interaction method of any one of claims 1 to 20.
CN202010277707.2A 2020-04-08 2020-04-08 Interaction method and device in game, readable storage medium and electronic equipment Active CN111420395B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010277707.2A CN111420395B (en) 2020-04-08 2020-04-08 Interaction method and device in game, readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010277707.2A CN111420395B (en) 2020-04-08 2020-04-08 Interaction method and device in game, readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111420395A CN111420395A (en) 2020-07-17
CN111420395B true CN111420395B (en) 2023-04-18

Family

ID=71557766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010277707.2A Active CN111420395B (en) 2020-04-08 2020-04-08 Interaction method and device in game, readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111420395B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111773712A (en) * 2020-07-20 2020-10-16 网易(杭州)网络有限公司 Interaction control method and device, electronic equipment and computer readable storage medium
CN112274918A (en) * 2020-11-18 2021-01-29 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment
CN113398585A (en) * 2021-07-14 2021-09-17 网易(杭州)网络有限公司 Game interaction method and device
CN115671711B (en) * 2021-07-21 2023-11-21 腾讯科技(深圳)有限公司 Cloud game-based device control method and device, electronic device and readable medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108465238A (en) * 2018-02-12 2018-08-31 网易(杭州)网络有限公司 Information processing method, electronic equipment in game and storage medium
CN110052025A (en) * 2019-05-24 2019-07-26 网易(杭州)网络有限公司 Method for sending information, display methods, device, equipment, medium in game

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10807001B2 (en) * 2017-09-12 2020-10-20 Netease (Hangzhou) Network Co., Ltd. Information processing method, apparatus and computer readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108465238A (en) * 2018-02-12 2018-08-31 网易(杭州)网络有限公司 Information processing method, electronic equipment in game and storage medium
CN110052025A (en) * 2019-05-24 2019-07-26 网易(杭州)网络有限公司 Method for sending information, display methods, device, equipment, medium in game

Also Published As

Publication number Publication date
CN111420395A (en) 2020-07-17

Similar Documents

Publication Publication Date Title
CN111420395B (en) Interaction method and device in game, readable storage medium and electronic equipment
US10511778B2 (en) Method and apparatus for push interaction
CN107019909B (en) Information processing method, information processing device, electronic equipment and computer readable storage medium
CN107656620B (en) Virtual object control method and device, electronic equipment and storage medium
CN110090444B (en) Game behavior record creating method and device, storage medium and electronic equipment
US20170165575A1 (en) Voxel-based, real-time acoustic adjustment
CN109260713B (en) Virtual object remote assistance operation method and device, storage medium and electronic equipment
CN103529942A (en) Touchless gesture-based input
CN109960558B (en) Virtual object control method and device, computer storage medium and electronic equipment
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN110624241A (en) Information processing method and device, electronic equipment and storage medium
CN108553894B (en) Display control method and device, electronic equipment and storage medium
CN109939433B (en) Operation control method and device of virtual card, storage medium and electronic equipment
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN109542323B (en) Interaction control method and device based on virtual scene, storage medium and electronic equipment
CN108595010B (en) Interaction method and device for virtual objects in virtual reality
CN111760272B (en) Game information display method and device, computer storage medium and electronic equipment
CN111228790B (en) Game role display control method and device, electronic equipment and computer medium
CN111249722A (en) Control method and device of terminal equipment, storage medium and electronic equipment
CN108355352B (en) Virtual object control method and device, electronic device and storage medium
CN110673810B (en) Display device, display method and device thereof, storage medium and processor
CN111773721A (en) Game screen display method and device, electronic device and storage medium
CN110908568B (en) Control method and device for virtual object
US10379639B2 (en) Single-hand, full-screen interaction on a mobile device
CN113769403A (en) Virtual object moving method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant