CN111729296B - Game interface interaction method and device and electronic terminal - Google Patents

Game interface interaction method and device and electronic terminal Download PDF

Info

Publication number
CN111729296B
CN111729296B CN202010545527.8A CN202010545527A CN111729296B CN 111729296 B CN111729296 B CN 111729296B CN 202010545527 A CN202010545527 A CN 202010545527A CN 111729296 B CN111729296 B CN 111729296B
Authority
CN
China
Prior art keywords
interaction
node
interaction node
determining
virtual light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010545527.8A
Other languages
Chinese (zh)
Other versions
CN111729296A (en
Inventor
李伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010545527.8A priority Critical patent/CN111729296B/en
Publication of CN111729296A publication Critical patent/CN111729296A/en
Application granted granted Critical
Publication of CN111729296B publication Critical patent/CN111729296B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Abstract

The application provides an interface interaction method and device for a game and an electronic terminal, relates to the technical field of games, and solves the technical problem that the convenience of an operation process of selecting in the game by a player is low. The method comprises the following steps: determining a ray path of the virtual ray reflected by the rotated first interaction node in response to the rotation operation of the first interaction node; determining at least one second interaction node through which the virtual light passes based on the light path; and determining the first interaction node and the at least one second interaction node as selected interaction nodes.

Description

Game interface interaction method and device and electronic terminal
Technical Field
The present application relates to the field of game technologies, and in particular, to a game interface interaction method and apparatus, and an electronic terminal.
Background
In the game, a player is often required to select a plurality of different options set in the game interface, so that the interface interaction process of the game is realized. For example, in a skill-to-point interface in a game, a player may select a skill that requires to be added. For another example, in a game level mission interface, a player may select a mission level or the like that needs to be completed.
Currently, players typically implement a selection process for an option by clicking on the option. If multiple options need to be selected, the options need to be clicked respectively, and the selection of the multiple options is realized through the operation of clicking for multiple times, so that the operation process of a player is complicated, and the convenience is low.
Disclosure of Invention
The invention aims to provide a game interface interaction method and device and an electronic terminal, so as to solve the technical problem of low convenience in the operation process of selecting in a game by a player.
In a first aspect, an embodiment of the present application provides an interface interaction method for a game, where a game scene of the game includes virtual light by executing a software application and rendering the virtual light on a display of a terminal, the graphical user interface includes a plurality of interaction nodes capable of reflecting the virtual light, and the virtual light irradiates the first interaction nodes; the method comprises the following steps:
determining a ray path of the virtual ray reflected by the rotated first interaction node in response to the rotation operation of the first interaction node;
determining at least one second interaction node through which the virtual light passes based on the light path;
and determining the first interaction node and the at least one second interaction node as selected interaction nodes.
In one possible implementation, the operation form of the rotation operation includes any one or more of the following:
dragging the interactive node; controlling the interaction node to rotate along a specific axial direction; and controlling the interaction node to rotate through the clicking times and the clicking positions.
In one possible implementation, in response to a rotation operation for the first interaction node, determining that the virtual light passes through the rotated first interaction node to generate a reflected light path includes:
determining a first orientation of the rotated first interaction node in response to a rotation operation for the first interaction node;
and determining a reflected ray path of the virtual ray generated by the rotated first interaction node according to the first orientation.
In one possible implementation, the step of determining, according to the first direction, that the virtual light passes through the rotated first interaction node, generates a reflected light path includes:
determining a first incident angle of the virtual light irradiated to the rotated first interaction node according to the first direction;
determining a first reflection angle of a first interaction node after the virtual light is rotated according to the first incidence angle; wherein the first reflection angle is equal to the first incidence angle;
determining a first reflection direction of the virtual light after passing through the rotated first interaction node according to the first reflection angle;
and determining a ray path reflected by the rotated first interaction node of the virtual ray according to the first reflection direction.
In one possible implementation, the light path includes: the virtual light irradiates an incident path of each second interaction node, and the virtual light passes through a reflection path of each second interaction node.
In one possible implementation, the selected interaction node is used to represent any one or more of:
the selected interactive node, the marked interactive node and the activated interactive node.
In one possible implementation, the selected interactive nodes are displayed in a designated form to highlight the selected interactive nodes among the interactive nodes.
In one possible implementation, the specified form includes any one or more of the following:
brightness change form, color change form, stereoscopic display form, shadow display form, node information display form.
In one possible implementation, the setting positions of the interaction nodes in the graphical user interface are arranged and combined according to a preset sequence; the preset sequence is used for representing the selection logic relationship among the interaction nodes.
In one possible implementation, the source direction of the virtual light is a fixed direction.
In one possible implementation, a plane model is provided on a side of the interaction node that can reflect the virtual light.
In one possible implementation, the graphical user interface is an information selection interface in the game; the information selection interface includes any one or more of the following:
a skill add point interface, a checkpoint selection interface, and a task selection interface.
In a second aspect, an interface interaction device for a game is provided, and a graphical user interface is obtained by executing a software application and rendering the software application on a display of a terminal, wherein a game scene of the game comprises virtual light rays, the graphical user interface comprises a plurality of interaction nodes capable of reflecting the virtual light rays, and the virtual light rays irradiate on a first interaction node; the device comprises:
the first determining module is used for responding to the rotation operation of the first interaction node and determining a ray path of the reflected virtual ray generated by the rotated first interaction node;
a second determining module, configured to determine, based on the light path, at least one second interaction node through which the virtual light passes;
and a third determining module, configured to determine the first interaction node and the at least one second interaction node as selected interaction nodes.
In a third aspect, an embodiment of the present application further provides an electronic terminal, including a memory, and a processor, where the memory stores a computer program that can be executed on the processor, and the processor executes the method according to the first aspect.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium storing machine-executable instructions which, when invoked and executed by a processor, cause the processor to perform the method of the first aspect described above.
The embodiment of the application brings the following beneficial effects:
according to the interface interaction method, the device and the electronic terminal for the game, the rotating operation for the first interaction node can be responded, the first interaction node after virtual light passes through the rotation can be determined to generate a reflected light path, then at least one second interaction node after virtual light passes through is determined based on the reflected light path, and then the first interaction node and the at least one second interaction node are determined to be the selected interaction nodes.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic diagram of an example structure of an electronic terminal according to an embodiment of the present application;
FIG. 3 is a flowchart of a method for interface interaction of a game according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of another graphical user interface provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of an interface interaction device for a game according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "comprising" and "having" and any variations thereof, as used in the embodiments of the present application, are intended to cover non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or apparatus.
In the game, if a player wants to select a plurality of options, the options need to be clicked respectively, and the selection of the plurality of options is realized through a multi-click operation. Taking a skill dotting process in a game as an example, a player realizes an interaction process by clicking skill icons, and clicking or long-pressing the skill icons for upgrading is a common interaction mode for skill dotting at present.
Currently, multiple results are often selected with multiple clicks, and it is difficult for such interface interaction to activate multiple results simultaneously with one intuitive operation. Furthermore, it is difficult to combine with the game world view of "ray as a game cue". The interaction logic and ray tracing of the level selection interface and the skill point adding interface of the current game are rarely combined, and an interface interaction mode using rays as interaction rules is lacking.
In addition, the player can only select a plurality of options through multi-click operation at present, so that the operation process of the player is complicated, and the convenience is low.
Based on the above, the embodiment of the application provides a game interface interaction method, a game interface interaction device and an electronic terminal, and by the method, the technical problem of low convenience in the operation process of selecting in the game by a player can be solved.
The interface interaction method of the game in the embodiment of the application can be applied to the electronic terminal. The electronic terminal comprises a display, an input device and a processor, wherein the display is used for presenting a graphical user interface. The input device may be a keyboard, mouse, touch screen, or the like for receiving operations for a graphical user interface.
In practical application, the electronic terminal may be a computer device, or may be a touch terminal such as a touch screen mobile phone or a tablet computer. As an example, the electronic terminal is a touch terminal, and the display and the input device thereof may be integrated as a touch screen for presenting a graphical user interface and receiving operations for the graphical user interface.
In some embodiments, when the graphical user interface is operated by the electronic terminal, the graphical user interface may be used to operate content local to the electronic terminal, and may also be used to operate content of the peer server.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. The application scenario may include an electronic terminal (e.g., a mobile phone 102) and a server 101, which may communicate with the server 101 through a wired network or a wireless network. The electronic terminal is used for running a virtual desktop, and through the virtual desktop, interaction with the server 101 can be performed to realize operation on content in the server 101.
The electronic terminal of the present embodiment will be described by taking the mobile phone 102 as an example. The handset 102 includes Radio Frequency (RF) circuitry 110, memory 120, touch screen 130, processor 140, and the like. It will be appreciated by those skilled in the art that the handset construction shown in fig. 2 is not limiting of the handset and may include more or fewer components than shown, or may combine certain components, or split certain components, or a different arrangement of components. Those skilled in the art will appreciate that the touch screen 130 pertains to a User Interface (UI) and that the handset 102 may include fewer User interfaces than shown or otherwise.
The radio frequency circuit 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications (Global System of Mobile communication, GSM), general packet radio service (General Packet Radio Service, GPRS), code division multiple access (Code Division Multiple Access, CDMA), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), long term evolution (Long Term Evolution, LTE), email, short message service (Short Messaging Service, SMS), and the like.
The memory 120 may be used to store software programs and modules that the processor 140 executes to perform various functional applications and data processing of the handset 102 by running the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the cell phone 102, etc. In addition, memory 120 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The touch screen 130 may be used to display a graphical user interface and to receive user operations with respect to the graphical user interface. A particular touch screen 130 may include a display panel and a touch panel. Wherein the display panel may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-Emitting Diode (OLED), or the like. The touch panel may collect contact or non-contact operations on or near the user (e.g., operations of the user on or near the touch panel using any suitable object or accessory such as a finger, stylus, etc.), and generate preset operation instructions. In addition, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth and the touch gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into information which can be processed by the processor, sends the information to the processor 140, and can receive and execute commands sent by the processor 140. In addition, the touch panel may be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave, or may be implemented by any technology developed in the future. Further, the touch panel may overlay the display panel, and a user may operate on or near the touch panel overlaid on the display panel according to a graphical user interface displayed by the display panel, and upon detection of an operation thereon or thereabout, the touch panel is transferred to the processor 140 to determine a user input, and the processor 140 then provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel may be implemented as two independent components or may be integrated.
The processor 140 is a control center of the mobile phone 102, and uses various interfaces and lines to connect various parts of the entire mobile phone, and by running or executing software programs and/or modules stored in the memory 120, and invoking data stored in the memory 120, performs various functions of the mobile phone 102 and processes the data, thereby performing overall monitoring of the mobile phone.
Embodiments of the present invention are further described below with reference to the accompanying drawings.
Fig. 3 is a flowchart of an interface interaction method for a game according to an embodiment of the present application. By executing a software application and rendering a graphical user interface on a display of a terminal (e.g., the handset 102 shown in fig. 2), a game scene of a game includes virtual light rays, and the graphical user interface includes a plurality of interaction nodes capable of reflecting the virtual light rays, and the virtual light rays are irradiated onto the first interaction node. As shown in fig. 3, the method includes:
in step S310, in response to the rotation operation for the first interaction node, it is determined that the virtual light passes through the rotated first interaction node to generate a reflected light path.
The interaction nodes can be selectable nodes such as skill options, task options, checkpoint options and the like, and all interaction nodes have a reflection function.
The rotation operation may be any of various types of operations capable of changing the orientation of the first interaction node, for example, rotation of the first interaction node by drag, click, rotation, or the like.
Illustratively, as shown in fig. 4, there is a beam of light in the graphical user interface that impinges on interaction node a (the first interaction node). After the interaction node a is rotated, as shown in fig. 5, the orientation of the interaction node a is changed, so that the path of the reflected light generated by the upper beam of light on the rotated interaction node a is changed.
Step S320, determining at least one second interaction node through which the virtual light passes based on the light path.
The light path is a reflected light path generated after the first interaction node is rotated in step S310. The light path may be a light path generated by reflecting the virtual light multiple times.
For example, as shown in fig. 5, the virtual light is reflected from the interaction node a (the first interaction node) and then enters the interaction node B (the second interaction node), and after being reflected by the interaction node B, enters the interaction node E (the other second interaction node).
Step S330, determining the first interaction node and the at least one second interaction node as selected interaction nodes.
The first interaction node and the at least one second interaction node are determined to be the selected interaction nodes, the light reflection line is fused into the interface interaction behavior, the operation which is originally needed to click the interaction nodes for many times is integrated into one beam of light by utilizing the light reflection principle, the plurality of interaction nodes can be selected in a light reflection mode only by one visual rotation operation, the player can select the plurality of interaction nodes through one operation by utilizing the light reflection principle, and the player can operate more conveniently. In addition, by combining interaction logic and ray tracing, an interface interaction mode taking rays as interaction rules is realized, and the interface interaction mode taking ray reflection as interaction rules is used as a game UI interface interaction method taking rays as interaction rules, so that interaction behaviors are more attached to a game world view taking rays as clues, games taking rays as clues can be combined more effectively, and game experience of players is improved.
The above steps are described in detail below.
In some embodiments, rotating the interaction node may be accomplished through a variety of ways of operation. Based on this, the operation form of the turning operation includes any one or more of the following:
dragging the interactive node; controlling the interaction node to rotate along a specific axial direction; and controlling the interaction node to rotate through the clicking times and the clicking positions.
For example, as shown in fig. 4, the player may click on the interaction node a at the upper left, and after drag operation, the interaction node a rotates along a specific axis to change the orientation of the interaction node a.
By pressing the nodes to drag the nodes to rotate in various modes such as rotating the nodes in a specific axial direction, the player can influence the orientations of the nodes and the reflection angles of the nodes to light rays through more convenient and faster operation.
In some embodiments, the reflected virtual ray may be determined based on the rotational orientation of the interaction node. As an example, the step S310 may include the steps of:
step a), responding to the rotation operation of the first interaction node, and determining the first orientation of the rotated first interaction node;
and b), determining a reflected ray path of the virtual ray generated by the rotated first interaction node according to the first direction.
By changing the orientation of the interaction node, the reflection angle of the interaction node to the virtual light can be influenced, and further the reflected virtual light is formed. Therefore, the reflected virtual light can be more effectively and accurately determined by utilizing the orientation of the interaction node.
Based on the above steps a) and b), the direction of the reflected virtual light can be determined using the principle that the incident angle of the light is equal to the reflection angle. As an example, the above step b) may include the steps of:
step c), determining a first incident angle of the virtual light irradiated to the rotated first interaction node according to the first direction;
step d), determining a first reflection angle of the rotated first interaction node of the virtual light according to the first incidence angle; wherein the first reflection angle is equal to the first incidence angle;
step e), determining a first reflection direction of the virtual light after passing through the rotated first interaction node according to the first reflection angle;
and f), determining a ray path reflected by the rotated first interaction node of the virtual ray according to the first reflection direction.
The principle that the incident angle of the light is equal to the reflection angle of the light is utilized, the reflection angle of the interaction node to the virtual light can be accurately and rapidly determined, the interaction behavior of the game is more in line with the physical common sense in reality, and the game experience of a player is improved.
In some embodiments, the virtual light reflected off the first interaction node may continue to be reflected off the other interaction nodes multiple times. Based on this, the light ray paths in the above-described step S310 and step S320 include: the virtual light irradiates an incident path of each second interaction node, and a reflected path of the virtual light passes through each second interaction node.
For example, as shown in fig. 5, the virtual light reflected by the interaction node a may continuously enter the interaction node B, and then be reflected by the interaction node B, and may further continuously enter the interaction node E, and then be reflected by the interaction node E.
Through the multiple reflection processes corresponding to the interaction nodes, the interaction nodes can be selected simultaneously, and the interaction nodes can be selected simultaneously by utilizing a beam of virtual light.
In some embodiments, the interaction node where the virtual ray reflection occurs may be marked, activated, selected, etc. As an example, the interaction node selected in step S330 is used to represent any one or more of the following: the selected interactive node, the marked interactive node and the activated interactive node.
For example, as shown in fig. 5, at this time, the interaction node a, the interaction node B, and the interaction node E are selected, or the node information corresponding to the interaction node a, the interaction node B, and the interaction node E may be activated.
Based on step g) above, the selected, marked or activated interaction node may be highlighted. Based on this, the selected interactive nodes are displayed in a designated form to highlight the selected interactive nodes in the interactive nodes.
Wherein the specified form comprises any one or more of the following: brightness change form, color change form, stereoscopic display form, shadow display form, node information display form.
Illustratively, when a virtual ray is incident on an interactive node, corresponding interface information is displayed, e.g., the color of the interactive node changes from green to red, the interactive node changes to a stereoscopic display, the interactive node is lit, etc., to indicate that the interactive node has been selected, marked, or activated.
By highlighting the interaction nodes through which the virtual light passes, the player can more conveniently and quickly identify a plurality of interaction nodes which are selected, marked or activated.
In some embodiments, the interaction nodes may be arranged in a graphical user interface according to preset locations. Based on the setting positions of the interaction nodes in the graphical user interface are arranged and combined according to a preset sequence; the preset sequence is used for representing the selection logic relationship among the plurality of interaction nodes. In practical applications, the source direction of the virtual light may be a fixed direction.
The reflection of the light rays between the interaction nodes can have a certain logic sequence by utilizing a beam with a fixed source in the graphical user interface and combining the interaction rule of the reflection of the light rays.
For example, for a skill-to-point interaction process, the skill options typically have a certain order and selection logic between higher-order skills and lower-order skills. Through the light reflection interaction mode and the rule provided by the embodiment of the application, the logic relationship can be embodied through the angle of light reflection, and the action of selecting skill upgrading can be achieved through different interaction actions.
For another example, for interactive processes of checkpoint selection, the checkpoint selection typically has a precedence logical relationship. Through the light reflection interaction mode and the rule provided by the embodiment of the application, the actions of unlocking and selecting the checkpoints can be executed through the light reflection interaction mode.
Nodes such as checkpoints or skills which originally need to be clicked for multiple times are combed, a beam of light can be integrated into all the nodes such as the checkpoints or skills through the mode of reflecting after node arrangement and combination, a player can conveniently utilize the light reflection principle to achieve the purpose of selecting multiple nodes through one-time operation, and the operation process is more convenient and rapid.
In some embodiments, each interaction node may implement reflection of virtual light rays through a planar structure. Based on the above, the side of the interaction node, which can reflect the virtual light, is a plane model. Through the planar structure on the interaction node, the incidence angle and the reflection angle of the virtual light can be rapidly and accurately determined, and then the reflected virtual light can be rapidly and accurately determined.
In some embodiments, the methods provided by embodiments of the present application may be applied to a variety of scenarios in a game. Based on this, the graphical user interface is an information selection interface in the game; the information selection interface includes any one or more of the following: a skill add point interface, a checkpoint selection interface, and a task selection interface.
For example, in the skill adding interface, the actions of selecting and adding skill upgrading can be realized by utilizing an interaction mode of light reflection. For example, in the task gate interface, actions such as gate selection, task unlocking and the like in the game can be realized by utilizing an interaction mode of light reflection.
The method provided by the embodiment of the application has a wider practical application scene, so that interaction logic of the game level selection interface, the skill adding interface and other interfaces can be combined with light tracking, and an interface interaction mode with light as an interaction rule is realized to cater to the development trend of the light tracking game.
FIG. 6 provides a schematic structural diagram of an interface interaction device for a game. And executing the software application and rendering the software application on a display of the terminal to obtain a graphical user interface, wherein the game scene of the game comprises virtual light rays, the graphical user interface comprises a plurality of interaction nodes capable of reflecting the virtual light rays, and the virtual light rays irradiate on the first interaction nodes. As shown in fig. 6, the interface interaction device 600 for a game includes:
a first determining module 601, configured to determine, in response to a rotation operation for the first interaction node, a path of reflected light generated by the first interaction node after the virtual light passes through the rotation;
a second determining module 602, configured to determine, based on the light path, at least one second interaction node through which the virtual light passes;
a third determining module 603 is configured to determine the first interaction node and the at least one second interaction node as selected interaction nodes.
In some embodiments, the operational form of the turning operation includes any one or more of the following:
dragging the interactive node; controlling the interaction node to rotate along a specific axial direction; and controlling the interaction node to rotate through the clicking times and the clicking positions.
In some embodiments, the first determining module 601 is specifically configured to:
determining a first orientation of the rotated first interaction node in response to a rotation operation for the first interaction node;
and determining a reflected ray path of the virtual ray passing through the rotated first interaction node according to the first direction.
In some embodiments, the first determining module 601 is further configured to:
determining a first incident angle of virtual light irradiated to the rotated first interaction node according to the first direction;
determining a first reflection angle of a first interaction node after the virtual light is rotated according to the first incidence angle; wherein the first reflection angle is equal to the first incidence angle;
determining a first reflection direction of the virtual light after passing through the rotated first interaction node according to the first reflection angle;
and determining a ray path reflected by the rotated first interaction node of the virtual ray according to the first reflection direction.
In some embodiments, the light path includes: the virtual light irradiates an incident path of each second interaction node, and a reflected path of the virtual light passes through each second interaction node.
In some embodiments, the selected interaction node is used to represent any one or more of the following:
the selected interactive node, the marked interactive node and the activated interactive node.
In some embodiments, the selected interactive nodes are displayed in a designated form to highlight the selected interactive nodes in the interactive nodes.
In some embodiments, the specified form includes any one or more of the following:
brightness change form, color change form, stereoscopic display form, shadow display form, node information display form.
In some embodiments, the setting positions of the plurality of interaction nodes in the graphical user interface are arranged and combined according to a preset sequence; the preset sequence is used for representing the selection logic relationship among the plurality of interaction nodes.
In some embodiments, the source direction of the virtual light is a fixed direction.
In some embodiments, the side of the interaction node that reflects the virtual ray is a planar model.
In some embodiments, the graphical user interface is an in-game information selection interface; the information selection interface includes any one or more of the following:
a skill add point interface, a checkpoint selection interface, and a task selection interface.
The game interface interaction device provided by the embodiment of the application has the same technical characteristics as the game interface interaction method provided by the embodiment, so that the same technical problems can be solved, and the same technical effects are achieved.
Corresponding to the interface interaction method of the game, the embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores machine executable instructions, and the computer executable instructions, when being called and executed by a processor, cause the processor to execute the steps of the interface interaction method of the game.
The interface interaction device of the game provided by the embodiment of the application can be specific hardware on the equipment or software or firmware installed on the equipment. The device provided in the embodiments of the present application has the same implementation principle and technical effects as those of the foregoing method embodiments, and for a brief description, reference may be made to corresponding matters in the foregoing method embodiments where the device embodiment section is not mentioned. It will be clear to those skilled in the art that, for convenience and brevity, the specific operation of the system, apparatus and unit described above may refer to the corresponding process in the above method embodiment, which is not described in detail herein.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
As another example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments provided in the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the interface interaction method of the game described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that: like reference numerals and letters in the following figures denote like items, and thus once an item is defined in one figure, no further definition or explanation of it is required in the following figures, and furthermore, the terms "first," "second," "third," etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application. Are intended to be encompassed within the scope of this application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. The interface interaction method for the game is characterized in that a graphical user interface is obtained by executing a software application and rendering the software application on a display of a terminal, wherein a game scene of the game comprises virtual light rays, the graphical user interface comprises a plurality of interaction nodes capable of reflecting the virtual light rays, and the virtual light rays irradiate on a first interaction node; the method comprises the following steps:
determining a ray path of the virtual ray reflected by the rotated first interaction node in response to the rotation operation of the first interaction node;
determining at least one second interaction node through which the virtual light passes based on the light path;
determining the first interaction node and the at least one second interaction node as selected interaction nodes;
the setting positions of the interaction nodes in the graphical user interface are arranged and combined according to a preset sequence, and the preset sequence is used for representing the selection logic relationship among the interaction nodes; the graphical user interface is an information selection interface in the game, and the information selection interface comprises any one or more of the following: a skill add point interface, a checkpoint selection interface, and a task selection interface.
2. The method of claim 1, wherein the operational form of the turning operation includes any one or more of:
dragging the interactive node; controlling the interaction node to rotate along a specific axial direction; and controlling the interaction node to rotate through the clicking times and the clicking positions.
3. The method of claim 1, wherein determining that the rotated first interaction node produced the reflected ray path for the virtual ray in response to the rotation operation for the first interaction node comprises:
determining a first orientation of the rotated first interaction node in response to a rotation operation for the first interaction node;
and determining a reflected ray path of the virtual ray generated by the rotated first interaction node according to the first orientation.
4. A method according to claim 3, wherein the step of determining, from the first orientation, that the virtual ray has undergone a rotation by the first interaction node, generates a reflected ray path, comprises:
determining a first incident angle of the virtual light irradiated to the rotated first interaction node according to the first direction;
determining a first reflection angle of a first interaction node after the virtual light is rotated according to the first incidence angle; wherein the first reflection angle is equal to the first incidence angle;
determining a first reflection direction of the virtual light after passing through the rotated first interaction node according to the first reflection angle;
and determining a ray path reflected by the rotated first interaction node of the virtual ray according to the first reflection direction.
5. The method of claim 1, wherein the light path comprises: the virtual light irradiates an incident path of each second interaction node, and the virtual light passes through a reflection path of each second interaction node.
6. The method of claim 1, wherein the selected interaction node is configured to represent any one or more of:
the selected interactive node, the marked interactive node and the activated interactive node.
7. The method of claim 6, wherein the selected interactive nodes are displayed in a designated form to highlight the selected interactive nodes in the interactive nodes.
8. The method of claim 7, wherein the specified form comprises any one or more of:
brightness change form, color change form, stereoscopic display form, shadow display form, node information display form.
9. The method of claim 1, wherein the source direction of the virtual light is a fixed direction.
10. The method of claim 1, wherein the side of the interaction node that reflects the virtual light is a planar model.
11. The interface interaction device for the game is characterized in that a graphical user interface is obtained by executing a software application and rendering the software application on a display of a terminal, wherein a game scene of the game comprises virtual light rays, the graphical user interface comprises a plurality of interaction nodes capable of reflecting the virtual light rays, and the virtual light rays irradiate on a first interaction node; the device comprises:
the first determining module is used for responding to the rotation operation of the first interaction node and determining a ray path of the reflected virtual ray generated by the rotated first interaction node;
a second determining module, configured to determine, based on the light path, at least one second interaction node through which the virtual light passes;
a third determining module, configured to determine the first interaction node and the at least one second interaction node as selected interaction nodes;
the setting positions of the interaction nodes in the graphical user interface are arranged and combined according to a preset sequence, and the preset sequence is used for representing the selection logic relationship among the interaction nodes; the graphical user interface is an information selection interface in the game, and the information selection interface comprises any one or more of the following: a skill add point interface, a checkpoint selection interface, and a task selection interface.
12. An electronic terminal comprising a memory, a processor, the memory having stored therein a computer program executable on the processor, characterized in that the processor, when executing the computer program, implements the steps of the method of any of the preceding claims 1 to 10.
13. A computer readable storage medium storing machine executable instructions which, when invoked and executed by a processor, cause the processor to perform the method of any one of claims 1 to 10.
CN202010545527.8A 2020-06-15 2020-06-15 Game interface interaction method and device and electronic terminal Active CN111729296B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010545527.8A CN111729296B (en) 2020-06-15 2020-06-15 Game interface interaction method and device and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010545527.8A CN111729296B (en) 2020-06-15 2020-06-15 Game interface interaction method and device and electronic terminal

Publications (2)

Publication Number Publication Date
CN111729296A CN111729296A (en) 2020-10-02
CN111729296B true CN111729296B (en) 2024-02-09

Family

ID=72649317

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010545527.8A Active CN111729296B (en) 2020-06-15 2020-06-15 Game interface interaction method and device and electronic terminal

Country Status (1)

Country Link
CN (1) CN111729296B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115501580A (en) * 2021-06-23 2022-12-23 中移物联网有限公司 Game acceleration method, optical line terminal and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4240204A1 (en) * 1992-11-30 1994-06-01 Thomas Hohenacker Electronic laser game appts. - deflects visible light beam to target by mirrors rotated by player
CN101119777A (en) * 2005-02-14 2008-02-06 英诺威申玩具股份有限公司 Light-reflecting board game
CN201807196U (en) * 2010-06-18 2011-04-27 浙江理工大学 Laser toy for children
CN106924970A (en) * 2017-03-08 2017-07-07 网易(杭州)网络有限公司 Virtual reality system, method for information display and device based on virtual reality
CN107433036A (en) * 2017-06-21 2017-12-05 网易(杭州)网络有限公司 The choosing method and device of object in a kind of game
CN107688426A (en) * 2017-08-07 2018-02-13 网易(杭州)网络有限公司 The method and apparatus for choosing target object
CN108310769A (en) * 2017-11-01 2018-07-24 深圳市创凯智能股份有限公司 Virtual objects adjusting method, device and computer readable storage medium
CN108854064A (en) * 2018-05-25 2018-11-23 深圳市腾讯网络信息技术有限公司 Interaction control method, device, computer-readable medium and electronic equipment
CN109675301A (en) * 2018-12-29 2019-04-26 成都视初文化科技有限公司 A kind of immersion game system
CN110812838A (en) * 2019-11-13 2020-02-21 网易(杭州)网络有限公司 Method and device for controlling virtual unit in game and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4240204A1 (en) * 1992-11-30 1994-06-01 Thomas Hohenacker Electronic laser game appts. - deflects visible light beam to target by mirrors rotated by player
CN101119777A (en) * 2005-02-14 2008-02-06 英诺威申玩具股份有限公司 Light-reflecting board game
CN201807196U (en) * 2010-06-18 2011-04-27 浙江理工大学 Laser toy for children
CN106924970A (en) * 2017-03-08 2017-07-07 网易(杭州)网络有限公司 Virtual reality system, method for information display and device based on virtual reality
CN107433036A (en) * 2017-06-21 2017-12-05 网易(杭州)网络有限公司 The choosing method and device of object in a kind of game
CN107688426A (en) * 2017-08-07 2018-02-13 网易(杭州)网络有限公司 The method and apparatus for choosing target object
CN108310769A (en) * 2017-11-01 2018-07-24 深圳市创凯智能股份有限公司 Virtual objects adjusting method, device and computer readable storage medium
CN108854064A (en) * 2018-05-25 2018-11-23 深圳市腾讯网络信息技术有限公司 Interaction control method, device, computer-readable medium and electronic equipment
CN109675301A (en) * 2018-12-29 2019-04-26 成都视初文化科技有限公司 A kind of immersion game system
CN110812838A (en) * 2019-11-13 2020-02-21 网易(杭州)网络有限公司 Method and device for controlling virtual unit in game and electronic equipment

Also Published As

Publication number Publication date
CN111729296A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
US10990268B2 (en) Operation method and terminal device
AU2016336604B2 (en) Information processing method and terminal, and computer storage medium
US20220292590A1 (en) Two-dimensional code identification method and device, and mobile terminal
CA2982868C (en) Method for performing virtual operations on a character object, terminal, and computer storage medium
EP3264248B1 (en) Information processing method, terminal, and computer storage medium
WO2017054464A1 (en) Information processing method, terminal and computer storage medium
US20180107363A1 (en) Method and apparatus for displaying icon
WO2020215959A1 (en) Game object control method and apparatus
CN107066188B (en) A kind of method and terminal sending screenshot picture
CN109939445B (en) Information processing method and device, electronic equipment and storage medium
CN107479818B (en) Information interaction method and mobile terminal
WO2020215978A1 (en) Game object control method and device
JP2023542666A (en) Operation method and device
CN116459506A (en) Game object selection method and device
CN111729296B (en) Game interface interaction method and device and electronic terminal
CN106648281B (en) Screenshot method and device
CN105589627B (en) Shortcut menu display method, shortcut menu display device and terminal
CN113304472B (en) Secondary confirmation method, secondary confirmation device, computer equipment and readable storage medium
CN113411539B (en) Multi-user chat initiation method and device
CN113926186A (en) Method and device for selecting virtual object in game and touch terminal
CN108108080A (en) Prompt operation processing method, device and mobile terminal
CN113457157A (en) Method and device for switching virtual props in game and touch terminal
CN112764862A (en) Application program control method and device and electronic equipment
CN107678632B (en) Resource transfer method, terminal and computer readable storage medium
CN116099198A (en) Virtual object control method and device and electronic terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant