CN107823882B - Information processing method, information processing device, electronic equipment and storage medium - Google Patents

Information processing method, information processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107823882B
CN107823882B CN201711148060.8A CN201711148060A CN107823882B CN 107823882 B CN107823882 B CN 107823882B CN 201711148060 A CN201711148060 A CN 201711148060A CN 107823882 B CN107823882 B CN 107823882B
Authority
CN
China
Prior art keywords
touch operation
game scene
user interface
graphical user
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711148060.8A
Other languages
Chinese (zh)
Other versions
CN107823882A (en
Inventor
张楠楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711148060.8A priority Critical patent/CN107823882B/en
Publication of CN107823882A publication Critical patent/CN107823882A/en
Application granted granted Critical
Publication of CN107823882B publication Critical patent/CN107823882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an information processing method, an information processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: detecting a first touch operation acting on the first touch operation area, controlling the virtual character to move and/or rotate in the game scene according to the first touch operation, and controlling the presentation visual field of the game scene on the graphical user interface according to the position of the virtual character in the game scene; detecting a second touch operation acting on the second touch operation area, and adjusting the presentation visual field of the game scene in the graphical user interface according to the second touch operation; and when the preset action of the second touch operation is detected, triggering the display visual field of the game scene to be adjusted to a visual field locking state, and locking the current visual field to the display visual field of the game scene in the graphical user interface. The invention solves the technical problem that the specific area can not be observed in a locked mode in the mobile terminal interaction mode.

Description

Information processing method, information processing device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of games, in particular to an information processing method, an information processing device, electronic equipment and a storage medium.
Background
With the development of mobile intelligent terminals and the game industry, a great amount of handgames with different subjects emerge to meet the requirements of players. The formation of a plurality of players for playing a game is the core playing method of many games, such as: the MOBA (english language is called Multiplayer Online Battle Arena, Chinese is translated into Multiplayer Online tactical competitive game) game is the group Battle of 5V 5.
In the existing mobile phone game application, the mobile operation is generally completed by the left hand, and the adjustment of the visual field presented to the game scene or the skill sending operation is completed by the right hand. The interaction mode can not adjust the visual field while controlling the virtual character, thereby increasing the action load of the player and reducing the operation efficiency of the player; meanwhile, the operation threshold of a novice player is improved, and the game experience is reduced.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
An object of the present invention is to provide an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the invention will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present invention, there is provided an information processing method applied to a touch terminal capable of presenting a graphical user interface, where the graphical user interface includes a first touch operation area and a second touch operation area, and content presented by the graphical user interface at least partially includes a game scene and a virtual character, the method including:
detecting a first touch operation acting on a first touch operation area, controlling a virtual character to move and/or rotate in a game scene according to the first touch operation, and controlling a presentation visual field of the game scene on a graphical user interface according to the position of the virtual character in the game scene;
detecting a second touch operation acting on a second touch operation area, and adjusting the presentation visual field of the game scene in the graphical user interface according to the second touch operation;
when the preset action of the second touch operation is detected, the presenting view of the game scene is triggered to be adjusted to the view locking state, and the current view is locked to the presenting view of the game scene in the graphical user interface.
Optionally, the preset action includes: the touch operation exceeding a preset time, the pressing operation exceeding a preset pressure and the operation of the touch object leaving the second touch operation area are combined.
Optionally, the method further comprises:
under the state of locking the visual field, when detecting that the virtual character enters the locking area, restoring to the presenting visual field of the game scene on the graphical user interface controlled according to the position of the virtual character in the game scene;
the locked area is a game scene area presented by the graphical user interface in the state of the locked view field.
Optionally, the restoring to the presentation field of view of the game scene on the graphical user interface according to the position control of the virtual character in the game scene comprises:
acquiring center point coordinates a (x0, y0) and virtual character position coordinates B (x1, y1) of a lock area;
and controlling the virtual camera corresponding to the rendering visual field to move towards the direction of the vector AB.
Optionally, controlling the virtual camera corresponding to the field of view to be presented to move in the direction of the vector AB includes:
the virtual camera moves at the rate of PN pixels per second, where N is the refresh rate and P is the number of pixels.
Optionally, the method further comprises:
and in the vision locking state, when a first touch operation acting on the first touch operation area is detected, the vision locking state is kept.
Optionally, the method further comprises:
in the view locking state, when a second touch operation acting on a second touch operation area is detected, the display view of the game scene in the graphical user interface is continuously adjusted according to the second touch operation;
and when the preset action of the second touch operation is detected, triggering the display view of the game scene to be adjusted to a view locking state.
Optionally, adjusting a presentation field of view of a game scene in the graphical user interface according to the second touch operation includes:
changing the position of the virtual camera corresponding to the display view according to the second touch operation;
and determining the presenting visual field of the game scene picture on the graphical user interface according to the position of the virtual camera.
Optionally, controlling a rendering field of view of the game scene on the graphical user interface according to the position of the virtual character in the game scene comprises:
determining the position of a virtual camera corresponding to the visual field according to the position of the virtual character;
and controlling the presentation visual field of the game scene picture on the graphical user interface according to the movement of the virtual character.
Optionally, the method further comprises:
and when the cancel operation acting on the preset area is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the second touch operation.
According to a second aspect of the present invention, there is provided an information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, the graphical user interface including a first touch operation area and a second touch operation area, content presented by the graphical user interface including at least a game scene and a virtual character, the apparatus comprising:
the first interaction unit is used for detecting a first touch operation acting on the first touch operation area, controlling the virtual character to move and/or rotate in the game scene according to the first touch operation, and controlling the presentation visual field of the game scene on the graphical user interface according to the position of the virtual character in the game scene;
the second interaction unit is used for detecting second touch operation acting on a second touch operation area and adjusting the presentation visual field of a game scene in the graphical user interface according to the second touch operation;
and the display unit is used for triggering the display visual field of the game scene to be adjusted to a visual field locking state when the preset action of the second touch operation is detected, and locking the current visual field to the display visual field of the game scene in the graphical user interface.
According to a third aspect of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method of any one of the above.
According to a fourth aspect of the present invention, there is provided an electronic apparatus comprising: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to perform the information processing method of any one of the above via execution of executable instructions.
In the information processing method, the information processing apparatus, the electronic device, and the computer-readable storage medium according to an exemplary embodiment of the present invention, by detecting a second touch operation applied to a second touch operation area, a presentation field of view of a game scene in a graphical user interface is adjusted according to the second touch operation; when the preset action of the second touch operation is detected, the presenting view of the game scene is triggered to be adjusted to the view locking state, and the current view is locked to the presenting view of the game scene in the graphical user interface.
Through the method provided by the invention, on one hand, the game visual angle (the game scene displayed on the screen is adjusted) can be adjusted according to the operation of the player, and the specific game visual angle can be locked, so that the player can observe the game scene at the specific position conveniently and correspondingly adjust the game strategy, the burden of the player on continuously performing visual angle adjustment operation is avoided, and the player can concentrate on the operation and control of the virtual character of the player, the observation of the game scene at the specific position and the operation (such as the operation of viewing, purchasing or using props) of other related games; on the other hand, whether the visual angle is restored to the visual angle before the virtual character of the player or not can be automatically determined according to the position of the virtual character of the player in the game scene, the operation is simple and convenient, and the user experience is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 schematically illustrates a flow chart of a method of information processing in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a first graphical user interface diagram of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a diagram of a second graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a diagram of a third graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
fig. 5 schematically illustrates a diagram of a fourth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with one embodiment of the present invention, there is provided an information processing method, wherein the steps shown in the flowchart of the figure may be executed in a computer system such as a set of computer executable instructions, and wherein, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that shown.
The exemplary embodiment first discloses an information processing method, which can be applied to a touch terminal capable of presenting a game scene picture in a graphical user interface, where the game scene picture may be a game scene picture of a mobile phone game application including shooting, puzzle solving and the like. The touch terminal may be various electronic devices having a touch screen, such as a mobile phone, a tablet computer, a notebook computer, a game machine, and a PDA. The method includes executing a software application on a processor of the touch terminal and rendering the software application on a display of the touch terminal to obtain a graphical user interface, wherein the graphical user interface includes a first touch operation area and a second touch operation area, and content presented by the graphical user interface at least partially includes a game scene and a virtual character.
As shown in fig. 1, the information processing method may include the steps of:
step S110, detecting a first touch operation acting on a first touch operation area, controlling a virtual character to move and/or rotate in a game scene according to the first touch operation, and controlling a presentation visual field of the game scene on a graphical user interface according to the position of the virtual character in the game scene;
step S130, detecting a second touch operation acting on a second touch operation area, and adjusting the display visual field of the game scene in the graphical user interface according to the second touch operation;
step S150, when the preset action of the second touch operation is detected, triggering the display view of the game scene to adjust to a view locking state, and locking the current view to the display view of the game scene in the graphical user interface.
Through the method provided by the invention, on one hand, the game visual angle (the game scene displayed on the screen is adjusted) can be adjusted according to the operation of the player, and the specific game visual angle can be locked, so that the player can observe the game scene at the specific position conveniently and correspondingly adjust the game strategy, the burden of the player on continuously performing visual angle adjustment operation is avoided, and the player can concentrate on the operation and control of the virtual character of the player, the observation of the game scene at the specific position and the operation (such as the operation of viewing, purchasing or using props) of other related games; on the other hand, whether the visual angle is restored to the visual angle before the virtual character of the player or not can be automatically determined according to the position of the virtual character of the player in the game scene, the operation is simple and convenient, and the user experience is improved.
Next, the steps of the information processing method in the present exemplary embodiment are further described with reference to fig. 2 to 5.
In the present exemplary embodiment, executing a software application on a processor of the mobile terminal and rendering on a touch display of the mobile terminal results in a graphical user interface 200, the content presented by the graphical user interface 200 at least partially comprising a game scene 210, and a virtual character 220; the graphical user interface 200 includes a first touch operation area 230 and a second touch operation area 240.
The content presented by the graphical user interface 200 may include all of the game scene 210 or may be a portion of the game scene 210. For example, in an embodiment of the present invention, as shown in fig. 2, since the game scene 210 is relatively large, the partial content of the game scene 210 is displayed on the graphic user interface 200 of the mobile terminal during the game.
Step S110 is to detect a first touch operation applied to the first touch operation area, control the virtual character to move and/or rotate in the game scene according to the first touch operation, and control a display view of the game scene on the graphical user interface according to a position of the virtual character in the game scene.
In the exemplary embodiment, as shown in fig. 2, a first touch operation area 230 is provided on the gui 200, and when a first touch operation on the first touch operation area 230 is detected, the virtual character 220 is controlled to move and/or rotate in the game scene 210 according to the movement of the touch point of the first touch operation.
Specifically, the first touch operation area 230 may be an area with a visual indication effect in the graphical user interface 200, or may be an area without a visual indication effect; an operation area, such as a virtual joystick or a direction control virtual key, may also be displayed in the first touch operation area 230, which is not limited in this exemplary embodiment.
In an embodiment of the present invention, the first touch operation area 230 is a virtual joystick operation area, which is located at the lower left of the gui 200 and controls the virtual character 220 to move and/or rotate in the game scene 210 according to the first touch operation received by the virtual joystick operation area.
It is understood that, in other embodiments, the first touch operation area 230 may also be a virtual cross key area/virtual direction key (D-PAD) area, and the virtual character 220 is controlled to move and/or rotate in the game scene 210 according to the first touch operation received by the virtual cross key area.
As an alternative embodiment, the first touch operation area 230 may be an area with a visual indication in the gui 200, for example, the first touch operation area 230 may have a bounding box, or a certain range of filling colors, or a certain range of predetermined transparency, or other manners capable of visually distinguishing the first touch operation area 230. The virtual character 220 is controlled to move and/or rotate in the game scene 210 according to the first touch operation received by the first touch operation area 230. The first touch operation area 230 with the visual indication can enable the user to quickly locate the area, and can reduce the operation difficulty of a game beginner.
As another alternative, the first touch operation area 230 may be an area of the gui 200 without a visual indication. The first touch operation area 230 without visual indication does not cover or affect the game picture, providing better picture effect and saving screen space. However, since the virtual joystick has no visual indication and is not easily perceived by the player, as an improved implementation, a visual guidance control may be displayed in the first touch operation area 230, for example, in an embodiment of the present invention, when the virtual joystick is used as a direction control scheme of the virtual character 220, the virtual joystick is displayed in the first touch operation area 230 to visually guide the player.
It will be appreciated that a plurality of skill controls may also be provided on the graphical user interface 200 on a different side of the first touch operational area 230 for providing skill-delivering control functionality to the player. In an embodiment of the present invention, the first touch operation area 230 is disposed at the lower left of the gui 200, and the skill control is disposed at the lower right of the gui 200; therefore, the virtual character can be controlled to move and/or rotate in the game scene conveniently through the left hand, and the skill control is controlled through the right hand to send the skill.
In the present exemplary embodiment, controlling a rendering field of view of a game scene on a graphical user interface according to a position of a virtual character in the game scene includes: determining the position of a virtual camera corresponding to the visual field according to the position of the virtual character; and controlling the presentation visual field of the game scene picture on the graphical user interface according to the movement of the virtual character.
In the first person game, the virtual camera may be an "eye" of the user in the game, the virtual camera may be disposed at the head of the virtual character, the orientation of the virtual camera rotates along with the rotation of the virtual character, and the content of the game scene is rendered on the display of the touch terminal to be equivalent to the content of the scene shot by the virtual camera. In the third person game, a virtual camera may be disposed at the upper rear of the virtual character, and all game scenes may be photographed. A mapping relation can be set between the vector distance of the virtual rocker control and the rotation angle of the virtual camera so as to control the virtual camera to rotate. In some games (e.g., 2.5D games), a virtual camera may be positioned overhead of a virtual character to capture a game scene in a certain area around the virtual character at a fixed angle in a top view.
Specifically, the virtual character 220 is controlled to move in the game scene according to the movement of the touch point of the first touch operation, thereby controlling the movement of the virtual camera corresponding to the graphical user interface 200; the position of the virtual camera corresponding to the graphical user interface 200 is controlled according to the position of the virtual character 220 in the game scene 210, so as to draw the real-time rendering view of the game scene 210 on the graphical user interface 200.
Step S130, detecting a second touch operation applied to the second touch operation area, and adjusting a display view of the game scene in the graphical user interface according to the second touch operation.
In the present exemplary embodiment, as shown in fig. 2, a second touch operation area 240 is provided on the graphical user interface 200, and when a second touch operation applied to the second touch operation area 240 is detected, a presentation field of view of the game scene 210 in the graphical user interface 200 is adjusted according to movement of a touch point of the second touch operation.
The second touch operation area 240 may be an area with a visual indication effect in the graphical user interface 200, or may be an area without the visual indication effect, and the shape, size and visual presentation effect of the second touch operation area 240 are not limited in this exemplary embodiment. For example, in an embodiment of the invention, as shown in fig. 2, the second touch operation area 240 is a rectangular area on the top of the gui 200, and in order to avoid blocking the game scene 210, the second touch operation area 240 is an area without a visual indication. In other embodiments, the second touch operation area 240 may be an area having a geometric shape similar to the outer contour of the gui 200, an area having any shape and size, or an area having a predetermined transparency and a visual indication.
In the present exemplary embodiment, adjusting the rendering field of view of the game scene in the graphical user interface according to the second touch operation includes: changing the position of the virtual camera corresponding to the display view according to the second touch operation; and determining the presenting visual field of the game scene picture on the graphical user interface according to the position of the virtual camera.
In some games (e.g., 2.5D games), a virtual camera may be positioned overhead of the virtual character to capture a game scene in an area around the virtual character in a top view. Thus, the position of the virtual camera corresponding to the presenting view field is changed according to the second touch operation; and determining the presenting visual field of the game scene picture on the graphical user interface according to the position of the virtual camera.
In the present exemplary embodiment, adjusting the rendering field of view of the game scene in the graphical user interface according to the second touch operation includes: and adjusting the view field movement track of the game scene in the graphical user interface according to the movement track of the touch point of the second touch operation.
The second touch operation is taken as an example for explanation. Specifically, the second touch operation includes: a sliding operation applied to the second touch operation area 240. And detecting sliding operation acting on the second touch operation area, and adjusting the view field movement track of the game scene in the graphical user interface according to the movement track of the touch point of the sliding operation.
In an embodiment of the present invention, when the sliding operation applied to the second touch operation area 240 is detected, the sliding track of the touch point of the sliding operation is detected in real time, and the view field of the game scene 210 in the graphical user interface 200 is controlled to move in a manner consistent with the sliding track of the touch point, that is, the moving track of the virtual camera is consistent with the sliding track of the touch point of the sliding operation.
For example, fig. 2 shows an initial presentation view of the game scene 210 in the graphical user interface 200, at which the virtual character 220 in the game scene 210 is displayed at a middle position of the graphical user interface 200, and when a sliding operation on the second touch operation area 240 is detected, for example, the sliding operation is a sliding operation upward for a certain distance, the view of the game scene 210 in the graphical user interface 200 is controlled to move in a manner consistent with a sliding trajectory of the touch point, that is, as shown in fig. 3, the view of the game scene 210 is controlled to move upward for a certain distance, and at this time, the virtual character 220 is not displayed in the graphical user interface 200.
In other embodiments, when the sliding operation on the second touch operation area 240 is detected, the view of the game scene 210 in the graphical user interface 200 may also be controlled to move in a manner opposite to the sliding trajectory of the touch point, that is, corresponding to directly dragging the game scene 210.
For example, fig. 2 shows an initial presentation view of the game scene 210 in the graphical user interface 200, when the virtual character 220 in the game scene 210 is displayed at a middle position of the graphical user interface 200, and when a sliding operation on the second touch operation area 240 is detected, the sliding operation is performed to slide down for a distance, and the view of the game scene 210 in the graphical user interface 200 is controlled to move in a manner opposite to the sliding trajectory of the touch point, that is, as shown in fig. 3, the view of the game scene 210 is controlled to move up for a distance.
Step S150, when the preset action of the second touch operation is detected, triggering the display view of the game scene to adjust to a view locking state, and locking the current view to the display view of the game scene in the graphical user interface.
Specifically, for example, fig. 2 is an initial rendering view of a game scene 210 in a graphical user interface 200, when a virtual character 220 in the game scene 210 is displayed in an intermediate position of the graphical user interface 200; if the battle occurs at the position A, the player needs to observe the game scene picture at the position A; when a second touch operation on the second touch operation area 240 is detected, for example, an upward sliding operation on the second touch operation area 240 is detected, and the display view of the game scene 210 in the gui 200 is adjusted according to the sliding operation, so that the position a is located in the display view of the gui 200, as shown in fig. 3; when the preset action of the second touch operation is detected, the presenting view of the game scene is triggered to be adjusted to the view locking state, and the current view is locked as the presenting view of the game scene in the graphical user interface, that is, the presenting view with the position a is locked as the presenting view of the game scene 210 in the graphical user interface 200.
In the present exemplary embodiment, the preset action includes: the touch operation exceeding a preset time, the pressing operation exceeding a preset pressure and the operation of the touch object leaving the second touch operation area are combined.
Specifically, in an embodiment of the present invention, the predetermined action of the second touch operation may be that the touch object leaves the second touch operation area 240. For example, in the process of adjusting the presenting view of the game scene 210 in the gui 200 by the second touch operation, when the current view needs to be locked, the user only needs to leave the second touch operation area 240 with a touch object (e.g., a finger), so as to trigger the presenting view of the game scene 210 to be adjusted to the locked view state and lock the current view to the presenting view of the game scene 210 in the gui 200.
As an alternative embodiment, the predetermined action of the second touch operation may be a pressing action exceeding a predetermined pressure. For example, in the process of adjusting the presentation field of the game scene 210 in the graphical user interface 200 by the second touch operation, when a pressing operation exceeding a preset pressure is detected in the process of the second touch operation, that is, when the user presses the finger again in the second touch operation area 240, the presentation field of the game scene 210 is triggered to be adjusted to the field-of-view locking state, and the current field of view is locked to the presentation field of the game scene 210 in the graphical user interface 200.
In other embodiments, the predetermined action of the second touch operation may also be a touch action that exceeds a predetermined time. For example, in the process of adjusting the presentation field of the game scene 210 in the graphical user interface 200 by the second touch operation, when a touch operation exceeding a preset time is detected in the process of the second touch operation, that is, when the user presses the second touch operation area 240 for a long time, the presentation field of the game scene 210 is triggered to be adjusted to the field-of-view locking state, and the current field of view is locked to the presentation field of the game scene 210 in the graphical user interface 200.
In an alternative embodiment, in order to facilitate viewing of the game scene picture around the virtual character, when the presentation view of the game scene is triggered to be adjusted to the view-locked state, an auxiliary display window is provided on the graphical user interface, and the auxiliary display window is used for displaying the presentation view of the game scene on the graphical user interface according to the position of the virtual character in the game scene.
As shown in fig. 4, when the adjustment of the rendering field of view of the game scene to the field-of-view-locked state is triggered, for example, the game scene picture of the rendering position a is locked, an auxiliary display window 250 is provided on the graphical user interface 200, and the auxiliary display window 250 is used for displaying the rendering field of view of the game scene 210 on the graphical user interface 200 according to the position of the virtual character 220 in the game scene 210.
In the present exemplary embodiment, in the field-of-view locked state, when it is detected that the virtual character enters the locked area, it is restored to control the presentation field of view of the game scene on the graphical user interface according to the position of the virtual character in the game scene; the locked area is a game scene area presented by the graphical user interface in the state of the locked view field.
Specifically, in the view lock state, when it is detected that the virtual character 220 enters the lock area, the view is restored to the view of the game scene 210 on the graphical user interface 200 according to the position of the virtual character 220 in the game scene 210, that is, the view before the second touch operation. The lock area is a game scene 210 area presented by the graphical user interface 200 in the view-locked state. For example, prior to the locked view state, the area of the game scene 210 presented by the graphical user interface 200 is centered on the virtual character 220 (as shown in FIG. 2); after entering the locked view state, the game scene 210 area presenting the battle venue location a (as shown in fig. 3) on the gui 200 is the locked area.
In the present exemplary embodiment, the restoring to the rendering field of view of the game scene on the graphical user interface according to the position control of the virtual character in the game scene includes: acquiring center point coordinates a (x0, y0) and virtual character position coordinates B (x1, y1) of a lock area; and controlling the virtual camera corresponding to the rendering visual field to move towards the direction of the vector AB.
In the present exemplary embodiment, controlling the virtual camera corresponding to the rendering field of view to move in the direction of the vector AB includes: the virtual camera moves at the rate of PN pixels per second, where N is the refresh rate and P is the number of pixels.
In the present exemplary embodiment, in the lock view state, when the first touch operation applied to the first touch operation area is detected, the lock view state is maintained.
Specifically, in the locked view state, when the first touch operation applied to the first touch operation area 230 is detected, the locked view state is maintained. That is, in the view-locked state, the first touch operation applied to the first touch operation region 230 does not change the presentation view of the game scene 210 on the gui 200.
In the exemplary embodiment, in the view-locked state, when a second touch operation acting on a second touch operation area is detected, the display view of the game scene in the graphical user interface is continuously adjusted according to the second touch operation; and when the preset action of the second touch operation is detected, triggering the display view of the game scene to be adjusted to a view locking state.
Specifically, in the view-locked state, for example, fig. 3 shows the view field of the game scene 210 in the gui 200 in the view-locked state, when a second touch operation (e.g., a right sliding operation) applied to the second touch operation area is detected, the view field of the game scene in the gui continues to be adjusted according to the second touch operation; when the preset action of the second touch operation is detected, the display view of the game scene is triggered to be adjusted to the view locking state, as shown in fig. 4.
As described above, the preset actions include: the touch operation exceeding a preset time, the pressing operation exceeding a preset pressure and the operation of the touch object leaving the second touch operation area are combined.
In the present exemplary embodiment, when a cancel operation acting on the preset region is detected, the presentation field of view of the game scene on the graphical user interface is controlled to be restored to the state before the second touch operation.
Specifically, a preset area is set on the gui 200, and when a cancel operation applied to the preset area is detected, the presentation field of view of the game scene 210 on the gui 200 is controlled to return to the state before the second touch operation.
The preset area can be an area fixedly arranged on the graphical user interface; the area may also be an area generated after entering the view locking state, and in order to save screen space and avoid blocking or affecting the game picture, the area is hidden after the display view of the game scene on the graphical user interface is restored to the state before the second touch operation. It is to be understood that the preset area may be an area with a visual indication effect in the graphical user interface, or may be an area without a visual indication effect, and the shape, size and visual presentation effect of the preset area are not limited in the present exemplary embodiment. As a preferred embodiment, in order to facilitate the user to quickly identify the area, the preset area is set as a circular control with visual presentation effect.
The cancel operation may be one or a combination of several operations of clicking, long pressing, and heavy pressing, etc. acting on the preset area.
In this exemplary embodiment, controlling the rendering field of view of the game scene on the graphical user interface to return to the state before the second touch operation includes: controlling the display visual field of the game scene on the graphical user interface to recover to the display visual field before the second touch operation; or controlling the display visual field of the game scene on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the second touch operation.
Specifically, the present view field of the game scene screen on the graphical user interface is controlled to return to the present view field before the second touch operation, that is, the present view field range is controlled to return to the state before the second touch operation: the position and angle/direction of the virtual camera in the game scene are restored to the state before the second touch operation. That is, the presentation field of view of the game scene screen on the graphical user interface is controlled based on the position of the virtual camera in the game scene coordinates and the shooting direction in the coordinates before the second touch operation.
Controlling the display visual field of the game scene picture on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the second touch operation, namely restoring the visual field to the control state before the second touch operation, for example: before the second touch operation, the game calculates the visual field according to a predetermined calculation logic (for example, a virtual camera is arranged above the head of the virtual character, and the position moves along with the movement of the virtual character), in such a case, the visual field of the invention is restored to the state before the second touch operation, or the calculation logic before the second touch operation is adopted to calculate the visual field. Namely, the presentation visual field of the game scene picture on the graphical user interface is controlled based on the association relationship between the position of the current virtual character in the game scene coordinates and the shooting direction of the virtual camera.
Through the method provided by the invention, on one hand, the game visual angle (the game scene displayed on the screen is adjusted) can be adjusted according to the operation of the player, and the specific game visual angle can be locked, so that the player can observe the game scene at the specific position conveniently and correspondingly adjust the game strategy, the burden of the player on continuously performing visual angle adjustment operation is avoided, and the player can concentrate on the operation and control of the virtual character of the player, the observation of the game scene at the specific position and the operation (such as the operation of viewing, purchasing or using props) of other related games; on the other hand, whether the visual angle is restored to the visual angle before the virtual character of the player or not can be automatically determined according to the position of the virtual character of the player in the game scene, the operation is simple and convenient, and the user experience is improved.
According to an embodiment of the present invention, there is provided an information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, where the graphical user interface includes a first touch operation area and a second touch operation area, and content presented by the graphical user interface at least partially includes a game scene and a virtual character, the apparatus including:
the first interaction unit is used for detecting a first touch operation acting on the first touch operation area, controlling the virtual character to move and/or rotate in the game scene according to the first touch operation, and controlling the presentation visual field of the game scene on the graphical user interface according to the position of the virtual character in the game scene;
the second interaction unit is used for detecting second touch operation acting on a second touch operation area and adjusting the presentation visual field of a game scene in the graphical user interface according to the second touch operation;
and the display unit is used for triggering the display visual field of the game scene to be adjusted to a visual field locking state when the preset action of the second touch operation is detected, and locking the current visual field to the display visual field of the game scene in the graphical user interface.
The details of each information processing apparatus unit are already described in detail in the corresponding information processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
There is further provided, according to an embodiment of the present invention, a computer-readable storage medium having stored thereon a program product capable of implementing the above-mentioned method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when the program product is run on the terminal device. Which may employ a portable compact disc read only memory (CD-ROM) and include program code and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to an embodiment of the present invention, there is also provided an electronic apparatus including: the processing components, which may further include one or more processors, and memory resources, represented by memory, for storing instructions, such as application programs, that are executable by the processing components. The application program stored in the memory may include one or more modules that each correspond to a set of instructions. Further, the processing component is configured to execute the instructions to perform the information processing method described above.
The electronic device may further include: a power component configured to power manage an executing electronic device; a wired or wireless network interface configured to connect the electronic device to a network; and an input-output (I/O) interface. The electronic device may operate based on an operating system stored in memory, such as Android, iOS, Windows, Mac OS X, Unix, Linux, FreeBSD, or the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit may be a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present invention.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (12)

1. An information processing method is applied to a touch terminal capable of presenting a graphical user interface, wherein the graphical user interface comprises a first touch operation area and a second touch operation area, and content presented by the graphical user interface at least partially comprises a game scene and a virtual character, and the method comprises the following steps:
detecting a first touch operation acting on the first touch operation area, controlling the virtual character to move and/or rotate in the game scene according to the first touch operation, and controlling the presentation visual field of the game scene on the graphical user interface according to the position of the virtual character in the game scene;
detecting a second touch operation acting on the second touch operation area, and adjusting the display visual field of the game scene in the graphical user interface according to the second touch operation, wherein the second touch operation is a sliding operation;
when a preset action of the second touch operation is detected, triggering a presenting view of the game scene to be adjusted to a view locking state, and locking a current view to the presenting view of the game scene in the graphical user interface, wherein the preset action comprises: the touch operation exceeding a preset time, the pressing operation exceeding a preset pressure and the operation of the touch object leaving the second touch operation area are combined.
2. The method of claim 1, wherein the method further comprises:
under the state of the view field locking, when the virtual character is detected to enter a locking area, the view field is restored to the presenting view field of the game scene on the graphical user interface according to the position of the virtual character in the game scene;
the locked area is a game scene area presented by the graphical user interface in the view-locked state.
3. The method of claim 2, wherein the reverting to controlling a rendered view of the game scene on the graphical user interface in accordance with the position of the virtual character in the game scene comprises:
acquiring a center point coordinate a (x0, y0) and a virtual character position coordinate B (x1, y1) of the locking region;
and controlling the virtual camera corresponding to the presenting visual field to move towards the direction of the vector AB.
4. The method of claim 3, wherein controlling the virtual camera corresponding to the field of view of the presentation to move in the direction of vector AB comprises:
the virtual camera moves at a speed of PN pixels/second, where N is the refresh rate and P is the number of pixels.
5. The method of claim 1, wherein the method further comprises:
and in the vision field locking state, when the first touch operation acting on the first touch operation area is detected, the vision field locking state is maintained.
6. The method of claim 1, wherein the method further comprises:
in the view locking state, when the second touch operation acting on the second touch operation area is detected, continuously adjusting the display view of the game scene in the graphical user interface according to the second touch operation;
and when the preset action of the second touch operation is detected, triggering the display view of the game scene to be adjusted to a view locking state.
7. The method of claim 1, wherein the adjusting the rendered field of view of the game scene in the graphical user interface according to the second touch operation comprises:
changing the position of the virtual camera corresponding to the presenting view field according to the second touch operation;
and determining the presenting visual field of the game scene picture on the graphical user interface according to the position of the virtual camera.
8. The method of claim 1, wherein said controlling a rendered field of view of the game scene on the graphical user interface based on the position of the virtual character in the game scene comprises:
determining the position of a virtual camera corresponding to the presenting view according to the position of the virtual character;
and controlling the display visual field of the game scene picture on the graphical user interface according to the movement of the virtual character.
9. The method of claim 1, wherein the method further comprises:
and when the cancel operation acting on the preset area is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the second touch operation.
10. An information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, where the graphical user interface includes a first touch operation area and a second touch operation area, and content presented by the graphical user interface at least partially includes a game scene and a virtual character, the apparatus comprising:
the first interaction unit is used for detecting a first touch operation acting on the first touch operation area, controlling the virtual character to move and/or rotate in the game scene according to the first touch operation, and controlling the presentation visual field of the game scene on the graphical user interface according to the position of the virtual character in the game scene;
the second interaction unit is used for detecting a second touch operation acting on the second touch operation area and adjusting the display visual field of the game scene in the graphical user interface according to the second touch operation, wherein the second touch operation is a sliding operation;
a display unit, configured to trigger, when a preset action of the second touch operation is detected, adjustment of a view of the game scene to a view locking state, and lock a current view to the view of the game scene in the graphical user interface, where the preset action includes: the touch operation exceeding a preset time, the pressing operation exceeding a preset pressure and the operation of the touch object leaving the second touch operation area are combined.
11. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the information processing method of any one of claims 1 to 9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information processing method of any one of claims 1 to 9 via execution of the executable instructions.
CN201711148060.8A 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium Active CN107823882B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711148060.8A CN107823882B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711148060.8A CN107823882B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107823882A CN107823882A (en) 2018-03-23
CN107823882B true CN107823882B (en) 2021-05-11

Family

ID=61652069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711148060.8A Active CN107823882B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107823882B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109045689A (en) * 2018-07-26 2018-12-21 努比亚技术有限公司 game control method, mobile terminal and computer readable storage medium
CN110947180A (en) * 2018-09-26 2020-04-03 网易(杭州)网络有限公司 Information processing method and device in game
CN110448906A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device at visual angle, touch control terminal in game
CN110448904B (en) * 2018-11-13 2023-04-25 网易(杭州)网络有限公司 Game view angle control method and device, storage medium and electronic device
CN109847354B (en) * 2018-12-19 2020-05-22 网易(杭州)网络有限公司 Method and device for controlling virtual lens in game
CN109675310A (en) * 2018-12-19 2019-04-26 网易(杭州)网络有限公司 The method and device of virtual lens control in a kind of game
CN109718548B (en) 2018-12-19 2019-11-26 网易(杭州)网络有限公司 The method and device of virtual lens control in a kind of game
CN109675308A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game
CN109568956B (en) 2019-01-10 2020-03-10 网易(杭州)网络有限公司 In-game display control method, device, storage medium, processor and terminal
CN109568957B (en) 2019-01-10 2020-02-07 网易(杭州)网络有限公司 In-game display control method, device, storage medium, processor and terminal
CN109675311A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game
CN111905366A (en) * 2019-05-07 2020-11-10 网易(杭州)网络有限公司 In-game visual angle control method and device
CN110170168B (en) * 2019-05-30 2022-05-27 腾讯科技(深圳)有限公司 Virtual object shooting control method and device, electronic equipment and storage medium
CN110251936B (en) * 2019-06-24 2022-12-20 网易(杭州)网络有限公司 Method and device for controlling virtual camera in game and storage medium
CN110215690B (en) * 2019-07-11 2023-02-03 网易(杭州)网络有限公司 Visual angle switching method and device in game scene and electronic equipment
CN110575671B (en) * 2019-10-08 2023-08-11 网易(杭州)网络有限公司 Method and device for controlling viewing angle in game and electronic equipment
CN111803935B (en) * 2020-07-23 2024-02-09 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling rotation of virtual object
CN112156455A (en) * 2020-10-14 2021-01-01 网易(杭州)网络有限公司 Game display method and device, electronic equipment and storage medium
CN112402967B (en) * 2020-12-04 2024-04-12 网易(杭州)网络有限公司 Game control method, game control device, terminal equipment and medium
CN112791403A (en) * 2021-01-12 2021-05-14 网易(杭州)网络有限公司 Method and device for controlling virtual character in game and terminal equipment
CN112933592B (en) * 2021-01-26 2024-05-10 网易(杭州)网络有限公司 Information processing method and device in game, electronic equipment and storage medium
CN113769384A (en) * 2021-09-16 2021-12-10 网易(杭州)网络有限公司 In-game visual field control method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778656A (en) * 2007-08-17 2010-07-14 微软公司 Able to programme the moving of recreation character view direction in the game environment
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9791897B2 (en) * 2012-06-29 2017-10-17 Monkeymedia, Inc. Handheld display device for navigating a virtual environment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778656A (en) * 2007-08-17 2010-07-14 微软公司 Able to programme the moving of recreation character view direction in the game environment
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
[佳楠出品]LOL纯新手教程(上)英雄联盟新手必看特效设置术语;佳楠在身边;《https://m.youku.com/video/id_XNjE1Mzg3NzE2.html?sharekey=daad8a9e6eb9feaa9d308d1aa5d32e7b5&from=message&source=》;20130930;第0时0分0秒-第0时56分45秒 *
[坏孩子]王者荣耀#8征召模式介绍;我才是坏孩子;《https://m.youku.com/video/id_XMTY5NDQ5MzU4NA==.html?sharekey=78717b55519b9f7c7a558ab6030fc1ff8&from=message&source=》;20160822;全文 *
unity3d小技巧之锁定场景物体防止被误选中;神米米;《https://blog.csdn.net/shenmifangke/article/details/69668626?utm_source=app&from=singlemessage》;20170408;全文 *

Also Published As

Publication number Publication date
CN107823882A (en) 2018-03-23

Similar Documents

Publication Publication Date Title
CN107823882B (en) Information processing method, information processing device, electronic equipment and storage medium
US10695674B2 (en) Information processing method and apparatus, storage medium and electronic device
CN108355354B (en) Information processing method, device, terminal and storage medium
CN107648847B (en) Information processing method and device, storage medium and electronic equipment
US10716997B2 (en) Information processing method and apparatus, electronic device, and storage medium
JP6722252B2 (en) Information processing method and apparatus, storage medium, electronic device
CN109908574B (en) Game role control method, device, equipment and storage medium
CN109621411B (en) Information processing method, information processing device, electronic equipment and storage medium
US10702774B2 (en) Information processing method, apparatus, electronic device and storage medium
CN108905212B (en) Game screen display control method and device, storage medium and electronic equipment
CN107583271B (en) Interactive method and device for selecting target in game
CN108404407B (en) Auxiliary aiming method and device in shooting game, electronic equipment and storage medium
US11794096B2 (en) Information processing method
US10716996B2 (en) Information processing method and apparatus, electronic device, and storage medium
US9199164B2 (en) Image display device, computer readable storage medium, and game control method
CN108854063B (en) Aiming method and device in shooting game, electronic equipment and storage medium
CN107899246B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107832001B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107913516B (en) Information processing method, information processing device, electronic equipment and storage medium
CN112619137A (en) Game picture switching method and device, electronic equipment and storage medium
CN107832000B (en) Information processing method, information processing device, electronic equipment and storage medium
CN112791410A (en) Game control method and device, electronic equipment and storage medium
CN107982916B (en) Information processing method, information processing device, electronic equipment and storage medium
CN116099195A (en) Game display control method and device, electronic equipment and storage medium
CN117547815A (en) Interaction control method and device in game and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant