CN108355354B - Information processing method, device, terminal and storage medium - Google Patents

Information processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN108355354B
CN108355354B CN201810141896.3A CN201810141896A CN108355354B CN 108355354 B CN108355354 B CN 108355354B CN 201810141896 A CN201810141896 A CN 201810141896A CN 108355354 B CN108355354 B CN 108355354B
Authority
CN
China
Prior art keywords
orientation
user interface
graphical user
view
game scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810141896.3A
Other languages
Chinese (zh)
Other versions
CN108355354A (en
Inventor
吴志武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810141896.3A priority Critical patent/CN108355354B/en
Publication of CN108355354A publication Critical patent/CN108355354A/en
Application granted granted Critical
Publication of CN108355354B publication Critical patent/CN108355354B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application relates to an information processing method, which comprises the following steps: determining a first presentation field of view of a game scene picture on a graphical user interface according to the current orientation of the virtual object in the game scene; providing a visual angle controller in the graphical user interface, detecting a first touch operation acting on the visual angle controller, and changing a first presentation visual field of a game scene picture on the graphical user interface into a second presentation visual field according to the first touch operation; and determining the visual angle corresponding to the second presentation visual field as the current orientation of the virtual object in response to the orientation switching instruction. The application also discloses a processing device, a mobile terminal and a computer storage medium. Through the implementation mode, the interactive instructions input by the user can be effectively reduced, the user experience is improved, the game process is smoother, the discomfort of frequently switching the screen of the player is reduced, the memory and operation burden of the player is reduced, and meanwhile, the rendering burden of the game scene picture is reduced.

Description

Information processing method, device, terminal and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to an information processing method, an information processing apparatus, a terminal, and a storage medium.
Background
Under the wave of the internet, the continuous development and evolution of hardware and software technologies have promoted the emergence of intelligent devices and software. Meanwhile, a great amount of handgames with different subjects emerge to meet the requirements of users. For a mobile terminal running a hand game, in general, a user controls a game by thumbs of both hands, and the control of the hand game is difficult to realize fluency of control of a game at a PC end due to hardware conditions such as a small terminal display, a small number of control dimensions, a low processor computing capacity and the like. It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object disclosed herein is to provide an information processing method, apparatus, mobile terminal, and storage medium, thereby overcoming, at least to some extent, one or more problems due to limitations and disadvantages of the related art.
In order to solve the above problem, an embodiment of the present application provides an information processing method, including: determining a first presentation field of view of a game scene picture on a graphical user interface according to the current orientation of the virtual object in the game scene; providing a visual angle controller in the graphical user interface, detecting a first touch operation acting on the visual angle controller, and changing a first presentation visual field of a game scene picture on the graphical user interface into a second presentation visual field according to the first touch operation; and determining the visual angle corresponding to the second presentation visual field as the current orientation of the virtual object in response to the orientation switching instruction.
Optionally, the step of responding to the direction switching instruction includes: providing a direction switching instruction control in a graphical user interface; responding to the direction switching instruction acting in the direction switching instruction control area.
Optionally, the step of providing a direction switching instruction control in the graphical user interface includes: and detecting a first touch operation acting on the view controller, and providing an orientation switching instruction control in the graphical user interface.
Optionally, at least one attack control is displayed in the graphical user interface, and the step of providing a direction switching instruction control in the graphical user interface includes: and replacing at least one attack control with an orientation switching instruction control.
Optionally, the attack control is configured with an attack icon, and the step of providing a direction switching instruction control in the graphical user interface further includes: the attack icon is replaced with a direction switch finger icon.
Optionally, the step of determining a viewing angle corresponding to the second rendering field of view as the current orientation of the virtual object further includes: and replacing the orientation switching instruction control with an attack control.
Optionally, the step of determining a viewing angle corresponding to the second rendering field of view as the current orientation of the virtual object further includes: the sight bead icon is displayed at a preset position of the graphical user interface.
Optionally, after the step of determining the viewing angle corresponding to the second rendering field of view as the current orientation of the virtual object, the method further includes: and detecting a second touch operation acting on a preset orientation control area, and controlling the sight to move according to the second touch operation.
Optionally, the step of determining a first rendering field of view of the game scene screen on the graphical user interface according to the current orientation of the virtual object in the game scene further comprises: determining a first presentation field of view of a game scene picture on a graphical user interface according to the current position and the current orientation of the virtual object in the game scene; the step of changing the first presentation view of the game scene picture on the graphical user interface into the second presentation view according to the first touch operation further comprises: and changing the first presentation visual field of the game scene picture on the graphical user interface into the second presentation visual field according to the current position of the virtual object in the game scene and the first touch operation control.
An embodiment of the present application also provides an information processing apparatus, including: the presentation module is used for determining a first presentation visual field of a game scene picture on the graphical user interface according to the current orientation of the virtual object in the game scene; the visual angle control module is used for providing a visual angle controller in the graphical user interface, detecting a first touch operation acting on the visual angle controller, and controlling a first presentation visual field of a game scene picture on the graphical user interface to be changed into a second presentation visual field according to the first touch operation; and the orientation switching module is used for responding to the orientation switching instruction and determining the visual angle corresponding to the second presentation visual field as the current orientation of the virtual object.
An embodiment of the present application further provides a mobile terminal, including: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to perform the above-described processing method via execution of the executable instructions.
An embodiment of the present application further provides a computer storage medium, and the computer program is executed by a processor to implement the processing method.
By the information processing method, the device, the mobile terminal and the computer storage medium, the orientation of the virtual object in the game and the visual angle of the virtual object are separately controlled, the visual angle controller is input with a first touch operation to control the visual field of the virtual object to be changed from a first presentation visual field to a second presentation visual field on the premise that the current orientation of the virtual object is not changed, and after an orientation switching instruction is received, the visual angle corresponding to the second presentation visual field is determined as the current orientation of the virtual object. By the method, when seeing other virtual objects, a user does not need to switch the current game scene picture to the game scene picture corresponding to the original orientation, but can directly stay in the current game scene picture to continue subsequent game operation, so that interactive instructions input by the user can be effectively reduced, the user experience is improved, the game process is smoother, the discomfort of frequently switching screens of the player is reduced, the memory and operation burden of the player is reduced, and meanwhile, the rendering burden of the game scene picture is reduced.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 is a flowchart of an information processing method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a virtual object in relation to a first rendering field of view and a second rendering field of view as provided by one embodiment of the present application;
FIG. 3 is a graphical user interface diagram of a first rendered field of view provided by an embodiment of the present application;
FIG. 4 is a graphical user interface diagram of a second rendered field of view provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a graphical user interface provided by an embodiment of the present application after receiving an orientation switch command;
FIG. 6 is a block diagram of a processing apparatus according to an embodiment of the present disclosure;
fig. 7 is a block diagram of a mobile terminal according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of one of the storage media according to the embodiment of the disclosure.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be further noted that various trigger events disclosed in this specification may be preset, or may be set in real time according to an operation instruction of a user during a program running process, and different trigger events may trigger execution of different functions.
FIG. 1 illustrates an information processing method according to an embodiment of the disclosure. In the present embodiment, an information processing method 100 is described by way of different examples. The mobile terminal of the information processing method provided in this embodiment is executed, and the mobile terminal may be any terminal device such as a computer, a tablet computer, and an electronic device. The method comprises the steps of executing a software application on a processor of the mobile terminal and rendering the software application on a touch display of the mobile terminal to obtain a graphical user interface, wherein displayed content of the graphical user interface at least partially comprises a part or all of a game scene, and the game scene comprises a game picture and at least one virtual object 10.
The game screen may contain relatively fixed locations of virtual resources 20, such as ground, mountains, stones, flowers, grass, trees, buildings, and the like, of virtual resources 20. The virtual object 10 may be a virtual object 10 for enemy camping, or a virtual object 10 for self camping, and the virtual object 10 may implement corresponding behaviors in a game scene in response to the operation of the user, for example, the user may control the virtual object 10 to perform actions such as walking, running, squatting, lying prone, attacking, shooting, and the like in the game scene, which is not limited herein.
The current frame content of the game frame corresponds to the presentation field of view 30 of the virtual object 10. Wherein the presentation field of view 30 is related to the orientation of the virtual object 10, the game program forms a current picture displayed in the graphical user interface of the mobile terminal by rendering the corresponding virtual resource 20 in the current orientation of the virtual object 10, and takes the current picture as the current presentation field of view 30 of the virtual object 10, such as the first presentation field of view 31 in fig. 2. It should be noted that the orientation of the virtual object is usually bound to the virtual camera, in the first person game, the virtual camera may be the "eye" of the user in the game, and the virtual camera may be set on the head of the virtual object; in the third person game, a virtual camera may be disposed above and behind a virtual object, and a game scene including the virtual object may be photographed. On one hand, the current position of the virtual object in the game scene is determined according to the movement of the virtual object in the game scene, so that the position of the virtual camera in the game scene is determined according to the current position of the virtual object in the game scene; on the other hand, the orientation of the virtual camera in the game scene is determined according to the current orientation of the virtual object in the game scene. Therefore, the current position and the orientation of the virtual camera are determined according to the position and the orientation of the virtual object in the game scene, and further, the presenting visual field of the game scene picture on the graphical user interface is determined according to the current position and the orientation of the virtual camera. When the view angle adjusting touch instruction is received, the game program renders the rendering view 30 corresponding to the changed view angle according to the view angle adjusting touch instruction, such as the second rendering view 32 in fig. 2. It should be noted that, when the angle of view is adjusted by the angle of view adjustment touch command, the orientation of the virtual object 10 is not changed, as shown in fig. 2, when the rendering view of the game scene screen is changed from the first rendering view to the second rendering view, the current orientation of the virtual object does not change with the change of the rendering view of the game scene screen, and remains in the first orientation.
It should be further noted that the second presentation view rendered according to the view angle adjustment touch instruction is a temporary presentation view, and when a preset condition is met, the presentation view of the game scene picture is automatically restored from the second presentation view to the state before the view angle adjustment touch instruction is received.
The information processing method provided by the embodiment includes the following steps:
step S310, determining a first presentation visual field 31 of a game scene picture on a graphical user interface according to the current orientation of the virtual object 10 in the game scene;
step S320, providing a view controller 40 in the gui, detecting a first touch operation applied to the view controller 40, and changing a first presentation view 31 of a game scene screen on the gui to a second presentation view 32 according to the first touch operation;
in step S330, in response to the direction switching instruction, the viewing angle corresponding to the second rendering field of view 32 is determined as the current direction of the virtual object 10.
Through the above embodiment, the first touch operation is input to the view angle controller 40 to control the rendering view field of the game scene picture to change from the first rendering view field 31 to the second rendering view field 32 on the premise that the current orientation of the virtual object 10 is not changed, and after the orientation switching instruction is received, the view angle corresponding to the second rendering view field 32 is determined as the current orientation of the virtual object 10. By the mode, when seeing other virtual objects 10, a user does not need to switch the current game scene picture to the game scene picture corresponding to the original orientation, but can directly stay in the current game scene picture to continue subsequent game operation, so that interactive instructions input by the user can be effectively reduced, the user experience is improved, the game process is smoother, the discomfort of frequently switching screens of players is reduced, the memory and operation burden of the players is reduced, and meanwhile, the rendering burden of the game scene picture is reduced.
Hereinafter, each step of the information processing method in the present exemplary embodiment will be further described.
As shown in fig. 2, in step S310, a first presentation field of view 31 of a game scene screen on a graphical user interface is determined according to a current orientation of the virtual object 10 in the game scene.
In particular, the step of determining a first rendered field of view 31 of a game scene view on a graphical user interface in dependence on the current orientation of the virtual object 10 in the game scene comprises:
step 3101, responding to the heading control command;
step 3102, adjusting the orientation of the virtual object 10 in the game scene to determine a current orientation;
at step 3103, a first rendered field of view 31 of a game scene on the graphical user interface is determined based on the current orientation.
As shown in fig. 3, in the present embodiment, step S3101 includes, in response to the direction control command: in response to a user's touch operation on the graphical user interface toward the control area 50. Wherein the orientation control area 50 is used to change the orientation of the virtual object 10 according to the touch operation.
The orientation control area 50 in the graphic user interface is disposed at an edge of a display interface of the mobile terminal, in this embodiment, the orientation control area 50 is disposed at a lower edge of the display interface, in other embodiments, the orientation control area 50 may be disposed at a left edge or a right edge, and in other embodiments, the orientation control area 50 may be disposed at other positions according to a user-defined operation. A specific orientation control icon may be provided at the orientation control area 50, and the current orientation of the virtual object 10 is changed by a touch operation. In the present embodiment, the orientation control icon has a salient feature parameter, which is used to facilitate the user to quickly locate the position of the orientation control area 50, in the present embodiment, the salient feature parameter is a shape parameter different from other virtual controls, in other embodiments, the salient feature parameter may be a flashing parameter and/or a color parameter, etc. different from other virtual controls.
In other embodiments, the step of responding to the directional control command comprises: providing an external device communicated with the mobile terminal, associating an input device of the external device with an orientation control instruction, and controlling the virtual object 10 to change the current orientation when the mobile terminal detects that the input device of the external device is touched, wherein the input device can be a physical key or a touch key or other sensor devices capable of realizing instruction input. In other embodiments, the virtual object 10 is controlled by the preset audio instruction to attack the virtual resource 20.
In step S3102, adjusting the orientation of the virtual object 10 in the game scene means adjusting that the current orientation of the virtual object 10 in the game scene changes. It should be noted that the orientation and the moving direction of the virtual object 10 in the game scene are different concepts, and the orientation and the moving direction of the virtual object 10 in the game scene are independent of each other and may be superimposed on each other. For example: the orientation of the virtual object 10A in the game scene is controlled to be north, and at the same time, the virtual object 10A is controlled to move with the preset speed V1 as the moving speed and with the west as the moving direction, so that the effect of representing the virtual object 10A in the game scene is realized as follows: the virtual object 10A is currently oriented in north, and changes its position in the game scene at the preset speed V1 as the moving speed and at the west as the moving direction.
In the present embodiment, the orientation of the virtual object 10 and the control of the angle of view of the virtual object 10 are independent from each other, and the game program controls the orientation of the virtual object 10 and the angle of view of the virtual object 10 according to the received input or the preset input command, for example, when the orientation control command for the orientation of the virtual object 10 is received, the virtual object 10 is controlled to face towards the west direction, i.e. the first orientation in fig. 2, so the first presentation field of view of the game scene screen is the presentation field of view corresponding to the current orientation (west direction) of the virtual object, at this time, the virtual object 10 can be further controlled to move towards the west direction, and at the same time, when the angle of view control command is received, the angle of view is controlled to be the north direction, and the scene object corresponding to the north direction, i.e. the second presentation field of view 32, at this time, although the presentation field of view of the game scene screen is the second presentation field of view, the current orientation of the virtual object 10 remains in the west direction and moves in the west direction.
In step 3103, in determining the first presentation field of view 31 of the game scene picture on the gui according to the current orientation, the orientation of the virtual camera in the game scene is controlled according to the orientation control instruction, the scene object information in the orientation is acquired by the virtual camera, the scene object information acquired by the virtual camera is rendered to the display device to present the game scene picture corresponding to the current orientation, and the game scene picture is taken as the first presentation field of view 31.
Optionally, step 3103, the step of determining a first rendered field of view 31 of a game scene on the graphical user interface according to the current orientation, further comprises: a first rendered field of view 31 of a scene of the game on the graphical user interface is determined in dependence on the current position and current orientation of the virtual object 10 in the game scene. Specifically, the current position of the virtual object 10 in the game scene is determined according to the current view angle type of the game, and then the first presentation view 31 of the game scene picture on the graphical user interface is determined according to the acquired current position and current orientation of the virtual character in the game scene. When the current visual angle type of the game is a first-person visual angle, determining the current position of the virtual object 10 in the game scene by acquiring the position information of the virtual object 10; when the current view angle type of the game is the third person's view angle, the view position of the virtual camera is acquired as the current position of the virtual object 10 in the game scene. The first rendered field of view 31 of the game scene screen on the graphical user interface is determined by the current position and current orientation of the virtual object 10 in the game scene.
As shown in fig. 4, in step S320, a view controller 40 is provided in the gui, a first touch operation applied to the view controller 40 is detected, and the first presentation view 31 of the game scene screen on the gui is controlled to be changed into the second presentation view 32 according to the first touch operation.
The view controller 40 is configured to receive a touch operation to adjust a view of the virtual object 10 to adjust a game scene. It should be noted that, when the viewing angle controller 40 receives the touch operation to adjust the viewing angle of the virtual object 10, the orientation of the virtual object 10 is not changed. The view controller 40 and the orientation control area 50 may be provided in the gui at the same time, and in the present embodiment, the view controller 40 and the orientation control area 50 are disposed in different areas of the gui, for example, the orientation control area 50 is disposed in a side margin area of the gui, and the view controller 40 is disposed at a position spaced apart from the orientation control area 50 by a preset distance. In other embodiments, the view controller 40 and the orientation control area 50 are disposed in the same area of the gui, and the view controller 40 or the orientation control area 50 is controlled to be activated according to a received touch operation, for example, the touch operation is a pressing operation, and when it is determined that a received pressure value is smaller than a preset value, the orientation control area 50 is controlled to be activated; when it is determined that the received pressure value is greater than the preset value, control activates the viewing angle controller 40. Taking fig. 2 as an example, the current orientation of the virtual object 10 is the first orientation, the viewing angle is the first viewing angle, when the viewing angle is adjusted by the viewing angle controller 40, the viewing angle of the virtual object is adjusted from the first viewing angle to the second viewing angle, and the current orientation of the virtual object is still the first orientation.
In this embodiment, the viewing angle controller 40 may be kept displayed on the graphical user interface at all times, or may be displayed on the graphical user interface in response to a trigger of a specific condition.
The view controller 40 may be, for example, a virtual joystick, a direction control virtual key, and the like, and the present exemplary embodiment is not particularly limited thereto.
In an alternative embodiment, the view controller 40 is a virtual joystick, and controls the rendering view 30 of the scene screen on the graphical user interface according to the third touch operation received by the virtual joystick and the position of the virtual object 10 in the game scene.
In an alternative embodiment, the view controller 40 is a virtual cross key/virtual direction key (D-PAD), and controls the rendering view 30 of the scene on the graphical user interface according to the third touch operation received by the virtual cross key and the position of the virtual object 10 in the game scene.
In an alternative embodiment, the view controller 40 is a touch control with a visual indication, such as a touch control with a border frame, or a touch control filled with color, or a touch control with a predetermined transparency, or other control capable of visually indicating the range of the motion controller, and the presented view 30 of the game scene screen on the gui is changed according to the touch control such as sliding, clicking, etc. received by the touch control. The touch control with the visual indication can enable a user to quickly locate the touch control, and can reduce the operation difficulty of a game novice.
In an alternative embodiment, the viewing angle controller 40 is a touch-operated control in the graphical user interface that does not have a visual indication. The touch control without visual indication can not cover or influence the game picture, provides better picture effect, can save screen space and is suitable for the operation of high hands of the game.
In other embodiments, the changing of the first presentation field of view 31 of the game scene screen on the graphical user interface to the second presentation field of view 32 according to the first touch operation may be further implemented by:
step 1, acquiring the current position of a virtual object 10 in a game scene;
and 2, changing a first presentation visual field 31 of a game scene picture on the control graphical user interface into a second presentation visual field 32 according to the current position and the first touch operation.
A coordinate system is set in the game space, and the specific position of the game element in the game space is defined by the coordinate system, and in the present embodiment, the current position of the virtual object 10 is the coordinate value of the virtual object 10 in the game space. In other embodiments, the current position of the virtual object 10 may also be orientation information determined in other ways. The first touch operation is a touch operation applied to the view controller 40 to change a view angle corresponding to a presentation view of the game scene screen, and in the present embodiment, the view angle is determined by an orientation of the virtual camera, that is, the orientation of the virtual camera is changed according to the first touch operation. In determining the game screen content rendered on the graphical user interface, it is necessary to determine the virtual camera position and the virtual camera orientation, and in this embodiment, the position of the virtual camera is the current position of the virtual object 10. In other embodiments, the position of the virtual camera is a position at a preset distance from the current position of the virtual object 10.
Through the above embodiment, the view angle controller 40 for receiving the touch operation is provided, and when it is determined that the first touch operation directed at the view angle controller 40 is received, on the premise of not changing the orientation of the virtual object 10, the view angle corresponding to the presentation view field of the game scene picture is adjusted according to the first touch operation, so as to control the graphical user interface to change from the first presentation view field 31 to the second presentation view field 32, and by this way, more interactive experience and game play of the user can be provided.
In step S330, in response to the orientation switching instruction, the viewing angle corresponding to the second presentation field of view 32 is determined as the current orientation of the virtual object 10.
As shown in fig. 5, the orientation switching instruction is used to control the orientation of the virtual object 10 to be adjusted to coincide with the angle of view corresponding to the presentation field of view of the game scene screen. In this embodiment, the step of responding to the direction switch instruction includes:
step S3301, providing a direction switching instruction control 60 in the graphical user interface;
in step S3302, the direction switching instruction in the region of the direction switching instruction control 60 is responded to.
Through the above embodiment, the orientation switching instruction control 60 is provided for the user to input the orientation switching instruction, and the viewing angle corresponding to the second presentation view of the game scene picture is controlled to be the orientation of the virtual object 10, so that the memory usage and rendering times of the processor can be effectively reduced, meanwhile, the operation experience of the user can be improved, and the interaction times with the graphical user interface can be reduced.
In step S3301, an orientation switching instruction control 60 is provided in the graphical user interface. The orientation switching instruction control 60 is disposed at a preset position of the gui. In this embodiment, the preset area is a blank area of the gui, and is separated from other controls or control areas set or displayed on the gui, for example, the direction switching command control 60 is set in a middle area of the gui. In other embodiments, the preset area is another control or control area already displayed or set on the gui, for example, the attack control 70 is displayed on the left blank area and the right blank area of the gui, and the direction switching instruction control 60 coincides with the area where the attack control 70 of the left blank area is located. The shape of the direction switching instruction control 60 may be a circle, a square, or the like, and in the present embodiment, the shape of the direction switching instruction control 60 is the same as the shape of the other control in the overlapping region. In other embodiments, the predetermined area is determined according to a touch position on the statistical gui, for example, when the number of touches in a certain area exceeds a predetermined value, the position is determined as the predetermined area.
As shown in fig. 4, the orientation switching control is controlled to be activated according to the received touch operation. In the present embodiment, a first touch operation on the view angle controller 40 is detected, and an orientation switching command control 60 is provided in the graphical user interface. In this embodiment, when it is determined that the operation of the first touch operation on the view angle controller 40 satisfies a preset condition, the graphical user interface provides an orientation switching instruction control 60, where the preset condition is that a touch operation satisfying a preset length or a touch operation with a preset pressure is input for the view angle controller 40, for example, when the first touch operation is a sliding touch operation, and it is detected that the sliding satisfies the preset length, the graphical user interface provides an orientation switching instruction control 60; when the first touch operation is a click touch operation, detecting that the distance between the click position and the origin of the view angle controller 40 meets a preset length, providing a direction switching instruction control 60 in the graphical user interface; when the pressure value of the first touch operation to the view angle controller 40 satisfies the preset pressure value, an orientation switching command control 60 is provided in the gui.
In other embodiments, as long as it is determined that the viewing angle controller 40 receives the touch operation, the control provides an orientation switching instruction control 60 in the graphic user interface.
Wherein, an orientation switching command control 60 is provided in the graphical user interface, wherein, regarding the position of the orientation switching command control 60 displayed in the graphical user interface, the same as the related content in the foregoing, not described in detail, at least one attack control element 70 is replaced with an orientation switching instruction control element 60, specifically, in the present embodiment, two attack controls 70 are displayed in the graphical user interface, which are respectively arranged on the left and right sides of the graphical user interface according to the setting instruction, wherein, the set instruction can be set according to the user definition or the preset position of the game program, when the first touch operation on the view angle controller 40 is determined, the orientation switching instruction control 60 located at the position where the left attack control 70 is located is activated, and further, and the icon of the left attack control 70 is replaced with the orientation switching instruction icon corresponding to the orientation switching instruction control 60.
Through the embodiment, the occupied area of the graphical user interface can be reduced, and the recognition rate of the user on the icons can be increased through the icon replacement mode.
In step S3302, a direction switch instruction in the region of the direction switch instruction control 60 is responded to. The direction switching instruction is a touch operation that satisfies a preset switching condition, and in this embodiment, the preset switching condition is that a preset time is satisfied, that is, the touch operation applied in the region of the direction switching instruction control 60 satisfies the preset time. In other embodiments, the preset switching condition is a preset pressure or a preset path, and the specific preset switching condition may be set according to a user-defined setting.
In the determining of the viewing angle corresponding to the second presentation view 32 as the current orientation of the virtual object 10 of step S330, the second presentation view 32 corresponds to the second viewing angle of the virtual object 10. In the present embodiment, when the first touch operation directed to the viewing angle controller 40 is detected, a temporary presentation view, such as the second presentation view shown in fig. 2, is rendered according to the change of the viewing angle corresponding to the game scene presentation view, wherein when a preset condition is satisfied, the presentation view of the game scene screen is automatically restored from the second presentation view to the state before the first touch operation is received. In this embodiment, the preset condition is that the view controller is determined to be released, and the orientation of the virtual object is not consistent with the view corresponding to the current rendering view of the game scene. That is, when it is determined that the viewing angle controller 40 is released and the orientation of the virtual object does not coincide with the viewing angle corresponding to the current rendering field of view of the game scene, it automatically returns to the first rendering field of view 31 as shown in fig. 2.
It should be noted that the restoring of the presentation field of view to the state before the first touch operation in the present application includes: controlling the display visual field of the game scene picture on the graphical user interface to recover to the display visual field before the first touch operation; or controlling the display visual field of the game scene picture on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the first touch operation.
And controlling the display visual field of the game scene picture on the graphical user interface to be restored to the display visual field before the first touch operation, namely, absolutely restoring the display visual field range to the state before the first touch operation: the absolute position and the absolute angle/direction of the virtual camera in the game picture are both restored to the state before the first touch operation, for example, before the first touch operation, the position of the virtual camera is a point a in the absolute coordinates of the game scene, and the image pickup direction is a direction vector AO; and the presented visual field range is absolutely restored to the state before the first touch operation based on the point A and the direction AO, namely, the presented visual field of the game scene picture on the graphical user interface is controlled based on the position of the virtual camera in the absolute coordinates of the game scene before the first touch operation and the shooting direction in the absolute coordinates.
Controlling the presentation view of the game scene picture on the graphical user interface to be restored to the presentation view calculated according to the presentation view calculation logic before the first touch operation, that is, restoring the view to the control state before the first touch operation, for example: before the first touch operation, the game calculates the visual field according to a preset calculation logic (for example, a virtual camera is arranged at the head of the virtual character and rotates towards the head following the rotation of the virtual character), in such a case, the visual field of the application is restored to the state before the first touch operation, or the calculation logic before the first touch operation is adopted to calculate the visual field; for example, before the first touch operation, the position of the virtual camera is a point a in relative coordinates associated with the virtual character (e.g., a point with a distance W and a height H behind the virtual character), the image capturing direction is a direction vector AO associated with the orientation of the virtual character and/or the weapon sight direction (e.g., the projection of the direction vector AO in the horizontal direction is the same as the orientation of the virtual character in the horizontal direction), and when the virtual camera is restored, the position of the virtual camera remains at the point with the distance W and the height H behind the virtual character, and the image capturing direction of the virtual camera is associated with the orientation of the virtual character and/or the weapon sight direction, that is, based on the current position of the virtual character in absolute coordinates of the game scene, the current orientation of the virtual character and/or the weapon sight direction of the virtual character, And controlling the display visual field of a game scene picture on the graphical user interface by the position relation of the virtual camera before the first touch operation in the game scene relative to the virtual character, the orientation of the virtual character before the first touch operation and/or the incidence relation between the weapon sight direction of the virtual character and the shooting direction of the virtual camera.
The scope of the claims of this application should include at least the two cases described above.
Through the embodiment, the visual angle corresponding to the second presentation visual field of the game scene picture is determined as the current orientation of the virtual object, so that when the game scene is observed at a free visual angle, the virtual object is quickly and automatically turned without adding extra operation, the rendering burden caused by the conventional automatic switching to the original presentation visual field is reduced, and through the embodiment, a user can present the visual field after switching to perform game operation, so that the interaction times between the user and a graphical user interface are reduced.
As shown in fig. 5, the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10 further comprises: it is determined that the viewing angle corresponding to the second presentation view 32 is determined as the current orientation of the virtual object 10, and it is determined that a preset hiding condition is satisfied to control the hiding orientation switching instruction control 60. In this embodiment, the manner of determining that the angle of view corresponding to the second presentation view 32 is determined as the current orientation of the virtual object 10 may be at least one of the following manners:
the first method is as follows: detecting an interruption of the touch operation to the direction switching instruction control 60;
the second method comprises the following steps: and obtaining and comparing the current orientation information value of the virtual character of the current frame with the orientation information value of the virtual character of the previous frame, and determining that the current orientation information value is inconsistent with the orientation information value of the virtual character of the previous frame.
The control hidden direction switching instruction control 60 is a switching stopping instruction control, and the touch operation on the direction switching instruction control 60 is ignored. In the present embodiment, control replaces the direction switch instruction control 60 with the attack control 70.
In the present embodiment, the preset hiding condition is that the viewing angle corresponding to the second presentation view 32 is determined to be the current orientation of the virtual object 10, and the hiding orientation switching instruction control 60 is triggered.
In other embodiments, the preset hiding condition is to detect that the viewing angle controller 40 is released.
Further, the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10 comprises: determining the viewing angle corresponding to the second presentation field of view 32 as the current orientation of the virtual object 10, the sight-star icon 80 is displayed at a preset position of the graphical user interface. In this embodiment, the sight bead icon 80 is fixedly displayed at the center of the graphical user interface. In other embodiments, the sight bead icon 80 is fixedly displayed elsewhere on the graphical user interface. In other embodiments, the position of the foresight icon 80 displayed on the graphical user interface is dynamically adjusted according to the input foresight control instructions. In other embodiments, the sight bead icon 80 is always displayed on the graphical user interface during receipt of a first touch operation directed to the view angle controller to control a change of a first rendered field of view of a game scene screen on the graphical user interface to a second rendered field of view.
Further, after the step of determining the viewing angle corresponding to the second presentation view 32 as the current orientation of the virtual object 10, a second touch operation acting on the preset orientation control area is detected, and the sight moving is controlled according to the second touch operation. In the present embodiment, the sight icon 80 is fixedly disposed at a predetermined position of the gui, the second touch operation corresponding to the sight control command is the same as the touch operation directed to the control area 50, and when the touch operation is input to the control area 50, the display field of view 30 displayed on the gui is controlled to move relative to the sight icon 80. In other embodiments, the quasi-star icon 80 is movably displayed on the gui and is associated with a second touch operation applied to the facing control area, and the quasi-star icon 80 is controlled to move in the gui by the second touch operation, where the second touch operation is a touch operation applied to the facing control area 50 and satisfying a second preset condition, and in this way, the second touch operation is distinguished from a touch operation used for controlling the facing of the virtual character, so that the layout of the gui can be effectively reduced, the operation interruption of the user can be reduced, and the operation experience of the user can be improved, where the second preset condition is to determine that the second touch operation satisfies a preset pressure value.
It should be noted that, the above steps may be executed simultaneously or sequentially.
Taking fig. 3 to 5 as an example to describe a specific application of the above embodiment, two attack controls 70 are disposed on the graphical user interface, respectively disposed on the left side and the right side of the graphical user interface, and disposed in a blank area on the right side of the graphical user interface towards the control area 50, and the viewing angle controller 40 is disposed at a specific distance from the minimap. Taking the game scene picture displayed on the graphical user interface shown in fig. 3 as the first presentation field of view 31, when the view angle controller 40 detects a touch operation, the first presentation field of view 31 displayed on the graphical user interface is controlled to change to the second presentation field of view 32 shown in fig. 4, wherein the orientation of the virtual object 10 is not changed when the presentation field of view 30 is changed by the view angle controller 40. When the view controller 40 detects a touch operation, the direction switching instruction control 60 is controlled to be activated, the direction switching instruction control 60 is located in the area where the left attack control 70 is located, the icon of the attack control 70 is replaced by a direction switching instruction control icon, when the direction switching instruction control 60 is clicked, the view of the second presentation view 32 is controlled to be determined as the current direction of the virtual object, the direction switching instruction control icon is hidden, and the attack control 70 icon is displayed, as shown in fig. 5.
Through the above embodiment, when the direction switching instruction is received, the view angle corresponding to the second presentation view 32 displayed on the graphical user interface is set as the current direction of the virtual object 10.
As shown in fig. 6, an exemplary embodiment further discloses a processing apparatus for virtual resources 20 in a game scene, where the apparatus includes:
a presentation module for determining a first presentation field of view 31 of a game scene screen on the graphical user interface according to a current orientation of the virtual object 10 in the game scene;
the visual angle control module is used for providing a visual angle controller 40 in the graphical user interface, detecting a first touch operation acting on the visual angle controller 40, and controlling a first presentation visual field 31 of a game scene picture on the graphical user interface to be changed into a second presentation visual field 32 according to the first touch operation;
and the orientation switching module is used for responding to the orientation switching instruction and determining the visual angle corresponding to the second presentation visual field 32 as the current orientation of the virtual object 10.
The specific details of each module unit in the foregoing embodiments have been described in detail in the corresponding information processing method, and it can be understood that other unit modules included in the information processing apparatus correspond to the information processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Fig. 7 is a schematic structural diagram of one of the mobile terminals according to the embodiments of the present disclosure. As shown in fig. 7, the electronic device 710 of the present embodiment includes: a memory 711, and a processor 712. The memory 711 and the processor 712 may be connected by a bus. The graphical user interface is obtained by executing a software application on a processor of the terminal and rendering on a display device of the terminal.
Wherein the processor is configured to implement the following steps via execution of the executable instructions: determining a first presentation field of view 31 of a game scene on the graphical user interface according to the current orientation of the virtual object 10 in the game scene;
providing a visual angle controller 40 in the graphical user interface, detecting a first touch operation acting on the visual angle controller 40, and controlling a first presentation visual field 31 of a game scene picture on the graphical user interface to be changed into a second presentation visual field 32 according to the first touch operation;
in response to the orientation switching instruction, the angle of view corresponding to the second rendering field of view 32 is determined as the current orientation of the virtual object 10.
Optionally, the step of responding to the direction switching instruction includes: providing a direction switch command control 60 in the graphical user interface; in response to an orientation switch command acting within the area of the orientation switch command control 60.
Optionally, the step of providing an orientation switching instruction control 60 in the graphical user interface includes: a first touch operation on the view angle controller 40 is detected, and an orientation switching command control 60 is provided in the graphical user interface.
Optionally, at least one attack control 70 is displayed in the graphical user interface, and the step of providing an orientation switching instruction control 60 in the graphical user interface is: at least one attack control 70 is replaced with an orientation switching instruction control 60.
Optionally, the attack control 70 is configured with an attack icon, and the step of providing an orientation switching instruction control 60 in the graphical user interface further includes: the attack icon is replaced with a direction switch finger icon.
Optionally, the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10 further includes: the orientation switching instruction control 60 is replaced with an attack control 70.
Optionally, the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10 further includes: a sight bead icon 80 is displayed at a preset location of the graphical user interface.
Optionally, after the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10, the method further includes: and detecting a second touch operation acting on a preset orientation control area, and controlling the sight to move according to the second touch operation.
Optionally, the step of determining a first rendered field of view 31 of a scene of the game on the graphical user interface according to the current orientation of the virtual object 10 in the game scene further comprises: determining a first presentation field of view 31 of a scene of the game on the graphical user interface in dependence on the current position and current orientation of the virtual object 10 in the game scene; the step of changing the first presentation view 31 of the game scene picture on the graphical user interface to the second presentation view 32 according to the first touch operation further comprises: the first presentation field of view 31 of the game scene screen on the graphical user interface is controlled to change to the second presentation field of view 32 in accordance with the current position of the virtual object 10 in the game scene and the first touch operation.
Through the mobile terminal provided by the application, when seeing other virtual objects 10, a user does not need to switch the current game scene picture to the game scene picture corresponding to the original orientation, but can directly stay in the current game scene picture to continue subsequent game operation, so that the interactive instruction input by the user can be effectively reduced, the user experience is improved, the game process is smoother, the discomfort of a player frequently switching a screen is reduced, the memory and operation burden of the player is reduced, and meanwhile, the rendering burden of the game scene picture is reduced.
In alternative embodiments, the mobile terminal may further include one or more processors and memory resources, represented by memory, for storing instructions, such as application programs, that are executable by the processing components. The application program stored in the memory may include one or more modules that each correspond to a set of instructions. Further, the processing component is configured to execute the instructions to perform the information processing method described above.
The mobile terminal may further include: a power component configured to power manage the executive mobile terminal; a wired or wireless network interface configured to connect the mobile terminal to a network; and an input-output (I/O) interface. The mobile terminal may operate based on an operating system stored in memory, such as Android, iOS, Windows, Mac OS X, Unix, Linux, FreeBSD, or the like.
FIG. 8 is a schematic structural diagram of one of the computer storage media according to the embodiments of the present disclosure. As shown in fig. 8, a program product 1100 according to an embodiment of the application is depicted, on which a computer program is stored which, when being executed by a processor, carries out the steps of: optionally, the step of responding to the direction switching instruction includes: providing a direction switch command control 60 in the graphical user interface; in response to an orientation switch command acting within the area of the orientation switch command control 60.
Optionally, the step of providing an orientation switching instruction control 60 in the graphical user interface includes: a first touch operation on the view angle controller 40 is detected, and an orientation switching command control 60 is provided in the graphical user interface.
Optionally, at least one attack control 70 is displayed in the graphical user interface, and the step of providing an orientation switching instruction control 60 in the graphical user interface is: at least one attack control 70 is replaced with an orientation switching instruction control 60.
Optionally, the attack control 70 is configured with an attack icon, and the step of providing an orientation switching instruction control 60 in the graphical user interface further includes: the attack icon is replaced with a direction switch finger icon.
Optionally, the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10 further includes: the orientation switching instruction control 60 is replaced with an attack control 70.
Optionally, the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10 further includes: a sight bead icon 80 is displayed at a preset location of the graphical user interface.
Optionally, after the step of determining the viewing angle corresponding to the second rendering field of view 32 as the current orientation of the virtual object 10, the method further includes: and detecting a second touch operation acting on a preset orientation control area, and controlling the sight to move according to the second touch operation.
Optionally, the step of determining a first rendered field of view 31 of a scene of the game on the graphical user interface according to the current orientation of the virtual object 10 in the game scene further comprises: determining a first presentation field of view 31 of a scene of the game on the graphical user interface in dependence on the current position and current orientation of the virtual object 10 in the game scene; the step of changing the first presentation view 31 of the game scene picture on the graphical user interface to the second presentation view 32 according to the first touch operation further comprises: the first presentation field of view 31 of the game scene screen on the graphical user interface is controlled to change to the second presentation field of view 32 in accordance with the current position of the virtual object 10 in the game scene and the first touch operation.
Through the computer storage medium provided by the application, when seeing other virtual objects 10, a user does not need to switch the current game scene picture to the game scene picture corresponding to the original orientation, but can directly stop at the current game scene picture to continue subsequent game operation, so that interactive instructions input by the user can be effectively reduced, the user experience is improved, the game process is smoother, the discomfort of frequently switching screens of players is reduced, the memory and operation burden of the players is reduced, and meanwhile, the rendering burden of the game scene picture is reduced.
Program code embodied in a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, an electronic device, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An information processing method, characterized in that the method comprises:
determining a first presentation field of view of a game scene picture on a graphical user interface according to the current orientation of the virtual object in the game scene;
providing a visual angle controller in the graphical user interface, detecting a first touch operation acting on the visual angle controller, and controlling a first presentation visual field of the game scene picture on the graphical user interface to be changed into a second presentation visual field according to the first touch operation without changing the orientation of the virtual object;
detecting a first touch operation acting on the view controller, and providing a direction switching instruction control in the graphical user interface;
responding to the orientation switching instruction acting in the orientation switching instruction control area;
determining a perspective corresponding to the second rendering field of view as a current orientation of the virtual object.
2. The process of claim 1, wherein at least one attack control is displayed in the graphical user interface, and the step of providing a direction switching instruction control in the graphical user interface comprises: and replacing the at least one attack control with the orientation switching instruction control.
3. The processing method of claim 2, wherein the attack control is configured with an attack icon, and the step of providing a direction switching instruction control in the graphical user interface further comprises: the attack icon is replaced with a direction switch finger icon.
4. The processing method of claim 2, wherein the step of determining the perspective corresponding to the second rendering field of view as the current orientation of the virtual object further comprises: and replacing the orientation switching instruction control with the attack control.
5. The processing method of claim 1, wherein the step of determining the perspective corresponding to the second rendering field of view as the current orientation of the virtual object further comprises: displaying a sight bead icon at a preset position of the graphical user interface.
6. The processing method of claim 5, wherein after the step of determining the perspective corresponding to the second rendering field of view as the current orientation of the virtual object, further comprising: and detecting a second touch operation acting on a preset orientation control area, and controlling the sight bead to move according to the second touch operation.
7. The process of claim 1, wherein the step of determining a first rendered field of view of a game scene screen on a graphical user interface based on a current orientation of a virtual object in the game scene further comprises:
determining the first presentation field of view of a game scene screen on the graphical user interface according to the current position and the current orientation of the virtual object in the game scene;
the step of controlling the first display view of the game scene picture on the graphical user interface to be changed into the second display view according to the first touch operation further comprises:
and controlling the first presentation visual field of the game scene picture on the graphical user interface to be changed into a second presentation visual field according to the current position of the virtual object in the game scene and the first touch operation.
8. An information processing apparatus characterized by comprising:
the presentation module is used for determining a first presentation visual field of a game scene picture on the graphical user interface according to the current orientation of the virtual object in the game scene;
the visual angle control module is used for providing a visual angle controller in the graphical user interface, detecting a first touch operation acting on the visual angle controller, controlling a first presentation visual field of the game scene picture on the graphical user interface to be changed into a second presentation visual field according to the first touch operation, and not changing the orientation of the virtual object;
the orientation switching module is used for detecting a first touch operation acting on the view angle controller and providing an orientation switching instruction control in the graphical user interface; and determining the visual angle corresponding to the second presentation visual field as the current orientation of the virtual object in response to the orientation switching instruction acting in the orientation switching instruction control area.
9. A mobile terminal, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the processing method of any one of claims 1-7 via execution of the executable instructions.
10. A computer storage medium, the computer program, when executed by a processor, implementing the processing method of any of claims 1-7.
CN201810141896.3A 2018-02-11 2018-02-11 Information processing method, device, terminal and storage medium Active CN108355354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810141896.3A CN108355354B (en) 2018-02-11 2018-02-11 Information processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810141896.3A CN108355354B (en) 2018-02-11 2018-02-11 Information processing method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN108355354A CN108355354A (en) 2018-08-03
CN108355354B true CN108355354B (en) 2021-08-10

Family

ID=63005811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810141896.3A Active CN108355354B (en) 2018-02-11 2018-02-11 Information processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN108355354B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110947180A (en) * 2018-09-26 2020-04-03 网易(杭州)网络有限公司 Information processing method and device in game
CN109589605B (en) * 2018-12-14 2022-08-05 网易(杭州)网络有限公司 Game display control method and device
CN109718537A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Game video method for recording, mobile terminal and computer readable storage medium
CN111957032B (en) * 2019-02-22 2024-03-08 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
JP6588177B1 (en) * 2019-03-07 2019-10-09 株式会社Cygames Information processing program, information processing method, information processing apparatus, and information processing system
CN111905366A (en) * 2019-05-07 2020-11-10 网易(杭州)网络有限公司 In-game visual angle control method and device
CN110215690B (en) * 2019-07-11 2023-02-03 网易(杭州)网络有限公司 Visual angle switching method and device in game scene and electronic equipment
CN110393917A (en) * 2019-08-26 2019-11-01 网易(杭州)网络有限公司 A kind of pumping card method and device in game
CN110404262B (en) * 2019-09-03 2023-05-12 网易(杭州)网络有限公司 Method and device for controlling display in game, electronic equipment and storage medium
CN110665226A (en) * 2019-10-09 2020-01-10 网易(杭州)网络有限公司 Method, device and storage medium for controlling virtual object in game
CN110694271B (en) * 2019-10-21 2023-05-12 网易(杭州)网络有限公司 Camera gesture control method and device in game scene and electronic equipment
CN111729308A (en) * 2020-06-15 2020-10-02 北京智明星通科技股份有限公司 Picture display method and device for shooting game and game terminal
CN111714887B (en) * 2020-06-24 2024-01-30 网易(杭州)网络有限公司 Game view angle adjusting method, device, equipment and storage medium
CN111939567A (en) * 2020-09-03 2020-11-17 网易(杭州)网络有限公司 Game virtual scene transformation method and device and electronic terminal
CN112494945A (en) * 2020-12-03 2021-03-16 网易(杭州)网络有限公司 Game scene conversion method and device and electronic equipment
CN113413593A (en) * 2021-07-20 2021-09-21 网易(杭州)网络有限公司 Game picture display method and device, computer equipment and storage medium
CN113648661B (en) * 2021-08-18 2024-04-12 网易(杭州)网络有限公司 Method and device for processing information in game, electronic equipment and storage medium
CN113813607B (en) * 2021-08-27 2024-03-15 腾讯科技(深圳)有限公司 Game view angle switching method and device, storage medium and electronic equipment
CN113648654A (en) * 2021-09-03 2021-11-16 网易(杭州)网络有限公司 Game picture processing method, device, equipment, storage medium and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101405061A (en) * 2006-03-27 2009-04-08 科乐美数码娱乐株式会社 Game system, game device, game device control method, and information storage medium
CN105148517A (en) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 Information processing method, terminal and computer storage medium
CN105378625A (en) * 2013-06-25 2016-03-02 微软技术许可有限责任公司 Indicating out-of-view augmented reality images
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8882590B2 (en) * 2006-04-28 2014-11-11 Nintendo Co., Ltd. Touch-controlled game character motion providing dynamically-positioned virtual control pad

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101405061A (en) * 2006-03-27 2009-04-08 科乐美数码娱乐株式会社 Game system, game device, game device control method, and information storage medium
CN105378625A (en) * 2013-06-25 2016-03-02 微软技术许可有限责任公司 Indicating out-of-view augmented reality images
CN105148517A (en) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 Information processing method, terminal and computer storage medium
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Also Published As

Publication number Publication date
CN108355354A (en) 2018-08-03

Similar Documents

Publication Publication Date Title
CN108355354B (en) Information processing method, device, terminal and storage medium
CN107823882B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108905212B (en) Game screen display control method and device, storage medium and electronic equipment
CN107648847B (en) Information processing method and device, storage medium and electronic equipment
US10500484B2 (en) Information processing method and apparatus, storage medium, and electronic device
CN108771858B (en) Skill control switching method, device, terminal and storage medium
CN107213643B (en) Display control method and device, storage medium, the electronic equipment of game picture
CN107977141B (en) Interaction control method and device, electronic equipment and storage medium
CN107132981B (en) Display control method and device, storage medium, the electronic equipment of game picture
CN108854063B (en) Aiming method and device in shooting game, electronic equipment and storage medium
CN109589605B (en) Game display control method and device
US10191612B2 (en) Three-dimensional virtualization
CN107583271A (en) The exchange method and device of selection target in gaming
CN107203321B (en) Display control method and device, storage medium, the electronic equipment of game picture
CN107832001B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107913516B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN110575671A (en) Method and device for controlling view angle in game and electronic equipment
CN111870956B (en) Method and device for split screen display of game sightseeing, electronic equipment and storage medium
CN107626105B (en) Game picture display method and device, storage medium and electronic equipment
CN111880715A (en) Method and device for editing virtual control in interface, mobile terminal and storage medium
CN112870701A (en) Control method and device of virtual role
CN114053693B (en) Object control method and device in virtual scene and terminal equipment
CN113721820B (en) Man-machine interaction method and device and electronic equipment
CN115243093B (en) Video bullet screen processing method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant