CN107694087B - Information processing method and terminal equipment - Google Patents

Information processing method and terminal equipment Download PDF

Info

Publication number
CN107694087B
CN107694087B CN201710990822.2A CN201710990822A CN107694087B CN 107694087 B CN107694087 B CN 107694087B CN 201710990822 A CN201710990822 A CN 201710990822A CN 107694087 B CN107694087 B CN 107694087B
Authority
CN
China
Prior art keywords
control
interactive
interaction
controlling
sliding operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710990822.2A
Other languages
Chinese (zh)
Other versions
CN107694087A (en
Inventor
陶毅阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201710990822.2A priority Critical patent/CN107694087B/en
Publication of CN107694087A publication Critical patent/CN107694087A/en
Application granted granted Critical
Publication of CN107694087B publication Critical patent/CN107694087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing

Abstract

The application provides an information processing method and terminal equipment, which are applied to a touch terminal capable of presenting an interactive interface and comprise the following steps: detecting touch sliding operation of a first preset area in an interactive interface; controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction; and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation. Thereby improving the operation efficiency of the terminal.

Description

Information processing method and terminal equipment
Technical Field
The present application relates to the field of game technologies, and in particular, to an information processing method and a terminal device.
Background
In a game, a player often needs to simultaneously consider a plurality of behaviors of a game character, for example, in a shooting game, operations of displacement, view angle switching and three dimensions of attack need to be simultaneously considered. However, for the mobile terminal game, the hardware condition of the mobile terminal device is limited, and when the terminal device is held by two hands, only two thumbs can interact with the screen in two dimensions at the same time. In the existing mobile terminal game, a control for controlling the displacement of a game role is arranged on a left half screen, so that a player can conveniently control the displacement of the game role by using the thumb of the left hand; a control for controlling the attack of the game role is arranged on the right half screen, so that the player can conveniently control the attack of the game role by the thumb of the right hand; and dragging the blank position in the interactive interface to realize the visual angle change of the game role.
However, when the player performs view angle switching by dragging the blank position, if an enemy is found and wants to attack, the finger must leave the screen and press the attack control, and such an interaction flow seriously slows down the operation of the player. Namely, the prior art has the problem of low operation efficiency of the terminal.
Disclosure of Invention
The application provides an information processing method and terminal equipment, so that the operation efficiency of a terminal is improved.
In a first aspect, the present application provides an information processing method applied to a touch terminal capable of presenting an interactive interface, including: detecting touch sliding operation of a first preset area in an interactive interface; controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction; and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation.
The beneficial effect of this application includes: by executing the method, the player can execute the preset virtual operation without leaving the screen, so that the operation of the player is improved, and the operation efficiency of the terminal is improved.
Optionally, comprising: an interactive control group is arranged in the interactive interface and comprises a first interactive control and a second interactive control, wherein the distance between the first interactive control and the second interactive control on the interactive interface is equal to a preset distance; wherein the first preset area at least partially contains the first interaction control.
Controlling the visual angle rotation according to the movement of the touch sliding operation in the first direction, and comprising two optional modes:
in an optional mode, the interaction control group is controlled to move in the interaction interface according to touch sliding operation; and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
In another optional mode, the interactive control group is controlled to move along a first direction in the interactive interface according to the touch sliding operation; and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
The beneficial effect of this application includes: according to the method, when any interactive control in the interactive control group is displaced, the whole interactive control group is displaced in a linkage manner, so that the preset distance between all interactive controls in the interactive control group is kept unchanged.
Optionally, the first interactive control and the second interactive control in the interactive control group are arranged along a second direction in the interactive interface; if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation, including: and if the touch point of the touch sliding operation moves to the range of the second interactive control, executing a preset virtual operation.
Optionally, the preset virtual operation includes: virtual shooting operation.
Optionally, the controlling the view angle to rotate according to the movement of the touch sliding operation in the first direction includes: and controlling the rotating amplitude of the visual angle according to the moving distance of the touch sliding operation in the first direction.
Next, a terminal device is determined, where a device portion corresponds to the method, and the corresponding content has the same technical effect, and is not described herein again.
In a second aspect, the present application provides a terminal device, comprising: the device comprises a detection module and a control module; the detection module is used for detecting touch sliding operation of a first preset area in the interactive interface; the control module is used for controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction; the control module is further configured to execute a preset virtual operation if the movement distance of the touch sliding operation in the second direction is greater than a preset distance.
Optionally, an interaction control group is arranged in the interaction interface, and the interaction control group includes a first interaction control and a second interaction control, wherein the distance between the first interaction control and the second interaction control on the interaction interface is equal to a preset distance; wherein the first preset area at least partially contains the first interaction control.
Optionally, the control module is specifically configured to: controlling an interactive control group to move in an interactive interface according to touch sliding operation; and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
Optionally, the control module is specifically configured to: controlling the interactive control group to move in the interactive interface along a first direction according to the touch sliding operation; and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
Optionally, the first interactive control and the second interactive control in the interactive control group are arranged along a second direction in the interactive interface; and the control module is specifically used for executing the preset virtual operation when the touch point of the touch sliding operation moves to the range of the second interactive control.
Optionally, the preset virtual operation includes: virtual shooting operation.
Optionally, the control module is specifically configured to: and controlling the rotating amplitude of the visual angle according to the moving distance of the touch sliding operation in the first direction.
In summary, the present application provides an information processing method and a terminal device, where the method includes: detecting touch sliding operation of a first preset area in an interactive interface; controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction; and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation. The player can execute the preset virtual operation without leaving the screen, so that the player operation is improved, and the terminal operation efficiency is improved.
Drawings
Fig. 1 is a flowchart of an information processing method according to an embodiment of the present application;
FIG. 2A is a diagram of an interface provided in accordance with an embodiment of the present application;
FIG. 2B is a diagram illustrating an interface provided by another embodiment of the present application;
fig. 3A is a flowchart of an information processing method according to an embodiment of the present application;
fig. 3B is a flowchart of an information processing method according to an embodiment of the present application;
FIG. 4A is a schematic illustration of an interface display provided in an embodiment of the present application;
FIG. 4B is a schematic diagram of an interface display provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal device 500 according to an embodiment of the present application;
fig. 6 is a block diagram illustrating a terminal device 600 according to an example embodiment.
Detailed Description
In the prior art, a player often needs to consider a plurality of operation behaviors, for example, operations of displacement, view angle switching and three dimensionality attacking need to be considered at the same time, and displacement of a game role is controlled through a displacement control; controlling the attack of the game role through the attack control; and dragging the blank position in the interactive interface to realize the visual angle change of the game role. When the player carries out visual angle switching by dragging the blank position, if finding an enemy and trying to attack, fingers must leave the screen at the moment, and then click and press an attack control, so that the operation of the player is seriously slowed down by the interactive process. Namely, the prior art has the problem of low operation efficiency of the terminal.
In order to solve the problem of low terminal operation efficiency in the prior art, the application provides an information processing method and terminal equipment. Detecting touch sliding operation of a first preset area in an interactive interface; controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction; and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation. Namely, when the player controls the visual rotation, the fingers can carry out the preset virtual operation without leaving the screen. Thereby improving the operation efficiency of the terminal.
Example one
Specifically, fig. 1 is a flowchart of an information processing method provided in an embodiment of the present application, and is applied to a touch terminal capable of presenting an interactive interface, where the touch terminal may be an intelligent terminal such as a smart phone, a tablet computer, and a game console, as shown in fig. 1, the method includes the following steps:
step S101: detecting touch sliding operation of a first preset area in an interactive interface;
step S102: controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction;
step S103: and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation.
Step S101 is explained in detail as follows: the interactive interface can be a game interface, and the player can realize the purpose of controlling the behavior of the game role through the operation of the interactive interface.
In this embodiment, the first preset area may be located in a right half screen or a left half screen of the interactive interface; the first preset area can be an implicit area or an explicit area; it can be in the shape of circle, square, pentagon, etc.; its area size may be one-fourth of the interactive interface, etc. In summary, the present application does not limit the position, display state, shape, and size of the first preset region.
In the present embodiment, the touch slide operation includes a slide operation in a first direction and a slide operation in a second direction. The touch sliding operation is a coherent operation, that is, when the player switches from the view rotation operation to the preset virtual operation in advance, the finger does not need to leave the screen.
The following description will be made with reference to step S102 and step S103: the first direction may be a horizontal direction and the second direction may be a vertical direction; alternatively, the second direction may be a horizontal direction and the first direction may be a vertical direction. The present embodiment does not limit the specific directions of the first direction and the second direction.
Optionally, step S102 includes: and controlling the rotation amplitude of the visual angle according to the movement distance of the touch sliding operation in the first direction, wherein the larger the movement distance in the first direction is, the larger the rotation amplitude of the visual angle is. The perspective rotation is a horizontal perspective rotation of the game character. When a player needs to control a game character to rotate in a vertical direction, the physical posture and the motion state of the equipment can be acquired through three acceleration sensors carried by the terminal equipment, so that the visual angle of the game character can be switched according to the rotation angle of the terminal equipment.
And if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation. Correspondingly, if the moving distance of the touch sliding operation in the second direction is smaller than or equal to the preset distance, the preset virtual operation is not executed. Wherein, this preset distance can set up according to actual conditions, if: the setting is 2 cm or 3 cm, and the like, and the method for determining the preset distance is not limited in the application.
Optionally, the preset virtual operation may be an attack operation, such as a virtual shooting operation, which is not limited in the present application.
The above steps are further illustrated with reference to examples: after the player finds the enemy, the player can aim at the enemy by controlling the visual angle to rotate in the first direction, the fingers of the player do not need to leave the screen, the player only needs to slide in the second direction, and once the moving distance in the second direction is larger than the preset distance, the game character can shoot the enemy.
The application provides an information processing method which comprises the steps of detecting touch sliding operation of a first preset area in an interactive interface; controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction; and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation. The player can execute the preset virtual operation without leaving the screen, so that the player operation is improved, and the terminal operation efficiency is improved.
Example two
On the basis of the first embodiment, optionally, an interactive control group is arranged in the interactive interface, and the interactive control group can reside on the screen or appear when a finger touches the screen; the interaction control group can be located at a preset position in the interaction interface or determined according to the initial position touched by the finger, and comprises a first interaction control and a second interaction control, wherein the distance between the first interaction control and the second interaction control on the interaction interface is equal to a preset distance; wherein the first preset area at least partially contains a first interaction control.
The first preset area at least partially contains the first interaction control, which means that the first interaction control is arranged in the first preset area, and the first interaction control is wholly (wholly) located in the first preset area or the first interaction control is partially (locally) located in the first preset area.
Based on the interactive control group provided in the present application, optionally, step S101 includes: and detecting the touch sliding operation on the interactive control group in the first preset area. Accordingly, the first interactive control is used for controlling the view angle rotation. The second interactive control is used for controlling preset virtual operation. The interactive control group can be displayed in a right half screen of the interactive interface; and a third interactive control used for controlling the game character to execute the displacement behavior can be arranged on the interactive interface. It may be displayed in the left half of the interactive interface. Based on the three-dimensional operation, the player can realize three-dimensional operation of game role displacement, view angle conversion and the preset virtual operation in a two-hand operation mode. For example: fig. 2A is a display diagram of an interface provided in an embodiment of the present application, and as shown in fig. 2A, a control 1, a control 2, and a control 3 are displayed on an interactive interface, where the control 1 and the control 2 form an interactive control group.
Further, step S103 includes the following two alternatives:
alternatively, fig. 3A is a flowchart of an information processing method according to an embodiment of the present application, and as shown in fig. 3A, step S103 includes the following steps:
step S103 a: controlling an interactive control group to move in an interactive interface according to touch sliding operation;
step S103 b: and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
Namely, the interactive control group is taken as a whole, and the touch sliding operation can control the whole interactive control group to move in the interactive interface.
In this example, step S103a is not executed first, and then step S103b is executed. But simultaneously performs step S103b in performing part or all of step S103 a.
Alternatively, fig. 3B is a flowchart of an information processing method according to an embodiment of the present application, and as shown in fig. 3B, step S103 includes the following steps:
step S103 c: controlling the interactive control group to move in the interactive interface along a first direction according to the touch sliding operation;
step S103 d: and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
In this embodiment, step S103c is not executed first, and then step S103d is executed. But simultaneously performs step S103d in performing part or all of step S103 c.
It should be noted that the difference between the first alternative and the second alternative is:
alternative one does not define the direction of movement of the set of interactive controls, for example: the set of interaction controls may be moved in a first direction, may be moved in a second direction, or may be moved in other directions. Specifically, the moving direction of the interactive control group is determined according to the specific direction of the touch sliding operation, if the touch sliding operation includes a touch sliding operation in a first direction, the interactive control group is displaced in the first direction, and if the touch sliding operation includes a touch sliding operation in a second direction, the interactive control group is displaced in the second direction.
In the second alternative, the interactive control group is limited to move only in the first direction, so that even if the touch sliding operation includes a touch sliding operation in the second direction or another direction, the interactive control group is not displaced.
In the application, the interactive control group can be moved through the first interactive control, and the interactive control group can also be moved through the second interactive control. This is not limited by the present application.
The process of moving an interactive control group is explained with reference to an example: fig. 2B is a display diagram of an interface according to another embodiment of the present application, and as shown in fig. 2B, a control 1, a control 2, and a control 3 are displayed on an interactive interface, where the control 1 and the control 2 form an interactive control group. When the player moves the control 1, the control 2 moves in linkage with the control 1; similarly, when the player moves the control 2, the control 1 moves in linkage with the control 2; as shown in fig. 2A and 2B, the set of interactive controls moves from the position shown in fig. 2A to the position shown in fig. 2B.
According to the method, when any interactive control in the interactive control group is displaced, the whole interactive control group is displaced in a linkage manner, so that the preset distance between all interactive controls in the interactive control group is kept unchanged.
EXAMPLE III
On the basis of the second embodiment, further, the first interactive control and the second interactive control in the interactive control group are arranged along the second direction in the interactive interface.
Specifically, fig. 4A is an interface display schematic diagram provided in an embodiment of the present application, and as shown in fig. 4A, it is assumed that the interaction control group includes a first interaction control and a second interaction control, the first interaction control and the second interaction control are arranged in a second direction in the interaction interface, and the second direction is a vertical direction in fig. 4A. The trigger operation direction of the first interaction control is a first direction. In fig. 4A, the first direction is a horizontal direction, that is, the triggering operation on the first interaction control can only be in the horizontal direction, and if the triggering operation direction of the first interaction control is in other directions, the triggering operation is invalid. Similarly, if the second interactive control comprises a sliding operation, the trigger operation direction of the second interactive control is also the first direction.
Fig. 4B is an interface display diagram provided in an embodiment of the present application, and as shown in fig. 4B, it is assumed that the interaction control group includes a first interaction control and a second interaction control, the first interaction control and the second interaction control are arranged in a second direction in the interaction interface, and the second direction is a horizontal direction in fig. 4B. The trigger operation direction of the first interaction control is a first direction. In fig. 4B, the first direction is a vertical direction, that is, the triggering operation on the first interaction control can only be in the vertical direction, and if the triggering operation direction of the first interaction control is in another direction, the triggering operation is invalid. Similarly, if the second interactive control comprises a sliding operation, the trigger operation direction of the second interactive control is also the first direction.
Based on this, step S103 includes: and if the touch point of the touch sliding operation moves to the range of the second interactive control, executing a preset virtual operation.
If the touch point of the touch sliding operation moves to the range of the second interactive control, only the preset virtual operation may be executed. Alternatively, both the preset virtual operation and the view angle conversion operation are performed. The method comprises the following specific steps:
optionally, if the touch sliding operation includes a "long press" operation on the second interactive control, that is, the second interactive control is not displaced, the game character is controlled to execute the preset virtual operation.
Optionally, if the touch sliding operation includes a long-press operation and a slide operation, the game character is controlled to execute a preset virtual operation and a view angle switching operation.
And by combining the two optional modes, the second interactive control has different functions in different scenes. And when the player only presses the second interactive control for a long time, and the second interactive control does not displace, only controlling the game character to execute the preset virtual operation through the second interactive control. And when the player presses and slides the second interactive control for a long time, and the second interactive control displaces, controlling the game character to execute the preset virtual operation and the view angle conversion behavior through the second interactive control. In the application, the second interactive control can have corresponding functions in different scenes, so that the control utilization rate is improved.
Considering that the second interactive control inevitably has different displacement when the player plays the game, in fact, the player may not want to switch the viewing angle, and the different displacement may cause misoperation. Based on this, this application also provides an alternative:
in an optional mode three, if the touch sliding operation includes a long press operation and a plus sliding operation on the second interactive control, if the displacement of the second interactive control is smaller than a preset threshold, the game role is controlled to execute only a preset virtual operation; or if the displacement of the second interactive control is greater than or equal to the preset threshold value, controlling the game role to execute the preset virtual operation and the visual angle conversion operation.
The preset threshold may be determined by the minimum movement amplitude of the finger when the player usually performs the perspective conversion.
Optionally, if the touch sliding operation includes a long-press operation and a slide operation, the game character is controlled to execute a preset virtual operation. That is, the second interactive control in this case has only a function for controlling the preset virtual operation.
The probability of misoperation can be reduced through the third alternative way and the fourth alternative way.
Example four
On the basis of the second embodiment or the third embodiment, optionally, a display trigger operation input by the player on the interactive interface is acquired; and detecting the position of the display trigger operation on the interactive interface, and displaying the interactive control group on the position.
For example: assume that the interactive interface is not currently displaying an interactive control group. When the player clicks on the right side of the interactive interface, the interactive control group can be displayed on the interactive interface at the position of the clicking operation. Alternatively, the set of interaction controls is displayed on the interactive interface at other locations on the interactive interface. The display position of the interaction control group is not limited by the application.
According to the interactive control group display method and device, the interactive control group can be displayed through display triggering operation input by the player on the interactive interface, and therefore the display efficiency of the interactive control group is improved.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a terminal device 500 according to an embodiment of the present application, where the terminal device may be: smart phone, tablet computer, game console, etc., as shown in fig. 5, the terminal device includes:
the detection module 501 is configured to detect a touch sliding operation in a first preset area in an interactive interface;
a control module 502, configured to control a viewing angle to rotate according to a movement of a touch sliding operation in a first direction;
the control module 502 is further configured to execute a preset virtual operation if the moving distance of the touch sliding operation in the second direction is greater than a preset distance.
Optionally, an interaction control group is arranged in the interaction interface, and the interaction control group includes a first interaction control and a second interaction control, wherein the distance between the first interaction control and the second interaction control on the interaction interface is equal to a preset distance;
wherein the first preset area at least partially contains the first interaction control.
In an optional manner, the control module 502 is specifically configured to: controlling an interactive control group to move in an interactive interface according to touch sliding operation; and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
Alternatively, the control module 502 is specifically configured to: controlling the interactive control group to move in the interactive interface along a first direction according to the touch sliding operation; and controlling the visual angle rotation according to the movement of the first interactive control in the first direction.
Optionally, the first interactive control and the second interactive control in the interactive control group are arranged along a second direction in the interactive interface; the control module 502 is specifically configured to execute a preset virtual operation when the touch point of the touch sliding operation moves to the range of the second interactive control.
Optionally, the preset virtual operation includes: virtual shooting operation.
Optionally, the control module 502 is specifically configured to:
and controlling the rotating amplitude of the visual angle according to the moving distance of the touch sliding operation in the first direction.
The terminal device provided by the present application may be configured to execute the technical solutions of the method embodiments described above, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 6 is a block diagram illustrating a terminal device 600 according to an example embodiment. For example, the device 600 may be a smartphone, tablet, game console, or the like.
Referring to fig. 6, device 600 may include one or more of the following components: processing component 602, memory 604, power component 606, multimedia component 608, audio component 610, input/output (I/O) interface 612, sensor component 614, and communication component 616.
The processing component 602 generally controls overall operation of the device 600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 602 may include one or more processors 620 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 can include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
The memory 604 is configured to store various types of data to support operation at the device 600. Examples of such data include instructions for any application or method operating on device 600, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 604 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 606 provides power to the various components of the device 600. The power components 606 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 600.
The multimedia component 608 includes a touch sensitive display screen that provides an output interface between the device 600 and a user. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 608 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 600 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 610 is configured to output and/or input audio signals. For example, the audio component 610 includes a Microphone (MIC) configured to receive external audio signals when the device 600 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 604 or transmitted via the communication component 616. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 614 includes one or more sensors for providing status assessment of various aspects of the device 600. For example, the sensor component 614 may detect an open/closed state of the device 600, the relative positioning of components, such as a display and keypad of the device 600, the sensor component 614 may also detect a change in the position of the device 600 or a component of the device 600, the presence or absence of user contact with the device 600, orientation or acceleration/deceleration of the device 600, and a change in the temperature of the device 600. The sensor assembly 614 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 616 is configured to facilitate communications between the device 600 and other devices in a wired or wireless manner. The device 600 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the device 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 604 comprising instructions, executable by the processor 620 of the device 600 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having stored thereon a program product capable of carrying out the methods described herein, the instructions in the storage medium, when executed by a processor of the device 600, enable the device 600 to perform the methods described herein.
The terminal device provided by the present application may be configured to execute the technical solutions of the method embodiments described above, and the implementation principles and technical effects are similar, which are not described herein again.

Claims (15)

1. An information processing method is applied to a touch terminal capable of presenting an interactive interface, and is characterized by comprising the following steps:
detecting touch sliding operation of a first preset area in the interactive interface;
controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction;
and if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation.
2. The method of claim 1,
an interaction control group is arranged in the interaction interface and comprises a first interaction control and a second interaction control, wherein the distance between the first interaction control and the second interaction control on the interaction interface is equal to a preset distance;
wherein the first preset area at least partially contains the first interaction control.
3. The method of claim 2, wherein the controlling the view rotation according to the movement of the touch slide operation in the first direction comprises:
controlling the interaction control group to move in an interaction interface according to the touch sliding operation;
and controlling the visual angle to rotate according to the movement of the first interactive control in the first direction.
4. The method of claim 2, wherein the controlling the view rotation according to the movement of the touch slide operation in the first direction comprises:
controlling the interaction control group to move along a first direction in an interaction interface according to the touch sliding operation;
and controlling the visual angle to rotate according to the movement of the first interactive control in the first direction.
5. The method according to claim 3 or 4,
the first interactive control and the second interactive control in the interactive interface group are arranged along a second direction;
if the moving distance of the touch sliding operation in the second direction is greater than a preset distance, executing a preset virtual operation, including: and if the touch point of the touch sliding operation moves to the range of the second interactive control, executing a preset virtual operation.
6. The method of claim 1, wherein the pre-defined virtual operations comprise: virtual shooting operation.
7. The method of claim 1, wherein the controlling the view rotation according to the movement of the touch slide operation in the first direction comprises:
and controlling the rotating amplitude of the visual angle according to the moving distance of the touch sliding operation in the first direction.
8. A terminal device, comprising:
the detection module is used for detecting touch sliding operation of a first preset area in the interactive interface;
the control module is used for controlling the visual angle to rotate according to the movement of the touch sliding operation in the first direction;
the control module is further configured to execute a preset virtual operation if the movement distance of the touch sliding operation in the second direction is greater than a preset distance.
9. The terminal device of claim 8,
an interaction control group is arranged in the interaction interface and comprises a first interaction control and a second interaction control, wherein the distance between the first interaction control and the second interaction control on the interaction interface is equal to a preset distance;
wherein the first preset area at least partially contains the first interaction control.
10. The terminal device of claim 9, wherein the control module is specifically configured to:
controlling the interaction control group to move in an interaction interface according to the touch sliding operation;
and controlling the visual angle to rotate according to the movement of the first interactive control in the first direction.
11. The terminal device of claim 9, wherein the control module is specifically configured to:
controlling the interaction control group to move along a first direction in an interaction interface according to the touch sliding operation;
and controlling the visual angle to rotate according to the movement of the first interactive control in the first direction.
12. The terminal device according to claim 10 or 11,
the first interactive control and the second interactive control in the interactive interface group are arranged along a second direction;
the control module is specifically configured to execute a preset virtual operation if the touch point of the touch sliding operation moves to the range of the second interactive control.
13. The terminal device of claim 8, wherein the pre-defined virtual operations comprise: virtual shooting operation.
14. The terminal device of claim 8, wherein the control module is specifically configured to:
and controlling the rotating amplitude of the visual angle according to the moving distance of the touch sliding operation in the first direction.
15. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the information processing method according to any one of claims 1 to 7.
CN201710990822.2A 2017-10-23 2017-10-23 Information processing method and terminal equipment Active CN107694087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710990822.2A CN107694087B (en) 2017-10-23 2017-10-23 Information processing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710990822.2A CN107694087B (en) 2017-10-23 2017-10-23 Information processing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN107694087A CN107694087A (en) 2018-02-16
CN107694087B true CN107694087B (en) 2021-03-16

Family

ID=61182102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710990822.2A Active CN107694087B (en) 2017-10-23 2017-10-23 Information processing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN107694087B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108553891A (en) * 2018-04-27 2018-09-21 腾讯科技(深圳)有限公司 Object method of sight and device, storage medium and electronic device
CN111905366A (en) * 2019-05-07 2020-11-10 网易(杭州)网络有限公司 In-game visual angle control method and device
CN110393916B (en) * 2019-07-26 2023-03-14 腾讯科技(深圳)有限公司 Method, device and equipment for rotating visual angle and storage medium
CN111263177A (en) * 2020-01-22 2020-06-09 杭州皮克皮克科技有限公司 Video interactive live broadcast method and system
CN113633975B (en) * 2021-08-19 2023-10-20 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, terminal and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201334843A (en) * 2012-02-20 2013-09-01 Fu Li Ye Internat Corp Game control method with touch panel media and game play media
CN104436657B (en) * 2014-12-22 2018-11-13 青岛烈焰畅游网络技术有限公司 Game control method, device and electronic equipment
CN105582670B (en) * 2015-12-17 2019-04-30 网易(杭州)网络有限公司 Aimed fire control method and device
CN106959812A (en) * 2016-01-11 2017-07-18 北京英雄互娱科技股份有限公司 Method and apparatus for man-machine interaction
CN105760076B (en) * 2016-02-03 2018-09-04 网易(杭州)网络有限公司 game control method and device

Also Published As

Publication number Publication date
CN107694087A (en) 2018-02-16

Similar Documents

Publication Publication Date Title
CN107694087B (en) Information processing method and terminal equipment
JP6553719B2 (en) Screen split display method and apparatus
KR101777070B1 (en) Method, device and apparatus for application switching
CN107124508B (en) Position adjusting method and device of suspension control, terminal and readable storage medium
EP3121701A1 (en) Method and apparatus for single-hand operation on full screen
JP6199510B2 (en) Method and apparatus for switching display modes
EP3099040B1 (en) Button operation processing method in single-hand mode, apparatus and electronic device
US20180039403A1 (en) Terminal control method, terminal, and storage medium
WO2017036019A1 (en) Mobile terminal control method and mobile terminal
JP2017534086A (en) Fingerprint identification method and apparatus
CN107992257B (en) Screen splitting method and device
CN111880757A (en) Screen projection method, screen projection device and storage medium
US20150113475A1 (en) Method and device for providing an image preview
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN105511777B (en) Session display method and device on touch display screen
CN112905136A (en) Screen projection control method and device and storage medium
CN111522498A (en) Touch response method and device and storage medium
CN106325712B (en) Terminal display control method and device and terminal
CN112650437B (en) Cursor control method and device, electronic equipment and storage medium
US20220147244A1 (en) Method and device for touch operation, and storage medium
KR102258742B1 (en) Touch signal processing method, apparatus and medium
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys
CN107515694B (en) Terminal control method and device and terminal
CN106843691B (en) Operation control method and device of mobile terminal
CN112445363A (en) Electronic device, control method and device for electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant