CN108245889B - Free visual angle orientation switching method and device, storage medium and electronic equipment - Google Patents

Free visual angle orientation switching method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN108245889B
CN108245889B CN201810145509.3A CN201810145509A CN108245889B CN 108245889 B CN108245889 B CN 108245889B CN 201810145509 A CN201810145509 A CN 201810145509A CN 108245889 B CN108245889 B CN 108245889B
Authority
CN
China
Prior art keywords
orientation
touch operation
free
operation area
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810145509.3A
Other languages
Chinese (zh)
Other versions
CN108245889A (en
Inventor
林晓强
潘杰伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810145509.3A priority Critical patent/CN108245889B/en
Publication of CN108245889A publication Critical patent/CN108245889A/en
Application granted granted Critical
Publication of CN108245889B publication Critical patent/CN108245889B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a free view direction switching method and apparatus, a storage medium, and an electronic device. The method can include responding to an input starting event acting on the touch operation area, and acquiring the position of the input starting event in the touch operation area; acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area; controlling the free viewing angle to switch directly from the orientation before the input initiation event occurs to the target orientation. The method simplifies the operation steps, improves the efficiency of switching the orientation of the free visual angle, reduces the operation cost, has better instantaneity of switching the free visual angle, and simultaneously meets the requirement of a user on the instantaneity of switching the free visual angle in a nervous and fast-paced shooting game.

Description

Free visual angle orientation switching method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a free view direction switching method and apparatus, a storage medium, and an electronic device.
Background
With the rapid development of mobile communication technology, more and more game applications are appearing on touch terminals. In the running process of the game application, the touch terminal displays various virtual objects according to a certain layout so as to present virtual scenes to a user and provide a virtual operation interface.
At present, in a virtual scene, the orientation of the free view angle is often controlled by the virtual joystick, for example, the orientation of the free view angle is adjusted by controlling the motion track of the virtual joystick in the area of the virtual joystick, that is, the orientation of the free view angle is continuously adjusted along with the motion track of the virtual joystick so as to reach the target orientation.
Obviously, in the above manner, the orientation of the free visual angle is continuously adjusted along with the motion trajectory of the virtual rocker, the operation cost for adjusting the orientation of the free visual angle is high, and particularly for tense and fast-paced shooting games, the instantaneity for adjusting the free visual angle is poor, and the user experience is poor.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for switching directions of free viewing angles, a storage medium, and an electronic device, so as to overcome the problems of high operation cost and poor instantaneity of direction switching of free viewing angles at least to a certain extent.
According to an aspect of the present disclosure, a free viewing angle direction switching method is provided, which is applied to a touch terminal capable of presenting an interactive interface, where the interactive interface includes a touch operation area, and the free viewing angle direction switching method includes:
responding to an input initial event acting on the touch operation area, and acquiring the position of the input initial event in the touch operation area;
acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area;
controlling the free viewing angle to switch directly from the orientation before the input initiation event occurs to the target orientation.
In an exemplary embodiment of the present disclosure, the method further comprises:
switching the free-viewing angle from the target orientation directly to an orientation before the input start event occurs in response to an input end event that is continuous with the input start event.
In an exemplary embodiment of the present disclosure, the method further comprises:
and adjusting the position of the free visual angle on the target plane, and adjusting the visual field range of the free visual angle.
In an exemplary embodiment of the present disclosure, the method further comprises:
and adjusting the position of the free visual angle in the direction vertical to the target plane, and adjusting the visual field height of the free visual angle.
In an exemplary embodiment of the present disclosure, the acquiring, according to the position of the input start event in the touch operation area, a target orientation of the free view angle on a target plane perpendicular to the interactive interface includes:
acquiring the direction of the input initial event in the touch operation area according to the position of the input initial event in the touch operation area;
determining the target orientation of the free visual angle on a target plane perpendicular to the interactive interface according to the direction of the input starting event in the touch operation area;
and the direction of the input starting event in the touch operation area corresponds to the orientation of the free visual angle on the target plane one by one.
In an exemplary embodiment of the present disclosure, the touch operation area includes eight touch operation sub-areas, and the eight touch operation sub-areas correspond to eight directions of the free viewing angle on the target plane in a one-to-one manner;
the acquiring, according to the position of the input start event in the touch operation area, a target orientation of the free view angle on a target plane perpendicular to the interactive interface includes:
acquiring the touch operation sub-area to which the input starting event belongs according to the position of the input starting event in the touch operation area;
determining the orientation corresponding to the touch operation sub-area to which the input starting event belongs as a target orientation.
In an exemplary embodiment of the present disclosure, the touch operation area is a virtual cross control, the virtual cross control includes four direction controls, and the four direction controls correspond to four directions of the free view on the target plane one by one;
the acquiring, according to the position of the input start event in the touch operation area, a target orientation of the free view angle on a target plane perpendicular to the interactive interface includes:
acquiring the direction control to which the input starting event belongs according to the position of the input starting event in the virtual cross control;
and determining the orientation corresponding to the direction control to which the input starting event belongs as a target orientation.
In an exemplary embodiment of the present disclosure, the touch operation area includes a virtual joystick area.
In an exemplary embodiment of the present disclosure, the input start event is any one of a slide operation, a press operation, and a drag operation.
According to an aspect of the present disclosure, there is provided a free viewing angle and direction switching apparatus for a touch terminal capable of presenting an interactive interface, wherein the interactive interface includes a touch operation area, the free viewing angle and direction switching apparatus includes:
the position acquisition module is used for responding to an input starting event acting on the touch operation area and acquiring the position of the input starting event in the touch operation area;
the orientation acquisition module is used for acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area;
and the orientation switching module is used for controlling the free visual angle to be directly switched to the target orientation from the orientation before the input starting event occurs.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the free viewing angle orientation switching method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the free viewing angle orientation switching method of any one of the above via execution of the executable instructions.
According to the method, the target orientation of the free visual angle on the target plane perpendicular to the interactive interface is acquired according to the position of the input starting event in the touch operation area, and the free visual angle is directly switched to the target orientation from the orientation before the input starting event occurs. On one hand, after the target orientation is acquired, the free visual angle is controlled to be directly (immediately) switched to the target orientation from the orientation before the input starting event occurs, and compared with the related technology, the process of continuously adjusting the target orientation from the current orientation is avoided, so that the operation steps are simplified, the efficiency of switching the orientation of the free visual angle is improved, the operation cost is reduced, the instantaneity of switching the free visual angle is better, and meanwhile, the requirement of a user on the instantaneity of switching the free visual angle in a tense and fast-paced shooting game is met. On the other hand, the target orientation of the free visual angle on the target plane perpendicular to the interactive interface is obtained through the position of the input starting event in the touch operation area, so that a user can quickly determine the position of the input starting event in the touch operation area according to the relation between the orientation of the free visual angle and the position of the input starting event in the touch operation area, and the input starting event is executed in the touch operation area, the operation steps are simple and easy to learn, and the user experience is good.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 is a flow chart of a free view direction switching method according to the present disclosure;
FIG. 2 is a first flowchart of a method of obtaining an orientation of a target provided in an exemplary embodiment of the present disclosure;
fig. 3 is a first schematic diagram illustrating a direction of an input start event in a touch operation area according to an exemplary embodiment of the present disclosure;
fig. 4 is a second schematic diagram illustrating a direction of an input start event in a touch operation area according to an exemplary embodiment of the disclosure;
fig. 5 is a schematic diagram of a touch operation area including eight touch operation sub-areas provided in an exemplary embodiment of the present disclosure;
FIG. 6 is a flowchart II of a method of obtaining a target orientation provided in an exemplary embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a virtual cross control provided in an exemplary embodiment of the present disclosure;
FIG. 8 is a flowchart III of a method of obtaining a target orientation provided in an exemplary embodiment of the present disclosure;
FIG. 9 is a schematic view of an expanded free-viewing angle field of view provided in an exemplary embodiment of the present disclosure;
FIG. 10 is a block diagram of a free viewing angle orientation switching device according to the present disclosure;
FIG. 11 is a block diagram illustration of an electronic device in an exemplary embodiment of the disclosure;
FIG. 12 is a schematic diagram illustrating a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses a free viewing angle direction switching method, which is applied to a touch terminal capable of presenting an interactive interface, where the interactive interface may include a touch operation area. The touch terminal may be, for example, various electronic devices having a touch screen, such as a mobile phone, a tablet computer, a notebook computer, a game machine, and a PDA. The touch terminal can control a touch screen of the touch terminal to present an interactive interface, a virtual object, a touch operation area, a virtual battle scene, a virtual natural environment and the like through an application program interface of the touch terminal. The free view angle means that when the orientation of the free view angle is changed, the orientation and the moving direction of the virtual object are not changed, namely, the connection between the orientation of the free view angle and the orientation and the moving direction of the virtual object is cut off. Referring to fig. 1, the free viewing angle orientation switching method may include the steps of:
step S110, responding to an input initial event acting on the touch operation area, and acquiring the position of the input initial event in the touch operation area;
step S120, acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area;
and step S130, controlling the free view angle to be directly switched to the target orientation from the orientation before the input starting event occurs.
Through the free visual angle direction switching method in the exemplary embodiment, on one hand, after the target direction is obtained, the free visual angle is controlled to be directly (i.e. immediately) switched to the target direction from the direction before the input starting event occurs, and compared with the related art, the process of continuously adjusting the current direction to the target direction is not performed, so that the operation steps are simplified, the efficiency of switching the direction of the free visual angle is improved, the operation cost is reduced, the instantaneity of switching the free visual angle is better, and meanwhile, the requirement of a user on the instantaneity of switching the free visual angle in a tense and fast-paced shooting game is met. On the other hand, the target orientation of the free visual angle on the target plane perpendicular to the interactive interface is obtained through the position of the input starting event in the touch operation area, so that a user can quickly determine the position of the input starting event in the touch operation area according to the relation between the orientation of the free visual angle and the position of the input starting event in the touch operation area, and the input starting event is executed in the touch operation area, the operation steps are simple and easy to learn, and the user experience is good.
Next, each step in the free viewing angle orientation switching method in the present exemplary embodiment will be further described.
In step S110, in response to an input start event acting on the touch operation area, a position of the input start event in the touch operation area is obtained.
In this exemplary embodiment, the touch operation area may be a virtual joystick area, a virtual cross control, or an area set by a developer for switching the orientation of a free viewing angle, which is not particularly limited in this exemplary embodiment. The shape of the touch operation area may be a circle, an ellipse, or a square, which is not particularly limited in this exemplary embodiment. The display state of the touch operation area may include: always displaying the touch control terminal in an interactive interface (namely a resident area in the interactive interface); and when the touch operation area is in a trigger state, the touch operation area is displayed in the interactive interface. It should be noted that the display state of the touch operation area is not limited to this. The input start event may be any one of a slide operation, a press operation, a drag operation, and the like.
The user can execute an input initial event in the touch operation area through a finger, a touch pen, and the like, for example, when the touch operation area is a virtual rocker area and the input initial event is a sliding operation, the user can slide in the virtual rocker area through the finger to execute a sliding operation acting on the virtual rocker area, and during the sliding process, the position of the virtual rocker in the virtual rocker area changes according to the change of the position of a touch point of the finger in the virtual rocker area. For another example, when the touch operation is a virtual joystick area and the input start event is a press operation, the user can press the virtual joystick area with a finger to perform the press operation on the virtual joystick area.
After an input initial event acting on the touch operation area is detected, the position of the input initial event in the touch operation area is acquired through a position acquisition module in response to the input initial event acting on the touch operation area. The position of the input start event in the touch operation area refers to a position of the end point of the touch point of the input start event in the touch operation area. For example, when the input start event is a sliding operation, the position of the input start event in the touch operation area is the position of the end point position of the touch point of the sliding operation in the touch operation area.
In step S120, according to the position of the input start event in the touch operation area, a target orientation of the free view angle on a target plane perpendicular to the interactive interface is obtained.
In this exemplary embodiment, the object plane perpendicular to the interactive interface may be parallel to an edge of the interactive interface, or may be not parallel to the edge of the interactive interface.
The method for acquiring the target orientation of the free view angle on the target plane perpendicular to the interactive interface according to the position of the input starting event in the touch operation area may include the following three ways.
As shown in fig. 2, the first method may include step S210 and step S220. Wherein:
in step S210, a direction of the input start event in the touch operation area is obtained according to a position of the input start event in the touch operation area.
In this exemplary embodiment, after acquiring the position of the input start event in the touch operation area, the process of acquiring the direction of the input start event in the touch operation area may include: as shown in fig. 3, a rectangular coordinate system is set in the touch operation area, and the direction of the input start event in the touch operation area is determined according to the direction from the position of the midpoint O of the rectangular coordinate system to the end position of the touch point of the input start event, and in fig. 3, the direction of the input start event in the touch operation area is in the first quadrant region of the rectangular coordinate system and forms an angle of 45 degrees with the positive X-axis. As shown in fig. 4, a rectangular coordinate system is set in the touch operation area, and the position of the input start event in the touch operation area is determined according to the direction from the position of the end point O of the rectangular coordinate system to the end point position of the touch point of the input start event, and in fig. 4, the direction of the input start event in the touch operation area is in the fourth quadrant area of the rectangular coordinate system and forms an angle of 60 degrees with the positive X-axis.
In step S220, a target orientation of the free viewing angle on a target plane perpendicular to the interactive interface is determined according to a direction of the input start event in the touch operation area. And the direction of the input starting event in the touch operation area corresponds to the orientation of the free visual angle on the target plane one by one.
In the present exemplary embodiment, a one-to-one correspondence relationship between the direction of the input start event in the touch operation area and the orientation of the free viewing angle on the target plane may be established, that is, the one-to-one correspondence between the direction and 360 degrees of the orientation is realized. For example, a rectangular coordinate system may be established in the target plane with the position of the free viewing angle as a midpoint, and a one-to-one correspondence relationship between directions at different angles to the forward X-axis in the touch operation area and directions at different angles to the forward X-axis in the target plane, respectively, e.g., a correspondence relationship between a direction at an angle of 45 degrees to the forward X-axis in the touch operation area and a direction at an angle of 45 degrees to the forward X-axis in the target plane. It should be noted that the manner of establishing the one-to-one correspondence between the direction of the input start event in the touch operation area and the orientation of the free viewing angle on the target plane in the present exemplary embodiment is not limited to this. On the basis, the direction of the free visual angle corresponding to the direction in the touch operation area is acquired through inputting the direction of the starting event in the touch operation area, and the direction is determined as the target direction.
In the second mode, the touch operation area may include eight touch operation sub-areas, and the eight touch operation sub-areas correspond to the eight directions of the free viewing angle on the target plane one by one. In the present exemplary embodiment, the touch operation area may be divided into eight touch operation sub-areas with the same area, and may also be divided into eight touch operation sub-areas with different areas, which is not particularly limited in the present exemplary embodiment. As shown in fig. 5, the process of dividing eight touch operation sub-areas is described by taking the touch operation area as a virtual joystick area. Firstly, a touch operation area is divided into 8 touch operation sub-areas with equal areas through 4 lines passing through the middle point of a virtual rocker area, wherein the included angles between two adjacent lines passing through the middle point are equal. As shown in fig. 5, the eight touch operation sub-areas are a first touch operation sub-area 501, a second touch operation sub-area 502, a third touch operation sub-area 503, a fourth touch operation sub-area 504, a fifth touch operation sub-area 505, a sixth touch operation sub-area 506, a seventh touch operation sub-area 507 and an eighth touch operation sub-area 508. A one-to-one correspondence relationship between the touch operation sub-areas and eight directions (for example, a forward direction, a backward direction, a leftward direction, a rightward direction, a forward left direction, a forward right direction, a backward left direction, and a backward right direction) of the free angle of view on the target plane is respectively established, for example, the first touch operation sub-area 501 corresponds to the forward direction, the second touch operation sub-area 502 corresponds to the forward left direction, the third touch operation sub-area 503 corresponds to the leftward direction, the fourth touch operation sub-area 504 corresponds to the backward left direction, the fifth touch operation sub-area 505 corresponds to the backward direction, the sixth touch operation sub-area 506 corresponds to the backward right direction, the seventh touch operation sub-area 507 corresponds to the rightward direction, and the eighth touch operation sub-area 508 corresponds to the forward right direction. Further, in order to more intuitively display the corresponding orientation of each touch operation sub-area, an identifier for indicating the corresponding orientation may be displayed in the touch operation sub-area, where the identifier may be a triangle or an arrow, and this is not particularly limited in this exemplary embodiment.
Based on this, as shown in fig. 6, the second method may include step S610 and step S620, where:
in step S610, the touch operation sub-area to which the input start event belongs is obtained according to the position of the input start event in the touch operation area. In the exemplary embodiment, the touch operation sub-area to which the input start event belongs may be determined according to the end position of the touch point of the input start event.
In step S620, an orientation corresponding to the touch operation sub-area to which the input start event belongs is determined as a target orientation.
In a third mode, as shown in fig. 7, the touch operation area is a virtual cross control, the virtual cross control includes four direction controls, and the four direction controls correspond to four directions of the free view on the target plane one by one. For example, the first direction control 701 corresponds to a forward facing orientation, the second direction control 702 corresponds to a leftward facing orientation, the third direction control 703 corresponds to a rearward facing orientation, and the fourth direction control 704 corresponds to a rightward facing orientation.
Based on this, as described in fig. 8, the third method may include step S810 and step S820, where:
in step S810, the direction control to which the input start event belongs is obtained according to the position of the input start event in the virtual cross control.
In step S820, the orientation corresponding to the direction control to which the input start event belongs is determined as a target orientation.
Among the three methods described above, the first method and the second method are more suitable for users who are familiar with the game because of no particular clear operation directivity, and the third method is more suitable for novice users because of clear operation and direction orientation.
In summary, the target orientation of the free viewing angle on the target plane perpendicular to the interactive interface is obtained by the position of the input start event in the touch operation area, so that the user can quickly determine the position of the input start event in the touch operation area according to the relationship between the orientation of the free viewing angle and the position of the input start event in the touch operation area, and execute the input start event in the touch operation area, and the operation steps are simple and easy to learn, and the user experience is better.
In step S130, the free viewing angle is controlled to be directly switched from the orientation before the input start event occurs to the target orientation.
In the present exemplary embodiment, after the target orientation is acquired, the free viewing angle is controlled to be switched from the orientation before the input start event occurs to the target orientation directly (i.e., immediately). While in the related art, the process of switching to the target orientation is to continuously adjust the free viewing angle from the orientation before the sliding operation to the target orientation through the trajectory of the sliding operation, that is, the orientation of the free viewing angle changes along with the change of the trajectory of the sliding operation, obviously, in the present exemplary embodiment, the free viewing angle is controlled to be directly (i.e., immediately) switched to the target orientation, and there is no process of continuously adjusting from the current orientation to the target orientation, so, in the present exemplary embodiment, the target orientation of the free viewing angle can be determined only according to the position of the input start event in the touch operation area, and the free viewing angle is directly controlled to be switched from the orientation before the input start event to the target orientation, thereby simplifying the operation steps, greatly reducing the time for switching the free viewing angle, improving the efficiency of switching the orientation of the free viewing angle, reducing the operation cost, and having good instantaneity, meanwhile, the requirement of the user on the instantaneity of switching the free visual angle in the tense and fast-paced shooting game is met.
Further, the method may further include: switching the free-viewing angle from the target orientation directly to an orientation before the input start event occurs in response to an input end event that is continuous with the input start event. In this exemplary embodiment, the input end event may be, for example, an operation of a finger leaving an interface of the touch operation area, and may also be an operation of a finger moving out of the touch operation area, which is not particularly limited in this exemplary embodiment. Upon receiving an input end event that is continuous with the input start event, the free-form viewing angle is switched directly (i.e., immediately) from the target orientation to the orientation before the input start event occurs in response to the input end event.
In the process of switching the orientation of the free viewing angle, the orientation of the free viewing angle changes, but the position of the free viewing angle does not change.
Therefore, the free visual angle can be rapidly switched from the direction before the initial event is input to the target direction through the initial event input, the free visual angle can be rapidly recovered from the target direction to the direction before the initial event is input according to the end event input, the operation is simple and easy to learn, the switching efficiency of the direction of the free visual angle is high, and the game experience and the operation hand feeling are well optimized in the compact shooting game battle.
In order to enlarge or reduce the free viewing angle field of view or change the free viewing angle reference point, the method may further include: and adjusting the position of the free visual angle on the target plane, and adjusting the visual field range of the free visual angle. In the present exemplary embodiment, with one virtual object in the field of view of the free view as a base point, enlarging the distance between the position of the free view and the position of the virtual object can expand the field of view of the free view when the position of the virtual object is not changed; the distance between the position of the free visual angle and the position of the virtual object is reduced, so that the visual field range can be reduced; the reference point of the free view angle may be changed by changing only the position of the free view angle without changing the distance between the position of the free view angle and the position of the virtual object. For example, as shown in fig. 9, the region between the two broken lines is the field of view of the free view, and when the free view 1 moves from a distance from the virtual object X1 to a distance from the virtual object X2, it can be visually seen in the figure that the field of view of the free view becomes larger.
It should be noted that the position of the free view angle is the position of the virtual camera in the interactive interface, and the orientation of the free view angle is the orientation of the virtual camera.
In order to adjust the height of the field of view of the free viewing angle, the method may further include: and adjusting the position of the free visual angle in the direction vertical to the target plane, and adjusting the visual field height of the free visual angle. In the present exemplary embodiment, the height of the field of view of the free view angle can be changed by changing only the position of the free view angle in the direction perpendicular to the target plane without changing the position of the free view angle on the target plane. For example, when the upper half of the virtual object in the field of view of the free view cannot be seen, which indicates that the field of view height of the free view is low, the position of the free view can be raised in the above manner to increase the field of view height of the free view.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
In this exemplary embodiment, a free viewing angle direction switching apparatus is further disclosed, which is applied to a touch terminal capable of presenting an interactive interface, where the interactive interface includes a touch operation area, and referring to fig. 10, the free viewing angle direction switching apparatus 1000 may include: a position acquisition module 1001, an orientation acquisition module 1002, and an orientation switching module 1003, wherein:
a position obtaining module 1001, configured to obtain, in response to an input start event acting on the touch operation area, a position of the input start event in the touch operation area;
an orientation obtaining module 1002, configured to obtain a target orientation of the free view angle on a target plane perpendicular to the interactive interface according to a position of the input start event in the touch operation area;
the orientation switching module 1003 may be configured to control the free viewing angle to be directly switched from the orientation before the input start event occurs to the target orientation.
The specific details of the above-mentioned device modules for switching the viewing angle directions have been described in detail in the corresponding method for switching the free viewing angle directions, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1100 according to this embodiment of the invention is described below with reference to fig. 11. The electronic device 1100 shown in fig. 11 is only an example and should not bring any limitations to the function and the scope of use of the embodiments of the present invention.
As shown in fig. 11, electronic device 1100 is embodied in the form of a general purpose computing device. The components of the electronic device 1100 may include, but are not limited to: the at least one processing unit 1110, the at least one memory unit 1120, a bus 1130 connecting different system components (including the memory unit 1120 and the processing unit 1110), and a display unit 1140.
Wherein the storage unit stores program code that is executable by the processing unit 1110 to cause the processing unit 1110 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 1110 may execute step S110 shown in fig. 1, in response to an input start event acting on the touch operation area, acquiring a position of the input start event in the touch operation area; step S120, acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area; and step S130, controlling the free view angle to be directly switched to the target orientation from the orientation before the input starting event occurs.
The storage unit 1120 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)11201 and/or a cache memory unit 11202, and may further include a read only memory unit (ROM) 11203.
Storage unit 1120 may also include a program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1130 may be representative of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1100 may also communicate with one or more external devices 1170 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1100, and/or any devices (e.g., router, modem, etc.) that enable the electronic device 1100 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 1150. Also, the electronic device 1100 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1160. As shown, the network adapter 1160 communicates with the other modules of the electronic device 1100 over the bus 1130. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1100, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 12, a program product 1200 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the figures do not indicate or limit the order of the events of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (12)

1. A free visual angle direction switching method is applied to a touch terminal capable of presenting an interactive interface, wherein the interactive interface comprises a touch operation area, and the free visual angle direction switching method comprises the following steps:
responding to an input initial event acting on the touch operation area, and acquiring the position of the input initial event in the touch operation area;
acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area; wherein the free perspective is a user-facing, freely adjustable perspective independent of the orientation and direction of motion of the virtual object;
controlling the free viewing angle to switch directly from the orientation before the input initiation event occurs to the target orientation.
2. The free viewing angle orientation switching method according to claim 1, further comprising:
switching the free-viewing angle from the target orientation directly to an orientation before the input start event occurs in response to an input end event that is continuous with the input start event.
3. The free viewing angle orientation switching method according to claim 1, further comprising:
and adjusting the position of the free visual angle on the target plane, and adjusting the visual field range of the free visual angle.
4. The free viewing angle orientation switching method according to claim 1, further comprising:
and adjusting the position of the free visual angle in the direction vertical to the target plane, and adjusting the visual field height of the free visual angle.
5. The method for switching the orientation of the free viewing angle according to claim 1, wherein the obtaining the target orientation of the free viewing angle on a target plane perpendicular to the interactive interface according to the position of the input start event in the touch operation area comprises:
acquiring the direction of the input initial event in the touch operation area according to the position of the input initial event in the touch operation area;
determining the target orientation of the free visual angle on a target plane perpendicular to the interactive interface according to the direction of the input starting event in the touch operation area;
and the direction of the input starting event in the touch operation area corresponds to the orientation of the free visual angle on the target plane one by one.
6. The method according to claim 1, wherein the touch operation area comprises eight touch operation sub-areas, and the eight touch operation sub-areas correspond to eight directions of the free viewing angle on the target plane one by one;
the acquiring, according to the position of the input start event in the touch operation area, a target orientation of the free view angle on a target plane perpendicular to the interactive interface includes:
acquiring the touch operation sub-area to which the input starting event belongs according to the position of the input starting event in the touch operation area;
determining the orientation corresponding to the touch operation sub-area to which the input starting event belongs as a target orientation.
7. The method according to claim 1, wherein the touch operation area is a virtual cross control, the virtual cross control includes four direction controls, and the four direction controls are in one-to-one correspondence with four orientations of the free view on the target plane;
the acquiring, according to the position of the input start event in the touch operation area, a target orientation of the free view angle on a target plane perpendicular to the interactive interface includes:
acquiring the direction control to which the input starting event belongs according to the position of the input starting event in the virtual cross control;
and determining the orientation corresponding to the direction control to which the input starting event belongs as a target orientation.
8. The method according to any one of claims 1 to 6, wherein the touch operation area comprises a virtual joystick area.
9. The method according to any one of claims 1 to 7, wherein the input start event is any one of a slide operation, a press operation, and a drag operation.
10. A free visual angle orientation switching device is applied to a touch terminal capable of presenting an interactive interface, wherein the interactive interface includes a touch operation area, and the free visual angle orientation switching device includes:
the position acquisition module is used for responding to an input starting event acting on the touch operation area and acquiring the position of the input starting event in the touch operation area;
the orientation acquisition module is used for acquiring the target orientation of the free visual angle on a target plane vertical to the interactive interface according to the position of the input starting event in the touch operation area; wherein the free perspective is a user-facing, freely adjustable perspective independent of the orientation and direction of motion of the virtual object;
and the orientation switching module is used for controlling the free visual angle to be directly switched to the target orientation from the orientation before the input starting event occurs.
11. A computer-readable storage medium, on which a computer program is stored, the computer program, when being executed by a processor, implementing the free viewing angle orientation switching method according to any one of claims 1 to 9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor; wherein the processor is configured to perform the free viewing angle orientation switching method of any one of claims 1-9 via execution of the executable instructions.
CN201810145509.3A 2018-02-12 2018-02-12 Free visual angle orientation switching method and device, storage medium and electronic equipment Active CN108245889B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810145509.3A CN108245889B (en) 2018-02-12 2018-02-12 Free visual angle orientation switching method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810145509.3A CN108245889B (en) 2018-02-12 2018-02-12 Free visual angle orientation switching method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108245889A CN108245889A (en) 2018-07-06
CN108245889B true CN108245889B (en) 2021-06-18

Family

ID=62744258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810145509.3A Active CN108245889B (en) 2018-02-12 2018-02-12 Free visual angle orientation switching method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108245889B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020061726A1 (en) * 2018-09-25 2020-04-02 维亚科技国际有限公司 Virtual reality game system, processor and virtual game scene moving method
CN110139090A (en) * 2019-05-22 2019-08-16 北京光启元数字科技有限公司 A kind of visual angle processing method and its processing system
CN110227254B (en) * 2019-06-21 2020-07-07 腾讯科技(深圳)有限公司 Visual angle switching control method and device, storage medium and electronic device
CN110300266B (en) * 2019-07-04 2021-04-02 珠海西山居移动游戏科技有限公司 Lens moving method and system, computing device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609210A (en) * 2012-02-16 2012-07-25 上海华勤通讯技术有限公司 Configuration method for functional icons of mobile terminal and mobile terminal
JP2016139165A (en) * 2015-01-26 2016-08-04 株式会社コロプラ Interface program for icon selection
CN106528020A (en) * 2016-10-26 2017-03-22 腾讯科技(深圳)有限公司 View mode switching method and terminal
CN107008003A (en) * 2017-04-13 2017-08-04 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and computer-readable recording medium
CN107132988A (en) * 2017-06-06 2017-09-05 网易(杭州)网络有限公司 Virtual objects condition control method, device, electronic equipment and storage medium
CN107213643A (en) * 2017-03-27 2017-09-29 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107491262A (en) * 2017-08-11 2017-12-19 网易(杭州)网络有限公司 Virtual object control method and device, storage medium, electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609210A (en) * 2012-02-16 2012-07-25 上海华勤通讯技术有限公司 Configuration method for functional icons of mobile terminal and mobile terminal
JP2016139165A (en) * 2015-01-26 2016-08-04 株式会社コロプラ Interface program for icon selection
CN106528020A (en) * 2016-10-26 2017-03-22 腾讯科技(深圳)有限公司 View mode switching method and terminal
CN107213643A (en) * 2017-03-27 2017-09-29 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107008003A (en) * 2017-04-13 2017-08-04 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and computer-readable recording medium
CN107132988A (en) * 2017-06-06 2017-09-05 网易(杭州)网络有限公司 Virtual objects condition control method, device, electronic equipment and storage medium
CN107491262A (en) * 2017-08-11 2017-12-19 网易(杭州)网络有限公司 Virtual object control method and device, storage medium, electronic equipment

Also Published As

Publication number Publication date
CN108245889A (en) 2018-07-06

Similar Documents

Publication Publication Date Title
CN108245889B (en) Free visual angle orientation switching method and device, storage medium and electronic equipment
CN106155553B (en) Virtual object motion control method and device
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
CN106975219B (en) Display control method and device, storage medium, the electronic equipment of game picture
CN105148517B (en) A kind of information processing method, terminal and computer-readable storage medium
CN108579089B (en) Virtual item control method and device, storage medium and electronic equipment
CN107132981B (en) Display control method and device, storage medium, the electronic equipment of game picture
CN105760076A (en) Game control method and device
CN107329690B (en) Virtual object control method and device, storage medium and electronic equipment
CN109260713B (en) Virtual object remote assistance operation method and device, storage medium and electronic equipment
CN108829320A (en) Exchange method, device, storage medium, mobile terminal and interactive system
CN106873886B (en) Control method and device for stereoscopic display and electronic equipment
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN113138670B (en) Touch screen interaction gesture control method and device, touch screen and storage medium
CN111773677B (en) Game control method and device, computer storage medium and electronic equipment
CN108012195B (en) Live broadcast method and device and electronic equipment thereof
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN113457144B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN106527916A (en) Operating method and device based on virtual reality equipment, and operating equipment
CN111973984B (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium
WO2023236602A1 (en) Display control method and device for virtual object, and storage medium and electronic device
CN117271045A (en) Equipment information display method and device based on digital twinning and electronic equipment
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN112965773A (en) Method, apparatus, device and storage medium for information display
CN113769403B (en) Virtual object moving method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant