CN112245914B - Viewing angle adjusting method and device, storage medium and computer equipment - Google Patents

Viewing angle adjusting method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN112245914B
CN112245914B CN202011252862.5A CN202011252862A CN112245914B CN 112245914 B CN112245914 B CN 112245914B CN 202011252862 A CN202011252862 A CN 202011252862A CN 112245914 B CN112245914 B CN 112245914B
Authority
CN
China
Prior art keywords
visual angle
viewing angle
adjusting
angle
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011252862.5A
Other languages
Chinese (zh)
Other versions
CN112245914A (en
Inventor
姚舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011252862.5A priority Critical patent/CN112245914B/en
Publication of CN112245914A publication Critical patent/CN112245914A/en
Application granted granted Critical
Publication of CN112245914B publication Critical patent/CN112245914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The embodiment of the application discloses a viewing angle adjusting method, a viewing angle adjusting device, a storage medium and computer equipment, wherein the method comprises the following steps: the graphical user interface provides a first visual angle control area and a second visual angle control area, wherein the second visual angle control area comprises a visual angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the view angle is adjusted in response to a second operation initiated at the view angle height control and acting on a designated area. Therefore, different operations are performed on the visual angle height control, so that various different visual angle adjusting functions are realized, the complexity of visual angle adjusting operation is reduced, and the visual angle adjusting efficiency is improved.

Description

Viewing angle adjusting method and device, storage medium and computer equipment
Technical Field
The present disclosure relates to the field of computers, and in particular, to a viewing angle adjustment method, a viewing angle adjustment device, a computer readable storage medium, and a computer device.
Background
In recent years, with development and popularization of computer equipment technology, more and more applications having three-dimensional virtual environments, such as: virtual reality applications, three-dimensional map programs, military simulation programs, first person shooter games (First person shooting game, FPS), multiplayer online tactical athletic games (Multiplayer Online Battle Arena Games, MOBA), etc.
In the prior art, a terminal displays an environmental picture in a three-dimensional virtual environment by adopting an observer viewing angle (OB), taking an FPS game on a mobile terminal as an example, a user controls a first viewing angle control area through a left hand so as to control movement of a viewing angle in a horizontal direction, clicks a second viewing angle control area through a right hand so as to adjust a viewing angle height, and realizes rotation of the viewing angle by sliding a screen of the mobile terminal.
In the research and practice process of the prior art, the inventor of the application finds that the control of the visual angle adjustment in the prior art is complicated and is not coherent, so that the efficiency of the visual angle adjustment is lower.
Disclosure of Invention
The embodiment of the application provides a visual angle adjusting method and device, which can improve the visual angle adjusting efficiency.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
A viewing angle adjustment method for providing a graphical user interface through a display component of a terminal, the graphical user interface displaying content including at least a portion of a game scene, the method comprising:
the graphical user interface provides a first visual angle control area and a second visual angle control area, wherein the second visual angle control area comprises a visual angle height control;
responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane;
adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control;
the orientation of the view angle is adjusted in response to a second operation initiated at the view angle height control and acting on a designated area.
A viewing angle adjustment apparatus for providing a graphical user interface through a display component of a terminal, the graphical user interface displaying content including at least a portion of a game scene, the apparatus comprising:
the display module is used for providing a first visual angle control area and a second visual angle control area for the graphical user interface, wherein the second visual angle control area comprises a visual angle height control;
The first adjusting module is used for responding to the touch operation acted on the first visual angle control area and adjusting the projection position of the visual angle on a designated plane;
the second adjusting module is used for responding to the first operation acted on the visual angle height control, and adjusting the height of the visual angle perpendicular to the appointed plane;
and the third adjusting module is used for responding to a second operation which starts from the visual angle height control and acts on the designated area, and adjusting the orientation of the visual angle.
In some embodiments, the second operation is a sliding operation that slides from the height control to the designated area;
the third adjustment module includes:
the first acquisition sub-module is used for acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface;
the first determining submodule is used for determining the current sliding direction of the second operation based on the touch position and the reference point position;
and the first adjusting submodule is used for determining the target orientation of the visual angle according to the current sliding direction and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the first determining sub-module comprises:
A determining unit, configured to determine a current sliding direction of the second operation and a current sliding distance of the second operation based on the touch position and the position of the reference point;
the first adjustment sub-module includes:
and the adjusting unit is used for determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the adjusting unit is configured to:
determining the current sliding direction as a viewing angle rotation direction;
acquiring a maximum angle at which the viewing angle is rotatable in the viewing angle rotation direction, and a maximum distance at which a second operation is slidable in the current sliding direction;
determining a viewing angle rotation angle based on the current sliding distance, the maximum distance, and the maximum angle;
and determining the target orientation of the visual angle according to the visual angle rotation direction and the visual angle rotation angle.
In some embodiments, the adjusting unit is further configured to:
if the sliding operation of moving from the appointed area to the visual angle height control is detected, acquiring the stay time of the second operation in the visual angle height control;
Comparing the residence time with a preset time to obtain a comparison result;
if the comparison result shows that the stay time length is longer than the preset time length, adjusting the height of the visual angle perpendicular to the appointed plane;
if the comparison result shows that the residence time is smaller than the preset time, judging whether the second operation is detected or not;
and if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In some embodiments, the adjusting unit is further configured to:
and if the sliding operation is moved from the appointed area to the visual angle height control, the second operation cannot be detected, and the appointed area is hidden.
In some embodiments, the content displayed by the user graphical interface is an environment picture when the target object observes the virtual environment in a preset viewing angle direction;
a first adjustment module comprising:
the second adjusting sub-module is used for responding to the touch operation acted on the first visual angle control area and adjusting the target object to move horizontally on the appointed plane;
A second adjustment module comprising:
and a third adjustment sub-module for adjusting the height of the target object perpendicular to the designated plane in response to a first operation acting on the second viewing angle control region.
A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the viewing angle adjustment method as described above.
A computer device comprising a memory in which a computer program is stored and a processor that performs the steps of the viewing angle adjustment method as described above by calling the computer program stored in the memory.
Providing a first visual angle control area and a second visual angle control area through a graphical user interface, wherein the second visual angle control area comprises a visual angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the view angle is adjusted in response to a second operation initiated at the view angle height control and acting on a designated area. Therefore, different operations are performed on the visual angle height control, so that various different visual angle adjusting functions are realized, the complexity of visual angle adjusting operation is reduced, and the visual angle adjusting efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic system diagram of a viewing angle adjustment method according to an embodiment of the present application.
Fig. 1b is a flow chart of a method for adjusting a viewing angle according to an embodiment of the present application.
Fig. 1c is a schematic diagram of a first application scenario of a viewing angle adjustment method according to an embodiment of the present application.
Fig. 1d is a schematic diagram of a world coordinate system in a virtual environment according to an embodiment of the present application.
Fig. 1e is a schematic diagram of a second application scenario of the viewing angle adjustment method provided in the embodiment of the present application.
Fig. 1f is a rotation schematic diagram of a rotation angle of view of a camera model according to an embodiment of the present application.
Fig. 1g is a schematic diagram of a third application scenario of the viewing angle adjustment method provided in the embodiment of the present application.
Fig. 1h is a schematic diagram of a fourth application scenario of the viewing angle adjustment method provided in the embodiment of the present application.
Fig. 2 is another flow chart of the method for adjusting a viewing angle according to the embodiment of the present application.
Fig. 3 is a schematic structural diagram of a viewing angle adjusting device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a viewing angle adjusting method, a viewing angle adjusting device, a storage medium and computer equipment. Specifically, the method for adjusting the viewing angle according to the embodiment of the present application may be performed by a computer device, where the computer device may be a device such as a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the viewing angle adjustment method is run on the terminal, the terminal device stores a game application and presents a part of a game scene in the game through a display component (e.g., a touch display screen). The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the viewing angle adjustment method is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the visual angle adjustment method are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but the terminal device for processing game data is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1a, fig. 1a is a schematic system diagram of a viewing angle adjusting device according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. Terminal 1000 held by a user may be connected to servers of different games through network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining input from a user through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals 1000 so as to be connected via an appropriate network and synchronized with each other to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 while different users play the multiplayer game online.
The embodiment of the application provides a visual angle adjusting method which can be executed by a terminal or a server. The embodiment of the present application will be described with an example in which a viewing angle adjustment method is executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or player) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the player, provide virtual services, increase scores related to the player's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the player. For example, a game may include a player controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by the player of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the viewing angle adjusting system shown in fig. 1a is only an example, and the viewing angle adjusting system and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation to the technical solutions provided in the embodiments of the present application, and those skilled in the art can know that, with the evolution of the viewing angle adjusting system and the appearance of the new service scenario, the technical solutions provided in the embodiments of the present application are equally applicable to similar technical problems.
In this embodiment, description will be made from the viewpoint of a viewing angle adjusting apparatus which can be integrated in a computer device having a storage unit and a microprocessor mounted thereon and having arithmetic capability.
Referring to fig. 1b, fig. 1b is a flow chart illustrating a method for adjusting a viewing angle according to an embodiment of the present disclosure. The visual angle adjusting method comprises the following steps:
in step 101, the graphical user interface provides a first viewing angle control region and a second viewing angle control region, wherein the second viewing angle control region comprises a viewing angle height control.
The graphical user interface displays a virtual environment in a game scene, wherein the virtual environment is a virtual environment provided when an application program runs on a terminal, and can be a simulation environment for a real world, a semi-simulation and semi-fictitious environment or a pure fictitious environment. The target object is a camera model, for example, an FPS game, which is positioned at the head or neck of the virtual character when at a first person viewing angle and behind the virtual character when at a third person viewing angle. The partial game scene is a picture generated by observing the virtual environment through the camera model in a certain view angle direction.
The graphical user interface further includes a first viewing angle control area 11 and a second viewing angle control area 12, wherein the first viewing angle control area 11 and the second viewing angle control area 12 can implement different viewing angle adjustment functions according to the user's non-touch operation.
Specifically, the second viewing angle control area 12 includes a viewing angle height control 121, and the name implies that the viewing angle height control 121 is used to adjust the height of the viewing angle. Referring to fig. 1c and fig. 1d, fig. 1c is a schematic diagram of a first application scenario of the viewing angle adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user using an observer viewing angle (OB) to play a game, the virtual environment has a world coordinate system constructed with an X-axis, a Y-axis, and a Z-axis, and thus the camera model located in the virtual environment also has its corresponding coordinates (X 1 ,Y 1 ,Z 1 )。
In step 102, the projection position of the viewing angle on the designated plane is adjusted in response to the touch operation applied to the first viewing angle control region.
Wherein the projection position of the view angle on the specified plane, i.e. the projection position of the camera model on the specified plane, the specified plane here is the plane 20 constructed by the X-axis and the Y-axis in the virtual environment. I.e. the first view control area 11 is used to control the movement of the camera model back and forth, left and right in the virtual scene.
Referring to fig. 1c and fig. 1d, fig. 1c is a schematic diagram of a first application scenario of the viewing angle adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user using an observer viewing angle (OB) to play a game, the virtual environment has a world coordinate system constructed with an X-axis, a Y-axis, and a Z-axis, and thus the camera model located in the virtual environment also has its corresponding coordinates (X 1 ,Y 1 ,Z 1 ). The designated plane here is a plane constructed by the X-axis and the Y-axis.
Specifically, the wheel disc control 111 is disposed in the first view angle control area 11, and the user controls the rocker in the wheel disc control 111 to move forwards, backwards, leftwards and rightwards so as to control the view angle to move forwards, backwards, leftwards and rightwards.
In some embodiments, the content displayed by the user graphical interface is an environment picture when the target object observes the virtual environment in a preset viewing angle direction;
the step of adjusting the projection position of the viewing angle on the designated plane in response to the touch operation applied to the first viewing angle control region includes:
and adjusting the target object to move horizontally on the designated plane in response to the touch operation acted on the first visual angle control area.
The target object is a camera model, and the projection position of the control view angle on the designated plane, namely, the movement of the target object in the horizontal direction (front, back, left and right) on the designated plane is controlled.
In step 103, the height of the viewing angle perpendicular to the designated plane is adjusted in response to a first operation on the viewing angle height control.
The first operation is a pressing operation on the view angle height control 121, where the view angle height control 121 adjusts a position of the camera model on the Z axis according to a pressing duration of the pressing operation, and since the Y axis is divided into a positive half axis and a negative half axis, the corresponding view angle height control 121 may further include a first sub-view angle height control 1211 adjusted to the positive half axis of the Z axis and a second sub-view angle height control 1212 adjusted to the negative half axis of the Z axis. The height perpendicular to the specified plane is the distance of the camera model from the plane 20 constructed by the X-axis and the Y-axis.
In some embodiments, the step of adjusting the height of the viewing angle perpendicular to the designated plane in response to a first operation of the viewing angle height control comprises:
in response to a first operation acting on the perspective height control, a height of the target object perpendicular to the designated plane is adjusted.
The target object is a camera model, and the height of the visual angle perpendicular to the appointed plane is adjusted, namely the target object is controlled to move along the positive half axis direction or the negative half axis direction of the Z axis on the appointed plane.
Step 104, responding to a second operation starting from the visual angle height control and acting on the designated area, and adjusting the orientation of the visual angle.
Referring to fig. 1e, fig. 1e is a schematic diagram of a second application scenario of the viewing angle adjustment method provided in the embodiment of the present application, where the designated area 13 is configured to receive a second operation of a user, and adjust a viewing angle direction according to the second operation. Taking the first operation as a pressing operation as an example, when the user presses the viewing angle height control 121, the function of adjusting the height of the viewing angle height control 121 is achieved, the designated area 13 is displayed on the graphical user interface, and in order to facilitate the user to operate the viewing angle height control 121 and the designated area 13 by one finger, the designated area 13 may be set as the outer side of the viewing angle height control 121, that is, the inner periphery of the designated area 13 is adjacent to the outer periphery of the second viewing angle control area 12. The specific area 13 may be other areas of the gui except the first viewing angle control area 11 and the second viewing angle control area 12, and the periphery of the specific area may be circular or polygonal, which is not limited herein.
In some embodiments, the graphical user interface provides a first viewing angle control region and a second viewing angle control region, the second viewing angle control region comprising a viewing angle height control, comprising:
in some embodiments, referring to fig. 1f, fig. 1f is a rotation schematic diagram of a rotation angle of a camera model according to an embodiment of the present application. The viewing angle direction of the lens 30 in the camera model can be adjusted by rotating the U-axis and the R-axis. Based on the second operation of different operation modes, the second operation can be divided into two types, wherein the first type is that a user slides a finger pressing the visual angle height control 121 to the designated area 13, namely the second operation is continuous with the first operation, so that the transition from the adjustment of the visual angle height to the adjustment of the visual angle rotation is smoother, and the operation is more continuous; the second is that the user lifts the finger pressing the view height control 121 and presses the designated area 13 for a designated period of time. Therefore, the mode of rotating the view angle of the camera model is different for different operation modes, and the following two modes are described.
For the second operation, which is a sliding operation performed by sliding the height control to the designated area, please refer to fig. 1g, fig. 1g is a schematic diagram of a third application scenario of the viewing angle adjustment method according to the embodiment of the present application. In some embodiments, the step of adjusting the orientation of the viewing angle in response to a second operation initiated by the viewing angle height control and acting on the designated area includes:
(1) Acquiring a touch position of a second operation in the designated area and a reference point position in the graphical user interface;
(2) Determining a current sliding direction of the second operation based on the touch position and the reference point position;
(3) And determining the target orientation of the visual angle according to the current sliding direction, and adjusting the orientation of the visual angle according to the target orientation.
The rotation mode of the camera model can be divided into rotation around the U-axis only and rotation around the U-axis and the R-axis only, based on the working mode of rotation around the U-axis only, the touch position of the sliding operation corresponding to the terminal device can be obtained in real time, and the sliding direction (i.e. the current sliding direction) of the sliding operation of the user can be determined according to the touch position and the reference point position on the graphical user interface.
For example: the touch coordinates are (X2, Y2), and the reference point position is the center point coordinates (X3, Y3) of the view angle height control 121, so that the sine value of the angle can be determined according to the | (Y2-Y3) |/| (X2-X3) |, the angle can be determined according to the sine value, and the current sliding direction can be further determined. And determining the direction corresponding to the current sliding direction as the target direction of the visual angle, and controlling the camera module to rotate around the U axis until the lens direction of the camera module is overlapped with the target direction.
In some embodiments, setting a mapping relationship between the current sliding direction and the viewing angle orientation may be implemented, so that after determining the current sliding direction, determining the corresponding viewing angle orientation as the target orientation according to the mapping relationship.
For example, the touch coordinates are (X2, Y2), the coordinates of the center point of the view angle height control 121 are (X3, Y3), the sine value of the angle can be determined according to the | (Y2-Y3) |/| (X2-X3) |, the angle can be determined according to the sine value, the current sliding direction can be further determined, and the mapping relationship between the view angle orientation and each sliding direction can be established in advance, so that after the current sliding direction is determined, the target orientation of the view angle can be determined according to the mapping relationship.
In some embodiments, as shown in fig. 1h, fig. 1h is a schematic diagram of a fourth application scenario of the viewing angle adjustment method provided in the embodiments of the present application. When a sliding operation from the view angle height control 121 to the designated area 13 is detected, the display position of the view angle height control 121 on the graphical user interface is adjusted according to the touch position of the second operation.
The center point position of the controllable viewing angle height control 121 moves along with the second operation, so that the viewing angle height control 121 moves along with the second operation, and the user intuitively experiences the sliding operation.
In some embodiments, the step of determining the current sliding direction of the second operation based on the touch position and the position of the reference point includes:
(1) Determining a current sliding direction of the second operation and a current sliding distance of the second operation based on the touch position and the position of the reference point;
(2) The step of determining a target orientation of the viewing angle according to the current sliding direction and adjusting the orientation of the viewing angle according to the target orientation includes:
(3) And determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance, and adjusting the orientation of the visual angle according to the target orientation.
The camera module is rotatable around a U axis and an R axis for a working mode, a touch position of a sliding operation corresponding to a terminal device can be obtained in real time, a sliding direction (i.e., a current sliding direction) of the sliding operation of a user and a sliding distance (a current sliding distance) are determined according to the touch position and a reference point position on a graphical user interface, and a viewing angle direction of the camera module is adjusted in real time based on the current sliding direction and the current sliding distance. Specifically, the touch position is a touch coordinate of a user touch on the screen, and the reference point position may be a center point coordinate of the view angle height control 121.
Specifically, in order to enable the user to achieve the effect of accurate adjustment according to the watching requirement, after the current sliding direction is determined, the distance between the touch control coordinate and the center point coordinate is calculated according to the Pythagorean theorem, namely the current sliding distance. And then determining the view angle rotation angle of the view angle in the view angle direction corresponding to the current sliding direction according to the current sliding distance. And the orientation of the viewing angle rotation angle in the viewing angle rotation direction is determined as the target orientation.
In some embodiments, the step of determining the target orientation of the viewing angle according to the current sliding direction and the current sliding distance includes:
(1.1) determining the current sliding direction as a viewing angle rotation direction;
(1.2) acquiring a maximum angle at which the viewing angle is rotatable in the viewing angle rotation direction, and a maximum distance at which the second operation is slidable in the current sliding direction;
(1.3) determining a viewing angle rotation angle based on the current sliding distance, the maximum distance, and the maximum angle;
(1.4) determining a target orientation of the viewing angle based on the viewing angle rotation direction and the viewing angle rotation angle.
After the current sliding direction is determined, the lens of the camera model can be controlled to rotate towards the current sliding direction, and the specific rotation mode can rotate around the U axis so that the lens of the camera model can rotate.
The angle of elevation of the lens of the camera model in the direction of view rotation can be determined by the current sliding distance in such a way that a maximum angle rotatable in the direction of view rotation is obtained, for example 90 °, and a maximum distance slidable in the current sliding direction by the second operation, for example 6cm, and the target orientation is determined based on the sliding distance, the maximum distance and the maximum angle.
When the rotation angle of the visual angle is determined, the lens of the camera model can be controlled to rotate according to the rotation angle of the visual angle, and the specific rotation mode can rotate around the R axis so as to enable the lens of the camera model to be lifted.
In some embodiments, the step of determining the target orientation of the viewing angle according to the viewing angle rotation direction and the viewing angle rotation angle includes:
(1.1) determining a distance ratio of the current sliding distance to the maximum distance;
and (1.2) determining a visual angle rotation angle according to the distance ratio and the maximum angle, and determining a target orientation according to the current sliding direction and the visual angle rotation angle.
Wherein if the maximum angle rotatable in the viewing angle rotation direction is 90 °, and the maximum distance slidable in the current sliding direction by the second operation is 6cm, and the current sliding distance is 2cm, the distance ratio of the current sliding distance to the maximum distance is 1/3, the maximum angle is 90 °, and the viewing angle rotation angle is 90 ° x 1/3=30°. The target orientation is thus the orientation in which the angle in the viewing angle rotation direction corresponding to the current sliding direction is the viewing angle rotation angle.
In some embodiments, the method further comprises:
(1.1) if the sliding operation moving from the designated area to the visual angle height control is detected, acquiring the stay time of the sliding operation in the visual angle height control;
(1.2) comparing the residence time with a preset time to obtain a comparison result;
(1.3) if the comparison result shows that the stay time length is longer than the preset time length, adjusting the height of the visual angle vertical to the appointed plane;
(1.4) if the comparison result shows that the residence time is less than the preset time, judging whether the second operation is detected;
and (1.5) if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In order to avoid that the lens lifting angle is too high, the user passes through the first control when performing the sliding operation in the opposite direction, so that the lens lifting angle is not adjusted but the camera model is moved, a preset duration may be set, and how to adjust the view angle of the target object may be further determined by comparing the size relationship between the residence duration of the second operation in the view angle height control 121 and the preset duration.
Specifically, if the stay time length is longer than the preset time length, it may be determined that the user is currently about to adjust the vertical height of the camera model on the designated plane, and if the stay time length is shorter than the preset time length, it may not be determined what adjustment mode the user is currently about to perform, so as to determine whether the second operation of moving from the view angle height control 121 to the designated area 13 is detected; and if the second operation is detected, returning to the step of acquiring the touch position of the sliding operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In some embodiments, the method further comprises:
if the second operation cannot be detected after the sliding operation moves from the designated area 13 to the view angle height control 121, the designated area 13 is not displayed.
If the second operation cannot be detected, it is determined that the user lifts the finger, and the screen of the terminal device is not touched any more, so that the state of each control in fig. 1c is restored, that is, the designated area 13 is not displayed.
For the manner in which the user lifts the finger pressing the view height control of the first control and presses the designated area for the designated period of time, in some embodiments, the step of adjusting the view direction of the target object according to the second operation in response to the second operation for the response area includes:
(1.1) acquiring a pressing position, a pressing duration of the pressing operation in the designated area, and a reference point position in the graphical user interface;
(1.2) determining a viewing angle rotation direction based on the pressing position and the reference point;
(1.3) determining the visual angle rotation angle corresponding to the pressing duration according to a preset mapping relation;
(1.4) determining a target orientation of the viewing angle based on the viewing angle rotation direction and the viewing angle rotation angle.
The difference between the pressing operation and the sliding operation for determining the rotation direction of the target viewing angle is that: the angle of view rotation is determined in the pressing operation by the pressing time period. The mapping relation between the pressing time length and the visual angle rotation angle is determined in advance, so that when the pressing time length of a user is obtained, the corresponding visual angle rotation angle can be determined. For example: the viewing angle is rotated 30 ° when the pressing time period is 1s, and is rotated 60 ° when the pressing time period is 2 s.
As can be seen from the above, the embodiment of the present application provides a first viewing angle control area and a second viewing angle control area through the gui, where the second viewing angle control area includes a viewing angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on the appointed plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the viewing angle is adjusted in response to a second operation initiated at the viewing angle height control and acting on the designated area. Therefore, by carrying out different operations on the visual angle height control, various different visual angle adjusting functions (adjusting the visual angle height and the visual angle direction) are realized, the complexity of visual angle adjusting operation is reduced, and the visual angle adjusting efficiency is improved.
The methods described in connection with the above embodiments are described in further detail below by way of example.
Referring to fig. 2, fig. 2 is another flow chart of the method for adjusting a viewing angle according to the embodiment of the present application, and the present embodiment is an example of using the second operation as a sliding operation, and the specific flow chart of the method may be as follows:
in step 201, the graphical user interface provides a first viewing angle control area and a second viewing angle control area, wherein the second viewing angle control area comprises a viewing angle height control.
The graphical user interface displays a virtual environment in a game scene, wherein the virtual environment is a virtual environment provided when an application program runs on a terminal, and can be a simulation environment for a real world, a semi-simulation and semi-fictitious environment or a pure fictitious environment. The target object is a camera model, for example, an FPS game, which is positioned at the head or neck of the virtual character when at a first person viewing angle and behind the virtual character when at a third person viewing angle. The partial game scene is a picture generated by observing the virtual environment through the camera model in a certain view angle direction.
Specifically, the second viewing angle control area 12 includes a viewing angle height control 121, and the name implies that the viewing angle height control 121 is used to adjust the height of the viewing angle. Referring to fig. 1c and fig. 1d, fig. 1c is a schematic diagram of a first application scenario of the viewing angle adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user using an observer viewing angle (OB) to play a game, the virtual environment has a world coordinate system constructed with an X-axis, a Y-axis, and a Z-axis, and thus the camera model located in the virtual environment also has its corresponding coordinates (X 1 ,Y 1 ,Z 1 )。
In step 202, the target object is adjusted to move in the horizontal direction on the designated plane in response to the touch operation applied to the first viewing angle control region.
Wherein the projection position of the view angle on the specified plane, i.e. the projection position of the camera model on the specified plane, the specified plane here is the plane 20 constructed by the X-axis and the Y-axis in the virtual environment. I.e. the first view control area 11 is used to control the movement of the camera model back and forth, left and right in the virtual scene.
Referring to fig. 1c and 1d, fig. 1c is a schematic view of an embodiment of the present application Fig. 1d is a schematic diagram of a world coordinate system in a virtual environment according to an embodiment of the present application. Taking the example of a user using an observer viewing angle (OB) to play a game, the virtual environment has a world coordinate system constructed with an X-axis, a Y-axis, and a Z-axis, and thus the camera model located in the virtual environment also has its corresponding coordinates (X 1 ,Y 1 ,Z 1 ). The designated plane here is a plane constructed by the X-axis and the Y-axis.
Specifically, the wheel disc control 111 is disposed in the first view angle control area 11, and the user controls the rocker in the wheel disc control 111 to move forwards, backwards, leftwards and rightwards so as to control the view angle to move forwards, backwards, leftwards and rightwards. The target object is a camera model, and the projection position of the control view angle on the designated plane, namely the movement of the target object in the horizontal direction (front, back, left and right) on the designated plane is controlled.
In step 203, the height of the target object perpendicular to the specified plane is adjusted in response to a first operation acting on the perspective height control.
The first operation is a pressing operation on the view angle height control 121, the view angle height control 121 adjusts the position of the camera model on the Z axis according to the pressing duration of the pressing operation, and the corresponding height control 121 may further include a first sub-view angle height control 1211 adjusted to the positive Z axis and a second sub-view angle height control 1212 adjusted to the negative Z axis because the Y axis is divided into the positive half axis and the negative half axis. The height perpendicular to the specified plane is the distance of the camera model from the plane 20 constructed by the X-axis and the Y-axis.
Specifically, the target object is a camera model, and the height of the visual angle perpendicular to the designated plane is adjusted, namely, the target object is controlled to move along the positive half-axis direction or the negative half-axis direction of the Z axis on the designated plane.
In step 204, the touch location of the second operation within the designated area and the reference point location within the graphical user interface are obtained.
For example, the touch coordinates are (X2, Y2), the coordinates of the center point of the view angle height control 121 are (X3, Y3), the sine value of the angle can be determined according to the | (Y2-Y3) |/| (X2-X3) |, the angle can be determined according to the sine value, the current sliding direction can be further determined, and the mapping relationship between the view angle orientation and each sliding direction can be established in advance, so that after the current sliding direction is determined, the target orientation of the view angle can be determined according to the mapping relationship.
The center point position of the controllable viewing angle height control 121 moves along with the second operation, so that the viewing angle height control 121 moves along with the second operation, and the user intuitively experiences the sliding operation.
In step 205, a current sliding direction of the second operation and a current sliding distance of the second operation are determined based on the touch position and the position of the reference point.
In order to enable a user to achieve an effect of accurate adjustment according to watching requirements, after determining a current sliding direction, the distance between the touch control coordinate and the center point coordinate is calculated according to the Pythagorean theorem, namely the current sliding distance. And then determining the view angle rotation angle of the view angle in the view angle direction corresponding to the current sliding direction according to the current sliding distance. And the orientation of the viewing angle rotation angle in the viewing angle rotation direction is determined as the target orientation.
In step 206, the current sliding direction is determined as the viewing angle rotation direction.
After the current sliding direction is determined, the lens of the camera model can be controlled to rotate towards the current sliding direction, and the specific rotation mode can rotate around the U axis so that the lens of the camera model can rotate.
The angle of elevation of the lens of the camera model in the direction of view rotation can be determined by the current sliding distance in such a way that a maximum angle rotatable in the direction of view rotation is obtained, for example 90 °, and a maximum distance slidable in the current sliding direction by the second operation, for example 6cm, and the target orientation is determined based on the sliding distance, the maximum distance and the maximum angle. When the rotation angle of the visual angle is determined, the lens of the camera model can be controlled to rotate according to the rotation angle of the visual angle, and the specific rotation mode can rotate around the R axis so as to enable the lens of the camera model to be lifted.
In step 207, the maximum angle by which the angle of view is rotatable in the direction of rotation of the angle of view is obtained, and the maximum distance by which the second operation is slidable in the current sliding direction is obtained.
For example, if the maximum angle rotatable in the viewing angle rotation direction is 90 °, and the maximum distance slidable in the current sliding direction by the sliding operation is 6cm, and the current sliding distance is 2cm, the distance ratio of the current sliding distance to the maximum distance is 1/3.
In step 208, the viewing angle rotation angle is determined based on the current sliding distance, the maximum distance, and the maximum angle.
For example, if the maximum angle is 90 °, the distance ratio is 1/3, the angle of view rotation is 90 ° ×1/3=30°.
In step 209, a target orientation of the viewing angle is determined based on the viewing angle rotation direction and the viewing angle rotation angle.
When the visual angle rotation angle is determined, the lens of the camera model can be controlled to rotate according to the visual angle rotation angle, and the specific rotation mode can rotate around the R axis, so that the lens of the camera model is lifted.
In step 210, if a sliding operation moving from the designated area to the view angle height control is detected, a stay time of the sliding operation within the view angle height control is obtained.
In order to avoid that the lens lifting angle is too high, the user passes through the first control when performing the sliding operation in the opposite direction, so that the lens lifting angle is not adjusted but the camera model is moved, a preset duration may be set, and how to adjust the view angle of the target object may be further determined by comparing the size relationship between the residence duration of the second operation in the view angle height control 121 and the preset duration. Thus, in this step, the stay length of the sliding operation within the view angle height control is acquired, for example, 4s.
In step 211, the residence time is compared with a preset time to obtain a comparison result.
For example, the preset time length is 5s, the stay time length is 4s, and the comparison result is that the stay time length is smaller than the preset time length; if the residence time is 6s, the comparison result is that the residence time is longer than the preset time.
In step 212, if the comparison result is that the stay time length is longer than the preset time length, the height of the viewing angle perpendicular to the designated plane is adjusted.
If the stay time length is longer than the preset time length, the vertical height of the camera model on the appointed plane can be judged to be adjusted currently by the user.
In step 213, if the comparison result is that the residence time is less than the preset time, it is determined whether the second operation is detected.
If the stay time is less than the preset time, it is impossible to determine what adjustment mode the user wants to perform currently, so as to determine whether a sliding operation of moving the view angle height control to the designated area is detected.
In step 214, if the second operation is detected, the step of obtaining the touch position of the second operation in the designated area and the reference point position in the gui is returned until the second operation cannot be detected. .
And if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
As can be seen from the above, the embodiment of the present application provides a first viewing angle control area and a second viewing angle control area through the gui, where the second viewing angle control area includes a viewing angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on the appointed plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the viewing angle is adjusted in response to a second operation initiated at the viewing angle height control and acting on the designated area. Therefore, by carrying out different operations on the visual angle height control, various different visual angle adjusting functions (adjusting the visual angle height and the visual angle direction) are realized, the complexity of visual angle adjusting operation is reduced, and the visual angle adjusting efficiency is improved.
In order to facilitate better implementation of the viewing angle adjustment method provided by the embodiment of the present application, the embodiment of the present application further provides a device based on the above viewing angle adjustment method. Wherein the meaning of the terms is the same as that of the above-mentioned viewing angle adjusting method, and specific implementation details can be referred to the description of the method embodiment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a viewing angle adjusting device according to an embodiment of the present application, where the viewing angle adjusting device may include a display module 301, a control module, an adjusting module, and the like.
The display module 301 is configured to provide a first viewing angle control area and a second viewing angle control area for the graphical user interface, where the second viewing angle control area includes a viewing angle height control.
The first adjustment module 302 is configured to adjust a projection position of the viewing angle on the designated plane in response to a touch operation applied to the first viewing angle control region.
A second adjustment module 303, configured to adjust a height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control.
The third adjustment module 304 is configured to adjust an orientation of the viewing angle in response to a second operation initiated by the viewing angle height control and acting on the designated area.
In some embodiments, the second operation is a sliding operation from the height control to the designated area;
the third adjustment module 304 includes:
the first acquisition sub-module is used for acquiring the touch position of the sliding operation in the designated area and the reference point position in the graphical user interface;
the first determining submodule is used for determining the current sliding direction of the second operation based on the touch position and the reference point position;
and the first adjusting sub-module is used for determining the target orientation of the visual angle according to the current sliding direction and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the first determining sub-module comprises:
a determining unit, configured to determine a current sliding direction of the second operation and a current sliding distance of the second operation based on the touch position and the position of the reference point;
the first adjustment sub-module includes:
and the adjusting unit is used for determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the adjusting unit is configured to:
determining the current sliding direction as a viewing angle rotation direction;
Acquiring a maximum angle at which the viewing angle is rotatable in the viewing angle rotation direction, and a maximum distance at which the second operation is slidable in the current sliding direction;
determining a viewing angle rotation angle based on the current sliding distance, the maximum distance, and the maximum angle;
and determining the target orientation of the viewing angle according to the viewing angle rotation direction and the viewing angle rotation angle.
In some embodiments, the adjusting unit is further configured to:
if the sliding operation from the designated area to the visual angle height control is detected, acquiring the stay time of the sliding operation in the visual angle height control;
comparing the residence time with a preset time to obtain a comparison result;
if the comparison result shows that the stay time is longer than the preset time, adjusting the height of the visual angle vertical to the appointed plane;
if the comparison result shows that the residence time is smaller than the preset time, judging whether the second operation is detected;
and if the second operation is detected, returning to the step of acquiring the touch position of the sliding operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In some embodiments, the adjusting unit is further configured to:
If the sliding operation is moved from the designated area to the visual angle height control, the second operation cannot be detected, and the designated area is hidden.
In some embodiments, the content displayed by the user graphical interface is an environment picture when the target object observes the virtual environment in a preset viewing angle direction;
the first adjustment module 302 includes:
the second adjusting sub-module is used for responding to the touch operation acted on the first visual angle control area and adjusting the target object to move horizontally on the appointed plane;
a second adjustment module comprising:
and a third adjustment sub-module for adjusting the height of the target object perpendicular to the designated plane in response to the first operation acting on the second viewing angle control region.
As can be seen from the foregoing, the display module 301 provides a first viewing angle control area and a second viewing angle control area through the graphical user interface, wherein the second viewing angle control area includes the viewing angle height control. The first adjustment module 302 adjusts the projection position of the viewing angle on the designated plane in response to the touch operation applied to the first viewing angle control region. The second adjustment module 303 adjusts the height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control. The third adjustment module 304 adjusts the orientation of the viewing angle in response to a second operation initiated at the viewing angle height control and acting on the designated area. Therefore, by carrying out different operations on the visual angle height control, various different visual angle adjusting functions (adjusting the visual angle height and the visual angle direction) are realized, the complexity of visual angle adjusting operation is reduced, and the visual angle adjusting efficiency is improved.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application, as shown in fig. 4. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 401 is a control center of computer device 400 and connects the various portions of the entire computer device 400 using various interfaces and lines to perform various functions of computer device 400 and process data by running or loading software programs and/or modules stored in memory 402 and invoking data stored in memory 402, thereby performing overall monitoring of computer device 400.
In the embodiment of the present application, the processor 401 in the computer device 400 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
the graphical user interface provides a first visual angle control area and a second visual angle control area, wherein the second visual angle control area comprises a visual angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on the appointed plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the viewing angle is adjusted in response to a second operation initiated at the viewing angle height control and acting on the designated area.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 4 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In the embodiment of the application, the processor 401 executes the game application program to generate a graphical user interface on the touch display screen 403, where the virtual scene on the graphical user interface includes at least one functional control or wheel control. The touch display 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 404 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuitry 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another computer device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., and will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in the present embodiment, the graphical user interface provides a first viewing angle control area and a second viewing angle control area, wherein the second viewing angle control area includes a viewing angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on the appointed plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the viewing angle is adjusted in response to a second operation initiated at the viewing angle height control and acting on the designated area. Therefore, by carrying out different operations on the visual angle height control, various different visual angle adjusting functions (adjusting the visual angle height and the visual angle direction) are realized, the complexity of visual angle adjusting operation is reduced, and the visual angle adjusting efficiency is improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the viewing angle adjustment methods provided by embodiments of the present application. For example, the computer program may perform the steps of:
the graphical user interface provides a first visual angle control area and a second visual angle control area, wherein the second visual angle control area comprises a visual angle height control; responding to the touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on the appointed plane; adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control; the orientation of the viewing angle is adjusted in response to a second operation initiated at the viewing angle height control and acting on the designated area.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps of any one of the viewing angle adjustment methods provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any one of the viewing angle adjustment methods provided in the embodiments of the present application may be achieved, which are detailed in the previous embodiments and are not repeated herein.
The foregoing describes in detail a method, apparatus, storage medium and computer device for adjusting a viewing angle provided in the embodiments of the present application, and specific examples are applied to illustrate the principles and embodiments of the present application, where the foregoing examples are only used to help understand the method and core idea of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (10)

1. A viewing angle adjustment method, wherein a graphical user interface is provided through a display component of a terminal, the graphical user interface displaying content including at least a portion of a game scene, the method comprising:
the graphical user interface provides a first visual angle control area and a second visual angle control area, wherein the second visual angle control area comprises a visual angle height control;
responding to touch operation acted on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane, wherein the designated plane is a plane constructed by an X axis and a Y axis in a virtual environment;
adjusting a height of the viewing angle perpendicular to the designated plane in response to a first operation acting on the viewing angle height control;
acquiring a touch position of a second operation in a designated area and a reference point position in the graphical user interface, wherein the second operation is a sliding operation of sliding from the visual angle height control to the designated area, and the designated area is an entire area or a partial area of the graphical user interface except the first visual angle control area and the second visual angle control area;
Determining a current sliding direction of a second operation based on the touch position and the reference point position;
determining the target orientation of the visual angle according to the current sliding direction, and adjusting the orientation of the visual angle according to the target orientation;
if the sliding operation moving from the appointed area to the visual angle height control is detected, acquiring the stay time of the sliding operation in the visual angle height control;
comparing the residence time with a preset time to obtain a comparison result;
if the comparison result shows that the stay time length is longer than the preset time length, adjusting the height of the visual angle perpendicular to the appointed plane;
if the comparison result shows that the residence time is smaller than the preset time, judging whether the second operation is detected or not;
and if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface.
2. The viewing angle adjustment method according to claim 1, wherein the step of determining the current sliding direction of the second operation based on the touch position and the position of the reference point comprises:
Determining a current sliding direction of the second operation and a current sliding distance of the second operation based on the touch position and the position of the reference point;
the step of determining the target orientation of the visual angle according to the current sliding direction and adjusting the orientation of the visual angle according to the target orientation comprises the following steps:
and determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance, and adjusting the orientation of the visual angle according to the target orientation.
3. The viewing angle adjustment method according to claim 2, wherein the step of determining the target orientation of the viewing angle according to the current sliding direction and the current sliding distance comprises:
determining the current sliding direction as a viewing angle rotation direction;
acquiring a maximum angle at which the viewing angle is rotatable in the viewing angle rotation direction, and a maximum distance at which a second operation is slidable in the current sliding direction;
determining a viewing angle rotation angle based on the current sliding distance, the maximum distance, and the maximum angle;
and determining the target orientation of the visual angle according to the visual angle rotation direction and the visual angle rotation angle.
4. The viewing angle adjustment method of claim 1, wherein the method further comprises:
And if the second operation cannot be detected after the sliding operation slides from the designated area to the visual angle height control, hiding the designated area.
5. The viewing angle adjusting method according to claim 1, wherein an inner periphery of the specified region is adjacent to an outer periphery of the second viewing angle control region, and the outer periphery of the specified region is circular or polygonal.
6. The viewing angle adjustment method according to claim 1, wherein the content displayed by the graphical user interface is an environment picture when the target object observes the virtual environment in a preset viewing angle direction;
the step of adjusting the projection position of the viewing angle on the designated plane in response to the touch operation applied to the first viewing angle control region includes:
responding to the touch operation acted on the first visual angle control area, and adjusting the target object to move horizontally on the appointed plane;
the step of adjusting the height of the viewing angle perpendicular to the designated plane in response to a first operation of the viewing angle height control, comprising:
and adjusting the height of the target object perpendicular to the designated plane in response to a first operation acting on the visual angle height control.
7. The viewing angle adjustment method of claim 1, wherein the first operation is continuous with the second operation.
8. A viewing angle adjustment apparatus for providing a graphical user interface through a display component of a terminal, the graphical user interface displaying content including at least a portion of a game scene, the apparatus comprising:
the display module is used for providing a first visual angle control area and a second visual angle control area for the graphical user interface, wherein the second visual angle control area comprises a visual angle height control;
the first adjusting module is used for responding to the touch operation acted on the first visual angle control area and adjusting the projection position of the visual angle on a designated plane, wherein the designated plane is a plane constructed by an X axis and a Y axis in the virtual environment;
the second adjusting module is used for responding to the first operation acted on the visual angle height control, and adjusting the height of the visual angle perpendicular to the appointed plane;
a third adjustment module comprising:
the first acquisition sub-module is used for acquiring a touch position of a second operation in a designated area and a reference point position in the graphical user interface, wherein the second operation is a sliding operation of sliding from the visual angle height control to the designated area, and the designated area is an entire area or a partial area of the graphical user interface except the first visual angle control area and the second visual angle control area;
The first determining submodule is used for determining the current sliding direction of the second operation based on the touch position and the reference point position;
the first adjusting submodule is used for determining the target orientation of the visual angle according to the current sliding direction and adjusting the orientation of the visual angle according to the target orientation;
if the sliding operation of moving from the appointed area to the visual angle height control is detected, acquiring the stay time of the second operation in the visual angle height control;
comparing the residence time with a preset time to obtain a comparison result;
if the comparison result shows that the stay time length is longer than the preset time length, adjusting the height of the visual angle perpendicular to the appointed plane;
if the comparison result shows that the residence time is smaller than the preset time, judging whether the second operation is detected or not;
and if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface.
9. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the viewing angle adjustment method of any of claims 1 to 7.
10. A computer device, characterized in that it comprises a memory in which a computer program is stored and a processor that performs the steps in the viewing angle adjustment method according to any one of claims 1 to 7 by calling the computer program stored in the memory.
CN202011252862.5A 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment Active CN112245914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011252862.5A CN112245914B (en) 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011252862.5A CN112245914B (en) 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN112245914A CN112245914A (en) 2021-01-22
CN112245914B true CN112245914B (en) 2024-03-12

Family

ID=74265332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011252862.5A Active CN112245914B (en) 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN112245914B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570693A (en) * 2021-07-26 2021-10-29 北京达佳互联信息技术有限公司 Method, device and equipment for changing visual angle of three-dimensional model and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007044320A (en) * 2005-08-11 2007-02-22 Taito Corp Video game machine
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN108920084A (en) * 2018-06-29 2018-11-30 网易(杭州)网络有限公司 Visual field control method and device in a kind of game
CN110141855A (en) * 2019-05-24 2019-08-20 网易(杭州)网络有限公司 Method of controlling viewing angle, device, storage medium and electronic equipment
CN110665226A (en) * 2019-10-09 2020-01-10 网易(杭州)网络有限公司 Method, device and storage medium for controlling virtual object in game
CN111603758A (en) * 2020-05-28 2020-09-01 网易(杭州)网络有限公司 Visual angle adjusting method for game role, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3762750B2 (en) * 2003-01-07 2006-04-05 コナミ株式会社 Image display control program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007044320A (en) * 2005-08-11 2007-02-22 Taito Corp Video game machine
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN108920084A (en) * 2018-06-29 2018-11-30 网易(杭州)网络有限公司 Visual field control method and device in a kind of game
CN110141855A (en) * 2019-05-24 2019-08-20 网易(杭州)网络有限公司 Method of controlling viewing angle, device, storage medium and electronic equipment
CN110665226A (en) * 2019-10-09 2020-01-10 网易(杭州)网络有限公司 Method, device and storage medium for controlling virtual object in game
CN111603758A (en) * 2020-05-28 2020-09-01 网易(杭州)网络有限公司 Visual angle adjusting method for game role, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112245914A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN113082707B (en) Virtual object prompting method and device, storage medium and computer equipment
CN113398590A (en) Sound processing method, sound processing device, computer equipment and storage medium
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN113398566A (en) Game display control method and device, storage medium and computer equipment
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
WO2024045528A1 (en) Game control method and apparatus, and computer device and storage medium
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN116999835A (en) Game control method, game control device, computer equipment and storage medium
CN116999825A (en) Game control method, game control device, computer equipment and storage medium
CN115430150A (en) Game skill release method and device, computer equipment and storage medium
CN116115991A (en) Aiming method, aiming device, computer equipment and storage medium
CN116920384A (en) Information display method and device in game, computer equipment and storage medium
CN117482516A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115970282A (en) Virtual lens control method and device, storage medium and computer equipment
CN116966544A (en) Region prompting method, device, storage medium and computer equipment
CN117504278A (en) Interaction method, interaction device, computer equipment and computer readable storage medium
CN116139484A (en) Game function control method, game function control device, storage medium and computer equipment
CN116920390A (en) Control method and device of virtual weapon, computer equipment and storage medium
CN115518375A (en) Game word skipping display method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant