CN111589112B - Interface display method, device, terminal and storage medium - Google Patents

Interface display method, device, terminal and storage medium Download PDF

Info

Publication number
CN111589112B
CN111589112B CN202010332228.6A CN202010332228A CN111589112B CN 111589112 B CN111589112 B CN 111589112B CN 202010332228 A CN202010332228 A CN 202010332228A CN 111589112 B CN111589112 B CN 111589112B
Authority
CN
China
Prior art keywords
target
touch point
virtual
virtual rocker
rocker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010332228.6A
Other languages
Chinese (zh)
Other versions
CN111589112A (en
Inventor
胡勋
万钰林
魏嘉城
粟山东
张勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010332228.6A priority Critical patent/CN111589112B/en
Publication of CN111589112A publication Critical patent/CN111589112A/en
Application granted granted Critical
Publication of CN111589112B publication Critical patent/CN111589112B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The embodiment of the application provides an interface display method, an interface display device, a terminal and a storage medium, and relates to the technical field of application program development. The method comprises the following steps: displaying a user interface, wherein the user interface comprises a virtual rocker; receiving a first operation signal acting on the virtual rocker; moving the virtual rocker from the initial position to the target position in response to the first operation signal moving from the initial touch point to the target touch point; the target touch point is a touch point for triggering and displaying an operation prompt element; and displaying an operation prompting element in the user interface according to the position relation between the central position of the virtual rocker at the target position and the real-time touch point of the first operation signal, wherein the operation prompting element is used for prompting the candidate operation execution area. According to the technical scheme, convenience of the virtual rocker in the process of executing aiming operation can be improved.

Description

Interface display method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of application program development, in particular to an interface display method, an interface display device, a terminal and a storage medium.
Background
In a game interface of a MOBA (Multiplayer Online game) hand game, a virtual joystick is often displayed, the virtual joystick includes a first area for controlling a virtual object to quickly release skills, and when a finger of a user presses in the first area and does not move out of the first area, the user cannot aim at a direction and a position where the skills of the virtual object are to be released.
In the related art, when the user's finger slides from the first area to outside the first area, the position at which the skill to be released by the virtual object is aimed is located farther from the virtual object. If the skill of the user is to aim at a location closer to the virtual object, the user's finger needs to slide from the first area to the outside of the first area and then reversely slide from the outside of the first area back to the first area.
In the above technology, since the finger of the user needs to slide out of the first area first and then slide back to the first area in the opposite direction, the skill of the virtual object can be controlled to aim at the position closer to the virtual object, which is inconvenient to operate.
Disclosure of Invention
The embodiment of the application provides an interface display method, an interface display device, a terminal and a storage medium, and can improve convenience when a virtual rocker is adopted to execute aiming operation. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides an interface display method, where the method includes:
displaying a user interface, wherein the user interface comprises a virtual rocker;
receiving a first operation signal acting on the virtual rocker;
moving the virtual rocker from an initial position to a target position in response to the first operation signal moving from a start touch point to a target touch point; wherein the distance between the center position of the virtual rocker at the target position and the target touch point is smaller than the distance between the center position of the virtual rocker at the initial position and the target touch point; the target touch point is a touch point for triggering and displaying an operation prompt element;
and displaying the operation prompting element in the user interface according to the position relation between the central position of the virtual rocker at the target position and the real-time touch point of the first operation signal, wherein the operation prompting element is used for prompting a candidate operation execution area.
In another aspect, an embodiment of the present application provides an interface display apparatus, where the apparatus includes:
the interface display module is used for displaying a user interface, and the user interface comprises a virtual rocker;
the signal receiving module is used for receiving a first operating signal acting on the virtual rocker;
the rocker moving module is used for responding to the first operation signal to move from the initial touch point to the target touch point and moving the virtual rocker from the initial position to the target position; wherein the distance between the center position of the virtual rocker at the target position and the target touch point is smaller than the distance between the center position of the virtual rocker at the initial position and the target touch point; the target touch point is a touch point for triggering and displaying an operation prompt element;
and the element display module is used for displaying the operation prompt element in the user interface according to the position relation between the central position of the virtual rocker at the target position and the real-time touch point of the first operation signal, wherein the operation prompt element is used for prompting a candidate operation execution area.
In another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the interface display method.
In yet another aspect, an embodiment of the present application provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the above interface display method.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
through showing virtual rocker in user interface, receive the first operating signal who acts on virtual rocker, when first operating signal moves to the target touch point from initial touch point, remove virtual rocker, make the central point of virtual rocker after the removal put closer to the target touch point, and then make the initial display position of operation suggestion element in user interface more be close to its central point that can show the regional scope, when adopting virtual rocker to carry out the operation of aiming, can reduce the probability of above-mentioned first operating signal reverse movement, promote the convenience of virtual rocker when carrying out the operation of aiming.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart of an interface display method provided by an embodiment of the present application;
FIG. 2 is a schematic view of a virtual rocker provided in one embodiment of the present application;
FIG. 3 is a schematic diagram of a location of a target touch point provided by one embodiment of the present application;
FIG. 4 is a flow chart of a method for displaying an interface provided by another embodiment of the present application;
FIG. 5 is a schematic illustration of a user interface provided by one embodiment of the present application;
FIG. 6 is a schematic diagram of virtual rocker movement provided by one embodiment of the present application;
FIG. 7 is a flow chart of a method for displaying an interface provided by another embodiment of the present application;
FIG. 8 is a block diagram of an interface display apparatus provided in one embodiment of the present application;
FIG. 9 is a block diagram of an interface display apparatus provided in another embodiment of the present application;
fig. 10 is a block diagram of a terminal provided in an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of methods consistent with aspects of the present application, as detailed in the appended claims.
The embodiment of the application provides a terminal, which is an electronic device with data calculation, processing and storage capabilities, wherein a target application program runs in the terminal. The terminal may be a smartphone, a tablet, a PC (Personal Computer), a wearable device, or the like. Optionally, the terminal is a mobile terminal device with a touch display screen, and a user can realize human-computer interaction through the touch display screen. The target application may be any application that can control the operation of the alert element through touch operation, such as a game application, a social application, a payment application, a video application, a music application, a shopping application, a news application, and the like. In the method of the embodiment of the present application, the execution subject of each step may be the terminal, such as the target application program running in the terminal.
The technical solution of the present application will be described below by means of several embodiments.
Referring to fig. 1, a flowchart of an interface display method according to an embodiment of the present application is shown. The method comprises the following steps (101-104):
step 101, displaying a user interface, wherein the user interface comprises a virtual rocker.
The virtual rocker may be a virtual control displayed in the user interface. The user interface may be an interface in the target application described above. The user interface may be displayed in a display panel of a terminal running a target application program, or may be displayed in other display panels through a projection screen through the terminal running the target application program, or may be displayed through projection, or may be displayed through technologies such as ar (augmented reality), vr (virtual reality), and the like, which are not limited in this embodiment of the present application.
As shown in fig. 2, the virtual stick 20 may include a first region 21 and a second region 22 located at the periphery of the first region 21. The first region 21 may be circular, the second region 22 may be a concentric ring of the circular first region 21, and the boundary between the first region 21 and the second region 22 is the outer edge of the first region 21 (or the inner circle of the second region 22). In some examples, the first region 21 is an ellipse, and the second region 22 may be an elliptical ring; in some examples, the second region 22 is annular, and the shape of the inner edge and the shape of the outer edge of the second region 22 may not be the same; in other examples, the first region 21 and the second region 22 may also have other shapes, which is not limited by the embodiment of the present application.
A virtual object may also be included in the user interface, and a virtual rocker may be used to control the virtual object, such as to control the position of a skill released by the virtual object (in which case the virtual rocker may also be referred to as a skill control for controlling the virtual object to release the skill). The virtual object refers to a virtual role controlled by the user account in the application program. Taking an application as a game application as an example, the virtual object refers to a game character controlled by a user account in the game application. The virtual object may be in the form of a character, an animal, a cartoon or other forms, which is not limited in this application. The virtual object may be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application. Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual object may be a three-dimensional volumetric model created based on animated skeletal techniques. The virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
In some embodiments, the first area is used to control the virtual object to release the skill quickly, and the skill may refer to the ability of the virtual object, such as to reduce or increase the value of an attribute of another virtual object (e.g., a life value, a defense value, a skill value), to control the state of another virtual object (e.g., to make it dizzy, change its position), to change the virtual environment (e.g., to put trees in the virtual environment, to fog the virtual environment, to increase or decrease items in the virtual environment), and so on. When the user performs the first trigger operation with respect to the first area, the virtual object may quickly release the corresponding skill. The first trigger operation may be a click operation, a slide operation, a long press operation, and the like, which is not limited in this embodiment of the application.
Step 102, receiving a first operation signal acting on a virtual rocker.
The first operation signal may be a touch signal, and when the user touches a virtual stick displayed in the user interface, the terminal receives and determines the touch signal as the first operation signal. The terminal can continuously identify the first operation signal, so as to acquire the change situation of the first operation signal.
And 103, responding to the first operation signal to move from the initial touch point to the target touch point, and moving the virtual rocker from the initial position to the target position.
And the distance between the center position of the virtual rocker at the target position and the target touch point is smaller than the distance between the center position of the virtual rocker at the initial position and the target touch point. That is, the center position of the moved virtual stick is closer to the target touch point than the center position of the virtual stick of the moved point.
Optionally, the target touch point is a touch point triggering display of an operation prompt element. As shown in FIG. 3, the target touch point 31 may be located at an outer edge of the first area 32 and the start touch point 33 is located within the first area 32.
In some embodiments, the position of the virtual joystick initially displayed in the user interface is an initial position, and the position of the virtual joystick after movement is a target position. When the user controls the operation body to move from the first area to the outer edge of the first area, that is, the first operation signal moves from the start touch point to the target touch point, the virtual joystick may move from the initial position to the target position corresponding to the target touch point. The operation body may be a finger, a stylus, or another object that can control the virtual joystick through touch, which is not limited in this embodiment of the present application.
And 104, displaying an operation prompt element in the user interface according to the position relation between the central position of the virtual rocker at the target position and the real-time touch point of the first operation signal.
In some embodiments, the operation hint element may be used to hint at the location of a candidate operation execution region. The region in which the user-controlled operation is performed may be referred to as an execution region, and the candidate operation execution region is a region in which the candidate operation is executed. For example, the candidate operation execution region may be a release region of the skill candidate to be released of the virtual object, and the displayable region range of the operation prompt element may be a circle centered on the virtual object. Alternatively, the operation prompt element may be used to prompt the targeted operation execution region. The position relation between the real-time touch point and the center position of the virtual rocker corresponds to the position relation between the operation prompt element and the displayable area range of the operation prompt element, so that the operation prompt element can be displayed at the corresponding position in the user interface when the position relation between the center position of the virtual rocker at the target position and the real-time touch point is obtained.
Alternatively, the operation prompt element may be a mark, such as an icon, a symbol, or the like, displayed in the user interface. The operation prompting element may be a circle, a circular ring, an arrow, a straight line segment or an arc segment, or other shapes, which is not limited in the embodiment of the present application. The operation prompting element may be displayed on the ground of the virtual environment, may also be displayed in the air of the virtual environment, may also be displayed on the water surface in the virtual environment, and may also be displayed under the water surface of the virtual environment, which is not limited in this embodiment of the application. A virtual environment is a scene displayed (or provided) by a client of a target application (e.g., a game application) when running on a terminal, and refers to a scene created for a first virtual object to perform an activity (e.g., a game competition), such as a virtual house, a virtual island, a virtual map, and the like. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in this embodiment of the present application.
To sum up, according to the technical scheme provided by the embodiment of the application, the virtual rocker is displayed in the user interface, the first operation signal acting on the virtual rocker is received, when the first operation signal moves from the initial touch point to the target touch point, the virtual rocker is moved, so that the center position of the moved virtual rocker is closer to the target touch point, the initial display position of the operation prompt element in the user interface is closer to the center position of the displayable area range of the operation prompt element, when the virtual rocker is adopted to execute the aiming operation, the probability of reverse movement of the first operation signal can be reduced, and the convenience of the virtual rocker in executing the aiming operation is improved.
Referring to fig. 4, a flowchart of an interface display method according to an embodiment of the present application is shown. The method can comprise the following steps (401-406):
step 401, displaying a user interface, wherein the user interface comprises a virtual joystick.
The content of step 401 is the same as or similar to the content of step 101 in the embodiment of fig. 1, and is not described again here.
In step 402, a first operation signal acting on a virtual joystick is received.
The content of step 402 is the same as or similar to the content of step 102 in the embodiment of fig. 1, and is not repeated here.
And step 403, moving the virtual rocker from the initial position to the target position according to the position of the target touch point in response to the first operation signal moving from the initial touch point to the target touch point.
Wherein, the center position of the virtual rocker at the target position may be the position of the target touch point. After the first operation signal moves from the initial touch point to the target touch point, the position of the target touch point can be obtained, the position of the target touch point is determined as the center position of the virtual rocker at the target position, and the target position is the position of the virtual rocker after the virtual rocker moves. Optionally, the shape and size of the moved virtual rocker do not change.
In some embodiments, as shown in fig. 5, when the first operation signal 51 is located at the center position of the virtual stick at the target position, the initial display position of the operation prompt element 52 is located at the center position of the displayable region range 53 thereof. For example, if the center of the displayable region range of the operation prompt element is the virtual object 54, the initial display position of the operation prompt element is the position where the virtual object 54 is located.
And step 404, determining the direction of the operation prompt element relative to the virtual object according to the direction of the real-time touch point relative to the center position of the virtual rocker at the target position.
In some embodiments, the direction of the real-time touch point relative to a center position of the virtual stick at the target position is determined as a direction of the operation prompt element relative to a virtual object, wherein the virtual object is located at a center of a displayable region range of the operation prompt element. For example, when the real-time touch point is located in a 45 ° direction diagonally above the right of the center position of the virtual stick at the target position, the operation prompt element is also located in a 45 ° direction diagonally above the right of the position where the virtual object is located.
It should be noted that, in some embodiments, the content displayed in the user interface, such as the operation prompting element, the displayable area range of the operation prompting element, and the like, is a three-dimensional picture processed by a three-dimensional technology, and the three-dimensional picture is deformed when displayed in a two-dimensional display screen, for example, a circle in the three-dimensional picture may be displayed as an ellipse in a planar display screen. Therefore, if the content displayed on the user interface, such as the operation presenting element and the displayable region range of the operation presenting element, is a three-dimensional screen, the direction of the operation presenting element with respect to the virtual object is a direction in a three-dimensional space, not a direction actually displayed on the two-dimensional display screen. The same description of the distance between the operation prompt element and the virtual object is omitted here.
And step 405, determining the distance between the operation prompt element and the virtual object according to the distance between the real-time touch point and the center position of the virtual rocker at the target position.
In some embodiments, step 405 may include the steps of:
1. calculating a first ratio, wherein the first ratio is the ratio of a first distance to the radius of the virtual rocker, and the first distance is the distance between the real-time touch point and the center position of the virtual rocker at the target position;
2. the product of the first ratio and the radius of the displayable region range of the operation cue element is determined as the distance between the operation cue element and the virtual object.
The distance between the operation prompting element and the virtual object may be a distance obtained by scaling up or scaling down the first distance, and the distance between the operation prompting element and the virtual object may also be equal to the first distance.
Optionally, when the real-time touch point is located outside the virtual rocker, the first ratio is 1.
And 406, displaying the operation prompt element in the user interface according to the direction of the operation prompt element relative to the virtual object and the distance between the operation prompt element and the virtual object.
From steps 404 and 405, the orientation of the action-cue element with respect to the virtual object and the distance between the action-cue element and the virtual object can be derived. The position at the distance of the direction of the virtual object is the position of the operation prompt element, and the operation prompt element is displayed at the position. Optionally, the position at the distance of the direction of the virtual object is a center position of the operation hint element.
In some embodiments, as shown in fig. 6, if the operation prompt element needs to be displayed at the position 61, the first operation signal 62 may slide from the first region 63 of the virtual joystick to the edge of the first region 63, the moving direction is denoted as 66, the virtual joystick moves, the center position of the virtual joystick moves from the position 64 to the position 65, and the first operation signal 62 moves to the position 61 again along the moving direction 66.
In some embodiments, the virtual rocker is moved from the target position back to the initial position in response to the first operating signal disappearing.
To sum up, according to the technical scheme provided by the embodiment of the application, the position of the target touch point is determined as the center position of the virtual rocker at the target position, so that the initial display position of the operation prompt element is located at the center of the displayable area range of the operation prompt element, namely, the position of the virtual object, when the operation prompt element needs to be displayed at a position close to the virtual object, only the first operation signal needs to be moved continuously along the direction of moving to the target touch point, the first operation signal does not need to be moved reversely, and convenience in operation is improved.
In the embodiment of the application, the distance between the operation prompting element and the virtual object is determined according to the distance between the real-time touch point and the center position of the virtual rocker at the target position, so that the display position of the operation prompting element is matched with the real-time touch point, and the operation convenience of the virtual rocker is further improved.
In some possible embodiments, moving the virtual stick from the initial position to the target position may further include the sub-steps of:
1. determining a target center position in an area between a center position of the virtual stick at the initial position and a position of the target touch point;
2. moving the virtual rocker from the initial position to the target position according to the target center position; and the center position of the virtual rocker at the target position is the target center position.
In this embodiment, the target center position is a position of the virtual stick in a region between the center position at the initial position and the position of the target touch point.
In some embodiments, determining the target center position in a region between a center position of the virtual stick at the initial position and a position of the target touch point may include the sub-steps of:
1. determining the moving direction of the virtual rocker according to the direction in which the central position of the virtual rocker at the initial position points to the target touch point;
2. determining the moving distance of the virtual rocker according to the distance between the central position of the virtual rocker at the initial position and the target touch point;
3. and determining the target center position according to the center position, the moving distance and the moving direction of the virtual rocker at the initial position.
Optionally, the center position of the virtual rocker at the initial position points to the direction of the target touch point, and the direction is determined as the moving direction of the virtual rocker; determining k times of the distance between the center position of the virtual rocker at the initial position and the target touch point as the moving distance of the virtual rocker, wherein k is greater than 0 and less than 1; then, the position of the center position of the virtual stick at the initial position at the movement distance in the movement direction is determined as the target center position. And determining the target center position according to the direction and the distance of the target touch point relative to the center position of the virtual rocker at the initial position, so that the user has a larger operation range in the moving direction, and the operation accuracy is further improved.
In some embodiments, the respective locations are represented by coordinates (e.g., planar coordinates, polar coordinates, etc.). And subtracting the coordinate of the center position of the virtual rocker at the initial position from the coordinate of the target touch point to calculate the moving direction of the virtual rocker and the distance between the center position of the virtual rocker at the initial position and the target touch point.
In the implementation mode, when the probability of reverse movement of the first operation signal is reduced and the convenience of operation is improved, overlong reaction time of a user due to overlong moving distance of the virtual rocker can be avoided, and the operation efficiency is improved.
In some possible embodiments, before receiving the first operation signal acting on the virtual joystick, the method may further include the following steps:
1. displaying a setting interface, wherein the setting interface comprises more than one distance option, and different distance options correspond to different moving distance coefficients;
2. receiving a selection signal for a target distance option element;
3. saving a moving distance coefficient corresponding to the set target distance option; the moving distance of the virtual rocker is determined based on the distance between the center position of the virtual rocker at the initial position and the target touch point and the moving distance coefficient corresponding to the target distance option.
Before the virtual rocker is moved, a user can select a target distance option from more than one distance options in the setting interface, and a moving distance coefficient corresponding to the target distance option is a moving distance coefficient adopted when the virtual rocker is moved. And determining the product of the distance between the center position of the virtual rocker at the initial position and the target touch point and the moving distance coefficient as the moving distance of the virtual rocker. The distance option may be displayed as a specific numerical value, or may be displayed in the form of an example diagram, which is not limited in this application. The shift coefficient is a positive number, the shift coefficient may be 0.4, 0.5, 0.6, 0.7, or 0.8, and a specific numerical value of the shift coefficient corresponding to more than one distance option may be set by a related technician, which is not limited in this embodiment of the application.
In some embodiments, prompt information corresponding to the recommended movement coefficient is displayed in the setting interface, so as to recommend the user to select the recommended movement coefficient. And if the user does not select the distance option, setting the recommended movement coefficient as the movement distance coefficient adopted when the virtual rocker moves.
In the implementation mode, the user can independently select the moving distance coefficient adopted when the virtual rocker moves, so that the moving distance of the virtual rocker is more in line with the personalized requirements and habits of the user, and the moving flexibility of the virtual rocker is improved.
In some embodiments, moving the virtual rocker from the initial position to the target position may include the steps of:
1. acquiring more than one candidate position;
2. selecting a target candidate position closest to the target touch point from the candidate positions;
3. moving the virtual rocker from the initial position to the target position according to the target candidate position; and the center position of the virtual rocker at the target position is a target candidate position.
The target application may previously store more than one candidate position, and the more than one candidate position may be distributed on a circumference centered on a center position of the virtual stick at the initial position. The number of candidate positions may be 4, 6, 8, 12, 20, 36, or 72, and the specific number of candidate positions may be set by a person skilled in the relevant art according to actual situations, which is not limited in the embodiment of the present application. And determining the candidate position closest to the target touch point as the central position of the virtual rocker at the target position.
In the implementation manner, the center position of the virtual rocker at the target position is selected from more than one candidate position, and only the distances between more than one candidate position and the target touch point need to be compared, and the moving direction and the moving distance of the virtual rocker do not need to be calculated, so that the storage space and the processing resource of the terminal can be saved.
Referring to fig. 7, a flowchart of an interface display method according to an embodiment of the present application is shown. As shown in fig. 7, the method may include the steps of:
step 701, displaying a user interface;
step 702, determining whether the first operation signal acting on the first area of the virtual joystick moves, if so, executing step 703; if not, go to step 709;
step 703, calculating to obtain a distance d1 between the target touch point and the center position of the virtual joystick at the initial position according to the coordinate P1 of the real-time touch point and the coordinate P2 of the center position of the virtual joystick;
step 704, determining whether d1 is greater than or equal to the radius d2 of the virtual rocker, if so, executing step 705; if not, go to step 709;
step 705, updating the coordinate P2 of the center position of the virtual rocker;
step 706, activating a skill aiming function of the first area;
step 707, recognizing that the first operation signal starts to move from the target touch point;
step 708, calculating and displaying the skill aiming position according to the coordinate P1 of the real-time touch point and the updated coordinate P2 of the center position of the virtual rocker;
step 709, judging whether the user clicks the first area, if yes, executing step 710; if not, the step is ended;
step 710, quickly releasing skills.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 8, a block diagram of an interface display apparatus according to an embodiment of the present application is shown. The device has the functions of realizing the method examples of the interface display, and the functions can be realized by hardware or by hardware executing corresponding software. The device may be the terminal described above, or may be provided on the terminal. The apparatus 800 may include: an interface display module 810, a signal receiving module 820, a rocker movement module 830, and an element display module 840.
The interface display module 810 is configured to display a user interface, where the user interface includes a virtual joystick.
The signal receiving module 820 is configured to receive a first operation signal applied to the virtual joystick.
The rocker moving module 830, configured to move the virtual rocker from an initial position to a target position in response to the first operation signal moving from the initial touch point to the target touch point; wherein the distance between the center position of the virtual rocker at the target position and the target touch point is smaller than the distance between the center position of the virtual rocker at the initial position and the target touch point; the target touch point is a touch point triggering display of an operation prompt element.
The element display module 840 is configured to display the operation prompt element in the user interface according to a position relationship between a center position of the virtual rocker at the target position and a real-time touch point of the first operation signal, where the operation prompt element is used to prompt a candidate operation execution area.
To sum up, the technical scheme provided by the embodiment of the application receives a first operation signal acting on the virtual rocker by displaying the virtual rocker in the user interface, and when the first operation signal moves from an initial touch point to a target touch point, the virtual rocker is moved, so that the center position of the moved virtual rocker is closer to the target touch point, the initial display position of an operation prompt element in the user interface is closer to the center position of a displayable area range of the operation prompt element, when the virtual rocker is adopted to execute aiming operation, the probability of reverse movement of the first operation signal can be reduced, and the convenience of the virtual rocker in executing aiming operation is improved.
In some embodiments, the rocker movement module 830 is configured to: moving the virtual rocker from the initial position to the target position according to the position of the target touch point; wherein the center position of the virtual rocker at the target position is the position of the target touch point.
In some embodiments, as shown in fig. 9, the rocker movement module 830 comprises: a position determination submodule 831 and a rocker movement submodule 832.
The position determination submodule 831 is configured to determine a target center position in an area between a center position of the virtual stick at the initial position and a position of the target touch point.
The rocker movement sub-module 832 is configured to move the virtual rocker from the initial position to the target position according to the target center position; wherein a center position of the virtual rocker at the target position is the target center position.
In some embodiments, as shown in fig. 9, the position determination submodule 831 is configured to:
determining the moving direction of the virtual rocker according to the direction of the central position of the virtual rocker at the initial position pointing to the target touch point;
determining the moving distance of the virtual rocker according to the distance between the central position of the virtual rocker at the initial position and the target touch point;
and determining the target central position according to the central position of the virtual rocker at the initial position, the moving distance and the moving direction.
In some embodiments, as shown in fig. 9, the apparatus 800 further comprises: options display module 850, element selection module 860, and coefficient save module 870.
The option display module 850 is further configured to display a setting interface, where the setting interface includes more than one distance option, and different distance options correspond to different moving distance coefficients.
The element selection module 860 is configured to receive a selection signal for a target distance option element.
The coefficient storage module 870 is configured to store a moving distance coefficient corresponding to the target distance option; the moving distance of the virtual rocker is determined based on the distance between the center position of the virtual rocker at the initial position and the target touch point and the moving distance coefficient corresponding to the target distance option.
In some embodiments, the rocker movement module 830 is configured to:
acquiring more than one candidate position;
selecting a target candidate position closest to the target touch point from the candidate positions;
moving the virtual rocker from the initial position to the target position according to the target candidate position; wherein a center position of the virtual rocker at the target position is the target candidate position.
In some embodiments, the element display module 840 is configured to:
determining the direction of the operation prompt element relative to a virtual object according to the direction of the real-time touch point relative to the center position of the virtual rocker at the target position;
determining the distance between the operation prompting element and the virtual object according to the distance between the real-time touch point and the center position of the virtual rocker at the target position;
and displaying the operation prompt element in the user interface according to the direction of the operation prompt element relative to the virtual object and the distance between the operation prompt element and the virtual object.
In some embodiments, as shown in fig. 9, the apparatus 800 further comprises: the rocker moves back to module 880.
The rocker return module 880 is configured to move the virtual rocker from the target position back to the initial position in response to disappearance of the first operation signal.
In some embodiments, the virtual rocker includes a first region and a second region located at a periphery of the first region, and the target touch point is located at an outer edge of the first region.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 10, a block diagram of a terminal 1000 according to an embodiment of the present application is shown. The terminal 1000 can be an electronic device such as a mobile phone, a tablet computer, a game console, an electronic book reader, a multimedia player, a wearable device, a PC, etc. The terminal is used for implementing the interface display method provided in the above embodiment. Specifically, the method comprises the following steps:
in general, terminal 1000 can include: a processor 1001 and a memory 1002.
Processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1001 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 1001 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1002 is used to store at least one instruction, at least one program, a set of codes, or a set of instructions, and is configured to be executed by one or more processors to implement the above-described interface display method.
In some embodiments, terminal 1000 can also optionally include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch screen display 1005, camera assembly 1006, audio circuitry 1007, positioning assembly 1008, and power supply 1009.
Those skilled in the art will appreciate that the configuration shown in FIG. 10 is not intended to be limiting and that terminal 1000 can include more or fewer components than shown, or some components can be combined, or a different arrangement of components can be employed.
In some embodiments, there is also provided a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which is loaded and executed by a processor to implement the above-described interface display method.
In some embodiments, there is also provided a computer program product for implementing the above interface display method when executed by a processor.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. An interface display method, characterized in that the method comprises:
displaying a user interface, wherein the user interface comprises a virtual rocker;
receiving a first operation signal acting on the virtual rocker;
moving the virtual rocker from an initial position to a target position in response to the first operation signal moving from a start touch point to a target touch point; wherein the distance between the center position of the virtual rocker at the target position and the target touch point is smaller than the distance between the center position of the virtual rocker at the initial position and the target touch point; the target touch point is a touch point for triggering and displaying an operation prompt element;
and displaying the operation prompting element in the user interface according to the position relation between the central position of the virtual rocker at the target position and the real-time touch point of the first operation signal, wherein the operation prompting element is used for prompting a candidate operation execution area.
2. The method of claim 1, wherein moving the virtual rocker from an initial position to a target position comprises:
moving the virtual rocker from the initial position to the target position according to the position of the target touch point; wherein the center position of the virtual rocker at the target position is the position of the target touch point.
3. The method of claim 1, wherein moving the virtual rocker from an initial position to a target position comprises:
determining a target center position in an area between a center position of the virtual joystick at the initial position and a position of the target touch point;
moving the virtual rocker from the initial position to the target position according to the target center position; wherein a center position of the virtual rocker at the target position is the target center position.
4. The method of claim 3, wherein determining a target center position in an area between a center position of the virtual rocker at the initial position and the position of the target touch point comprises:
determining the moving direction of the virtual rocker according to the direction of the central position of the virtual rocker at the initial position pointing to the target touch point;
determining the moving distance of the virtual rocker according to the distance between the central position of the virtual rocker at the initial position and the target touch point;
and determining the target central position according to the central position of the virtual rocker at the initial position, the moving distance and the moving direction.
5. The method of claim 3, wherein prior to receiving the first operating signal acting on the virtual rocker, further comprising:
displaying a setting interface, wherein the setting interface comprises more than one distance option, and different distance options correspond to different moving distance coefficients;
receiving a selection signal for a target distance option element;
saving a moving distance coefficient corresponding to the target distance option; the moving distance of the virtual rocker is determined based on the distance between the center position of the virtual rocker at the initial position and the target touch point and the moving distance coefficient corresponding to the target distance option.
6. The method of claim 1, wherein moving the virtual rocker from an initial position to a target position comprises:
acquiring more than one candidate position;
selecting a target candidate position closest to the target touch point from the candidate positions;
moving the virtual rocker from the initial position to the target position according to the target candidate position; wherein a center position of the virtual rocker at the target position is the target candidate position.
7. The method of claim 1, wherein the displaying the operation prompt element in the user interface according to a positional relationship between a center position of the virtual joystick at the target position and a real-time touch point of the first operation signal comprises:
determining the direction of the operation prompt element relative to a virtual object according to the direction of the real-time touch point relative to the center position of the virtual rocker at the target position;
determining the distance between the operation prompting element and the virtual object according to the distance between the real-time touch point and the center position of the virtual rocker at the target position;
and displaying the operation prompt element in the user interface according to the direction of the operation prompt element relative to the virtual object and the distance between the operation prompt element and the virtual object.
8. The method of claim 1, wherein the displaying the operation prompt element in the user interface according to a positional relationship between a center position of the virtual joystick at the target position and a real-time touch point of the first operation signal further comprises:
moving the virtual rocker from the target position back to the initial position in response to the first operating signal disappearing.
9. The method of any one of claims 1 to 8, wherein the virtual rocker comprises a first region and a second region located at a periphery of the first region, and the target touch point is located at an outer edge of the first region.
10. An interface display apparatus, the apparatus comprising:
the interface display module is used for displaying a user interface, and the user interface comprises a virtual rocker;
the signal receiving module is used for receiving a first operating signal acting on the virtual rocker;
the rocker moving module is used for responding to the first operation signal to move from the initial touch point to the target touch point and moving the virtual rocker from the initial position to the target position; wherein the distance between the center position of the virtual rocker at the target position and the target touch point is smaller than the distance between the center position of the virtual rocker at the initial position and the target touch point; the target touch point is a touch point for triggering and displaying an operation prompt element;
and the element display module is used for displaying the operation prompt element in the user interface according to the position relation between the central position of the virtual rocker at the target position and the real-time touch point of the first operation signal, wherein the operation prompt element is used for prompting a candidate operation execution area.
11. A terminal, characterized in that it comprises a processor and a memory, in which at least one instruction, at least one program, set of codes or set of instructions is stored, which is loaded and executed by the processor to implement the interface display method according to any one of claims 1 to 9.
12. A computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the interface display method according to any one of claims 1 to 9.
CN202010332228.6A 2020-04-24 2020-04-24 Interface display method, device, terminal and storage medium Active CN111589112B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010332228.6A CN111589112B (en) 2020-04-24 2020-04-24 Interface display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010332228.6A CN111589112B (en) 2020-04-24 2020-04-24 Interface display method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111589112A CN111589112A (en) 2020-08-28
CN111589112B true CN111589112B (en) 2021-10-22

Family

ID=72189035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010332228.6A Active CN111589112B (en) 2020-04-24 2020-04-24 Interface display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111589112B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306351B (en) * 2020-10-30 2022-05-13 腾讯科技(深圳)有限公司 Virtual key position adjusting method, device, equipment and storage medium
CN113476822B (en) * 2021-06-11 2022-06-10 荣耀终端有限公司 Touch method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9262073B2 (en) * 2010-05-20 2016-02-16 John W. Howard Touch screen with virtual joystick and methods for use therewith
CN105194873B (en) * 2015-10-10 2019-01-04 腾讯科技(成都)有限公司 A kind of information processing method, terminal and computer storage medium
CN106582018B (en) * 2016-12-22 2019-04-23 腾讯科技(深圳)有限公司 A kind of method and terminal for being inserted into virtual resource object in the application
CN109999493B (en) * 2019-04-03 2022-04-15 网易(杭州)网络有限公司 Information processing method and device in game, mobile terminal and readable storage medium
CN110096214B (en) * 2019-06-05 2021-08-06 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for controlling movement of virtual object

Also Published As

Publication number Publication date
CN111589112A (en) 2020-08-28

Similar Documents

Publication Publication Date Title
CN110354489B (en) Virtual object control method, device, terminal and storage medium
KR102625233B1 (en) Method for controlling virtual objects, and related devices
CN111330272B (en) Virtual object control method, device, terminal and storage medium
JP7238143B2 (en) MOVEMENT CONTROL METHOD AND APPARATUS THEREOF, TERMINAL AND COMPUTER PROGRAM FOR VIRTUAL OBJECT
CN111589112B (en) Interface display method, device, terminal and storage medium
CN112717392B (en) Mark display method, device, terminal and storage medium
US20230289054A1 (en) Control mode selection to indicate whether simultaneous perspective change and function selection is enabled
CN111383345B (en) Virtual content display method and device, terminal equipment and storage medium
CN112569611A (en) Interactive information display method, device, terminal and storage medium
JP2022532315A (en) Virtual object selection method, devices, terminals and programs
JP2022534661A (en) VIRTUAL OBJECT CONTROL METHOD AND DEVICE, TERMINAL AND COMPUTER PROGRAM
US20220032188A1 (en) Method for selecting virtual objects, apparatus, terminal and storage medium
US20230330543A1 (en) Card casting method and apparatus, device, storage medium, and program product
CN111475089B (en) Task display method, device, terminal and storage medium
CN114849228A (en) Method, device and equipment for layout of control in game and storage medium
CN111643895B (en) Operation response method, device, terminal and storage medium
US20220054944A1 (en) Virtual object control method and apparatus, terminal, and storage medium
CN113546403A (en) Role control method, role control device, terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40027962

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant