CN111475089B - Task display method, device, terminal and storage medium - Google Patents

Task display method, device, terminal and storage medium Download PDF

Info

Publication number
CN111475089B
CN111475089B CN202010220556.7A CN202010220556A CN111475089B CN 111475089 B CN111475089 B CN 111475089B CN 202010220556 A CN202010220556 A CN 202010220556A CN 111475089 B CN111475089 B CN 111475089B
Authority
CN
China
Prior art keywords
task
displaying
game
area
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010220556.7A
Other languages
Chinese (zh)
Other versions
CN111475089A (en
Inventor
雷坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010220556.7A priority Critical patent/CN111475089B/en
Publication of CN111475089A publication Critical patent/CN111475089A/en
Application granted granted Critical
Publication of CN111475089B publication Critical patent/CN111475089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/47Controlling the progress of the video game involving branching, e.g. choosing one of several possible scenarios at a given point in time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a task display method, a task display device, a terminal and a storage medium, and relates to the technical field of application program development. The method comprises the following steps: the method comprises the steps of displaying a user interface of a game application program, displaying a first area of a task length graph in the user interface, wherein the task length graph is used for displaying a picture of a game task provided by the game application program, is formed by overlapping a plurality of layers, receives a moving instruction aiming at the task length graph, and responds to the moving instruction to display a second area of the task length graph in the user interface. According to the technical scheme, the game tasks are displayed in the form of the task length graph, the game tasks can be flexibly switched and checked only by moving the task length graph, operation of checking the game tasks is simplified, and operation efficiency and flexibility of checking the game tasks are improved.

Description

Task display method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of application program development, in particular to a task display method, a task display device, a task display terminal and a storage medium.
Background
In the game application, the user can understand the story line and task information included in the game.
In some game applications, each time a game progress node arrives, the game application pops up and displays a popup, text information related to the story line and the game task of the node is displayed in the popup, a user closes the popup after reading the text in the popup, and then the popup needs to be opened again when viewing the popup.
In the related art, since the task information of the game is dispersed in the respective pop-up windows, the pop-up windows need to be opened respectively when the completed game task is viewed, which is inconvenient to view and has low operation efficiency.
Disclosure of Invention
The embodiment of the application provides a task display method, a task display device, a terminal and a storage medium, which can simplify the operation of checking game tasks and improve the operation efficiency of checking game tasks. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a task display method, where the method includes:
displaying a user interface of a game application;
displaying a first area of a task length map in the user interface, wherein the task length map is a picture for displaying a game task provided by the game application program and is formed by overlapping a plurality of map layers;
receiving a movement instruction for the task long graph;
displaying a second area of the task length map in the user interface in response to the movement instruction.
In another aspect, an embodiment of the present application provides a task display device, where the task display device includes:
the interface display module is used for displaying a user interface of the game application program;
the area display module is used for displaying a first area of a task length image in the user interface, the task length image is used for displaying a game task image provided by the game application program, and the task length image is formed by overlapping a plurality of image layers;
the instruction receiving module is used for receiving a movement instruction aiming at the task long graph;
the area display module is further configured to display a second area of the task length map in the user interface in response to the movement instruction.
In another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the task display method.
In yet another aspect, an embodiment of the present application provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the task exposure method.
In a further aspect, the present application provides a computer program product, which is executed by a processor to implement the task demonstration method.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
by displaying a task length picture in a user interface of a game application program, wherein the task length picture is a picture for displaying a game task provided by the game application program, different areas of the task length picture can be triggered and displayed by moving the task length picture, so that different game tasks can be viewed; therefore, the game tasks are displayed in the form of the task length graph, the game tasks can be flexibly switched and checked only by moving the task length graph, the operation of checking the game tasks is simplified, and the operation efficiency and the flexibility of checking the game tasks are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flowchart of a task demonstration method provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of a task length graph provided by one embodiment of the present application;
FIG. 3 is a schematic diagram of a user interface provided by one embodiment of the present application;
FIG. 4 is a diagram of a mobile task length graph provided by one embodiment of the present application;
FIG. 5 is a flowchart of a task demonstration method provided by another embodiment of the present application;
FIG. 6 is a schematic illustration of a presentation of associated information provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a task length graph provided by another embodiment of the present application;
FIG. 8 is a schematic view of a user interface provided by another embodiment of the present application;
FIG. 9 is a flowchart of a task demonstration method provided by another embodiment of the present application;
FIG. 10 is a schematic view of a user interface provided by another embodiment of the present application;
FIG. 11 is a block diagram of a task presentation device provided by one embodiment of the present application;
FIG. 12 is a block diagram of a task presentation device provided in another embodiment of the present application;
fig. 13 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of methods consistent with aspects of the present application, as detailed in the appended claims.
The embodiment of the application provides a terminal, which is an electronic device with data calculation, processing and storage capabilities, and a game application program can be installed and run in the terminal. The terminal can be an electronic device such as a smart phone, a tablet Computer, a PC (Personal Computer), a game console, a smart television, and a wearable device. The terminal is provided with a display screen, and a user can realize human-computer interaction through the display screen. In the embodiment of the present application, the type of the game application is not limited, such as a MOBA (Multiplayer Online Battle Arena) type game application, a shooting type game application, an LBS (Location Based Service) type game application, and the like.
In the method of the embodiment of the present application, the execution subject of each step may be the terminal, such as a client of a game application running in the terminal.
The technical solution of the present application will be described below by means of several embodiments.
Referring to fig. 1, a flowchart of a task presentation method according to an embodiment of the present application is shown. As shown in FIG. 1, the method can comprise the following steps (101-104):
step 101, displaying a user interface of a game application.
The user interface of the game application program refers to a display interface provided by the game application program, and a user can realize human-computer interaction through the user interface. The user interface of the game application program can comprise different interfaces such as a login interface, a task interface, a game-to-game interface, a prop display interface and the like, and the display contents contained in the different interfaces are different. In addition, the user interface of the game application program may further include some operable controls, such as buttons, sliders, virtual rockers, and other operable controls, for the user to perform human-computer interaction.
Step 102, displaying a first area of the task length map in the user interface.
Optionally, the task length map is a picture for showing a game task provided by the game application. A game task may refer to a pre-set game goal that requires a user to complete in a game. The game tasks include, but are not limited to, at least one of: reaching a specified account level, obtaining specified virtual props (e.g., virtual weapons, virtual gems, virtual skin adornments, etc.), defeating specified virtual objects (e.g., virtual monsters preset by a gaming application, other user-controlled virtual objects, etc.). In some embodiments, the game mission is developed against a particular social/natural environment or against some storyline, and accordingly, the mission length map may also be used to show the social/natural environment or storyline associated with the game mission.
The size of the task length map in at least one direction may exceed the displayable area of the user interface, and thus, the user interface may not display the complete task length map. The task length map may be a long bar shape, an aspect ratio (i.e., a ratio of a length dimension to a width dimension) of the task length map may be greater than an aspect ratio of the user interface, the aspect ratio of the task length map may be 1.5, 2, 5, 10, or 20, and a specific value of the aspect ratio of the task length map may be set by a related technician according to an actual situation, which is not limited in this embodiment of the present application. The aspect ratio of the task length map may also vary according to the update of the game application, for example, the length of the task length map may be lengthened as the game application is updated to show some game tasks or social/physical environment information or storyline added by the updated game application.
Alternatively, the task length map may be formed by overlapping a plurality of map layers. The layer located on the lower layer may be covered by the layer located on the upper layer, or may not be covered by the layer located on the upper layer, which is not limited in this application embodiment. In the development process of the game application program, each layer can be simultaneously manufactured by different related technicians or manufactured in batches by the same related technicians, so that the manufacturing efficiency of the long task chart is improved.
Referring to fig. 2, a schematic diagram of a task length diagram provided by an embodiment of the present application is shown. In some embodiments, task length diagram 20 is elongated, as shown in fig. 2, with length 21 of task length diagram 20 being substantially greater than width 22. The task length map 20 may include a background layer 23 and a foreground layer 24. The background layer 23 may be used to present the social/natural environment of the game task and the foreground layer 24 may be used to present the main content of the game task.
In some embodiments, the task length map includes a plurality of regions, each region being usable to present at least one game task. The first area of the task length map is one of the areas of the task length map for presenting a portion of the game task. The game tasks displayed correspondingly in different areas can be different, for example, the difficulty of the game tasks can be increased progressively, and the difficulty of the game tasks corresponding to the areas farther back is higher; for example, the game tasks corresponding to the plurality of areas are set according to the progress of the story line, and the game tasks corresponding to the areas included in different story lines are different.
Referring to fig. 3, a schematic diagram of a user interface provided by an embodiment of the present application is shown. As shown in fig. 3, a first area 32 of the task length map is displayed in the user interface 31, and an area 33 other than the first area 32 in the task length map is not displayed in the user interface 31.
Step 103, receiving a movement instruction aiming at the task length map.
Alternatively, the move instruction is an instruction for changing an area displayed in the user interface of the task length map.
In some possible embodiments, the receiving of the movement instruction for the task length map may be receiving a sliding operation signal for the task length map, and the task length map may move according to a direction indicated by the sliding operation. Alternatively, the sliding operation may be implemented by a finger or a mouse-controlled cursor. The sliding distance of the sliding operation and the moving distance corresponding to the task length map may be the same or different. In some embodiments, the sliding distance of the sliding operation is different from the moving distance corresponding to the task length map, a ratio of the sliding distance of the sliding operation to the moving distance corresponding to the task length map may be k, k is a positive number, k may be 5, k may be 2, k may be 0.8, k may be 0.5, or k may be 0.2, and a specific numerical value of k may be set by a related technician according to an actual situation, which is not limited in this embodiment of the present application. In other embodiments, the sliding distance of the sliding operation is different from the moving distance corresponding to the task length map, and the moving speed of the task length map can be determined according to the sliding speed of the sliding operation. For example, when the sliding operation is at a constant speed, the task length map also moves at a constant speed; when the slide operation is an acceleration slide, the task length map also corresponds to an acceleration movement, and the acceleration of the task length map at the time of the movement may be determined by the acceleration of the slide operation.
Referring to fig. 4, a diagram of a mobile task length diagram according to an embodiment of the present application is shown. As shown in fig. 4, a region 41 in the task length map is displayed in the user interface, and when a sliding operation signal 42 for moving the task length map to the left is received, a region 43 may be moved to the user interface for display.
In other possible embodiments, the receiving of the movement instruction for the task length map may be receiving a click operation signal for a corresponding movement control of the task length map. The mobile control may be an entity control or a virtual control. The movement control may include a plurality of controls that control different directions of movement of the task length map. For example, when a forward control for controlling the task length graph to move forward is clicked, the task length graph moves forward; and when a backward movement control for controlling the backward movement of the task long image is clicked, the task long image moves backward.
And 104, responding to the moving instruction, and displaying a second area of the task length graph in the user interface.
The second area is a display area in the task length diagram different from the first area. The second region may be a region adjacent to the first region, or may also be a region not adjacent to the first region, which is not limited in this application. The second region may or may not have an overlapping region with the first region, which is not limited in this application. After receiving the movement instruction, the game application program may correspondingly display a second area corresponding to the movement instruction in the user interface. For example, the movement instruction is a move-forward instruction that instructs the task length map to move forward, the move-forward instruction may be triggered by a slide operation of sliding forward, or may be triggered by an operation of clicking the move-forward control, and then the second area is an area located before the first area in the task length map; for another example, the movement command is a backward movement command for instructing the task length map to move backward, and the backward movement command may be triggered by a slide operation of sliding backward, or may be triggered by an operation of clicking the above-described backward movement control, and the second area is an area located behind the first area in the task length map.
To sum up, in the technical solution provided in the embodiment of the present application, a task length chart is displayed in a user interface of a game application program, where the task length chart is a picture for showing a game task provided by the game application program, and different areas of the task length chart can be triggered and displayed by moving the task length chart, so as to view different game tasks; therefore, the game tasks are displayed in the form of the task length graph, the game tasks can be flexibly switched and checked only by moving the task length graph, the operation of checking the game tasks is simplified, and the operation efficiency and the flexibility of checking the game tasks are improved.
Referring to fig. 5, a flowchart of a task presentation method according to an embodiment of the present application is shown. As shown in FIG. 5, the method can comprise the following steps (501-507):
step 501, a user interface of a game application is displayed.
This step 501 is the same as or similar to the above step 101, and is not described herein again.
Step 502, a first area of a task length map is displayed in a user interface.
This step 502 is the same as or similar to the above step 102, and is not repeated here.
Step 503, acquiring a trigger signal for the target picture element in the task length map.
The task length map may include a plurality of picture elements, which are display elements in the task length map. The picture elements may be used to represent elements included in the game task. The picture elements in the task length map may include 2D picture elements and/or 3D picture elements. A 2D picture element may also be referred to as a two-dimensional picture element, i.e. a planar figure, which 2D picture element can only show the element content in two dimensions. The 2D picture element in the embodiment of the present application may refer to art material manufactured by using a 2D technology. A 3D picture element may also be referred to as a three-dimensional picture element, i.e. a stereo graphic, which may show the element content in three dimensional directions. The 3D picture element in the embodiment of the present application may refer to art material manufactured by using a 3D technology.
For the touch display screen, the trigger signal may be a touch signal generated by touch operations such as clicking, sliding and the like; for a non-touch display screen, the trigger signal may be a trigger signal generated by a cursor click or slide, etc.
In some embodiments, before step 503, the following steps may be further included:
1. acquiring a corresponding stop position of an operation body on a user interface;
2. determining whether the operation body stays at the display position of the target picture element or not according to the stay position and the display position of each picture element in the task length image;
3. highlighting the target picture element in response to the operator staying at the display position of the target picture element.
In the game application program, the position of the target picture element in the task length image can be provided with a trigger area corresponding to the target picture element, and the shape of the trigger area can be generated according to the outline shape of the target picture element. When the operator moves to the position of the trigger area, the corresponding target picture element may be highlighted. The manner of highlighting the target picture element may include: changing a color of the target picture element, highlighting an outline of the target picture element, changing a size of the target picture element, changing a brightness of the target picture element, and the like; the manner of highlighting the target picture element may further comprise: changing the color (e.g., graying out the color, etc.) or brightness (e.g., dimming or brightening) of the display region other than the target picture element, changing the color (e.g., graying out the color, etc.) or brightness (e.g., dimming or brightening) of the picture element other than the target picture element.
In some embodiments, the target picture element may be highlighted when the operator is in the display position of the target picture element; the target picture element may also be highlighted after a duration in which the operation body stays at the display position of the target picture element is greater than or equal to a preset duration. The preset time duration may be 0.2 second, 0.5 second, 1 second, or 2 seconds, and the specific value of the preset time duration may be set by a relevant technician according to an actual situation, which is not limited in the embodiment of the present application.
Optionally, the operating body is a cursor of a mouse. And when the cursor in the display screen stays at the display position of the target picture element, highlighting the target picture element. The user can control the position where the cursor stays by controlling the movement of the mouse or clicking a control in the mouse, and can also control the position where the cursor stays by performing operations such as sliding and clicking on the touch pad.
Alternatively, the operation body is an operation body such as a finger or a stylus that can perform touch operation. In some examples, the terminal recognizes that a finger touches a display area of the target picture element through the touch screen, and then highlights the target picture element. In other examples, the finger may not touch the display screen, and the terminal recognizes that the finger stays above the display area of the target picture element through the screen-integrated camera, and highlights the target picture element.
And 504, responding to the trigger signal, and displaying the associated information of the target picture element.
When a trigger signal for the target picture element is received, the associated information of the target picture element may be presented. The associated information of the target picture element may be used to present information of a game task related to the target picture element. The way of presenting the associated information of the target picture element includes, but is not limited to, at least one of the following: displaying the associated information of the target picture element in a text form, displaying the associated information of the target picture element in a sound form, displaying the associated information of the target picture element in an animation form, and displaying the associated information of the target picture element in a real person video form. The sound may be voice, music, or dubbing, which is not limited in the embodiment of the present application.
The associated information may be displayed in a predetermined area of the user interface, such as the bottom, top, or both areas of the user interface. Please refer to fig. 6, which shows a schematic illustration of the presentation of the related information provided in an embodiment of the present application. As shown in fig. 5, the related information 52 of the target picture element 51 may also be displayed near the target picture element 51.
Step 505, receiving a move instruction for the task length map.
This step 505 is the same as or similar to the content of step 103 in the embodiment of fig. 1, and is not repeated here.
Step 506, responding to the moving instruction, and detecting whether the second area has the display condition.
After receiving the moving operation instruction, whether the second area has the display condition or not can be detected according to the design logic of the game application program. If the second area has the display condition, the second area can be displayed; if the second area does not have the display condition, the second area may not be displayed.
In some embodiments, the task length map includes n regions, the regions being provided with corresponding game tasks, n being a positive integer. Detecting whether the second area has the display condition may include:
1. detecting whether the game task corresponding to the second area is finished;
2. and responding to the completion of the game task corresponding to the second area, and executing the step of displaying the second area in the user interface.
If the game task corresponding to the second area is completed, the second area can be displayed; if the game task corresponding to the second area is not completed, the second area may not be displayed. The condition that the game task corresponding to the second area is completed includes but is not limited to at least one of the following conditions: the user account has reached a specified level, the user has acquired a specified virtual prop (e.g., a virtual weapon, a virtual gemstone, a virtual skin adornment, etc.), the user has defeated a specified virtual object (e.g., a virtual monster preset by a gaming application, other user-controlled virtual object, etc.).
Referring to fig. 7, a schematic diagram of a task length diagram provided in another embodiment of the present application is shown. As shown in fig. 7, the task length map may include a region 71, a region 72, and a region 73, and the story line a and the game task a are a story line corresponding to the region 71 and a game task that a user needs to complete, the story line B and the game task B are a story line corresponding to the region 72 and a game task that a user needs to complete, and the story line C and the game task C are a story line corresponding to the region 73 and a game task that a user needs to complete. The condition of the display area 72 is met only if the game task a corresponding to the area 71 is completed; similarly, only game task B corresponding to completion area 73 is conditioned for display area 73.
In some embodiments, the display condition of the second region may also be that the real time is located at a specified time interval. For example, if the specified time interval is 3/2/2020 and 3/2/2020, then the second region does not have a display condition before 3/2/2020 (e.g., 3/1/2020); the second area has a display condition on day 3, month 2 in 2020.
Referring to fig. 8, a schematic diagram of a user interface provided in another embodiment of the present application is shown. As shown in fig. 8, when the second area has the display condition, a prompt message 81 may be displayed in the user interface to prompt the user that the second area may be displayed.
In step 507, in response to the second area having the display condition, the second area is displayed in the user interface.
For the description of step 507, reference may be made to step 104 in the embodiment of fig. 1, which is not described herein again.
Referring to fig. 9, a flowchart of a task presenting method according to another embodiment of the present application is shown. As shown in FIG. 9, the method can comprise the following steps (901-907):
step 901, dividing the task length map into a plurality of areas;
step 902, setting a trigger area at a corresponding position of a part of picture elements in the task length graph;
step 903, clicking a target picture element by a user;
step 904, the associated information is presented in the vicinity of the target picture element.
Step 905, setting corresponding storyline and game tasks for different areas of the task length chart;
step 906, the user implements sliding operation aiming at the task length graph;
step 907, detecting whether the second area has the display condition, if so, displaying the second area; and if not, not displaying the second area.
In summary, in the technical scheme provided by the embodiment of the application, by setting the picture elements in the task length chart, when the user views the task length chart, the user can visually know the relevant information of the game task through the picture elements, and the time for viewing the game task is saved.
In addition, in the embodiment of the application, when the fact that the operation body stays at the position of the target picture element is recognized, the target picture element is highlighted, and therefore the user is prompted that the associated information of the target picture element can be triggered.
Referring to fig. 10, a schematic diagram of a user interface provided in another embodiment of the present application is shown. As shown in fig. 10, in some embodiments, the method may further include the steps of:
1. displaying a thumbnail 11 corresponding to the task length image in a user interface;
2. receiving a selection instruction for a target position 12 in a thumbnail 11;
3. in response to the selection instruction, an area 13 corresponding to the target position 12 in the task length map is displayed in the user interface.
The thumbnail 11 corresponding to the task length map may be displayed in the display area at the top, bottom, or both sides of the user interface. By recognizing the operation of clicking, long pressing and the like of the user on the target position 12 in the thumbnail 11, a selection instruction is generated, and the area 13 corresponding to the target position 12 in the task length image is displayed in the user interface according to the selection instruction, so that the efficiency of determining the target position 12 in the task length image is improved.
In some embodiments, after the user selects the target position 12 in the thumbnail 11, the area 13 corresponding to the target position 12 in the task length image may be displayed in a preview manner, and if an undo instruction corresponding to the target position 12 is received, the area 13 corresponding to the target position 12 in the task length image displayed in the preview manner is closed; when a reconfirmation instruction for the target position 12 is received, the area 13 corresponding to the target position 12 in the displayed task length map is retained in the user interface.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 11, a block diagram of a task presenting device according to an embodiment of the present application is shown. The apparatus 1100 has a function of implementing the example of the task displaying method, and the function may be implemented by hardware or by hardware executing corresponding software. The apparatus 1100 may be the terminal described above, or may be provided on the terminal. The apparatus 1100 may include: an interface display module 1110, a long map display module 1120, an instruction receiving module 1130, and a long map moving module 1140.
The interface display module 1110 is configured to display a user interface of a game application.
The long map display module 1120 is configured to display a first area of a task long map in the user interface, where the task long map is a picture used for displaying a game task provided by the game application, and the task long map is formed by overlapping a plurality of map layers.
The instruction receiving module 1130 is configured to receive a move instruction for the task length map.
The long map moving module 1140 is configured to display a second area of the task long map in the user interface in response to the moving instruction.
To sum up, in the technical solution provided in the embodiment of the present application, a task length chart is displayed in a user interface of a game application program, where the task length chart is a picture for showing a game task provided by the game application program, and different areas of the task length chart can be triggered and displayed by moving the task length chart, so as to view different game tasks; therefore, the game tasks are displayed in the form of the task length graph, the game tasks can be flexibly switched and checked only by moving the task length graph, the operation of checking the game tasks is simplified, and the operation efficiency and the flexibility of checking the game tasks are improved.
In some embodiments, as shown in fig. 12, the region display module 1120 further includes a condition detection sub-module 1121 and a region display sub-module 1122.
The condition detection submodule 1121 is configured to detect whether the second region has a display condition.
The region display sub-module 1122 is configured to display the second region in the user interface in response to the second region being provided with the display condition.
In some embodiments, the task length map includes n regions, the regions are provided with corresponding game tasks, and n is a positive integer. The condition detection submodule 1121 configured to:
and detecting whether the game task corresponding to the second area is finished.
And responding to the completion of the game task corresponding to the second area, and executing the step of displaying the second area in the user interface.
In some embodiments, the instruction receiving module 1130 is configured to:
receiving a sliding operation signal aiming at the task length map;
or receiving a click operation signal aiming at the mobile control corresponding to the task long graph.
In some embodiments, as shown in fig. 12, the apparatus 1100 further comprises: a signal acquisition module 1150 and an information presentation module 1160.
The signal obtaining module 1150 is configured to obtain a trigger signal for a target picture element in the task length map.
The information displaying module 1160 is configured to display the associated information of the target picture element in response to the trigger signal.
In some embodiments, the information presentation module 1160 is to:
displaying the associated information of the target picture element in a text form;
and/or displaying the associated information of the target picture element in a sound form;
and/or displaying the associated information of the target picture element in an animation mode.
In some embodiments, as shown in fig. 12, the apparatus 1100 further comprises: a location acquisition module 1170, a location determination module 1180, and an element display module 1190.
The position obtaining module 1170 is configured to obtain a stop position of the operation body on the user interface.
The position determining module 1180 is configured to determine whether the operation body stays at the display position of the target picture element according to the staying position and the display position of each picture element in the task length map.
The element display module 1190 is configured to highlight the target picture element in response to the operator staying at the display position of the target picture element.
In some embodiments, the picture elements in the task length map comprise 2D picture elements and/or 3D picture elements.
In some embodiments, the apparatus 1100 further comprises: thumbnail display module 1195.
The thumbnail display module 1195 is configured to display a thumbnail corresponding to the task length map in the user interface.
The instruction receiving module 1130 is further configured to receive a selection instruction for a target position in the thumbnail.
The long map display module 1120 is further configured to display, in response to the selection instruction, an area corresponding to the target position in the task long map in the user interface.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 13, a block diagram of a terminal according to an embodiment of the present application is shown. The terminal 1300 may be an electronic device such as a mobile phone, a tablet computer, a game console, an electronic book reader, a multimedia player, a wearable device, a PC, etc. The terminal is used for implementing the task display method provided in the embodiment. The terminal may be terminal 1300 in the environment of implementation shown in fig. 1. Specifically, the method comprises the following steps:
in general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction, at least one program, set of codes, or set of instructions, and is configured to be executed by one or more processors to implement the task exposure method described above.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
In an exemplary embodiment, a computer readable storage medium is also provided, in which at least one instruction, at least one program, code set, or instruction set is stored, which when executed by a processor, implements the task exposure method described above.
In an exemplary embodiment, a computer program product is also provided for implementing the task exposure method described above when the computer program product is executed by a processor.
It should be understood that reference to "a plurality" herein means two or more. Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. A task presentation method, characterized in that the method comprises:
displaying a user interface of a game application;
displaying a first area of a task length map in the user interface, wherein the task length map is used for displaying pictures of a plurality of game tasks provided by the game application program, the task length map is formed by overlapping a plurality of layers, and the first area is used for displaying part of the game tasks in the plurality of game tasks;
receiving a movement instruction for the task long graph;
displaying a second area of the task length map in the user interface in response to the movement instruction.
2. The method of claim 1, wherein displaying the second region of the task length map in the user interface comprises:
detecting whether the second area has a display condition;
displaying the second area in the user interface in response to the second area being provided with the display condition.
3. The method according to claim 2, wherein the task length map includes n regions including the second region, the n regions are respectively provided with corresponding game tasks, and n is a positive integer;
the detecting whether the second area has a display condition includes:
detecting whether a game task corresponding to the second area is finished;
and responding to the completion of the game task corresponding to the second area, and executing the step of displaying the second area in the user interface.
4. The method of claim 1, wherein receiving a move instruction for the task length map comprises:
receiving a sliding operation signal aiming at the task length map;
or,
and receiving a click operation signal aiming at the mobile control corresponding to the task long graph.
5. The method of claim 1, wherein after displaying the first region of the task length map in the user interface, further comprising:
acquiring a trigger signal aiming at a target picture element in the task length chart;
and responding to the trigger signal, and displaying the associated information of the target picture element.
6. The method according to claim 5, wherein said presenting the associated information of the target picture element comprises:
displaying the associated information of the target picture element in a text form;
and/or the presence of a gas in the gas,
displaying the associated information of the target picture element in a sound form;
and/or the presence of a gas in the gas,
and displaying the associated information of the target picture element in an animation mode.
7. The method of claim 5, wherein after displaying the first region of the task length map in the user interface, further comprising:
acquiring a corresponding stop position of an operation body on the user interface;
determining whether the operation body stays at the display position of the target picture element or not according to the stay position and the display position of each picture element in the task length chart;
highlighting the target picture element in response to the operator staying at a display position of the target picture element.
8. The method according to claim 5, wherein the picture elements in the task length map comprise 2D picture elements and/or 3D picture elements.
9. The method according to any one of claims 1 to 8, further comprising:
displaying a thumbnail corresponding to the task length map in the user interface;
receiving a selection instruction for a target position in the thumbnail;
and responding to the selection instruction, and displaying an area corresponding to the target position in the task length map in the user interface.
10. A task demonstration apparatus, the apparatus comprising:
the interface display module is used for displaying a user interface of the game application program;
the long image display module is used for displaying a first area of a task long image in the user interface, the task long image is used for displaying images of a plurality of game tasks provided by the game application program, the task long image is formed by overlapping a plurality of image layers, and the first area is used for displaying part of the game tasks in the plurality of game tasks;
the instruction receiving module is used for receiving a movement instruction aiming at the task long graph;
and the long image moving module is used for responding to the moving instruction and displaying a second area of the task long image in the user interface.
11. A terminal, characterized in that it comprises a processor and a memory, in which at least one instruction, at least one program, a set of codes or a set of instructions is stored, which is loaded and executed by the processor to implement a task exposure method according to any of the preceding claims 1 to 9.
12. A computer-readable storage medium, having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a task exposure method according to any one of claims 1 to 9.
CN202010220556.7A 2020-03-25 2020-03-25 Task display method, device, terminal and storage medium Active CN111475089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010220556.7A CN111475089B (en) 2020-03-25 2020-03-25 Task display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010220556.7A CN111475089B (en) 2020-03-25 2020-03-25 Task display method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111475089A CN111475089A (en) 2020-07-31
CN111475089B true CN111475089B (en) 2021-06-01

Family

ID=71748429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010220556.7A Active CN111475089B (en) 2020-03-25 2020-03-25 Task display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111475089B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930287B (en) * 2020-08-12 2022-02-18 广州酷狗计算机科技有限公司 Interaction method and device based on virtual object, electronic equipment and storage medium
CN113082708B (en) * 2021-04-14 2022-05-03 网易(杭州)网络有限公司 Task guiding method and device in game
CN113476834B (en) * 2021-07-06 2024-06-25 网易(杭州)网络有限公司 Method and device for executing task in game, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104307174A (en) * 2014-10-14 2015-01-28 腾讯科技(深圳)有限公司 Method and device of showing user game data
CN108404411A (en) * 2018-01-05 2018-08-17 阿里巴巴集团控股有限公司 A kind of game information methods of exhibiting, device and equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5891183B2 (en) * 2013-01-16 2016-03-22 株式会社スクウェア・エニックス Video game processing apparatus and video game processing program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104307174A (en) * 2014-10-14 2015-01-28 腾讯科技(深圳)有限公司 Method and device of showing user game data
CN108404411A (en) * 2018-01-05 2018-08-17 阿里巴巴集团控股有限公司 A kind of game information methods of exhibiting, device and equipment

Also Published As

Publication number Publication date
CN111475089A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
US20210379491A1 (en) Virtual object control method and related apparatus
US11379105B2 (en) Displaying a three dimensional user interface
CN110354489B (en) Virtual object control method, device, terminal and storage medium
CN111475089B (en) Task display method, device, terminal and storage medium
US20220249949A1 (en) Method and apparatus for displaying virtual scene, device, and storage medium
KR102491443B1 (en) Display adaptation method and apparatus for application, device, and storage medium
CN109754454B (en) Object model rendering method and device, storage medium and equipment
CN112569611B (en) Interactive information display method, device, terminal and storage medium
CN111330272B (en) Virtual object control method, device, terminal and storage medium
CN109045694B (en) Virtual scene display method, device, terminal and storage medium
US20230289054A1 (en) Control mode selection to indicate whether simultaneous perspective change and function selection is enabled
WO2021227684A1 (en) Method for selecting virtual objects, apparatus, terminal and storage medium
CN109388461A (en) Display methods, device and the display terminal of object are identified in screen-picture screenshot
CN113318428A (en) Game display control method, non-volatile storage medium, and electronic device
CN113181640B (en) Menu bar display method and device, electronic equipment and storage medium
CN112306332A (en) Method, device and equipment for determining selected target and storage medium
CN113031846B (en) Method and device for displaying description information of task and electronic equipment
Botev et al. Immersive mixed reality object interaction for collaborative context-aware mobile training and exploration
US11875088B2 (en) Systems and methods for smart volumetric layouts
CN117065348A (en) Control method and device of virtual component, electronic equipment and readable storage medium
CN117899451A (en) Game processing method and device, electronic equipment and storage medium
CN116943197A (en) Virtual character display method, device, terminal, storage medium and program product
CN115518373A (en) Visual angle adjusting method and device in game scene, electronic equipment and storage medium
CN118277001A (en) Content display method, device, terminal and storage medium
CN117732053A (en) Game interface display method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40025724

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant