CN113138780B - Terminal performance evaluation method, device, equipment and storage medium - Google Patents

Terminal performance evaluation method, device, equipment and storage medium Download PDF

Info

Publication number
CN113138780B
CN113138780B CN202110546948.7A CN202110546948A CN113138780B CN 113138780 B CN113138780 B CN 113138780B CN 202110546948 A CN202110546948 A CN 202110546948A CN 113138780 B CN113138780 B CN 113138780B
Authority
CN
China
Prior art keywords
evaluation
performance
performance evaluation
target
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110546948.7A
Other languages
Chinese (zh)
Other versions
CN113138780A (en
Inventor
徐婧
张国乾
韩榕
傅镝文
李慧珊
王梓瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110546948.7A priority Critical patent/CN113138780B/en
Publication of CN113138780A publication Critical patent/CN113138780A/en
Application granted granted Critical
Publication of CN113138780B publication Critical patent/CN113138780B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/61Installation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/368Test management for test version control, e.g. updating test cases to a new software version
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/65Updates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/71Version control; Configuration management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Computer Security & Cryptography (AREA)
  • Debugging And Monitoring (AREA)
  • Stored Programmes (AREA)

Abstract

The application discloses a terminal performance evaluation method, a device, equipment and a storage medium, which enter a performance evaluation process of a terminal operation target application according to an evaluation instruction of a user. In the performance evaluation process, the evaluation resources and the configuration information of the target application are loaded, the evaluation resources of the target application are operated according to the execution sequence in the configuration information, and the corresponding evaluation tasks are executed to obtain the performance evaluation data of the target performance evaluation project. And when the evaluation task is executed, the application running animation corresponding to the evaluation task is played, so that the picture performance closely focused by the user is increased, and the user experience is improved. After the performance evaluation is finished, the evaluation result of the terminal operation target application is displayed, and the performance of the terminal operation target application is reflected as the evaluation result is determined according to the performance evaluation data, so that the performance evaluation in the more subdivision field is realized, and a user can know the performance advantage of the terminal operation target application according to the evaluation result, so that the user is guided to select a terminal with more target application cost performance.

Description

Terminal performance evaluation method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for evaluating terminal performance.
Background
With the rapid development of science and technology, various terminals are continuously enriched and are convenient for mass life. The terminal is used for processing the information, so that the terminal is an indispensable part of life or work of people due to the advantages of convenience, rapidness, resource saving and the like.
Under the condition, the user is also enthusiastic for the terminal running score, namely, the developed evaluation software is operated on the terminal, and the complex performance configuration is quantified through a set measurement mode, so that the exact score is calculated; the score is used as a conclusion to characterize the performance of the terminal, so that the terminal is simple and is easy to understand and compare for a user.
While in practice terminals of different users may be focused on different uses, the performance requirements for the terminal may also be different, e.g. some users play games with the terminal, and the performance of the terminal running the game application may be relatively high. However, most of the current evaluation software is performed around the performance running of the terminal, but is never used as a way of evaluating the performance of the terminal running the specific application in the subdivision field, for example, the influence of the actual performance on the specific application is not presented and expressed, so that the user cannot be helped to better know the performance advantages of the terminal running different applications, and the user cannot be guided to select a terminal more suitable for use.
Disclosure of Invention
In order to solve the technical problems, the application provides a terminal performance evaluation method, a device, equipment and a storage medium, which realize performance test in more sub-division fields, for example, the performance of a terminal operation target application can be tested, so that a user can know the performance advantage of the terminal operation target application according to the evaluation result, and further instruct the user to select a terminal with more target application cost performance. Meanwhile, when the evaluation task is executed, the application running animation corresponding to the evaluation task is played, so that the picture performance closely focused by the user is increased, and the user experience is improved.
The embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application provides a method for evaluating terminal performance, where the method includes:
receiving an evaluation instruction of a user, and entering a performance evaluation process of a terminal running target application;
in the performance evaluation process, loading evaluation resources and configuration information of the target application, wherein the configuration information is used for configuring the execution sequence of evaluation tasks corresponding to target performance evaluation items;
executing corresponding evaluation tasks according to the evaluation resources of the target application running in the execution sequence, and playing application running animations corresponding to the evaluation tasks in an evaluation interface according to the evaluation resources to obtain performance evaluation data of the target performance evaluation items;
And after the performance evaluation is finished, displaying a performance evaluation result of the terminal running the target application, wherein the performance evaluation result is determined according to the performance evaluation data of the target performance evaluation item.
In a second aspect, an embodiment of the present application provides a terminal performance evaluation device, where the device includes a receiving unit, a loading unit, an executing unit, a playing unit, and a display unit:
the receiving unit is used for receiving an evaluation instruction of a user and entering a performance evaluation process of a terminal operation target application;
the loading unit is used for loading the evaluation resource and the configuration information of the target application in the performance evaluation process, and the configuration information is used for configuring the execution sequence of the evaluation task corresponding to the target performance evaluation item;
the execution unit is used for running the evaluation resources of the target application according to the execution sequence, executing the corresponding evaluation tasks and obtaining the performance evaluation data of the target performance evaluation items;
the playing unit is used for playing the application running animation corresponding to the evaluation task in the evaluation interface according to the evaluation resource when the evaluation task is executed;
and the display unit is used for displaying the performance evaluation result of the terminal running the target application after the performance evaluation is completed, wherein the performance evaluation result is determined according to the performance evaluation data of the target performance evaluation item.
In a third aspect, an embodiment of the present application provides an apparatus for terminal performance evaluation, where the apparatus includes a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of the first aspect according to instructions in the program code.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing program code for performing the method of the first aspect.
According to the technical scheme, the performance evaluation process of the terminal operation target application is entered after the evaluation instruction of the user is received. In the performance evaluation process, the evaluation resource and the configuration information of the target application are loaded, and the configuration information is used for configuring the execution sequence of the evaluation tasks corresponding to the target performance evaluation items. And then, executing a corresponding evaluation task according to the evaluation resources of the target application in the execution sequence, and playing an application operation animation corresponding to the evaluation task in an evaluation interface according to the evaluation resources to obtain performance evaluation data of the target performance evaluation project. Because the evaluation process uses the evaluation resource of the target application, the performance evaluation data is obtained by evaluating the performance of the terminal running the target application, and can reflect the performance of the terminal running the target application, thereby realizing the performance evaluation in the more subdivision field. Therefore, after performance evaluation is completed, the evaluation result of the terminal running target application is displayed, and the evaluation result is determined according to the performance evaluation data of the target performance evaluation item, so that a user can know the performance advantage of the terminal running target application according to the evaluation result, and further instruct the user to select a terminal with more target application cost performance. Meanwhile, when the evaluation task is executed, the application running animation corresponding to the evaluation task is played, so that the picture performance closely focused by the user is increased, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a schematic diagram of an interface for evaluating performance of a terminal according to the related art;
FIG. 2 is a schematic diagram of an interface for evaluating performance of a terminal according to the related art;
fig. 3 is a schematic diagram of a system architecture of a terminal performance evaluation method according to an embodiment of the present application;
fig. 4 is a flowchart of a method for evaluating terminal performance according to an embodiment of the present application;
FIG. 5 is a schematic interface diagram of a pre-evaluation software provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface of an evaluation-in-evaluation software according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an interface of an evaluation-in-evaluation software according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an interface of an evaluation-in-evaluation software according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a display interface of an evaluation result according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a display interface of an evaluation result according to an embodiment of the present application;
FIG. 11 is a flow chart of a method for evaluating terminal performance according to an embodiment of the present application;
FIG. 12 is a signaling interaction diagram for evaluating performance of a mobile phone running a game application according to an embodiment of the present application;
FIG. 13 is a block diagram of a terminal performance evaluation device according to an embodiment of the present application;
fig. 14 is a block diagram of a terminal according to an embodiment of the present application;
fig. 15 is a block diagram of a server according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings.
A terminal performance test method is provided in the related art, and the performance of the terminal is characterized by taking the score as a conclusion. Referring to fig. 1 and 2, fig. 1 and 2 show schematic diagrams of an evaluation result interface for evaluating terminal performance, and in the evaluation result interface shown in fig. 1, only scores representing evaluation results are shown: 46078 minutes; in the evaluation result interface shown in fig. 2, however, a score representing the evaluation result is shown in addition to: 651082, it also shows information about the capabilities of several terminals, namely a graphics processor (graphics processing unit, GPU), a central processing unit (central processing unit, CPU), memory (MEM), and User Experience (UX).
However, this method can only test the overall performance of the terminal itself, and is used as, for example, and evaluates the performance of the terminal running a specific application in its subdivision field, and the effect of the actual performance on the specific application is not presented and expressed.
Since in practical situations, the terminals of different users may focus on different purposes, the performance requirements for the terminals may also be different, for example, some users play games with the terminals, and then there may be relatively high requirements for the performance of the terminals to run game applications; some users use terminals to perform mechanical design, and there is a high requirement on the performance of the terminals to run mechanical design software. However, the current evaluation method can know the performance of the terminal itself, but cannot evaluate the performance of the terminal running the specific application, which results in that the user cannot know the performance advantages of the terminal running different applications, and cannot instruct the user to select a terminal more suitable for use.
In addition, only the score is shown in the evaluation results shown in fig. 1 and fig. 2, but according to the score, the user does not know whether the performance represented by the score is good or bad, to what extent, or not to what extent, so that the user cannot intuitively and clearly know the terminal performance.
In order to solve the technical problems, the embodiment of the application provides a terminal performance evaluation method, which is used for realizing performance evaluation of a terminal operation target application, and in the performance evaluation process, evaluation resources and configuration information of the target application are loaded, so that the performance evaluation is performed by using the evaluation resources of the target application, and the obtained performance evaluation data is obtained by evaluating the performance of the terminal operation target application, can reflect the performance of the terminal operation target application, and realizes the performance evaluation in the more subdivision field. Therefore, after performance evaluation is completed, the displayed evaluation result is determined according to the performance evaluation data of the target performance evaluation item, and the performance of the terminal running target application is reflected, so that a user can know the performance advantage of the terminal running target application according to the evaluation result, and further instruct the user to select a terminal with more target application cost performance.
It should be noted that, the terminal performance evaluation provided by the embodiment of the present application is used to evaluate the performance of the terminal running various applications, for example, the performance of the terminal running the game application (terminal game performance), the performance of the terminal running the machine design application, the performance of the terminal running the office application, and so on.
The method provided by the embodiment of the application relates to the technical field of cloud, such as the field of cloud computing, namely the terminal performance testing method provided by the embodiment of the application can be implemented on the cloud, and performance evaluation data and a final performance evaluation result are obtained through cloud computing.
Cloud computing (cloud computing) refers to the delivery and usage model of the internet technology (Internet Technology, IT) infrastructure, meaning that required resources are obtained in an on-demand, easily scalable manner over a network; generalized cloud computing refers to the delivery and usage patterns of services, meaning that the required services are obtained in an on-demand, easily scalable manner over a network. Such services may be IT, software, internet related, or other services. Cloud Computing is a product of fusion of traditional computer and network technology developments such as Grid Computing (Grid Computing), distributed Computing (distributed Computing), parallel Computing (Parallel Computing), utility Computing (Utility Computing), network storage (Network Storage Technologies), virtualization (Virtualization), load balancing (Load balancing), and the like.
The method can also relate to the field of cloud storage, cloud storage (cloud storage) is a new concept which extends and develops in the concept of cloud computing, and a distributed cloud storage system (hereinafter referred to as a storage system for short) refers to a storage system which integrates a large number of storage devices (storage devices are also called storage nodes) of different types in a network to work cooperatively through application software or application interfaces through functions such as cluster application, grid technology, a distributed storage file system and the like, and provides data storage and service access functions together. For example, the evaluation resources and configuration information required for performance evaluation can be stored in a cloud storage mode.
Next, a system architecture of the terminal performance evaluation method will be described. Referring to fig. 3, fig. 3 is a schematic system architecture diagram of a terminal performance evaluation method according to an embodiment of the present application. The system architecture comprises a terminal 301 and an evaluation server 302, wherein the terminal 301 can be provided with an operation evaluation software, so that a user can trigger performance evaluation of executing a target application operated by the terminal 301 through the evaluation software. The target application may be, for example, a separately downloaded installed application, such as a gaming application, a mechanical design application, an office application, and the like.
To enable evaluation of the performance of the terminal 301 running the target application, the user may install an evaluation software for the terminal 301. The user then opens the evaluation software on the terminal 301 and initiates the evaluation. The terminal 301 receives the evaluation instruction of the user, and enters a performance evaluation process of running the target application by the terminal 301.
In the performance evaluation process, the terminal 301 loads evaluation resources and configuration information of the target application, so that performance of the terminal 301 running the target application can be evaluated. The configuration information is used for configuring the execution sequence of the evaluation tasks corresponding to the target performance evaluation items, the evaluation tasks are tasks created for acquiring the performance evaluation data of the target evaluation items, and the performance evaluation data of the target evaluation items are acquired by executing the evaluation tasks corresponding to the target performance evaluation items.
The evaluation resource may be a scene material corresponding to a target application required for performance evaluation, and taking the game application as an example, the evaluation resource is a game scene material required for performance evaluation. The target application can correspond to a plurality of evaluation resources, execute the evaluation tasks corresponding to different target evaluation items, and the required evaluation resources can be the same or different and can be determined according to actual requirements. For example, the target evaluation item is a special effect, and then the evaluation resource used is typically scene material that needs to render the special effect of the game.
Then, the terminal 301 executes the corresponding evaluation tasks according to the evaluation resources of the target application running in the execution sequence, and plays the application running animation corresponding to the evaluation tasks according to the evaluation resources when executing the evaluation tasks, so as to obtain the performance evaluation data of the target performance evaluation items. The terminal 301 may synchronize the performance evaluation data to the evaluation server 302, and after the performance evaluation is completed, the evaluation server 302 may determine a performance evaluation result of the terminal 301 running the target application according to the performance evaluation data. The terminal 301 may obtain the performance evaluation result and display the performance evaluation result to the user.
Because the evaluation result is determined according to the performance evaluation data of the target performance evaluation item and the performance evaluation data is obtained by evaluating by using the evaluation resource of the target application, the evaluation result can reflect the performance of the terminal running the target application, and a user can know the performance advantage of the terminal running the target application according to the evaluation result, so that the user is guided to select the terminal with more target application cost performance.
It should be noted that, in the embodiment of the present application, the evaluation server 302 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides a cloud computing service. Terminal 301 may be, but is not limited to, a smart phone, tablet, notebook, desktop, smart box, smart watch, etc. The terminal 301 and the evaluation server 302 may be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
The method provided by the embodiment of the application can be executed by the terminal or can be executed by the terminal and the evaluation server in a matched manner, and the embodiment of the application is not limited to the method.
Next, a detailed description will be given of a terminal performance evaluation method provided by the embodiment of the present application with reference to the accompanying drawings.
Referring to fig. 4, fig. 4 shows a flowchart of a terminal performance evaluation method, which includes:
s401, the terminal receives an evaluation instruction of a user and enters a performance evaluation process of the terminal running the target application.
The terminal performance evaluation method can be realized by an evaluation software, and in the embodiment of the application, the evaluation software can be named as GMark evaluation software. In order to test the performance of the terminal running target application, a user can install evaluation software on the terminal, then the user opens the evaluation software on the terminal to start evaluation, and the terminal can receive an evaluation instruction of the user and enter a performance evaluation process of the terminal running target application.
Referring to fig. 5, fig. 5 shows a schematic interface diagram of an evaluation software before evaluation, before the evaluation, a user can click a "start evaluation" button in the interface, so as to trigger an evaluation instruction, start the evaluation, and enter a performance evaluation process of a terminal running target application.
It should be noted that, in the embodiment of the present application, compared with the related art, not only the evaluation in the more subdivided field can be realized, for example, the performance evaluation for the terminal running target application can be realized, but also the evaluation can be respectively performed for the target performance evaluation items of different dimensions focused by the user, so as to obtain the performance evaluation data of the target performance evaluation items, and further, the performance evaluation result of the terminal running target application is obtained according to the performance evaluation data.
Taking a game application as an example, a user generally focuses attention on related contents of game side terminal performances such as smoothness, game starting time, duration, special effect, graphics processor performance and the like aiming at the performance of the terminal running the game application. In this case, the target performance evaluation items may include one or more combinations of smoothness, game start time, endurance, special effects, and graphics processor performance. The corresponding evaluation task is one or a combination of a plurality of fluency evaluation, game starting time evaluation, endurance time evaluation, special effect evaluation and graphics processor performance evaluation.
In addition, the target performance evaluation items may also include performance evaluation items of delayed response, tone quality, screen, network, temperature, etc.
It can be appreciated that the target performance evaluation items to be evaluated later can be preset, user-defined, or determined by a combination of preset and user-defined modes. For example, the performance evaluation items which are relatively common can be directly preset, and the performance evaluation items which are relatively unusual can be selected by the user according to the evaluation requirements of the user.
Thus, in one possible implementation, the terminal may present a performance evaluation item selection interface that includes a plurality of candidate performance evaluation items, the plurality of candidate performance evaluation items corresponding to different target applications being different. If the target performance evaluation item is preset, a plurality of candidate performance evaluation items displayed in the performance evaluation item selection interface can be used as the target performance evaluation item. Referring to 501 in fig. 5, 501 illustrates a performance evaluation item selection interface corresponding to when the target application is a game application, through which a user can learn that core performance evaluation items of the current evaluation software include smoothness, game start time, duration, special effects, and graphics processor performance (501 in fig. 5 is not illustrated), and these performance evaluation items may be referred to as basic performance evaluation items.
If the target performance evaluation item can be user-defined, the user can execute a selection operation on the performance evaluation item selection interface to select the performance evaluation item to be evaluated, and the terminal can respond to the selection operation of the user to determine the target performance evaluation item from the candidate performance evaluation items.
It should be noted that, in addition to the basic performance evaluation items shown in 501 in fig. 5, the plurality of candidate performance evaluation items may further include an extended performance evaluation item, where the extended performance evaluation item may include, for example, a stress test, a network test, a game feature special effect test, and the like. The basic performance evaluation items are generally displayed preferentially, namely, the opening evaluation software can be displayed, and for the extended performance evaluation items, a user can browse other performance evaluation items through a sliding interface.
In some cases, the basic performance evaluation items may be regarded as fixed evaluation items, i.e., the target performance evaluation items must include the basic performance evaluation items; the extended performance evaluation item may be selected by a user. Of course, the target performance evaluation items may also be selected by the user in the basic performance evaluation item and the extended performance evaluation item.
S402, loading evaluation resources and configuration information of the target application by a terminal in the performance evaluation process, wherein the configuration information is used for configuring the execution sequence of the evaluation tasks corresponding to the target performance evaluation items.
In the performance evaluation process, the terminal can load evaluation resources and configuration information of the target application so as to evaluate the performance of the terminal running the target application. The configuration information is used for configuring the execution sequence of the evaluation tasks corresponding to the target performance evaluation items, the evaluation tasks are tasks created for acquiring the performance evaluation data of the target evaluation items, and the performance evaluation data of the target evaluation items are acquired by executing the evaluation tasks corresponding to the target performance evaluation items. For example, when the target application is a game application and the target performance evaluation items include smoothness, game start time, duration, special effect and graphics processor performance, the execution sequence of the evaluation tasks corresponding to the target performance evaluation items can be set through configuration information, for example, the execution sequence is set to be sequentially executed according to the sequence of smoothness evaluation, game start time evaluation, duration evaluation, special effect evaluation and graphics processor performance evaluation.
In addition, the configuration information can be provided with a calculation algorithm of the performance evaluation data and the performance evaluation result, so that how to acquire the performance evaluation data and how to calculate the performance evaluation result can be known according to the configuration information.
In the embodiment of the application, the used evaluation resource is the real evaluation resource of the target application. Taking the example that the target application is a game application, the evaluation resources used for evaluation are unique real game scene materials. The game scene material may be preset, but may be updated as the game application is updated, a new game application appears, i.e. the evaluation resource is updated.
Based on the above, in the application, before the terminal loads the evaluation resource and the configuration information of the target application, the version numbers of the local evaluation resource and the configuration information can be obtained, and if the version numbers of the local evaluation resource and the configuration information do not meet the preset conditions, the target evaluation resource request and the target configuration information request are sent to the file storage server. Wherein the file storage server may be a server in a content delivery network (Content Delivery Network, CDN). The terminal receives target evaluation resources and target configuration information returned by the file storage server, and updates local evaluation resources and configuration information by using the target evaluation resources and the target configuration information to serve as evaluation resources and configuration information of target application.
It is understood that the preset condition may be whether the local evaluation resource and the configuration information are the latest versions. The latest version number can be obtained from the operation platform, so the terminal can request the version numbers of the evaluation resources and the configuration information from the operation platform through the evaluation server, and the operation platform returns the version numbers of the evaluation resources and the configuration information to the terminal through the evaluation server according to the request. Therefore, the terminal can determine whether the version numbers of the local evaluation resources and the configuration information accord with preset conditions according to the version numbers of the local evaluation resources and the configuration information and the version numbers obtained from the operation platform. If the version number of the local evaluation resource and the configuration information is smaller than the version number obtained from the operation platform, the local evaluation resource and the configuration information can be considered to be not the latest version and do not meet the preset condition. And the terminal can assemble the target evaluation resource request and the target configuration information request, send the target evaluation resource request and the target configuration information request to the file storage server, and download the target evaluation resource and the target configuration information returned by the file storage server.
When downloading the target evaluation resource and the target configuration information returned by the file storage server, a User Interface (UI) can be updated on the terminal in real time, so that the User Interface presents a downloading state until the downloading is completed.
After downloading the target evaluation resource and the target configuration information, the terminal can display an installation inquiry window for the user so as to inquire whether the user installs the target evaluation resource and the target configuration information, namely whether the target evaluation resource and the target configuration information are utilized to update the local evaluation resource and the local configuration information. If the user selects to install, installing the target evaluation resource and the target configuration information, updating the local evaluation resource and the local configuration information, and obtaining the evaluation resource and the configuration information of the target application, thereby starting to execute the evaluation task.
And S403, the terminal runs the evaluation resources of the target application according to the execution sequence, executes the corresponding evaluation tasks, and plays the application running animation corresponding to the evaluation tasks in the evaluation interface according to the evaluation resources to obtain the performance evaluation data of the target performance evaluation item.
The evaluation tasks corresponding to different target performance evaluation items are executed, and the required evaluation resources can be the same or different, so that the terminal can sequentially run the evaluation resources of the target applications according to the execution sequence configured by the configuration information, execute the corresponding evaluation tasks, and play the application running animation corresponding to the evaluation tasks in the evaluation interface according to the evaluation resources to obtain the performance evaluation data of the target performance evaluation items.
Taking a game application as an example, when special effect evaluation is executed, a plurality of scene materials rendering special effects are required to be used as evaluation resources for special effect evaluation. In this way, when the special effect evaluation is executed, the game animation of rendering the special effect, namely the application running animation (for example, the game animation shown in fig. 8) can be played, so that the user can intuitively feel the rendering effect, and the user experience is improved.
It should be noted that, if the target application is a game application, since the running of the game application needs to depend on the game engine, different game engines may correspond to different terminal game performances. Therefore, when performing performance evaluation of the terminal running game application, the implementation manner of S403 may be to sequentially run evaluation resources corresponding to the target application under different game engines according to the execution sequence, and execute corresponding evaluation tasks for different game engines.
For example, if the game engine includes a UE4 engine and a Unity engine, performance evaluation data corresponding to the target performance evaluation items needs to be obtained for the UE4 engine and the Unity engine respectively. If the target performance evaluation items comprise fluency, game starting time, endurance time, special effect and graphics processor performance, performance evaluation is required to be performed aiming at the running game application of the UE4 engine, and the fluency evaluation, the game starting time evaluation, the endurance time evaluation, the special effect evaluation and the graphics processor performance evaluation are sequentially performed to obtain performance evaluation data corresponding to the fluency, the game starting time, the endurance time, the special effect and the graphics processor performance; and performing performance evaluation on the Unity engine running game application, and sequentially performing smoothness evaluation, game starting time evaluation, duration evaluation, special effect evaluation and graphics processor performance evaluation to obtain performance evaluation data corresponding to the smoothness, the game starting time, the duration, the special effect and the graphics processor performance.
In the process of executing the evaluation task, the terminal can display an evaluation interface in real time, wherein the evaluation interface comprises a picture of an evaluation resource of a target application operated by the terminal, for example, when the target application is a game application, the terminal can display the game picture effect on the evaluation interface in real time until the evaluation is finished.
It should be noted that in the embodiment of the present application, the target performance evaluation items with different dimensions may be evaluated, and a certain time is required for completing all the evaluations. In order to improve user experience, a user can know the progress of performance evaluation in the process of waiting for the completion of the evaluation, and in a possible implementation manner, the terminal can display the execution progress of the evaluation task in the process of the performance evaluation. The execution progress of the evaluation task may reflect which evaluation task of the plurality of evaluation tasks is currently executed, or may reflect an evaluation time required for the evaluation task currently being executed.
In the embodiment of the application, the execution progress of the evaluation task can be displayed in various modes. The first way may be to directly show the execution progress of the evaluation task by means of a text description, for example, show a text description "smoothness of execution is being evaluated, and the remaining 1 minute" in the evaluation interface of the terminal.
The second way may be by way of a multi-tasking node. Specifically, the terminal may create multi-task nodes, each task node representing an evaluation task, and the multi-task nodes are presented in the evaluation interface. And when the target evaluation task is executed in the plurality of evaluation tasks, marking the target evaluation task and displaying the evaluation time of the target evaluation task.
Taking the game application as an example, if the evaluation task includes smoothness evaluation, special effect evaluation and graphics processor performance evaluation, a multi-task node with smoothness evaluation, special effect evaluation and graphics processor performance evaluation as task nodes can be created, and the multi-task node is displayed by a rectangular progress display bar, and each task node is represented by a rectangle, as shown at the bottom of an evaluation interface shown in fig. 6. If the smoothness evaluation is currently being executed, namely the smoothness evaluation is taken as a target evaluation task, the smoothness evaluation can be marked. The marking may be color marking of the target evaluation task, symbol marking of the target evaluation task, or the like. For example, the rectangle corresponding to the fluency evaluation is represented by a different color from the rectangle corresponding to the other two evaluation tasks, the rectangle corresponding to the fluency evaluation is marked by light gray in fig. 6, and the rectangle corresponding to the special effect evaluation and the graphics processor performance evaluation has the same color and is different from the rectangle corresponding to the fluency evaluation, so that the user can know that the fluency evaluation is currently being executed.
Because the embodiment of the application can execute performance evaluation for different game engines, when the performance evaluation is performed for a game engine such as a UE4 engine running game application, a progress bar (shown in fig. 7) for loading the game engine can be displayed in an evaluation interface, after loading is completed, a target evaluation task starts to be executed, namely, at the moment, an application running animation such as a game animation (shown in fig. 8) corresponding to the target evaluation task is played in the evaluation interface according to the evaluation resource of the target evaluation task, and the evaluation time of the target evaluation task is displayed in the evaluation interface, for example, the evaluation time required for smooth evaluation in fig. 8 is about 5 minutes, and the evaluation time is updated along with the evaluation, so that the progress of the target evaluation task is reflected, and the user experience is improved.
In some cases, there are some performance evaluation data, such as performance evaluation data corresponding to the state of the terminal system, and although the performance evaluation data is not the performance evaluation data corresponding to the target performance evaluation item, the intermediate data generated by the performance evaluation is changed in real time, so that the real-time performance of the terminal running target application is reflected, and the performance evaluation data corresponding to the target performance evaluation item can be calculated. Therefore, in the performance evaluation process, the system state of the terminal can be monitored in real time, so that the performance evaluation data corresponding to the system state is displayed.
The performance evaluation data corresponding to the system state may include a real-time frame rate, a temperature, a CPU occupancy rate, a GPU occupancy rate, power consumption, a memory, and the like, and referring to fig. 8, fig. 8 illustrates the real-time frame rate, the temperature, the CPU occupancy rate, and the GPU occupancy rate. Meanwhile, the game engine for which the current performance evaluation is aimed is also embodied as a UE4 engine.
In the performance evaluation process, the performance evaluation data corresponding to the system state is monitored and displayed in real time, so that a user can conveniently know the real-time performance of the terminal running target application in real time, and the experience is clearer and clearer.
And S404, after performance evaluation is completed, displaying a performance evaluation result of the terminal running the target application, wherein the performance evaluation result is determined according to performance evaluation data of the target performance evaluation item.
Wherein the determination of the performance evaluation result may include a plurality of ways. The first mode is calculated by the terminal according to the performance evaluation data of the target performance evaluation item. The second mode is that the terminal synchronizes the performance evaluation data of the target performance evaluation item to the evaluation server, and the evaluation server determines the performance evaluation result according to the performance evaluation data of the target performance evaluation item, so that when the terminal sends a performance evaluation result request to the evaluation server, the server can return the performance evaluation result to the terminal according to the performance evaluation result request. The third mode is that the terminal calculates a performance evaluation result according to the performance evaluation data of the target performance evaluation project, the server also calculates a performance evaluation result according to the second mode, then the performance evaluation result calculated by the terminal is verified by the performance evaluation result calculated by the server, if the verification result shows that the performance evaluation result calculated by the terminal is accurate, the performance evaluation result is used as a final performance evaluation result, otherwise, the performance evaluation result calculated by the server is used for replacing the performance evaluation result calculated by the terminal.
In one possible implementation manner, the performance evaluation result includes one or more combinations of a performance total score, a ranking list, performance evaluation data corresponding to each target performance evaluation item, and a level corresponding to each target performance evaluation item, where the performance total score is calculated according to the performance evaluation data corresponding to the target performance evaluation item. The ranking list reflects the ranking of the performance of the terminal running the target application in various terminals, so that a user can know what level the performance of the terminal running the target application is according to the ranking list, and the user can conveniently select a terminal with better performance.
The performance evaluation data corresponding to the target performance evaluation items can reflect the quality of each item target performance evaluation item concerned by the user, so that the user can clearly know the quality of each target performance evaluation item. Because the target performance evaluation items which are emphasized by different users may be different, for example, some users are emphasized in fluency and some users are emphasized in special effect, the terminal suitable for the user to use can be conveniently selected by the user according to the performance evaluation data of each target performance evaluation item by displaying the performance evaluation data of each target performance evaluation item.
The performance evaluation data of the target performance evaluation item comprises an evaluation score and sub item content, wherein the sub item content is sub item actual data based on which the evaluation score is calculated, and the evaluation is more objective.
The grade corresponding to each target performance evaluation item can reflect the quality degree of each target performance evaluation item, so that a user can intuitively and clearly know the quality degree of the terminal operation target application in each target performance evaluation item, and the terminal with the highest cost performance suitable for self use is selected according to the grade corresponding to each target performance evaluation item.
Referring to fig. 9 and 10, a performance score (gmart score: 666666) is shown in fig. 9, and a ranking list, such as "5 th", "handset game performance monster exceeding 92%, corresponds to performance evaluation data, such as" fluency ", for each target performance evaluation item: 55FPS "," endurance: 10 hours "," game start time: 10s "," pressure evaluation: and (3) excellent.
In fig. 10, scoring details are further shown, including the score of each target performance evaluation item, the sub-item content, and the rank corresponding to each target performance evaluation item. The evaluation score of each target performance evaluation item is, for example, "fluency 12256 score", "game start time 12256 score", another expression of endurance time "power consumption 9536 score", "special effect 44874 score". As shown in fig. 10, for example, the smoothness evaluation includes sub-items of "shot evaluation 58FPS", "MOBA evaluation 58FPS", "RPG evaluation 58FPS", wherein FPS is the number of transmission frames per second (Frames Per Second), MOBA is a multiplayer online tactical game (Multiplayer Online Battle Arena), and RPG is a Role-playing game; the game start time evaluation includes the sub-items of "shoot/58 FPS:32s "," MOBA/58FPS:16s "," RPG/58FPS:8 s). The grades corresponding to the target performance evaluation items are shown in fig. 10, the grades can comprise SSS, SS, S, A, D from high to low, and the higher the grade is, the more advantageous the performance is, for example, the grade corresponding to the smoothness evaluation is SSS, the grade corresponding to the game starting time evaluation is SS, the grade corresponding to the endurance evaluation is A, and the grade corresponding to the special effect evaluation is D.
It should be noted that, the total performance score and the evaluation score of each target performance evaluation item may be obtained through a preset scoring model, and the scoring model calculates the total performance score and the evaluation score of each target performance evaluation item through a calculation algorithm configured in the configuration information.
In the evaluation result display interface shown in fig. 10, a "re-evaluation" button is further included, and clicking the button can re-perform performance evaluation by the method provided by the embodiment of the present application.
According to the technical scheme, the performance evaluation process of the terminal operation target application is entered after the evaluation instruction of the user is received. In the performance evaluation process, the evaluation resource and the configuration information of the target application are loaded, and the configuration information is used for configuring the execution sequence of the evaluation tasks corresponding to the target performance evaluation items. And then, executing a corresponding evaluation task according to the evaluation resources of the target application in the execution sequence, and playing an application operation animation corresponding to the evaluation task in an evaluation interface according to the evaluation resources to obtain performance evaluation data of the target performance evaluation project. Because the evaluation process uses the evaluation resource of the target application, the performance evaluation data is obtained by evaluating the performance of the terminal running the target application, and can reflect the performance of the terminal running the target application, thereby realizing the performance evaluation in the more subdivision field. Therefore, after performance evaluation is completed, the evaluation result of the terminal running target application is displayed, and the evaluation result is determined according to the performance evaluation data of the target performance evaluation item, so that a user can know the performance advantage of the terminal running target application according to the evaluation result, and further instruct the user to select a terminal with more target application cost performance. Meanwhile, when the evaluation task is executed, the application running animation corresponding to the evaluation task is played, so that the picture performance closely focused by the user is increased, and the user experience is improved.
It should be noted that, by the method provided by the embodiment of the present application, not only the performance of the terminal running target application can be obtained, but also the performance of the terminal itself, for example, the CPU performance, in this embodiment, the CPU performance of the terminal can be estimated by floating point calculation, compression algorithm, analysis algorithm, encryption algorithm, matrix operation, physical simulation, and the like, through calculation speed and result.
Based on the foregoing description, taking an example that the target application is a game application and the evaluation software is a gmart evaluation software, a flowchart of the terminal performance evaluation method provided by the embodiment of the present application may be shown in fig. 11, and the flowchart mainly includes a target performance evaluation item, an evaluation main logic, and a system state of the terminal. Wherein the target performance evaluation items include target performance evaluation items of different game engines (e.g., a UE4 engine and a Unity engine), and CPU evaluation. And the evaluation main logic of the GMart evaluation software starts an evaluation resource of the target application according to the configuration information to execute an evaluation task corresponding to the target performance evaluation item.
In terms of performance evaluation of a terminal running game application, the performance evaluation is divided into UE4 evaluation and UNITY evaluation, and comprises fluency evaluation, special effect evaluation and GPU performance evaluation, wherein the fluency evaluation and the special effect evaluation can also comprise off-screen rendering. The game performance evaluation takes the real-time frame rate as a core calculation index, and performance evaluation data corresponding to the target performance evaluation items are calculated according to the characteristic values of the real-time frame rate in the scoring model. In terms of CPU evaluation, by running complex calculations, including: floating point calculation, compression algorithm, analysis algorithm, encryption algorithm, matrix operation, physical simulation and the like, and the CPU performance of the game terminal is evaluated through calculation speed and result. In the performance evaluation process, the GMarm evaluation main logic can monitor the system state of the terminal at the same time, and the method comprises the following steps: CPU occupancy rate, GPU occupancy rate, temperature, power consumption, memory and the like, and corresponding performance evaluation data are obtained for interface display and score calculation participation. For example, information display may be performed in an interface, CPU occupancy, GPU occupancy, temperature, etc. may be displayed in real time, power consumption may be used to calculate game start time, etc. The final performance evaluation result obtained by the scoring model is used for ranking list calculation and is displayed to the user.
Next, the terminal performance evaluation method provided by the embodiment of the application will be described in connection with an actual application scenario. If the terminal is a mobile phone and the target application is a game application, if the user is a user who frequently plays games, the user hopes that the mobile phone has higher performance of running the game application, such as higher fluency, very good special effect and the like, when buying the mobile phone. Therefore, in order to facilitate the user to know the performance advantage of running the game application by the mobile phone, and further guide the user to select the mobile phone with more game cost performance, the performance evaluation of running the game application by the mobile phone can be realized by installing GMark evaluation software on the terminal. Referring to fig. 12, fig. 12 shows a signaling interaction diagram for evaluating performance of a mobile phone running a gaming application, the method comprising:
s1201, a user opens GMark evaluation software.
S1202, user starts evaluation.
S1203, the mobile phone acquires a local evaluation resource and a version number of configuration information.
S1204, the mobile phone requests evaluating resources and version numbers of configuration information from the evaluating server.
S1205, the evaluation server requests the version numbers of the evaluation resources and the configuration information from the operation platform.
And S1206, the operation platform returns version numbers of evaluation resources and configuration information to the evaluation server.
S1207, the evaluation server returns the version numbers of the evaluation resources and the configuration information to the mobile phone.
S1208, updating the local evaluation resources and the version numbers of the configuration information by the mobile phone.
The mobile phone obtains the version number of the latest version from the cloud platform, so that the mobile phone can determine whether the version numbers of the local evaluation resources and the configuration information accord with preset conditions according to the version numbers of the local evaluation resources and the configuration information and the version numbers obtained from the operation platform. If the version number of the local evaluation resource and the configuration information is smaller than the version number obtained from the operation platform, the local evaluation resource and the configuration information can be considered to be not the latest version, the preset condition is not met, and the latest evaluation resource and the latest configuration information need to be obtained.
S1209, the mobile phone assembly target evaluating resource request and the target configuration information request.
And requesting the latest evaluation resource and configuration information from the file storage server through the target evaluation resource request and the target configuration information request.
S1210, the mobile phone sends a target evaluation resource request and a target configuration information request to the file storage server.
S1211, the mobile phone receives target evaluation resources and target configuration information returned by the file storage server.
The target evaluation resource and the target configuration information are the latest evaluation resource and configuration information of the target application.
S1212, mobile phone update UI real-time showing download state
In the process, the mobile phone can update the UI and display the downloading state in real time so as to embody the downloading progress of the target evaluation resource and the target configuration information.
S1213, the mobile phone stores the target evaluation resource and the target configuration information.
S1214, inquiring whether to install the target evaluation resource and the target configuration information or not.
S1215, the user selects installation.
S1216, installing target evaluation resources and target configuration information by the mobile phone, and starting evaluation.
S1217, the mobile phone sequentially runs the evaluation resources of the target application according to the configuration information of the target application, and executes the evaluation task.
S1218, the mobile phone displays the execution progress of the evaluation task in the evaluation interface.
That is, the mobile phone updates the UI to present the performance evaluation process in real time to represent the execution progress of the evaluation task, which may include, for example, displaying a game animation corresponding to the evaluation task currently executed by the game application, a plurality of evaluation tasks and identifying which evaluation task is currently executed, and the execution progress of the evaluation task being executed.
S1219, monitoring performance evaluation data corresponding to the system state by the mobile phone in real time.
S1220, the mobile phone synchronizes performance evaluation data to the evaluation server.
S1221, finishing performance evaluation.
S1222, the mobile phone requests performance evaluation results from the evaluation server.
S1223, the mobile phone obtains a performance evaluation result returned by the evaluation server.
S1224, the mobile phone displays the performance evaluation result to the user.
Based on the terminal performance evaluation method provided in the corresponding embodiment of fig. 4, the embodiment of the present application further provides a terminal performance evaluation device, referring to fig. 13, where the device includes a receiving unit 1301, a loading unit 1302, an executing unit 1303, a playing unit 1304, and a displaying unit 1305:
the receiving unit 1301 is configured to receive an evaluation instruction of a user, and enter a performance evaluation process of a terminal running target application;
the loading unit 1302 is configured to load an evaluation resource and configuration information of the target application during the performance evaluation process, where the configuration information is used to configure an execution sequence of an evaluation task corresponding to a target performance evaluation item;
the executing unit 1303 is configured to execute a corresponding evaluating task according to the executing sequence to execute an evaluating resource of the target application, so as to obtain performance evaluating data of the target performance evaluating item;
The playing unit 1304 is configured to play, when an evaluation task is executed, an application running animation corresponding to the evaluation task in an evaluation interface according to the evaluation resource;
and the display unit 1305 is configured to display a performance evaluation result of the terminal running the target application after performance evaluation is completed, where the performance evaluation result is obtained according to performance evaluation data of the target performance evaluation item.
In one possible implementation, the display unit 1305 is further configured to:
and in the performance evaluation process, displaying the execution progress of the evaluation task in the evaluation interface.
In one possible implementation, the display unit 1305 is specifically configured to:
creating multi-task nodes, wherein each task node represents one evaluation task;
displaying the multi-task node in the evaluation interface;
when a target evaluation task in a plurality of evaluation tasks is executed, marking the target evaluation task, and displaying the evaluation time of the target evaluation task.
In a possible implementation manner, the device further includes a monitoring unit:
the monitoring unit is used for monitoring the system state of the terminal in real time in the performance evaluation process;
The display unit 1305 is further configured to display performance evaluation data corresponding to the system state.
In one possible implementation manner, the performance evaluation result includes one or more combinations of a performance total score, a ranking list, performance evaluation data corresponding to each target performance evaluation item, and a level corresponding to each target performance evaluation item, where the performance total score is calculated according to the performance evaluation data corresponding to the target performance evaluation item.
In one possible implementation manner, the performance evaluation data corresponding to the target performance evaluation item includes an evaluation score and sub item content, where the sub item content is sub item actual data according to which the evaluation score is calculated.
In one possible implementation manner, if the target application is a game application, the executing unit 1303 is configured to:
and sequentially running evaluation resources corresponding to the target application under different game engines according to the execution sequence, and executing corresponding evaluation tasks aiming at different game engines.
In one possible implementation, the target performance evaluation items include one or more of smoothness, game start time, endurance time, special effects, and graphics processor performance.
In one possible implementation, the display unit 1305 is further configured to:
displaying a performance evaluation item selection interface, wherein the performance evaluation item selection interface comprises a plurality of candidate performance evaluation items;
the apparatus further includes a determining unit that determines the target performance evaluation item from the plurality of candidate performance evaluation items in response to a selection operation by the user.
In one possible implementation manner, the apparatus further includes an acquisition unit and a transmission unit:
the acquisition unit is used for acquiring the version numbers of the local evaluation resources and the configuration information;
the sending unit is used for sending a target evaluation resource request and a target configuration information request to the file storage server if the version numbers of the local evaluation resource and the configuration information do not accord with preset conditions;
the receiving unit 1301 is further configured to receive a target evaluation resource and target configuration information returned by the file storage server, and update local evaluation resource and configuration information with the target evaluation resource and the target configuration information, as evaluation resource and configuration information of the target application.
In a possible implementation manner, the apparatus further includes a synchronization unit and a transmission unit:
The synchronization unit is used for synchronizing the performance evaluation data of the target performance evaluation item to an evaluation server so that the evaluation server can determine the performance evaluation result according to the performance evaluation data of the target performance evaluation item;
and the sending unit is used for sending a performance evaluation result request to the evaluation server.
According to the technical scheme, the performance evaluation process of the terminal operation target application is entered after the evaluation instruction of the user is received. In the performance evaluation process, the evaluation resource and the configuration information of the target application are loaded, and the configuration information is used for configuring the execution sequence of the evaluation tasks corresponding to the target performance evaluation items. And then, according to the execution sequence, running the evaluation resources of the target application, and executing the corresponding evaluation tasks to obtain the performance evaluation data of the target performance evaluation items. Because the evaluation process uses the evaluation resource of the target application, the performance evaluation data is obtained by evaluating the performance of the terminal running the target application, and can reflect the performance of the terminal running the target application, thereby realizing the performance evaluation in the more subdivision field. Therefore, after performance evaluation is completed, the evaluation result of the terminal running target application is displayed, and the evaluation result is determined according to the performance evaluation data of the target performance evaluation item, so that a user can know the performance advantage of the terminal running target application according to the evaluation result, and further instruct the user to select a terminal with more target application cost performance. Meanwhile, when the evaluation task is executed, the application running animation corresponding to the evaluation task is played, so that the picture performance closely focused by the user is increased, and the user experience is improved.
The embodiment of the application also provides equipment for evaluating the performance of the terminal, which can be the terminal, taking the terminal as an intelligent mobile phone as an example:
fig. 14 is a block diagram showing a part of a structure of a smart phone related to a terminal provided by an embodiment of the present application. Referring to fig. 14, the smart phone includes: radio Frequency (r.f. Frequency) circuit 1410, memory 1420, input unit 1430, display unit 1440, sensor 1450, audio circuit 1460, wireless fidelity (r.f. wireless fidelity, wiFi) module 1470, processor 1480, and power supply 1490. The input unit 1430 may include a touch panel 1431 and other input devices 1432, the display unit 1440 may include a display panel 1441, and the audio circuit 1460 may include a speaker 1461 and a microphone 1462. Those skilled in the art will appreciate that the smartphone structure shown in fig. 14 is not limiting of the smartphone and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The memory 1420 may be used to store software programs and modules, and the processor 1480 performs various functional applications and data processing of the smartphone by running the software programs and modules stored in the memory 1420. The memory 1420 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebooks, etc.) created according to the use of the smart phone, etc. In addition, memory 1420 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 1480 is a control center of the smart phone, connects various parts of the entire smart phone using various interfaces and lines, and performs various functions of the smart phone and processes data by running or executing software programs and/or modules stored in the memory 1420, and calling data stored in the memory 1420, thereby performing overall monitoring of the smart phone. In the alternative, processor 1480 may include one or more processing units; preferably, the processor 1480 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 1480.
In this embodiment, the processor 1480 in the terminal may perform the following steps:
receiving an evaluation instruction of a user, and entering a performance evaluation process of a terminal running target application;
in the performance evaluation process, loading evaluation resources and configuration information of the target application, wherein the configuration information is used for configuring the execution sequence of evaluation tasks corresponding to target performance evaluation items;
Executing corresponding evaluation tasks according to the evaluation resources of the target application running in the execution sequence, and playing application running animations corresponding to the evaluation tasks in an evaluation interface according to the evaluation resources to obtain performance evaluation data of the target performance evaluation items;
and after the performance evaluation is finished, displaying a performance evaluation result of the terminal running the target application, wherein the performance evaluation result is determined according to the performance evaluation data of the target performance evaluation item.
Referring to fig. 15, fig. 15 is a schematic diagram of a server 1500 according to an embodiment of the present application, where the server 1500 may have a relatively large difference due to different configurations or performances, and may include one or more central processing units (Central Processing Units, abbreviated as CPUs) 1522 (e.g., one or more processors) and a memory 1532, one or more storage media 1530 (e.g., one or more mass storage devices) storing application programs 1542 or data 1544. Wherein the memory 1532 and the storage medium 1530 may be transitory or persistent storage. The program stored on the storage medium 1530 may include one or more modules (not shown), each of which may include a series of instruction operations on the server. Still further, the central processor 1522 may be configured to communicate with a storage medium 1530 and execute a series of instruction operations on the storage medium 1530 on the server 1500.
The server 1500 may also include one or more power supplies 1526, one or more wired or wireless network interfaces 1550, one or more input/output interfaces 1558, and/or one or more operating systems 1541, such as Windows server (tm), mac OS XTM, unixTM, linuxTM, freeBSDTM, and the like.
In the present embodiment, the steps performed by the server in the above-described method may be implemented based on the structure of the server 1500 shown in fig. 15.
According to an aspect of the present application, there is provided a computer-readable storage medium for storing a program code for executing the terminal performance evaluation method according to the foregoing embodiments.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the methods provided in the various alternative implementations of the above embodiments.
The terms "first," "second," "third," "fourth," and the like in the description of the application and in the above figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. The terminal performance evaluation method is characterized by comprising the following steps:
receiving an evaluation instruction of a user, and entering a performance evaluation process of a terminal running target application;
in the performance evaluation process, loading evaluation resources and configuration information of the target application, wherein the configuration information is used for configuring the execution sequence of evaluation tasks corresponding to target performance evaluation items;
executing corresponding evaluation tasks according to the evaluation resources of the target application running in the execution sequence, and playing application running animations corresponding to the evaluation tasks in an evaluation interface according to the evaluation resources to obtain performance evaluation data of the target performance evaluation items;
And after the performance evaluation is finished, displaying a performance evaluation result of the terminal running the target application, wherein the performance evaluation result is determined according to the performance evaluation data of the target performance evaluation item.
2. The method according to claim 1, wherein the method further comprises:
and in the performance evaluation process, displaying the execution progress of the evaluation task in the evaluation interface.
3. The method of claim 2, wherein the presenting the progress of execution of the assessment task comprises:
creating multi-task nodes, wherein each task node represents one evaluation task;
displaying the multi-task node in the evaluation interface;
when a target evaluation task in a plurality of evaluation tasks is executed, marking the target evaluation task, and displaying the evaluation time of the target evaluation task.
4. The method according to claim 1, wherein the method further comprises:
in the performance evaluation process, monitoring the system state of the terminal in real time;
and displaying the performance evaluation data corresponding to the system state.
5. The method according to any one of claims 1 to 4, wherein the performance evaluation result includes one or more combinations of a performance score, a ranking list, performance evaluation data corresponding to each target performance evaluation item, and a ranking corresponding to each target performance evaluation item, and the performance score is calculated according to the performance evaluation data corresponding to the target performance evaluation item.
6. The method of claim 5, wherein the performance evaluation data corresponding to the target performance evaluation item includes an evaluation score and sub-item content, the sub-item content being sub-item actual data on which the evaluation score is calculated.
7. The method according to claim 1, wherein if the target application is a game application, the running the evaluation resource corresponding to the target application according to the execution sequence performs a corresponding evaluation task, including:
and sequentially running evaluation resources corresponding to the target application under different game engines according to the execution sequence, and executing corresponding evaluation tasks aiming at different game engines.
8. The method of claim 7, wherein the target performance evaluation items include one or more of fluency, game start-up time, endurance, special effects, graphics processor performance, and combinations thereof.
9. The method according to any one of claims 1-4, further comprising:
displaying a performance evaluation item selection interface, wherein the performance evaluation item selection interface comprises a plurality of candidate performance evaluation items;
the target performance evaluation item is determined from the plurality of candidate performance evaluation items in response to a selection operation by the user.
10. The method according to any of claims 1-4, wherein prior to loading the evaluation resources and configuration information of the target application, the method further comprises:
acquiring a local evaluation resource and a version number of configuration information;
if the version numbers of the local evaluation resources and the configuration information do not accord with the preset conditions, sending a target evaluation resource request and a target configuration information request to a file storage server;
and receiving target evaluation resources and target configuration information returned by the file storage server, and updating local evaluation resources and configuration information by utilizing the target evaluation resources and the target configuration information to serve as evaluation resources and configuration information of the target application.
11. The method according to any one of claims 1 to 4, wherein before the presenting the performance evaluation result of the terminal running the target application, the method further includes:
synchronizing performance evaluation data of the target performance evaluation items to an evaluation server so that the evaluation server can determine the performance evaluation result according to the performance evaluation data of the target performance evaluation items;
and sending a performance evaluation result request to the evaluation server.
12. The device for evaluating the terminal performance is characterized by comprising a receiving unit, a loading unit, an executing unit, a playing unit and a display unit:
the receiving unit is used for receiving an evaluation instruction of a user and entering a performance evaluation process of a terminal operation target application;
the loading unit is used for loading the evaluation resource and the configuration information of the target application in the performance evaluation process, and the configuration information is used for configuring the execution sequence of the evaluation task corresponding to the target performance evaluation item;
the execution unit is used for running the evaluation resources of the target application according to the execution sequence, executing the corresponding evaluation tasks and obtaining the performance evaluation data of the target performance evaluation items;
the playing unit is used for playing the application running animation corresponding to the evaluation task in the evaluation interface according to the evaluation resource when the evaluation task is executed;
and the display unit is used for displaying the performance evaluation result of the terminal running the target application after the performance evaluation is completed, wherein the performance evaluation result is obtained according to the performance evaluation data of the target performance evaluation item.
13. The apparatus of claim 12, wherein the display unit is further configured to:
and in the performance evaluation process, displaying the execution progress of the evaluation task in the evaluation interface.
14. An apparatus for terminal performance evaluation, the apparatus comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of any of claims 1-11 according to instructions in the program code.
15. A computer readable storage medium, characterized in that the computer readable storage medium is for storing a program code for performing the method of any one of claims 1-11.
CN202110546948.7A 2021-05-19 2021-05-19 Terminal performance evaluation method, device, equipment and storage medium Active CN113138780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110546948.7A CN113138780B (en) 2021-05-19 2021-05-19 Terminal performance evaluation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110546948.7A CN113138780B (en) 2021-05-19 2021-05-19 Terminal performance evaluation method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113138780A CN113138780A (en) 2021-07-20
CN113138780B true CN113138780B (en) 2023-09-15

Family

ID=76817378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110546948.7A Active CN113138780B (en) 2021-05-19 2021-05-19 Terminal performance evaluation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113138780B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113656276B (en) * 2021-08-26 2024-02-09 深圳市腾讯网络信息技术有限公司 Equipment performance detection method for game program and related device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104112003A (en) * 2014-07-14 2014-10-22 广州华多网络科技有限公司 Method and system for detecting performance of game terminals
CN104268047A (en) * 2014-09-18 2015-01-07 北京安兔兔科技有限公司 Electronic equipment performance testing method and device
CN107544805A (en) * 2017-09-04 2018-01-05 同济大学 Mobile phone games system architecture based on android system reusable framework

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104112003A (en) * 2014-07-14 2014-10-22 广州华多网络科技有限公司 Method and system for detecting performance of game terminals
CN104268047A (en) * 2014-09-18 2015-01-07 北京安兔兔科技有限公司 Electronic equipment performance testing method and device
CN107544805A (en) * 2017-09-04 2018-01-05 同济大学 Mobile phone games system architecture based on android system reusable framework

Also Published As

Publication number Publication date
CN113138780A (en) 2021-07-20

Similar Documents

Publication Publication Date Title
JP6670714B2 (en) Information processing system, server, information processing program, and object assignment method
JP6573397B2 (en) Information processing system, server, information processing program, and object granting method
CN111708948B (en) Content item recommendation method, device, server and computer readable storage medium
CN111701241B (en) Form switching method and device, storage medium and computer equipment
CN112657186B (en) Game interaction method and device
US20210065421A1 (en) Moving image distribution system, moving image distribution method, and moving image distribution program
JP6337051B2 (en) Program and system
CN113138780B (en) Terminal performance evaluation method, device, equipment and storage medium
CN111729291A (en) Interaction method, interaction device, electronic equipment and storage medium
CN114191822A (en) Test method, test device, computer equipment, storage medium and product
CN105935492A (en) Game device and interaction system
CN110461429A (en) For managing the system and method that experience is added in dynamic select in virtual environment
JP2019025067A (en) Program, information processor, and control method
KR101508766B1 (en) Game system and operation method thereof
JP6890641B2 (en) Information processing system, server, information processing program, and object assignment method
US10943319B2 (en) Information processing system, information processing apparatus, server, storage medium having stored therein information processing program, and information processing method
KR101183731B1 (en) Method and server for providing service of using item
CN112138389A (en) Game role recommendation display method, system and equipment
CN110298702B (en) Information display method and device, intelligent robot, storage medium and electronic equipment
JP6366224B2 (en) Game server device
JP6495522B2 (en) Information processing system, server, information processing program, and object granting method
CN113750540B (en) Game matching method, game matching device, storage medium and computer program product
US20230041552A1 (en) Relevancy-based video help in a video game
JP6640270B2 (en) Video game processing program and video game processing system
CN113926200A (en) Task completion method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40047480

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant