CN113886208B - Data processing method, device, equipment and storage medium - Google Patents

Data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN113886208B
CN113886208B CN202111191647.3A CN202111191647A CN113886208B CN 113886208 B CN113886208 B CN 113886208B CN 202111191647 A CN202111191647 A CN 202111191647A CN 113886208 B CN113886208 B CN 113886208B
Authority
CN
China
Prior art keywords
evaluation
game
interface
client
picture frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111191647.3A
Other languages
Chinese (zh)
Other versions
CN113886208A (en
Inventor
王金桂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Tencent Domain Computer Network Co Ltd
Original Assignee
Shenzhen Tencent Domain Computer Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Tencent Domain Computer Network Co Ltd filed Critical Shenzhen Tencent Domain Computer Network Co Ltd
Priority to CN202111191647.3A priority Critical patent/CN113886208B/en
Publication of CN113886208A publication Critical patent/CN113886208A/en
Application granted granted Critical
Publication of CN113886208B publication Critical patent/CN113886208B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Abstract

The embodiment of the application discloses a data processing method, a device, equipment and a storage medium, wherein the method comprises the following steps: displaying N evaluation controls on an evaluation main interface of an evaluation client; responding to a first triggering operation of a target evaluation control in the N evaluation controls, displaying an evaluation game interface of a game sub-client corresponding to the target evaluation control, and displaying a first game picture frame on the evaluation game interface; responding to the second triggering operation, displaying a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switching the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame; and displaying the evaluation result associated with the evaluation object on the evaluation game interface. By adopting the embodiment of the application, the attractiveness and the interestingness in the performance evaluation process can be enhanced.

Description

Data processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a data processing method, apparatus, device, and storage medium.
Background
At present, when the terminal performance parameters of the user terminal are evaluated, the evaluation results related to the terminal performance parameters can be obtained through the automatic operation of the evaluation program of the third party evaluation client installed on the user terminal, namely, the user using the user terminal is difficult to participate in the evaluation of the user terminal in the process of automatically operating the evaluation program of the third party evaluation client, which means that the evaluation results obtained through the automatic operation of the evaluation program are all approximate to be consistent for different users using the user terminal, so that the phenomenon that the final evaluation result is single exists, and the attractiveness and the interestingness in the performance evaluation process are reduced.
Disclosure of Invention
The embodiment of the application provides a data processing method, a device, equipment and a storage medium, which can enhance attractiveness and interestingness in a performance evaluation process.
An aspect of an embodiment of the present application provides a data processing method, including:
displaying N evaluation controls on an evaluation main interface of an evaluation client; one evaluation control corresponds to one game sub-client; n is a positive integer;
responding to a first triggering operation of a target evaluation control in the N evaluation controls, displaying an evaluation game interface of a game sub-client corresponding to the target evaluation control, and displaying a first game picture frame on the evaluation game interface; the first game picture frame comprises first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute a second trigger operation;
responding to the second triggering operation, displaying a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switching the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame;
and displaying the evaluation result associated with the evaluation object on the evaluation game interface.
An aspect of an embodiment of the present application provides a data processing method, including:
Displaying N evaluation controls on an evaluation main interface of an evaluation client; one evaluation control corresponds to one game sub-client; n is a positive integer;
responding to the evaluation operation aiming at the target evaluation control in the N evaluation controls, and outputting an evaluation result display interface associated with the target evaluation control in an evaluation client; the evaluation result display interface comprises evaluation scores determined by terminal performance parameters corresponding to target evaluation controls;
when N evaluation scores are obtained, determining an evaluation total score corresponding to the user terminal running with the evaluation client based on the N evaluation scores;
and acquiring a terminal list associated with the evaluation total, displaying a performance display interface of the evaluation client, and displaying the terminal list on the performance display interface.
An aspect of an embodiment of the present application provides a data processing apparatus, including:
the main interface display module is used for displaying N evaluation controls on an evaluation main interface of the evaluation client; one evaluation control corresponds to one game sub-client; n is a positive integer;
the first picture frame display module is used for responding to a first trigger operation of a target evaluation control in the N evaluation controls, displaying an evaluation game interface of a game sub-client corresponding to the target evaluation control, and displaying a first game picture frame on the evaluation game interface; the first game picture frame comprises first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute a second trigger operation;
The second picture frame display module is used for responding to a second trigger operation, displaying a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switching the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame;
and the evaluation result display module is used for displaying the evaluation result related to the evaluation object on the evaluation game interface.
The first game picture frame comprises an object to be processed with a fixed display position, and the service state of the object to be processed is a first state; the first evaluation prompt information comprises state prompt information and a first evaluation auxiliary parameter with a first initial value;
the second picture frame display module includes:
the prompt information hiding unit is used for hiding the state prompt information when the display time of the state prompt information reaches a state display time threshold value, and changing the service state of the object to be processed from a first state to a second state on the evaluating game interface;
the hit animation display unit is used for responding to a second trigger operation aiming at the evaluation game interface, displaying hit animations of the objects to be processed on the evaluation game interface, and determining second game picture frames corresponding to the first game picture frames based on the game picture frames corresponding to the hit animations;
The decremental processing unit is used for carrying out decremental processing on the first initial value in the second game picture frame, and taking the first evaluation auxiliary parameter corresponding to the decremented first initial value as the second evaluation prompt information.
Wherein the object to be processed is a shooting object; the game sub-clients corresponding to the target evaluation control comprise first game sub-clients; the first game sub-client is used for evaluating the object behavior attribute of the evaluation object;
the hit animation display unit includes:
a touch event capturing subunit, configured to respond to a second trigger operation for the evaluating game interface by using a touch chip of the user terminal running with the evaluating client, and capture a first touch event associated with the second trigger operation;
the picture frame generation subunit is used for sending the first touch event to a system driver of the user terminal and transmitting the first touch event to the first game sub-client through the system driver; the first game sub-client is used for processing the first touch event and generating a game picture frame corresponding to the hit animation of the object to be processed;
the first display subunit is used for determining a second game picture frame corresponding to the first game picture frame based on the game picture frame corresponding to the hit animation, and displaying the second game picture frame on the evaluating game interface.
The touch event capturing subunit is further configured to:
receiving a second trigger operation aiming at the evaluating game interface through a touch control chip of the user terminal running with the evaluating client, and scanning the screen level of the user terminal based on the second trigger operation;
recording operation parameters related to a second triggering operation when the touch chip scans that the screen level changes;
a first touch event associated with a second trigger operation is captured based on the operating parameter.
The first evaluation prompt information comprises a second evaluation auxiliary parameter with a second initial value; the evaluating game interface comprises a position operation control, a business operation control and an auxiliary aiming area at a first display position;
the second picture frame display module includes:
the aiming area display position switching unit is used for responding to the screen drawing operation aiming at the position operation control when X objects to be processed with random display positions are displayed on the evaluation game interface, and switching the display position of the auxiliary aiming area from a first display position to a second display position; the second display position is determined based on the screen-scribing operation; x is a positive integer;
the picture frame display unit is used for responding to the second triggering operation for the business operation control and displaying a second game picture frame corresponding to the first game picture frame on the evaluating game interface;
And the change processing unit is used for carrying out change processing on the second initial value in the second game picture frame, and taking a second evaluation auxiliary parameter corresponding to the changed second initial value as second evaluation prompt information.
The game sub-client corresponding to the target evaluation control comprises a second game sub-client; the second game sub-client is used for evaluating the object hit attribute of the evaluation object;
the picture frame display unit includes:
the touch event acquisition subunit is used for responding to a second triggering operation for the business operation control and acquiring a second touch event associated with the second triggering operation through a second game sub-client; the second touch event comprises an auxiliary aiming area with a second display position;
the display position matching subunit is used for respectively matching the second display position with the display position of each object to be processed in the X objects to be processed to obtain a matching result; the display duration of each object to be processed in the X objects to be processed is the same;
and the second display subunit is used for generating a second game picture frame corresponding to the first game picture frame based on the matching result, and displaying the second game picture frame on the evaluating game interface.
Wherein the second display subunit is further configured to:
if the matching result indicates that the second display position is matched with the display position of the target processing object in the X objects to be processed, determining a hit time stamp associated with a second trigger operation through the second game sub-client;
determining a display cut-off time stamp of the target processing object based on the display duration of the target processing object and the display start time stamp of the target processing object;
generating a first type game screen frame for representing successful hit when the display cut-off timestamp is greater than or equal to the hit timestamp;
and determining a second game picture frame corresponding to the first game picture frame based on the first type game picture frame, and displaying the second game picture frame in the evaluating game interface.
Wherein the second display subunit is further configured to:
if the matching result indicates that the second display position is not matched with the display position of each object to be processed in the X objects to be processed, generating a second type game picture frame for representing hit failure through the second game sub-client;
and determining a second game picture frame corresponding to the first game picture frame based on the second type game picture frame, and displaying the second game picture frame in the evaluating game interface.
Wherein the first game picture frame comprises an object to be processed having a first state; the second game picture frame comprises an object to be processed with a second state; the second state is a service state obtained after the state of the first state is changed;
the evaluation result display module comprises:
the state resetting unit is used for carrying out state resetting on the object to be processed with the second state and displaying the object to be processed after the state resetting on the evaluating game interface; the business state of the object to be processed after the state reset is a first state;
and the evaluation result display unit is used for displaying a target display area with a result display duration threshold on the evaluation game interface and displaying an evaluation result on the target display area when the evaluation result related to the evaluation object is acquired.
Wherein, this evaluation result display element includes:
a change timestamp obtaining subunit, configured to obtain a timestamp for performing a state change on the object to be processed in the first game frame, and use the obtained timestamp as a state change timestamp;
a hit time stamp recording subunit configured to record a generation time stamp of the second game screen frame, and use the generation time stamp as a hit time stamp associated with the second trigger operation;
The object interaction duration determining subunit is used for determining a time difference value between the hit time stamp and the state change time stamp, taking the time difference value as the object interaction duration of the evaluation object on the user terminal running the evaluation client, and determining the object behavior attribute associated with the evaluation object based on the object interaction duration;
and the evaluation result display subunit is used for determining a target display area with a result display duration threshold on the evaluation game interface and displaying an evaluation result comprising the object behavior attribute on the target display area.
Wherein the apparatus further comprises:
the updating value acquisition module is used for acquiring updating values corresponding to the evaluation auxiliary parameters in the second evaluation prompt information; the updated value is determined after the initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt message is changed;
the evaluation cutoff condition acquisition module is used for acquiring an evaluation cutoff condition associated with the target evaluation control; the evaluation cutoff condition comprises an evaluation threshold;
the result display interface display module is used for determining that the second evaluation prompt information meets the evaluation cut-off condition if the updated value is matched with the evaluation threshold value and displaying an evaluation result display interface of the game sub-client corresponding to the target evaluation control;
And the evaluation score display module is used for displaying the evaluation score determined by the terminal performance parameters corresponding to the target evaluation control on the evaluation result display interface.
The terminal performance parameters corresponding to the target evaluation control comprise object behavior attributes; the evaluation result display interface comprises a first evaluation result display interface for displaying the behavior attribute of the object; the object behavior attribute comprises an object interaction time length used for representing an evaluation object on a user terminal running with an evaluation client;
the evaluation score display module comprises:
the interaction time length acquisition unit is used for acquiring X object interaction time lengths associated with the evaluation object; x is an initial value corresponding to an evaluation auxiliary parameter in the first evaluation prompt message; x is a positive integer;
the first evaluation score determining unit is used for acquiring object interaction time length with a minimum value from the X object interaction time lengths, taking the acquired object interaction time length as the minimum interaction time length, and determining a first evaluation score corresponding to the evaluation object based on the minimum interaction time length;
the first evaluation score display unit is used for displaying a first evaluation score on the first evaluation result display interface, and the first evaluation score is used as an evaluation score determined by the terminal performance parameters corresponding to the target evaluation control.
Wherein the first evaluation score determining unit includes:
the minimum duration determination subunit is configured to obtain an object interaction duration with a minimum value from the X object interaction durations, and take the obtained object interaction duration as a minimum interaction duration;
the reference interaction time length obtaining subunit is used for obtaining a first evaluation reference score associated with the object behavior attribute and a reference interaction time length corresponding to the first evaluation reference score; the reference interaction time length is an average interaction time length obtained after the Y sample objects are evaluated for the same user terminal; y is a positive integer;
a policy acquisition subunit, configured to acquire a first score mapping policy associated with the object behavior attribute;
the evaluation score determining subunit is configured to obtain a first difference between the reference interaction duration and the minimum interaction duration, and determine a first evaluation score corresponding to the evaluation object based on the first difference, the first score mapping policy, and the first evaluation reference score.
The terminal performance parameters corresponding to the target evaluation control comprise object hit attributes; the evaluation result display interface comprises a second evaluation result display interface for displaying object hit attributes; the object hit attribute comprises object hit times used for representing the evaluation object on a user terminal running with an evaluation client;
The evaluation score display module comprises:
the object hit number acquisition unit is used for acquiring the object hit number associated with the evaluation object from the evaluation game interface;
a reference hit number acquisition unit configured to acquire a second evaluation reference score associated with the object hit attribute and a reference hit number corresponding to the second evaluation reference score; the reference hit number is the average hit number obtained after the Y sample objects are evaluated for the same user terminal; y is a positive integer;
a score policy acquisition unit for acquiring a second score mapping policy associated with the object hit attribute;
the second evaluation score determining unit is used for obtaining a second difference value between the reference hit times and the object hit times and determining a second evaluation score corresponding to the evaluation object based on the second difference value, a second score mapping strategy and the second evaluation reference score;
the second evaluation score display unit is used for displaying a second evaluation score on a second evaluation result display interface, and the second evaluation score is used as an evaluation score determined by the terminal performance parameters corresponding to the target evaluation control.
The evaluation result display interface comprises a result sharing control and evaluation data associated with an evaluation score;
The apparatus further comprises:
the sharing sub-interface display module is used for responding to a third triggering operation aiming at the result sharing control and displaying a sharing sub-interface independent of the evaluating result display interface; the sharing sub-interface comprises Z sharing selection controls; z is a positive integer; one sharing selection control corresponds to one sharing public platform; the sharing sub-interface is an interface overlapped on the evaluating result display interface, and the size of the sharing sub-interface is smaller than that of the evaluating result display interface;
the interface to be distributed display module is used for responding to a fourth triggering operation of a target sharing selection control in the Z sharing selection controls, displaying an interface to be distributed of the sharing public platform corresponding to the target sharing selection control, and displaying evaluation data on the interface to be distributed; the interface to be published comprises a publishing control;
and the evaluation data release module is used for responding to the fifth trigger operation for the release control so as to release the evaluation data on the sharing public platform corresponding to the target sharing control.
An aspect of an embodiment of the present application provides a data processing apparatus, including:
the evaluation control display module is used for displaying N evaluation controls on an evaluation main interface of the evaluation client; one evaluation control corresponds to one game sub-client; n is a positive integer;
The result display interface display module is used for responding to the evaluation operation of the target evaluation control in the N evaluation controls and outputting an evaluation result display interface associated with the target evaluation control in the evaluation client; the evaluation result display interface comprises evaluation scores determined by terminal performance parameters corresponding to target evaluation controls;
the evaluation total score determining module is used for determining evaluation total scores corresponding to the user terminals running with the evaluation clients based on the N evaluation scores when the N evaluation scores are obtained;
the performance display interface display module is used for acquiring a terminal list associated with the evaluation total, displaying a performance display interface of the evaluation client, and displaying the terminal list on the performance display interface.
In one aspect, an embodiment of the present application provides a computer device, including: a processor and a memory;
the processor is connected to the memory, wherein the memory is configured to store a computer program, and when the computer program is executed by the processor, the computer device is caused to execute the method provided by the embodiment of the application.
In one aspect, the present application provides a computer readable storage medium storing a computer program adapted to be loaded and executed by a processor, so that a computer device having the processor performs the method provided by the embodiment of the present application.
In one aspect, embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method provided by the embodiment of the present application.
In the embodiment of the application, the evaluation main interface of the evaluation client operated by the user terminal can comprise N evaluation controls for evaluating the performance parameters of the terminal, and one evaluation control corresponds to one game sub-client, so that in the process of evaluating the user terminal, an evaluation user is required to interact with the evaluation client instead of automatically operating the evaluation client by the user terminal, namely, the evaluation user can flexibly select any one of the N evaluation controls (namely, the target evaluation control) corresponding game sub-client, and further, trigger operation can be performed in the game sub-client corresponding to the target evaluation control, so that the user terminal determines the evaluation result with user characteristics according to the trigger operation, which means that the evaluation results obtained by different users using the same user terminal can also have larger difference, thereby enhancing attractiveness and interestingness in the performance evaluation process.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application;
FIG. 2 is a schematic diagram of interface switching for evaluating terminal performance parameters according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a data processing method according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a shooting scene according to an embodiment of the present application;
FIG. 5 is a diagram of an interface display of a target display area according to an embodiment of the present application;
FIG. 6 is an interface switching schematic diagram of a first evaluation result display interface according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of a data processing method according to an embodiment of the present application;
FIG. 8 is a schematic diagram of interface switching for evaluating object hit attributes according to an embodiment of the present application;
FIG. 9 is an interface switching schematic diagram of a second evaluation result display interface according to an embodiment of the present application;
FIG. 10 is an interface diagram of a performance display interface provided by an embodiment of the present application;
FIG. 11 is a schematic view of a scenario for sharing evaluation data according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a data processing apparatus according to an embodiment of the present application;
FIG. 13 is a schematic diagram of a data processing apparatus according to an embodiment of the present application;
fig. 14 is a schematic diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application. As shown in fig. 1, the network architecture may include a server 10W and a cluster of user terminals. The cluster of user terminals may comprise one or more user terminals, the number of which will not be limited here. As shown in fig. 1, the user terminals 100a, 100b, 100c, …, and 100n may be specifically included. As shown in fig. 1, the user terminals 100a, 100b, 100c, …, 100n may respectively perform network connection with the server 10W, so that each user terminal may perform data interaction with the server 10W through the network connection.
Wherein each user terminal in the user terminal cluster may include: smart terminals for evaluating terminal performance parameters, such as smart phones, tablet computers, notebook computers, desktop computers, smart speakers, smart watches, vehicle-mounted terminals, smart televisions, and the like. It should be appreciated that each user terminal in the user terminal cluster shown in fig. 1 may be provided with an application client (e.g. an evaluation client), which may interact with the server 10W shown in fig. 1, respectively, when the application client is running in each user terminal.
As shown in fig. 1, the server 10W in the embodiment of the present application may be a server corresponding to the application client. The server 10W may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing services.
For easy understanding, the embodiment of the present application may select one user terminal from the plurality of user terminals shown in fig. 1 as the target user terminal. For example, the embodiment of the present application may use the user terminal 100a shown in fig. 1 as a target user terminal, where an evaluation client for evaluating a terminal performance parameter may be integrated. At this time, the target user terminal may implement data interaction between the service data platform corresponding to the evaluation client and the server 10W.
The evaluation client can design N game sub-clients for evaluating the performance parameters of the terminal by using a game engine, wherein N is a positive integer. The terminal performance parameters herein may include object behavior attributes (e.g., object interaction duration and object interaction speed, etc.) and object hit attributes (e.g., object hit number, object travel distance, object remaining life, etc.). It should be understood that the game sub-clients herein may include shooting-type game clients, agile-type game clients, and cutting-type game clients. For example, the N game sub-clients may include a first game sub-client and a second game sub-client.
Wherein the first game sub-client herein may be used to evaluate object behavior attributes of an evaluation object (e.g., an evaluation user). It is understood that the object interaction time (also known as user reaction time) may be used to characterize the interaction time of the evaluation object on the user terminal running the evaluation client. The object interaction speed may characterize the interaction speed of the evaluation object on the user terminal.
Wherein the second game sub-client can be used for evaluating the object hit attribute of the evaluation object. It will be appreciated that the number of hits may be used to characterize the number of hits of the evaluation object on the user terminal running the evaluation client, e.g. the number of hits may refer to the number of hits the evaluation object hits the object to be processed (e.g. targets in a shooting game client, balloons in an agile game client, fruit to be cut in a cutting game client or spatial model to be cut) in the second game sub-client. The object driving distance may refer to a distance traveled by an object to be processed (e.g., a racing car in a racing game client) controlled by the evaluation object in the second game sub-client; the object remaining life may indicate a life value remaining to evaluate a pending object controlled by the object in the second game sub-client (e.g., a game character in the athletic-class game client).
It may be appreciated that, when the user terminal (for example, the user terminal 100 a) running with the evaluation client obtains the service profile information associated with the evaluation client, the service profile information may be subjected to service processing, so as to obtain service evaluation response information corresponding to the service profile information. For example, when the service profile information obtained by the user terminal 100a is the profile determined by the profile client, the user terminal 100a determines the profile score corresponding to the profile. For another example, when the service profile information obtained by the user terminal 100a is an evaluation total score determined by N evaluation scores, the user terminal 100a may request the server 10W for an evaluation total score corresponding to another user terminal, and may further generate a terminal list for indicating the rank to which the user terminal 100a belongs based on the evaluation total score corresponding to another user terminal (i.e., the user terminals that have participated in the evaluation, for example, the user terminal 100b and the user terminal 100 c) returned by the server 10W and the evaluation total score corresponding to the user terminal 100 a. The embodiment of the application can collectively refer to the evaluation score and the terminal list as the service evaluation response information obtained after the service processing of the service evaluation data information.
Optionally, in order to effectively reduce the consumption of computing resources of the user terminal 100a, when the user terminal 100a obtains the service profile information, the service profile information may be further sent to the server 10W shown in fig. 1, so that the server 10W performs service processing on the received service profile information, thereby obtaining service profile response information corresponding to the service profile information, and further may return the service profile response information to the user terminal 100a.
In the embodiment of the application, the evaluation main interface of the evaluation client operated by the user terminal can comprise N evaluation controls for evaluating the performance parameters of the terminal, and one evaluation control corresponds to one game sub-client, so that in the process of evaluating the user terminal, the evaluation client is not operated automatically by the user terminal, but the evaluation user is required to interact with the evaluation client, namely, the evaluation user can flexibly select any game sub-client corresponding to one evaluation control (namely, a target evaluation control) from the N evaluation controls, so that the attractiveness and the interestingness in the performance evaluation process are enhanced. In addition, after the evaluation user evaluates the game sub-client corresponding to each evaluation control in the N evaluation controls, the evaluation user can obtain the evaluation total score corresponding to the user terminal, and further can obtain a terminal list based on user participation, so that a personalized evaluation result with user characteristics is obtained, and the accuracy of performance evaluation is improved.
For ease of understanding, further, please refer to fig. 2, fig. 2 is a schematic diagram of interface switching for evaluating a terminal performance parameter according to an embodiment of the present application. As shown in fig. 2, the user terminal (i.e., the target user terminal) used by the evaluation object (e.g., the user 20A) in the embodiment of the present application may be a user terminal running with an evaluation client, and the target user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100A.
It should be appreciated that, when the user 20A in the embodiment of the present application needs to evaluate the target user terminal used by the user 20A, a start operation may be performed for the evaluation client, so that the target user terminal responds to the start operation, so that an evaluation main interface (for example, the evaluation main interface 210J shown in fig. 2) of the evaluation client is displayed on a terminal interface of the target user terminal. The triggering operation in the embodiment of the present application may be a triggering operation for triggering the evaluation client, where the triggering operation may include non-contact operations such as voice, gesture, and the like, and may also include contact operations such as clicking, long press, and the like, which will not be limited herein.
As shown in fig. 2, the display area 2Q on the evaluation main interface 210J may display an evaluation total score associated with the target user terminal. The evaluation main interface 210J may also display thereon evaluation controls associated with multiple evaluation types, which may include, for example, a first evaluation type (e.g., game experience test) and a second evaluation type (e.g., operability test). The number of evaluation controls associated with the first evaluation type may be multiple, for example, 4 embodiments of the present application may specifically include an evaluation control 1a (for example, an "FPS" control, frames Per Second, i.e., a frame rate test), an evaluation control 2a (for example, an "RPG" control, a roll-playing game, i.e., a Role playing game test), an evaluation control 3a (for example, a "speed" control, i.e., a web speed test), and an evaluation control 4a (for example, an "MOBA" control, multiplayer Online Battle Arena, i.e., a multi-user online tactical competitive game test). The number of evaluation controls associated with the second evaluation type may be N, where N is a positive integer, and embodiments of the present application may take 2 as examples, and specifically may include an evaluation control 1b (e.g., a "fixed target" control) and an evaluation control 2b (e.g., a "random target" control). Wherein, one evaluation control can correspond to one game sub-client.
It can be appreciated that the user 20A can flexibly select any one evaluation control (i.e., the target evaluation control) from the N evaluation controls displayed on the evaluation main interface 210J, so as to evaluate the terminal performance parameters of the target user terminal. For example, the user 20A may perform a triggering operation (i.e., a first triggering operation, e.g., a clicking operation) on the evaluation control 1b on the evaluation main interface 210J, so that the target user terminal may display, in response to the first triggering operation, an evaluation game interface (e.g., the evaluation game interface 220J shown in fig. 2) of a game sub-client (e.g., a first game sub-client for evaluating the behavior attribute of the object) corresponding to the evaluation control 1b 1 ) And at the evaluate game interface 220J 1 A first game screen frame (e.g., game screen frame 21Z shown in fig. 2) is displayed. After the user terminal responds to the first trigger operation, the game picture frame displayed on the evaluating game interface can be called a first game picture frame.
The object 2D to be processed in the game screen frame 21Z may be an object to be shot (for example, a target to be shot). As shown in fig. 2, the game screen frame 21Z further includes first evaluation prompt information for instructing the user 20A to perform the second trigger operation. The first evaluation notification information may include an evaluation auxiliary parameter 2F (e.g., an evaluation auxiliary parameter such as "remaining bullet: 4" shown in fig. 2) and a status notification information 2T (e.g., a text notification information such as "post-target-flip shot" shown in fig. 2) for notifying the evaluation target of the execution of the second trigger operation. The status prompt information 2T may be a text prompt information or a voice prompt information, which will not be limited herein.
Further, when the user 20A is evaluating the game interface (e.g., the game interface 220J) according to the status prompt 2T 2 ) Executing a second trigger operation thereon (e.g., at the evaluating game interface 220J 2 Performing a clicking operation at any location thereon), the target user terminal may respond to the second triggering operation at an evaluation game interface (e.g., evaluation game interface 220J) 3 ) A second game screen frame (e.g., game screen frame 22Z shown in fig. 2) corresponding to the game screen frame 21Z is displayed thereon, and in this game screen frame 22Z, the evaluation prompt information is displayed as a first evaluation prompt information (e.g., "remaining bullets: 4 "to a second evaluation cue information (e.g.," remaining bullets "): 3 "such evaluation aid parameters). The game picture frame generated by the target user terminal when responding to the second trigger operation can be called a second game picture frame.
Further, the target user terminal may obtain an evaluation result associated with the user 20A, and may further display the evaluation result on the evaluation game interface. For example, the target user terminal may evaluate game interface 220J directly at game frame 22Z 3 On the above, the evaluation result associated with the user 20A is displayed. Optionally, the target user terminal may also change to evaluate game interface 220J at game frame 22Z 4 In (i) game screen frame 23Z (i.e., the third game screen frame), the evaluation result associated with the user 20A is displayed. As shown in FIG. 2, the target user terminal may evaluate game interface 220J at which game frame 23Z is located 4 On which are displayed evaluation results (e.g., as shown in fig. 2) including object behavior attributes (e.g., object interaction time period)The evaluation result 2P shown). For example, the evaluation result 2P may be: "this hit time: 1 second "of such text information.
Therefore, N evaluation controls for evaluating the performance parameters of the terminal can be displayed on the evaluation main interface 210J of the evaluation client operated by the target user terminal, and one evaluation control corresponds to one game sub-client. Then, in the process of evaluating the target user terminal by the user 20A, instead of automatically running the evaluating client by the target user terminal or directly obtaining the evaluating result of the offline test of the third party institution, the user 20Z is required to interact with the evaluating client, that is, the user 20A may flexibly select a game sub-client corresponding to any one evaluating control (i.e., the target evaluating control) from N evaluating controls, and further operate based on the prompt (e.g., the state prompt information 2T) of the evaluating client, so that the evaluating client determines the evaluating result with the user feature according to the triggering operation of the user 20A, thereby enhancing attractiveness and interestingness in the performance evaluating process.
The specific implementation manner of evaluating the terminal performance parameter of the user terminal by responding to the interactive operation between the evaluation object and the evaluation client in the user terminal running with the evaluation client can be seen in the embodiments corresponding to the following fig. 3-11.
Further, referring to fig. 3, fig. 3 is a flow chart of a data processing method according to an embodiment of the application. As shown in fig. 3, the method may be performed by a computer device running with an evaluation client, where the computer device may be a user terminal (e.g., the user terminal 100a shown in fig. 1 and described above) or a server (e.g., the server 10W shown in fig. 1 and described above), which is not limited herein. For easy understanding, the embodiment of the present application is described by taking the method performed by the user terminal running with the evaluation client as an example, and the method at least may include the following steps S101 to S104:
and step S101, displaying N evaluation controls on an evaluation main interface of the evaluation client.
The evaluation object in the embodiment of the application can be an evaluation user corresponding to the user terminal running with the evaluation client. Specifically, when the evaluation object needs to evaluate the user terminal used by the evaluation object, the evaluation object may perform a start operation (for example, a click operation) with respect to the evaluation client, so that the user terminal responds to the start operation, and N evaluation controls are displayed on an evaluation main interface of the evaluation client, where N is a positive integer. One evaluation control may correspond to a game sub-client for evaluating a terminal performance parameter of an evaluation object.
As shown in fig. 2, the evaluation main interface 210J of the evaluation client may display a terminal type (e.g., a mobile phone model) to which the user terminal (i.e., the target user terminal) used by the user 20A belongs. Wherein the display area 2Q on the evaluation main interface 210J may display the total score of the evaluation associated with the target user terminal. It will be appreciated that, if the user 20A is a user who does not participate in the evaluation for the first time, that is, the user 20A has evaluated the terminal performance parameter of the target user terminal in the evaluation client, the total evaluation score displayed in the display area 2Q may be the total evaluation score of the terminal performance parameter of the target user terminal that the user 20A last evaluated.
Optionally, if the user 20A is a user participating in the evaluation for the first time, that is, the user 20A has not yet evaluated the terminal performance parameter of the target user terminal in the evaluation client, the total evaluation score displayed in the display area 2Q may be a total evaluation score distributed by the server corresponding to the evaluation client for the terminal type to which the target user terminal belongs. The total evaluation score distributed by the server refers to the average value of the latest total evaluation scores evaluated by all historical evaluation terminals belonging to the terminal type (namely, the user terminals which belong to the terminal type and are participated in evaluation).
As shown in fig. 2, the N evaluation controls (2 are examples) displayed on the evaluation main interface 210J may specifically include an evaluation control 1b and an evaluation control 2b. The game sub-client corresponding to the evaluation control 1b may be referred to as a first game sub-client, and the first game sub-client may be used for evaluating the object behavior attribute of the evaluation object. For example, the first game sub-client may be a shooting-type game client (e.g., a "fixed target shooting game"). The game sub-client corresponding to the evaluation control 2b may be referred to as a second game sub-client, and the second game sub-client may be used for evaluating the object hit attribute of the evaluation object. For example, the second game sub-client may be a shooting-type game client (e.g., a "random target shooting game").
Step S102, in response to a first trigger operation for a target evaluation control in the N evaluation controls, an evaluation game interface of a game sub-client corresponding to the target evaluation control is displayed, and a first game picture frame is displayed on the evaluation game interface.
Specifically, the evaluation object can flexibly select any one evaluation control from the N evaluation controls displayed on the evaluation main interface to execute the triggering operation, so as to evaluate a certain terminal performance parameter of the user terminal. The embodiment of the application can refer to an evaluation control selected by an evaluation object from N evaluation controls as a target evaluation control, and refers to a trigger operation executed for the target evaluation control as a first trigger operation. When the user terminal responds to the first triggering operation, an evaluation game interface of the game sub-client corresponding to the target evaluation control can be displayed, and a first game picture frame is displayed on the evaluation game interface. The embodiment of the application can refer to the trigger operation executed by the evaluation object on the evaluation game interface where the first game picture frame is located as the second trigger operation. The first game picture frame can comprise first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute the second trigger operation.
It should be understood that, if the game sub-client corresponding to the target evaluation control selected by the evaluation object is a first game sub-client, and the first game sub-client is used for evaluating the object behavior attribute of the evaluation object, a first game picture frame associated with the first game sub-client may be displayed on the evaluation game interface of the game sub-client corresponding to the target evaluation control. The first game frame may include an object to be processed having a fixed display position, and the service status of the object to be processed is a first status, and in addition, the first evaluation prompt information in the first game frame may include a first evaluation auxiliary parameter (for example, an evaluation auxiliary parameter such as "remaining bullet: 4" shown in fig. 2) having a first initial value and a status prompt information.
As shown in fig. 2, the user 20A may perform a first trigger operation for the evaluation control 1b from among N evaluation controls displayed in the evaluation main interface 210J. Since the game sub-client (i.e., the first game sub-client) corresponding to the evaluation control 1b may be a shooting game client (e.g., "fixed target shooting game"), the user terminal used by the user 20A may display the evaluation game interface 220J shown in fig. 2 when responding to the first trigger operation 1 . Wherein, the evaluating game interface 200J 1 The game screen frame 21Z of the above may include an object to be processed 2D (e.g., a target to be shot) having a fixed display position, and the traffic state of the object to be processed 2D is a first state (e.g., a "non-shooting state"). In addition, the first evaluation promoting information in the game frame 21Z may include an evaluation auxiliary parameter 2F (i.e., a first evaluation auxiliary parameter having a first initial value) and a status promoting information 2T.
Step S103, responding to the second triggering operation, displaying a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switching the evaluation prompt information from the first evaluation prompt information to the second evaluation prompt information in the second game picture frame.
Specifically, when the display duration of the state prompt information reaches the state display duration threshold (for example, 5 seconds), the user terminal may perform hiding processing on the state prompt information, and change the service state of the object to be processed (for example, the shooting object) from the first state to the second state on the evaluating game interface. At this time, the evaluation object corresponding to the user terminal may execute a second trigger operation with respect to the evaluation game interface, and when the user terminal responds to the second trigger operation, a hit animation of the object to be processed may be displayed on the evaluation game interface, and a second game picture frame corresponding to the first game picture frame may be determined based on the game picture frame corresponding to the hit animation. The second game frame may refer to all game frames corresponding to the hit animation, or alternatively, may refer to a part of game frames in all game frames corresponding to the hit animation, for example, the last frame in all game frames or any multi-frame in all game frames (for example, a game frame composed of the first frame and the last frame), which will not be limited herein. Further, the user terminal may perform a decreasing process on the first initial value in the second game frame, and further use a first evaluation auxiliary parameter corresponding to the decreased first initial value as the second evaluation prompt information.
For ease of understanding, further, please refer to fig. 4, fig. 4 is a schematic flow chart of a shooting scene according to an embodiment of the present application. As shown in fig. 4, the user terminal in the embodiment of the present application may be a user terminal running with an evaluation client, where the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100a. The shooting scene corresponding to the game sub-client in the embodiment of the present application may be a fixed target shooting scene corresponding to the first game sub-client, or may be a random target shooting scene corresponding to the second game sub-client, which will not be limited herein.
It should be understood that, when the evaluation object performs the first trigger operation for the target evaluation control, the user terminal may perform step S401 to display a first game frame on the evaluation game interface of the evaluation client in response to the first trigger operation, and further, the user terminal needs to perform step S402 to display the object to be processed on the evaluation game interface.
Further, when the evaluation object performs a triggering operation (for example, a second triggering operation) for the evaluation game interface, the user terminal may perform step S403, respond to the second triggering operation of the evaluation object, and further may perform step S404, to capture, through a touch chip of the user terminal, a touch event associated with the second triggering operation. It can be understood that the user terminal can respond to the second trigger operation aiming at the evaluating game interface through the touch chip, and further can capture the touch event associated with the second trigger operation. For example, the user terminal may receive a second trigger operation for evaluating the game interface through the touch chip, and further may scan the screen level of the user terminal based on the second trigger operation. When the touch chip scans that the screen level changes, the user terminal can record the operation parameter associated with the second triggering operation, and further can capture the touch event associated with the second triggering operation based on the operation parameter. The operation parameters herein may include, among other things, location information associated with the second trigger operation, a touch effort associated with the second trigger operation, and a touch travel associated with the second trigger operation.
When the user terminal sends the touch event to the system driver of the user terminal through the touch chip, the user terminal may execute step S405, so that the system driver receives the touch event. At this time, the user terminal may execute step S406, so that the game sub-client corresponding to the target evaluation control receives the touch event transmitted through the system driver. Further, the user terminal may execute steps S407 to S412 to enable the game sub-client to process the touch event and generate a second game frame corresponding to the first game frame.
When the user terminal receives the touch event, step S408 may be executed to determine whether the object to be processed is hit, so as to obtain a hit result. If the hit result indicates that the user terminal successfully hits the object to be processed, the user terminal may execute step S409 to generate a first type game frame for indicating that the hit is successful, and may further execute step S410 to display the first type game frame on the evaluation game interface. Optionally, if the hit result indicates that the user terminal did not hit the object to be processed successfully, the user terminal may jump to step S411 to generate a second type game frame for indicating hit failure. Step S412 may be further performed to display the second type game frame on the evaluation game interface. The embodiment of the application can refer to the first type of game picture frame or the second type of game picture frame as the second game picture frame corresponding to the first game picture frame.
For example, if the game sub-client corresponding to the target evaluation client is the first game sub-client, the user terminal may respond to the second trigger operation executed for the evaluation game interface in the first game sub-client, and may further obtain a touch event associated with the second trigger operation. In the embodiment of the application, the touch event acquired through the first game sub-client may be referred to as a first touch event. Since the first game frame of the first game sub-client includes the object to be processed having a fixed display position, after the second trigger operation is performed in the first game sub-client, the second game frame for display on the evaluation game interface is determined based on the game frame corresponding to the hit animation of the object to be processed.
As shown in fig. 2, the status prompt 2T in the game frame 21Z has a status display duration threshold (e.g., 5 seconds) on the evaluation game interface, which means that the status prompt 2Z will be displayed from the evaluation game interface 220J after 5 seconds of status prompt display 1 The upper part disappears. After the user terminal conceals the state prompt information 2Z, the user terminal can evaluate the game interface 220J 1 The state change is performed on the service state (e.g., shooting state) of the object 2D to be processed, that is, the shooting state of the object 2D to be processed is changed from the "non-shooting state" to the "shooting-possible state" (i.e., the second state).
It should be appreciated that the user 20A can target the evaluation game interface 220J when the shooting status of the object to be processed 2D is the second status 2 And executing a second triggering operation. Wherein it is understood that the second triggering operation herein may instruct the user 20A to target the evaluation game interface 220J 2 Performing a triggering operation at any one of the positions, optionally, evaluating the game interface 220J 2 Including business operations controls (e.g., a "shoot" control), the second trigger operation means that the user 20A may targetAnd triggering the business operation control. The triggering operation may include a touch operation such as clicking, long pressing, or a non-touch operation such as voice, gesture, or the like, which will not be limited herein.
It should be appreciated that the user terminal used by user 20A may receive a game interface 220J for the evaluation through the touch chip of the user terminal 2 And further scanning the screen level of the user terminal based on the second trigger operation. When the screen level is scanned to change, the user terminal can record the operation parameter associated with the second triggering operation, further can capture the first touch event associated with the second triggering operation based on the operation parameter, and sends the first touch event to a system driver of the user terminal. Further, the user terminal may transmit the first touch event to a game sub-client (e.g., a first game sub-client) corresponding to the target evaluation control through a system driver. The first game sub-client may be configured to process the first touch event to generate a game frame corresponding to the hit animation of the object to be processed 2D. At this time, the user terminal may determine a game screen frame 22Z (i.e., a second game screen frame) corresponding to the game screen frame 21Z (i.e., a first game screen frame) based on the game screen frame corresponding to the hit animation, and evaluate the game interface 220J 3 The game screen frame 22Z is displayed thereon. The game screen frame 22Z may include a target having a pit hitting effect.
Meanwhile, the user terminal may further perform a decreasing process on the first initial value in the auxiliary evaluation parameter 2F in the game frame 22Z, that is, decrease the number of the remaining bullets from 4 to 3, and further use the first auxiliary evaluation parameter (for example, the auxiliary evaluation parameter of "remaining bullets: 3") corresponding to the decreased first initial value as the second evaluation prompt information.
And step S104, displaying the evaluation result associated with the evaluation object on the evaluation game interface.
Specifically, if the game sub-client corresponding to the target evaluation control is the first game sub-client, the user terminal may perform state reset on the object to be processed having the second state (for example, the "shooting state") in the second game frame, so as to obtain the object to be processed after the state reset, and further may display the object to be processed after the state reset on the evaluation game interface. The game picture frame where the object to be processed is located after the state is reset can be called a third game picture frame in the embodiment of the application. Wherein the business state of the object to be processed after the state reset is a first state (for example, a "non-shooting state"). When the evaluation result associated with the evaluation object is obtained, the user terminal can display a target display area with a result display duration threshold on the evaluation game interface, and display the evaluation result on the target display area.
It should be appreciated that the user terminal may also obtain an evaluation result associated with the evaluation object. For example, the first game frame of the first game sub-client may include an object to be processed having a first state, and the second game frame includes an object to be processed having a second state, where the second state may be a service state obtained after the state of the first state is changed. The calculation formula of the object interaction time length (denoted by D) in the embodiment of the present application may be referred to the following formula (1):
D=T 1 -T 2 , (1)
wherein T is 1 Refers to a hit timestamp associated with the second trigger operation; t (T) 2 Refers to a state change timestamp associated with the object to be processed.
Based on the above, when determining the evaluation result associated with the evaluation object, the user terminal needs to acquire a time stamp for performing state change on the object to be processed in the first game picture frame, and then the acquired time stamp can be used as a state change time stamp. At the same time, the user terminal also needs to record the generation timestamp of the second game picture frame, and uses the generation timestamp as the hit timestamp associated with the second trigger operation.
For example, if the second game frame is all the game frames corresponding to the hit animation, the generation time stamp of the second game frame may be a generation time stamp determined from all the game frames corresponding to the hit animation. Alternatively, if the second game frame is a game frame of all game frames corresponding to the hit animation, the generation timestamp of the second game frame may be a generation timestamp of the game frame, which will not be limited herein.
Further, the user terminal may determine a time difference between the hit timestamp and the state change timestamp according to the above formula (1), take the time difference as an object interaction duration of the evaluation object on the user terminal running the evaluation client, and determine an object behavior attribute associated with the evaluation object based on the object interaction duration. As shown in fig. 2, the hit time stamp obtained by the user terminal may be 2021, 5, 12, 12:30:25, and the state change time stamp may be 2021, 5, 12, 12:30:24, so that the time difference determined by the user terminal is 1 second, and the time difference of 1 second is further taken as the object interaction duration of the user 20A on the user terminal. At this time, the user terminal may determine the object interaction time period as an object behavior attribute associated with the user 20A, and may further generate an evaluation result (for example, text information such as "this hit time: 1 second") for display on the evaluation game interface based on the object behavior attribute.
It may be appreciated that the user terminal may determine a target display area for displaying the evaluation result including the object behavior attribute on the evaluation game interface, and may further display the evaluation result on the target display area. The target display area in the embodiment of the present application may be a display area with a threshold value of a result display duration (for example, 5 seconds), which means that when the display duration of the target display area reaches 5 seconds, the user terminal will hide the target display area, so that new evaluation may be continuously performed in the evaluation game interface. Optionally, the target display area in the embodiment of the present application may also be a display area specifically used for displaying the evaluation result on the evaluation game interface, for example, the target display area may be used for displaying all the evaluation results for evaluating the object behavior attribute on the evaluation game interface, and the target display area may also be used for displaying the current optimal evaluation result associated with the object behavior attribute on the evaluation game interface, which will not be limited herein.
For ease of understanding, further, please refer to fig. 5, fig. 5 is a diagram illustrating an interface display of a target display area according to an embodiment of the present application. As shown in fig. 5, the evaluation game interface of the game sub-client (for example, the first game sub-client) corresponding to the target evaluation control in the embodiment of the present application may be the evaluation game interface 520J shown in fig. 5. The display area 5Q (i.e., target display area) in the evaluation game interface 520J may be dedicated to displaying all the evaluation results of evaluating the object behavior attribute.
It should be understood that, in the evaluation game interface 520J shown in fig. 5, the evaluation object corresponding to the user terminal may evaluate the object behavior attribute multiple times based on the evaluation auxiliary parameter 5F displayed in the evaluation game interface 520J, so as to obtain a more accurate evaluation result. For example, when the remaining bullets indicated by the evaluation auxiliary parameter 5F are 4, the evaluation user may perform evaluation on the evaluation game interface 520J for 4 times, so as to obtain a corresponding evaluation result.
For example, if the evaluation object corresponding to the user terminal performs the first evaluation, when the evaluation object observes that the object 5D to be processed in the evaluation game interface 520J is in the second state, the evaluation object may perform the second triggering operation with respect to the evaluation game interface 520J. After the user terminal responds to the second trigger operation, the user terminal may display a second game screen frame associated with the current second trigger operation on the evaluation game interface 520J. Further, the user terminal may obtain an evaluation result associated with the evaluation object. Wherein, the evaluation result can be "the first hit time is 1 second". At this time, the user terminal may display the current evaluation result on the display area 5Q of the evaluation game interface 520J. Then, the user terminal may change the service status of the object to be processed 5D again in the evaluation game interface 520J, so that the evaluation object performs a second evaluation on the object behavior attribute, and so on.
It should be understood that after the evaluation game interface displays the evaluation result associated with the evaluation object, the user terminal may obtain, from the current evaluation game interface, an updated value corresponding to the evaluation auxiliary parameter in the second evaluation prompt message. The updated value may be determined after the initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt message is changed. Further, the user terminal may obtain an evaluation cutoff condition associated with the target evaluation control. Wherein the evaluation cutoff condition may include an evaluation threshold.
If the updated value does not match the evaluation threshold, the user terminal may determine that the second evaluation prompt information does not meet the evaluation cutoff condition, which means that the evaluation object corresponding to the user terminal still needs to continue to perform new evaluation on the evaluation game interface, and at this time, the user terminal may repeatedly execute the steps S103 to S104. For example, the user terminal may determine a third game frame (i.e., a game frame in which the object to be processed is located after the state is reset) displayed on the current evaluation game interface as a new first game frame, and determine the second evaluation prompt information in the third game frame as a new first evaluation prompt information. Further, when the evaluation object executes a new second trigger operation for the evaluation game interface, the user terminal can respond to the new second trigger operation, display a new second game picture frame corresponding to the new first game picture frame on the evaluation game interface, and switch the evaluation prompt information from the new first evaluation prompt information to the new second evaluation prompt information in the new second game picture frame, so that a new evaluation result associated with the evaluation object is displayed on the evaluation game interface until the new second evaluation prompt information meets the evaluation cut-off condition.
As shown in fig. 5, the updated value (i.e., the first updated value, for example, 3) corresponding to the evaluation auxiliary parameter 5F in the evaluation game interface 520J may be obtained by performing a decrementing process on the initial value (i.e., the first initial value, for example, 4) corresponding to the evaluation auxiliary parameter 5F in the first evaluation prompt message after the user terminal responds to the second trigger operation. Further, the user terminal may obtain an evaluation cutoff condition associated with the target evaluation control, e.g., the evaluation cutoff condition may be a first evaluation cutoff condition associated with the object behavior attribute, and the first evaluation cutoff condition may include a first evaluation threshold (e.g., 0). This means that when the update value corresponding to the evaluation auxiliary parameter 5F in the evaluation game interface 520J is 0, the end of the evaluation can be indicated.
Because the first updated value corresponding to the evaluation auxiliary parameter 5F in the evaluation game interface 520J is 3, that is, the first updated value is not matched with the first evaluation threshold, the user terminal may determine that the second evaluation prompt information does not meet the evaluation cutoff condition. When the user terminal changes the service state of the object to be processed in the evaluation game interface 520J from the first state to the second state again, at this time, the user corresponding to the user terminal needs to perform new evaluation on the evaluation game interface 520J to obtain a new evaluation result, and further displays the new evaluation result in the display area 5Q shown in fig. 5, until the update value corresponding to the evaluation auxiliary parameter 5F in the evaluation game interface 520J is 0, so that the evaluation on the object behavior attribute can be ended.
Optionally, if the updated value is matched with the evaluation threshold, the user terminal may determine that the second evaluation prompt information meets the evaluation cutoff condition, and may further display an evaluation result display interface of the game sub-client corresponding to the target evaluation control. Further, the user terminal can display the evaluation score determined by the terminal performance parameter corresponding to the target evaluation control on the evaluation result display interface.
For the terminal performance parameter of the user terminal, the terminal performance parameter may include a first evaluation type (for example, a game experience test) and a second evaluation type (for example, an operation performance test), and according to the influence degree of the evaluation type on the terminal performance parameter, the embodiment of the application may assign a first weight parameter (for example, 80%) to the first evaluation type and assign a second weight parameter (for example, 20%) to the second evaluation type. Further, according to the embodiment of the application, the total reference score (for example, 10000 score) corresponding to the terminal performance parameter can be respectively determined according to the respective weight parameters, wherein the total reference score (for example, 8000 score) corresponding to the first evaluation type and the total reference score (for example, 2000 score) corresponding to the second evaluation type are respectively determined. If the second evaluation type in the terminal performance parameter includes two types of object behavior attribute and object hit attribute, the embodiment of the application may take 2 associated evaluation controls as examples, so that the embodiment of the application may determine the first evaluation benchmark score associated with the object behavior attribute as 1000 points and determine the second evaluation benchmark score associated with the object hit attribute as 1000 points.
It may be appreciated that the terminal performance parameters corresponding to the target evaluation control may include an object behavior attribute, where the evaluation result display interface includes a first evaluation result display interface for displaying the object behavior attribute, and the object behavior attribute may include an object interaction duration for characterizing an evaluation object on a user terminal running an evaluation client. Based on the above, the user terminal may obtain X object interaction durations associated with the evaluation object, where X may be an initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt information, and X is a positive integer. Further, the user terminal can acquire the object interaction duration with the minimum value from the X object interaction durations, and further can determine a first evaluation score corresponding to the evaluation object based on the minimum interaction duration by taking the acquired object interaction duration as the minimum interaction duration. At this time, the user terminal may display a first evaluation score on the first evaluation result display interface, and use the first evaluation score as an evaluation score determined by the terminal performance parameter corresponding to the target evaluation control.
It should be understood that, when the user terminal determines the first evaluation score corresponding to the evaluation object, the user terminal may obtain the object interaction duration with the minimum value from the X object interaction durations, and may further use the obtained object interaction duration as the minimum interaction duration. Further, the user terminal may obtain a first evaluation benchmark score associated with the object behavior attribute and a benchmark interaction duration corresponding to the first evaluation benchmark score. The reference interaction duration is an average interaction duration obtained after the evaluation of Y (e.g. 20) sample objects for the same user terminal, where Y is a positive integer. In other words, in the embodiment of the present application, 20 users may be selected to perform performance test on the same user terminal, and the object interaction durations (i.e., user reaction durations) of the 20 users may be averaged, so that the averaged object interaction duration is used as a reference interaction duration (for example, 1 second), and further a first evaluation reference score (for example, 1000 minutes) may be assigned to the reference interaction duration.
Further, the user terminal may obtain a first score mapping policy associated with the object behavior attribute. The first score mapping strategy can be used for indicating the user terminal to determine a first evaluation score associated with the object behavior attribute according to the first evaluation reference score and the reference interaction duration. For example, the user terminal may obtain a difference value (i.e., a first difference value) between the reference interaction duration and the minimum interaction duration, and further determine a first evaluation score corresponding to the evaluation object based on the first difference value, the first score mapping policy, and the first evaluation reference score.
For example, the first score mapping policy indicates: if the minimum interaction time length of the evaluation object is reduced by 10 milliseconds (namely 0.01 second) on the basis of the reference interaction time length, the first evaluation score corresponding to the evaluation object is increased by 100 minutes on the basis of the first evaluation reference score. For example, when the minimum interaction time of the evaluation object is 0.98 seconds, the user terminal may increase by 200 minutes on the basis of the first evaluation benchmark score (for example, 1000 minutes), and then determine the increased first evaluation benchmark score as the first evaluation score (for example, 1200 minutes) corresponding to the evaluation object.
For another example, the first score mapping policy indicates: if the minimum interaction time length of the evaluation object is increased by 10 milliseconds (namely 0.01 second) on the basis of the reference interaction time length, the first evaluation score corresponding to the evaluation object is reduced by 100 minutes on the basis of the first evaluation reference score. For example, when the minimum interaction time of the evaluation object is 1.01 seconds, the user terminal may decrease by 100 minutes on the basis of the first evaluation benchmark score (for example, 1000 minutes), and then determine the decreased first evaluation benchmark score as the first evaluation score (for example, 900 minutes) corresponding to the evaluation object.
For ease of understanding, further, please refer to fig. 6, fig. 6 is an interface switching schematic diagram of a first evaluation result display interface according to an embodiment of the present application. As shown in fig. 6, the evaluation game interface of the game sub-client (e.g., the first game sub-client) corresponding to the target evaluation control in the embodiment of the present application may be an evaluation game interface 620J.
It should be appreciated that, when determining the evaluation score of the terminal performance parameter (e.g., the object behavior attribute) corresponding to the target evaluation control, the user terminal running with the evaluation client may obtain all the evaluation results that match the first initial value (e.g., 4) corresponding to the evaluation auxiliary parameter in the first evaluation prompt information. The 4 evaluation results may specifically include an evaluation result 1, an evaluation result 2, an evaluation result 3, and an evaluation result 4. Wherein, the evaluation result 1 may indicate the object interaction duration 1 (e.g., 1 second) determined after the evaluation object performs the first evaluation on the user terminal running with the evaluation client; the evaluation result 2 may indicate the object interaction duration 2 (e.g., 1.01 seconds) determined after the evaluation object performs the second evaluation on the user terminal; the evaluation result 3 may indicate an object interaction duration 3 (e.g., 0.98 seconds) determined after the evaluation object performs the third evaluation on the user terminal; the evaluation result 4 may indicate an object interaction period 4 (e.g., 1.02 seconds) determined after the evaluation object performs the fourth evaluation on the user terminal. Wherein each of the 4 evaluation results can be displayed in the display area 6Q in the evaluation game interface 620J shown in FIG. 6 1 Is a kind of medium.
It is understood that the evaluation cutoff condition associated with the target evaluation control obtained by the user terminal may be a first evaluation cutoff condition associated with the object behavior attribute, and the first evaluation cutoff condition may include a first evaluation threshold (e.g., 0). Because the first updated value corresponding to the evaluation auxiliary parameter 6F in the second evaluation prompt information displayed on the evaluation game interface 620J is 0, the user terminal may determine that the first updated value matches the first evaluation threshold, and the user terminal may determine that the second evaluation prompt information meets the evaluation cutoff condition. At this time, the user terminal may switch the terminal interface of the user terminal from the evaluation game interface 620J to the evaluation result display interface of the first game sub-client (i.e., the first evaluation result display interface, for example, the evaluation result display interface 630J shown in fig. 6).
Further, the user terminal may obtain an evaluation score (e.g., a first evaluation score) determined by the object behavior attribute. Wherein, it can be understood that the user terminal can acquire 4 object interaction durations associated with the evaluation object. Specifically, object interaction duration 1 (e.g., 1 second), object interaction duration 2 (e.g., 1.01 seconds), object interaction duration 3 (e.g., 0.98 seconds), and object interaction duration 4 (e.g., 1.02 seconds) may be included. At this time, the user terminal may obtain the object interaction duration with the minimum value from the 4 object interaction durations, and further may determine the first evaluation score corresponding to the evaluation object based on the minimum interaction duration by using the obtained object interaction duration as the minimum interaction duration (for example, object interaction duration 3).
Wherein the user terminal may obtain a first benchmark score (e.g., 1000 points) associated with the object behavior attribute and a benchmark interaction time period (e.g., 1 second) corresponding to the first benchmark score. At the same time, the user terminal may also obtain a first score mapping policy associated with the object behavior attribute. Further, the user terminal may obtain a first difference (e.g., 0.02 seconds) between the reference interaction duration and the minimum interaction duration, as indicated by the first score mapping policy: if the minimum interaction time length of the evaluation object is reduced by 10 milliseconds (namely 0.01 second) on the basis of the reference interaction time length, the first evaluation score corresponding to the evaluation object is increased by 100 minutes on the basis of the first evaluation reference score. Based on this, the user terminal may determine a first evaluation score (e.g., 1200 points) corresponding to the evaluation object based on the first difference, the first score mapping policy, and the first evaluation benchmark score.
At this time, the user terminal may display a display area 6Q in the evaluation result display interface 630J shown in fig. 6 2 And displaying a first evaluation score determined by the object behavior attribute, and taking the first evaluation score as an evaluation score determined by the terminal performance parameter corresponding to the target evaluation control. For example, the user terminal may generate text information corresponding to the evaluation result based on the first evaluation score, and further may display the text information on the evaluation result display interface 630J shown in fig. 6. For example, text information 60S shown in fig. 6: "I achieved 1200 points in the fixed target test, defeating 98.5% of the players". The evaluation result display interface 630J may further include a control 61K (e.g., a page close control) and a control 62K (e.g., a result sharing control) shown in fig. 6. The control 61K may be used to instruct the evaluation object to return to the evaluation main interface of the evaluation client, so that the evaluation object continues to select another evaluation control, thereby evaluating other terminal performance parameters of the user terminal. The control 62K may be used to instruct the evaluation object to share the evaluation data in the evaluation result display interface 630J to other sharing public platforms.
In the embodiment of the application, the evaluation main interface of the evaluation client operated by the user terminal can comprise N evaluation controls for evaluating the performance parameters of the terminal, and one evaluation control corresponds to one game sub-client, so that in the process of evaluating the user terminal, an evaluation user is required to interact with the evaluation client instead of automatically operating the evaluation client by the user terminal, namely, the evaluation user can flexibly select any game sub-client corresponding to one evaluation control (namely, a target evaluation control) from the N evaluation controls, and further, trigger operation can be performed in the game sub-client corresponding to the target evaluation control, so that the user terminal responds to determine the evaluation result with user characteristics according to the trigger operation, which means that the evaluation results obtained by different users using the same user terminal can also have larger difference, thereby enhancing attractiveness and interestingness in the performance evaluation process.
Further, referring to fig. 7, fig. 7 is a flow chart of a data processing method according to an embodiment of the application. As shown in fig. 7, the method may be performed by a computer device running with an evaluation client, where the computer device may be a user terminal (e.g., the user terminal 100a shown in fig. 1 and described above) or a server (e.g., the server 10W shown in fig. 1 and described above), which is not limited herein. For easy understanding, the embodiment of the present application is described by taking the method performed by the user terminal running with the evaluation client as an example, and the method at least may include the following steps S201 to S204:
Step S201, N evaluation controls are displayed on an evaluation main interface of the evaluation client.
Specifically, when the evaluation object needs to evaluate the user terminal used by the evaluation object, the evaluation object may perform a start operation (e.g., a click operation) with respect to the evaluation client, so that the user terminal responds to the start operation, thereby displaying N evaluation controls on an evaluation main interface (e.g., the evaluation main interface 210J shown in fig. 2) of the evaluation client, where N is a positive integer. One evaluation control may correspond to a game sub-client for evaluating a terminal performance parameter of an evaluation object.
Step S202, in response to the evaluation operation of the target evaluation control in the N evaluation controls, outputting an evaluation result display interface associated with the target evaluation control in the evaluation client.
Specifically, the evaluation object may flexibly select any one evaluation control (i.e., a target evaluation control) from N evaluation controls displayed on the evaluation main interface to execute the first triggering operation, so as to evaluate a certain terminal performance parameter of the user terminal. When the user terminal responds to the first triggering operation, an evaluation game interface of the game sub-client corresponding to the target evaluation control can be displayed, and a first game picture frame is displayed on the evaluation game interface. The first game picture frame can comprise first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute a second trigger operation. Further, the user terminal can respond to the second triggering operation, display a second game picture frame corresponding to the first game picture frame on the evaluation game interface, switch the evaluation prompt information from the first evaluation prompt information to the second evaluation prompt information in the second game picture frame, and display an evaluation result associated with the evaluation object on the evaluation game interface. It should be understood that the user terminal may also obtain an updated value corresponding to the evaluation auxiliary parameter in the second evaluation prompt information; the updated value may be determined after the initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt message is changed. Further, the user terminal may obtain an evaluation cutoff condition associated with the target evaluation control, where the evaluation cutoff condition includes an evaluation threshold. It can be understood that if the updated value does not match the evaluation threshold, the user terminal may determine that the second evaluation prompt information does not meet the evaluation cutoff condition, which means that the evaluation object corresponding to the user terminal still needs to continue to perform new evaluation on the evaluation game interface. Optionally, if the updated value matches the evaluation threshold, the user terminal may determine that the second evaluation prompt information meets the evaluation cutoff condition, and then may display an evaluation result display interface of the game sub-client corresponding to the target evaluation control, and display an evaluation score determined by the terminal performance parameter corresponding to the target evaluation control on the evaluation result display interface.
It should be understood that, if the game sub-client corresponding to the target evaluation control selected by the evaluation object is the first game sub-client, and the first game sub-client is used for evaluating the object behavior attribute of the evaluation object, the user terminal may respond to the evaluation operation for the target evaluation control in the N evaluation controls, and a specific implementation manner of outputting, in the evaluation client, an evaluation result display interface (for example, the first evaluation result display interface) associated with the target evaluation control may be referred to the description of step S101 to step S104 in the embodiment corresponding to fig. 3, which will not be repeated herein.
Optionally, if the game sub-client corresponding to the target evaluation control selected by the evaluation object is a second game sub-client, and the second game sub-client is used for evaluating the object hit attribute of the evaluation object, when responding to the first trigger operation for the target evaluation control, the user terminal may display an evaluation game interface of the second game sub-client, and display a first game picture frame on the evaluation game interface of the second game sub-client. It will be appreciated that the first evaluation cue information in the first game frame may include a second evaluation assistance parameter (e.g., an evaluation assistance parameter of "10 seconds remaining") having a second initial value. The evaluating game interface of the second game sub-client can comprise a position operation control, a business operation control and an auxiliary aiming area at the first display position.
It can be understood that when the terminal interface of the user terminal is switched from the evaluation main interface to the evaluation game interface, the user terminal temporarily does not display the object to be processed in the evaluation game interface in the initial state. It should be appreciated that the object to be processed associated with the second game sub-client may be randomly displayed in the evaluation game interface. For example, the user terminal may randomly display all objects to be processed (e.g., shooting class objects) at one time in the evaluation game interface. Optionally, the user terminal may also display a plurality of objects to be processed with random display positions in the evaluation game interface at the same time, and the batch of objects to be processed has a display duration threshold, that is, when the display duration of the batch of objects to be processed reaches the display duration threshold, the user terminal may perform hiding processing on the batch of objects to be processed, and further may display a plurality of objects to be processed with random display positions again at the same time. Optionally, the user terminal may further display a plurality of objects to be processed on the evaluation game interface, and when the evaluation object successfully hits a certain object to be processed, a new object to be processed may be updated randomly. The display manner of the plurality of objects to be processed having the random display positions in the evaluation game interface will not be limited here.
It should be understood that when the X objects to be processed with random display positions are displayed on the evaluation game interface, the evaluation object corresponding to the user terminal may perform a screen-dividing operation with respect to the position operation control in the evaluation game interface. Wherein X is a positive integer. When the user terminal responds to the screen-drawing operation, the user terminal can switch the display position of the auxiliary aiming area from the first display position to the second display position. Wherein the second display position is determined based on a screen-swipe operation. Further, the evaluation object may perform a second trigger operation for a service operation control (e.g., "shooting control") in the evaluation game interface, so that the user terminal displays a second game picture frame corresponding to the first game picture frame on the evaluation game interface when responding to the second trigger operation. Further, the user terminal may perform a modification process on the second initial value (for example, "10 seconds") in the second game frame, and further may use a second evaluation auxiliary parameter corresponding to the modified second initial value as the second evaluation prompt information.
It can be understood that, when the user terminal responds to the second triggering operation for the service operation control, the touch event associated with the second triggering operation may be acquired through the second game sub-client, and specifically, refer to the description of step S401 to step S406 in the embodiment corresponding to fig. 4. In this embodiment of the present application, the touch event acquired through the second game sub-client may be referred to as a second touch event, where the second touch event may include an auxiliary aiming area having a second display position. At this time, the user terminal may process the second touch event through the second game sub-client, that is, the second display position is respectively matched with the display position of each object to be processed in the X objects to be processed, so as to obtain a matching result. The display duration of each of the X objects to be processed herein may be the same, that is, the X objects to be processed are displayed simultaneously and disappear simultaneously. Further, the user terminal may generate a second game picture frame corresponding to the first game picture frame based on the matching result, and display the second game picture frame on the evaluation game interface.
It should be appreciated that if the matching result indicates that the second display position matches the display position of the target processing object of the X objects to be processed, the user terminal may determine, through the second game sub-client, a hit timestamp associated with the second trigger operation. Further, the user terminal may determine a display cut-off time stamp of the target processing object based on the display duration of the target processing object and the display start time stamp of the target processing object, and may further determine a hit result of the target processing object based on the display cut-off time stamp and the hit time stamp. When the display cut-off time stamp is greater than or equal to the hit time stamp, that is, the hit result indicates that the user terminal successfully hits the target processing object, at this time, the user terminal may execute steps S409-S410 corresponding to fig. 4, generate, by using the second game sub-client, a first type of game picture frame for indicating that the hit is successful, and further determine, based on the first type of game picture frame, a second game picture frame corresponding to the first game picture frame, and display the second game picture frame in the evaluation game interface. When the display cut-off time stamp is smaller than the hit time stamp, that is, the hit result indicates that the user terminal does not hit the target processing object successfully, the user terminal may execute steps S411 to S412 corresponding to fig. 4, generate, by the second game sub-client, a second type game picture frame for indicating that the hit fails, and further determine, based on the second type game picture frame, a second game picture frame corresponding to the first game picture frame, and display the second game picture frame in the evaluation game interface.
Optionally, if the matching result indicates that the second display position is not matched with the display position of each object to be processed in the X objects to be processed, the user terminal may execute steps S411 to S412 corresponding to fig. 4, generate, by the second game sub-client, a second type game frame for indicating a hit failure, and further determine, based on the second type game frame, a second game frame corresponding to the first game frame, and display the second game frame in the evaluation game interface.
Meanwhile, in the second game picture frame, the user terminal may perform a change process on the second initial value (for example, 10 seconds), that is, change the evaluation duration of the remaining time from 10 to 8, and further may use a second evaluation auxiliary parameter corresponding to the second initial value after the change process as a second evaluation prompt message. Further, the user terminal may display the evaluation result associated with the evaluation object on the evaluation game interface.
For ease of understanding, further, please refer to fig. 8, fig. 8 is a schematic diagram of interface switching for evaluating object hit attributes according to an embodiment of the present application. As shown in fig. 8, a user terminal used by an evaluation object (e.g., user 80A) in an embodiment of the present application may operate with an evaluation client, where the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, user terminal 100A.
It should be appreciated that the evaluate game interface 820J shown in FIG. 8 1 The game interface may be evaluated for display on the terminal interface of the user terminal after the first trigger operation is performed for the user 80A to the target evaluation control in the evaluation main interface (e.g., the evaluation control 2b in the evaluation main interface 210J shown in fig. 2). The second game sub-client corresponding to the target evaluation control is configured to evaluate an object hit attribute (e.g., hit number) of the user 80A.
Wherein the evaluation game interface 820J 1 A first game screen frame associated with the second game sub-client may be displayed. The first evaluation promoting information in the first game frame may include an evaluation auxiliary parameter 8F (i.e., a second evaluation auxiliary parameter having a second initial value) shown in fig. 8, and for example, the evaluation auxiliary parameter 8F may be an evaluation auxiliary parameter such as "remaining time 10 seconds". The evaluation auxiliary parameter 8F may be automatically updated according to the change in the time period of the evaluation performed by the user 80A in the evaluation game interface. In addition, the evaluation game interface 820J 1 Controls 81K (e.g., position-manipulating controls), controls 82K (e.g., business-manipulating controls), and auxiliary aiming areas 8Q (e.g., auxiliary aiming areas in a first display position) may also be included. Wherein, in the initial state, the user terminal is not in the evaluating game interface 820J 1 Displaying the object to be processed.
When the user terminal displays X objects to be processed (for example, x=2) with random display positions on the evaluation game interface of the second game sub-client, the terminal interface of the user terminal may be selected from the evaluation game interface 820J 2 Change to evaluate game interface 820J 2 . Wherein the 2 are to beThe processing object may be an evaluation game interface 820J 2 Displayed object to be processed 8D 1 And an object to be processed 8D 2 . At this time, the user 80A corresponding to the user terminal may target the evaluation game interface 820J 2 The control 81K in (a) performs a screen-drawing operation. The user terminal may evaluate game interface 820J when the user terminal responds to the screen-swipe operation 2 The display position of the auxiliary aiming area 8Q in (a) is switched from the first display position to the second display position. Wherein the second display position is determined based on a screen-swipe operation. At this time, the terminal interface of the user terminal may be selected from the evaluation game interface 820J 2 Changing to the evaluation game interface 820J shown in FIG. 8 3 . Wherein the second display position may be the evaluation game interface 820J 3 The display position of the auxiliary aiming area 8Q.
Further, the user 80A may be directed to evaluating the game interface 820J 3 The control 82K in the game is used for executing the second trigger operation so that the user terminal can evaluate the game interface 820J when responding to the second trigger operation 3 And displaying a second game picture frame corresponding to the first game picture frame. Meanwhile, in the second game frame, the user terminal may perform a change process on the second initial value (for example, 10 seconds), that is, change the evaluation duration of the remaining time from 10 to 8, and further may use the evaluation auxiliary parameter 8F corresponding to the second initial value after the change process as the second evaluation prompt information. Further, the user terminal may display the evaluation results associated with the user 80A on the evaluation game interface. Wherein the user terminal can evaluate the game interface 820J at the second game frame 3 The evaluation results associated with the user 80A are displayed directly. Optionally, the user terminal may also display an evaluation result (for example, the evaluation result 8P shown in fig. 8) associated with the user 80A on the evaluation game interface 820J where the third game frame is located when the second game frame is changed to the third game frame 4
It will be appreciated that if the second game frame displayed by the user terminal after responding to the second trigger operation for control 82K belongs to the first type of game Game picture frames (i.e., game picture frames used to characterize successful hits), the user terminal may be in the evaluate game interface 820J 4 The evaluation result 8P shown above may be "hit: 1", such as text information. Optionally, if the second game frame displayed by the user terminal after responding to the second trigger operation for the control 82K belongs to the second type of game frame (i.e. the game frame for indicating hit failure), the user terminal may evaluate the game interface 820J 4 The evaluation result 8P shown above may be "hit: text information of 0 ". Wherein the evaluation result 8P here may be used to indicate the sum of the number of times the user 80A successfully hits the object to be processed within a fixed time (e.g. 10 seconds).
It should be understood that after the evaluation game interface displays the evaluation result associated with the evaluation object, the user terminal may obtain the updated value corresponding to the evaluation auxiliary parameter in the second evaluation prompt message. The updated value may be determined after the initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt message is changed. Further, the user terminal may obtain an evaluation cutoff condition associated with the target evaluation control. Wherein the evaluation cutoff condition may include an evaluation threshold.
If the updated value does not match the evaluation threshold, the user terminal may determine that the second evaluation prompt information does not meet the evaluation cutoff condition, which means that the evaluation object corresponding to the user terminal still needs to continue to perform new evaluation on the evaluation game interface, and at this time, the user terminal may repeatedly execute the steps S103 to S104.
As shown in fig. 8, the updated value (i.e., the second updated value, for example, 7) corresponding to the evaluation auxiliary parameter 8F in the evaluation game interface 520J may be obtained by modifying the initial value (i.e., the second initial value, for example, 10) corresponding to the evaluation auxiliary parameter in the first evaluation prompt message after the user terminal responds to the second trigger operation. Further, the user terminal may obtain an evaluation cutoff condition associated with the target evaluation control, for example, the evaluation cutoff condition may be a second evaluation cutoff condition associated with the object hit attribute, the evaluation cutoff conditionThe second evaluation cutoff condition may include a second evaluation threshold (e.g., 0). This means that the game interface 820J is evaluated 4 And when the update value corresponding to the evaluation auxiliary parameter 8F is 0, the end of the evaluation can be indicated.
Because of evaluating game interface 820J 4 And the second updated value corresponding to the evaluation auxiliary parameter 8F is 7, namely the second updated value is not matched with the second evaluation threshold value, so that the user terminal can determine that the second evaluation prompt information does not meet the evaluation cut-off condition. At this time, the user terminal may evaluate game interface 820J 4 The display duration of 2 objects to be processed in the game interface is used for determining whether a batch of new objects to be processed need to be updated randomly or not, and further, the game interface 820J is evaluated 4 The object to be processed displayed above is subjected to new evaluation to obtain a new evaluation result, and the new evaluation result can be displayed in the display area to which the evaluation result 8P shown in FIG. 8 belongs until the game interface 820J is evaluated 4 When the update value corresponding to the evaluation auxiliary parameter 8F is 0, the evaluation of the object hit attribute can be finished.
Optionally, if the updated value is matched with the evaluation threshold, the user terminal may determine that the second evaluation prompt information meets the evaluation cutoff condition, and may further display an evaluation result display interface of the game sub-client corresponding to the target evaluation control. Further, the user terminal can display the evaluation score determined by the terminal performance parameter corresponding to the target evaluation control on the evaluation result display interface.
It may be appreciated that the terminal performance parameter corresponding to the target evaluation control may include an object hit attribute, where the evaluation result display interface includes a first evaluation result display interface for displaying the object hit attribute, and the object hit attribute may include an object hit number for characterizing an evaluation object on a user terminal running an evaluation client. Based on the information, the user terminal obtains the hit times of the object associated with the evaluation object from the evaluation game interface of the second game sub-client. Further, the user terminal may obtain a second benchmark score (e.g., 1000 points) associated with the object hit attribute and a benchmark number of hits (e.g., 5) corresponding to the second benchmark score. The reference hit number here is an average hit number obtained after the Y (e.g., 20) sample objects are evaluated for the same user terminal, and Y is a positive integer.
Further, the user terminal may obtain a second score mapping policy associated with the object hit attribute. The second score mapping strategy may be used to instruct the user terminal to determine a second evaluation score associated with the object hit attribute according to the second evaluation benchmark score and the benchmark hit number. For example, the user terminal may obtain the difference between the reference hit number and the target hit number (i.e., the second difference), and further determine a second evaluation score corresponding to the evaluation target based on the second difference, the second score mapping policy, and the second evaluation reference score. Further, the user terminal can display a second evaluation score on a second evaluation result display interface, and the second evaluation score is used as an evaluation score determined by the terminal performance parameters corresponding to the target evaluation control.
For example, the second score mapping policy indicates: if the object hit number of the evaluation object is increased once on the basis of the reference hit number, the second evaluation score corresponding to the evaluation object is increased by 100 points on the basis of the second evaluation reference score. For example, when the number of object hits of the evaluation object is 6, the user terminal may increase the second evaluation benchmark score by 100 points (for example, 1000 points) based on the second evaluation benchmark score, and then determine the increased second evaluation benchmark score as the second evaluation score (for example, 1100 points) corresponding to the evaluation object.
For another example, the first score mapping policy indicates: and if the object hit times of the evaluation object are reduced once on the basis of the reference hit times, reducing the second evaluation score corresponding to the evaluation object by 100 points on the basis of the second evaluation reference score. For example, when the number of object hits of the evaluation object is 3, the user terminal may decrease the number of hits by 300 points based on the second evaluation benchmark score (for example, 1000 points), and then determine the decreased second evaluation benchmark score as the second evaluation score (for example, 800 points) corresponding to the evaluation object.
For ease of understanding, further, please refer to fig. 9, fig. 9 is an interface switching schematic diagram of a second evaluation result display interface according to an embodiment of the present application. As shown in fig. 9, in the embodiment of the present application, the evaluation game interface of the game sub-client (for example, the second game sub-client) corresponding to the target evaluation control may be an evaluation game interface 920J.
It should be appreciated that a user terminal running with an evaluation client may, upon determining an evaluation score for a terminal performance parameter (e.g., object behavior attribute) corresponding to a target evaluation control, obtain an evaluation cutoff condition associated with the target evaluation control, which may be a second evaluation cutoff condition associated with an object hit attribute, which may include a second evaluation threshold (e.g., 0). Because the second updated value corresponding to the evaluation auxiliary parameter 9F displayed on the evaluation game interface 920J is 0, the user terminal may determine that the second updated value matches the second evaluation threshold, and then the user terminal may determine that the second evaluation prompt information meets the second evaluation cutoff condition. At this time, the user terminal may switch the terminal interface of the user terminal from the evaluation game interface 920J to the evaluation result display interface of the second game sub-client (i.e., the second evaluation result display interface, for example, the evaluation result display interface 930J shown in fig. 9).
Further, the user terminal may obtain an evaluation score (e.g., a second evaluation score) determined by the object hit attribute. It will be appreciated that the user terminal is shown from the display area 9Q in the evaluation game interface 920J 1 The number of object hits associated with the evaluation object (e.g., 6) is obtained, and a second evaluation score corresponding to the evaluation object may be determined based on the number of object hits.
Wherein the user terminal may obtain a second evaluation benchmark score (e.g., 1000 points) associated with the object hit attribute and a benchmark number of hits (e.g., 5) corresponding to the second evaluation benchmark score. At the same time, the user terminal may also obtain a first score mapping policy associated with the object hit attribute. Further, the user terminal may obtain a second difference (e.g., 1) between the reference hit number and the object hit number, since the first score mapping policy indicates: if the object hit number of the evaluation object is increased once on the basis of the reference hit number, the second evaluation score corresponding to the evaluation object is increased by 100 points on the basis of the second evaluation reference score. Based on this, the user terminal may determine a second evaluation score (e.g., 1100 points) corresponding to the evaluation object based on the second difference, the second score mapping policy, and the second evaluation benchmark score.
At this time, the user terminal may display the display area 9Q in the evaluation result display interface 930J shown in fig. 9 2 And displaying a second evaluation score determined by the object hit attribute, and taking the second evaluation score as an evaluation score determined by the terminal performance parameter corresponding to the target evaluation control. For example, the user terminal may generate text information corresponding to the evaluation result based on the second evaluation score, and may further display the text information on the evaluation result display interface 930J shown in fig. 9. For example, the text information 90S shown in fig. 9 may be: "I achieved 1100 points in the random target test, defeating 98% of the players". The evaluation result display interface 930J may further include a control 91K (e.g., a page close control) and a control 92K (e.g., a result sharing control) shown in fig. 9. The control 91K may be used to indicate that the evaluation object returns to the evaluation main interface of the evaluation client. The control 92K may be used to instruct the evaluation object to share the evaluation data in the evaluation result display interface 930J to other sharing public platforms.
And step S203, when N evaluation scores are obtained, determining the total evaluation score corresponding to the user terminal running with the evaluation client based on the N evaluation scores.
Specifically, when the user terminal obtains N evaluation scores, the user terminal may determine the N evaluation scores as evaluation scores associated with a second evaluation type (for example, an operation performance test), and may further perform summation processing on the N evaluation scores, and determine a score obtained after the summation processing as a second-type evaluation total score corresponding to the second evaluation type. Meanwhile, the user terminal may further obtain an evaluation score associated with a first evaluation type (e.g., a game experience test), and further determine the obtained evaluation score as a first-type evaluation total score corresponding to the first evaluation type. At this time, the user terminal may determine the total evaluation score corresponding to the user terminal based on the total evaluation score of the first type and the total evaluation score of the second type.
For example, if the total score of the first type of evaluation obtained by the user terminal may be 8900, the total score of the second type of evaluation determined based on the N evaluation scores may be 2300 (e.g., the display area 6Q shown in fig. 6 2 1200 in (a) and a display area 9Q shown in fig. 9 2 As determined by 1100 in the above) the total score for the evaluation corresponding to the user terminal may be a score (e.g., 11200) obtained by summing the total score for the first type of evaluation and the total score for the second type of evaluation.
Step S204, a terminal list associated with the evaluation total is obtained, a performance display interface of the evaluation client is displayed, and the terminal list is displayed on the performance display interface.
It should be understood that if the user terminal (e.g., the user terminal 1) has a function of generating a terminal list, the user terminal 1 may send a service acquisition request (e.g., an evaluated terminal score acquisition request) to a server (e.g., the server 10W shown in fig. 1) corresponding to the evaluation client. The evaluated terminal score obtaining request may be used to request the server 10W to return the total score of the evaluation corresponding to the other evaluated user terminals. It may be appreciated that the evaluated terminal score acquisition request may include the terminal identifier of the user terminal 1 and the total score of the current evaluation corresponding to the user terminal 1. The terminal identifier here may include a device identifier (for example, unique Device Identifier, abbreviated as device ID) of the user terminal 1, and an internet protocol address (for example, internet Protocol Address, abbreviated as IP address) of the user terminal 1.
The server corresponding to the evaluation client may be configured to store the evaluation score sent by each user terminal. The server can acquire evaluation screening conditions when receiving a evaluated terminal score acquisition request. Wherein the evaluation screening condition herein may be used to indicate a threshold number of evaluations (e.g., 2) per user terminal during an evaluation period (e.g., one day), which means that the number of evaluations per user terminal does not exceed 2 times per day. Further, the server can perform authentication processing on the user terminal 1 based on the terminal identification and the evaluation screening condition of the user terminal 1 to obtain an authentication result, so that malicious swiping of an evaluation user on a certain user terminal can be effectively avoided.
For example, the server may search for the number of evaluations performed by the user terminal 1 in the present evaluation period based on the terminal identifier of the user terminal 1, and if the server searches that the number of evaluations of the user terminal 1 is greater than or equal to the threshold of the number of evaluations in the evaluation screening condition, at this time, the server may determine that the user terminal 1 does not meet the evaluation screening condition, and may further determine that the user terminal 1 is an illegally evaluated user terminal, thereby obtaining an authentication result for indicating authentication failure. Based on this, the server may not store the evaluation score transmitted by the user terminal 1 at this time, and at the same time, the server may generate an abnormal evaluation prompt message for returning to the user terminal 1 based on the authentication result. The abnormal evaluation prompt message may be "you have exceeded the number of evaluations defined by the current evaluation period, and the current evaluation is invalid. When the user terminal 1 receives the abnormal evaluation prompt information returned by the server, the abnormal evaluation prompt information can be displayed on the performance display interface.
Optionally, if the server finds that the number of the evaluations of the user terminal 1 is smaller than the threshold value of the number of the evaluations in the evaluation screening condition, at this time, the server may determine that the user terminal 1 meets the evaluation screening condition, and further may determine that the user terminal 1 is a legal evaluation user terminal, so as to obtain an authentication result for indicating that the authentication is successful, at this time, the server may store the evaluation score sent by the user terminal 1 at this time, and at the same time, the server may query the total evaluation scores corresponding to other evaluated user terminals based on the authentication result, and further may return the found total evaluation score to the user terminal 1. At this time, the user terminal 1 may generate a terminal list associated with the evaluation total based on the received evaluation total score corresponding to the other evaluated user terminals and the evaluation total score corresponding to the current evaluation of the user terminal 1. Further, the user terminal 1 may display a performance display interface of the evaluation client on a terminal interface of the user terminal 1, and display the terminal list on the performance display interface.
It should be understood that if the user terminal (e.g., the user terminal 2) does not have the function of generating the terminal list, the user terminal 2 may send a service acquisition request (e.g., a terminal list acquisition request) to a server (e.g., the server 10W shown in fig. 1) corresponding to the evaluation client. The terminal list obtaining request may be for requesting the server 10W to return a terminal list associated with the current evaluation. It may be appreciated that the terminal list obtaining request may include the terminal identifier of the user terminal 2 and the total score of the current evaluation corresponding to the user terminal 2. The terminal identification here may include, among other things, the device identification of the user terminal 2 and the internet protocol address of the user terminal 2.
The server corresponding to the evaluation client may be configured to store the evaluation score sent by each user terminal. The server can acquire evaluation screening conditions when receiving a evaluated terminal score acquisition request. Wherein the evaluation screening condition herein may be used to indicate a threshold number of evaluations (e.g., 2) per user terminal during an evaluation period (e.g., one day), which means that the number of evaluations per user terminal does not exceed 2 times per day. Further, the server may perform authentication processing on the user terminal 1 based on the terminal identifier of the user terminal 1 and the evaluation screening condition, to obtain an authentication result.
When the authentication result indicates authentication failure, the server may generate an abnormal evaluation prompt message for returning to the user terminal 2. The abnormal evaluation prompt message may be "you have exceeded the number of evaluations defined by the current evaluation period, and the current evaluation is invalid. When the user terminal 2 receives the abnormal evaluation prompt information returned by the server, the abnormal evaluation prompt information can be displayed on the performance display interface. Optionally, when the authentication result indicates that the authentication is successful, the server may generate a terminal list associated with the evaluation total based on the evaluation total score corresponding to the other evaluated user terminals and the evaluation total score corresponding to the current evaluation of the user terminal 2, and further may directly return the terminal list to the user terminal 2, so that the user terminal 2 displays a performance display interface of the evaluation client on a terminal interface of the user terminal 2, and displays the terminal list on the performance display interface.
For ease of understanding, further, please refer to fig. 10, fig. 10 is an interface schematic diagram of a performance display interface according to an embodiment of the present application. As shown in FIG. 10, the performance display interface in embodiments of the application may include performance display interface 1040J 1 And performance display interface 1040J 2 . Wherein, performance display interface 1040J 1 May be used to display a first list of terminals associated with each terminal type, performance presentation interface 1040J 2 May be used to display a second list of terminals associated with each user terminal of the terminal type to which the user terminal belongs.
The terminal identifier of the user terminal used by the evaluation object (for example, the user 100A) running on the evaluation client corresponds to the evaluation client is a terminal identifier 1, and the total evaluation score obtained by the current evaluation of the user 100A may be 11000, and the terminal type to which the user terminal belongs may be terminal type 1.
It should be appreciated that the performance display interface 1040J 1 The evaluation object (for example, the user 100A) corresponding to the evaluation client may be directly switched from the evaluation result display interface (for example, the evaluation result display interface 630J shown in the above-mentioned fig. 6 or the evaluation result display interface 930J shown in the above-mentioned fig. 9) of the game sub-client corresponding to the nth evaluation control after the evaluation of the game sub-client corresponding to the nth evaluation control is completed. Optionally, the performance display interface 1040J 1 The user 100A may also be directly switched by the evaluation main interface (e.g., the evaluation main interface 210J shown in fig. 2) after performing a trigger operation for a display area (e.g., the display area 2Q shown in fig. 2) for displaying the evaluation total score in the evaluation main interface for the evaluation client. It will not be limited here.
Wherein the performance isDisplay interface 1040J 1 The displayed first terminal list may be arranged according to average total evaluation scores corresponding to the terminal types, and the first terminal list may include ranking, terminal types and average total evaluation scores. Wherein the average evaluation total score corresponding to each terminal type is determined according to the average value of the evaluated terminals under the respective terminal type. It should be appreciated that to facilitate the user 100A being able to quickly learn the ranking of the terminal types to which the user terminal is assigned for use by the user, the user terminal may be presented at the performance presentation interface 1040J 1 Is shown in the display area 10Q of (2) 1 The ranking of the terminal types to which the user terminals used by themselves belong is highlighted (e.g., 1 st, and the average evaluation of terminal type 1 is totally divided into 12900). Wherein the performance display interface 1040J 1 Control 1010K (e.g., a page return control) and control 1020K (e.g., a result sharing control) may be included.
For display area 10Q at user 100A 1 When performing a triggering operation (e.g., a clicking operation), the user terminal may respond to the triggering operation to cause the terminal interface of the user terminal to be represented by a performance presentation interface 1040J 1 Switch to Performance display interface 1040J 2 And further can be displayed at the performance display interface 1040J 2 And displaying the second terminal list. The second terminal list may be arranged according to evaluation total scores corresponding to the evaluated terminals in the terminal type 1, and the second terminal list may include a ranking, a terminal identifier and the evaluation total scores. It should be appreciated that to facilitate the user 100A being able to quickly learn the ranking of its own user terminals in the corresponding terminal type (e.g., terminal type 1), the user terminals may be in the performance presentation interface 1040J 1 Is shown in the display area 10Q of (2) 2 The ranking of the user terminals used by themselves is highlighted (e.g., 99 th, and the total score for the user terminals' corresponding evaluations is 11000 points). Wherein the performance display interface 1040J 2 A control 1030K (e.g., a result sharing control) may be included.
Further, in the process of evaluating the user terminal running with the evaluation client, the evaluation object is required to participate in evaluation so as to obtain an evaluation result with more user characteristics, so that the embodiment of the application can obviously enhance the participation interest in the evaluation process, so that the evaluation result is more in line with the game experience of the evaluation object, and further the social sharing desire of the evaluation object can be improved.
The to-be-shared interface of the game sub-client corresponding to the target evaluation control can comprise a result sharing control and evaluation data associated with the evaluation score, and the to-be-shared interface can comprise an evaluation result display interface and a performance display interface. When the evaluation object executes a trigger operation (i.e., a third trigger operation) for the result sharing control in the interface to be shared, the user terminal may respond to the third trigger operation, and a sharing sub-interface independent of the interface to be shared is displayed on a terminal interface of the user terminal, where the sharing sub-interface may include Z sharing selection controls, and Z is a positive integer. The sharing sub-interface can be an interface overlapped on the interface to be shared, and the size of the sharing sub-interface is smaller than that of the interface to be shared.
Further, the evaluation object can execute a fourth triggering operation for a target sharing selection control in the Z sharing selection controls, so that an interface to be distributed of the sharing public platform corresponding to the target sharing selection control can be displayed, and evaluation data on the interface to be distributed is displayed on the interface to be distributed; the interface to be published may include a publication control. It should be understood that, when the evaluation object executes the fifth trigger operation for the publishing control, the user terminal may respond to the fifth trigger operation, so that the sharing public platform corresponding to the target sharing control publishes the evaluation data.
Further, referring to fig. 11, fig. 11 is a schematic view of a scenario for sharing evaluation data according to an embodiment of the present application. As shown in fig. 11, the interface to be shared of the evaluation client in the embodiment of the present application may include an evaluation result display interface and a performance display interface. The evaluation result display interface may be the evaluation result display interface 630J shown in fig. 6 and the evaluation result display interface 930J shown in fig. 9, which are not limited herein. The performance is thatThe presentation interface may be the performance presentation interface 1040J shown in fig. 10 and described above 1 The performance display interface 1040J shown in fig. 10 may be the above 2 . For convenience of explanation, the interface to be shared in the embodiment of the present application may be exemplified by an evaluation result display interface (for example, the evaluation result display interface 1140J shown in fig. 11).
As shown in fig. 11, the evaluation result display interface 1140J may include a result sharing control (e.g., the result sharing control 111K) and an evaluation data (e.g., the evaluation data 11S) associated with the evaluation score, where when the evaluation object (e.g., the user 11A) performs a trigger operation (i.e., a third trigger operation) on the result sharing control 111K in the evaluation result display interface 1140J, the user terminal may display a sharing sub-interface (e.g., the sharing sub-interface 1150J shown in fig. 11) independent of the evaluation result display interface 1140J on a terminal interface of the user terminal in response to the third trigger operation, where the sharing sub-interface 1150J may include Z sharing selection controls, where Z is a positive integer. Here, 3 may be taken as an example, and the 3 sharing selection controls may include a sharing selection control 11K 1 Sharing selection control 11K 2 Sharing selection control 11K 3 . One sharing selection control corresponds to one sharing common platform, the sharing sub-interface 1150J may be an interface superimposed on the evaluation result display interface 1140J, and the size of the sharing sub-interface 1150J is smaller than the size of the evaluation result display interface 1140J.
Further, the evaluation object may be directed to a target sharing selection control of the Z sharing selection controls (e.g., sharing selection control 11K 3 ) Executing a fourth trigger operation to display the sharing selection control 11K 3 The corresponding interface to be distributed (for example, interface to be distributed 1160J shown in fig. 11) of the sharing public platform, and the evaluation data 11S on the evaluation result display interface 1140J is displayed on the interface to be distributed 1160J. The to-be-published interface 1160J may include a publication control (e.g., publication control 112K, shown in fig. 11, a "publication" control). It should be appreciated that the user terminal may respond to the fifth trigger operation performed by the evaluation object for the release control 112KFive triggers to cause the sharing selection control 11K 3 The corresponding shared public platform publishes the profile 11S. At this point, the user terminal may switch the to-be-deployed interface 1160J to the evaluation main interface (e.g., the evaluation main interface 110J) of the evaluation client. Optionally, the user terminal may also switch the interface to be distributed 1160J to the sharing selection control 11K 3 The corresponding display interface of the shared common platform will not be limited here to the terminal interface displayed after the release of the evaluation data 11S.
In the embodiment of the application, the evaluation main interface of the evaluation client operated by the user terminal can comprise N evaluation controls for evaluating the performance parameters of the terminal, and one evaluation control corresponds to one game sub-client, so that in the process of evaluating the user terminal, the evaluation client is not operated automatically by the user terminal, but the evaluation user is required to interact with the evaluation client, namely, the evaluation user can flexibly select any game sub-client corresponding to one evaluation control (namely, a target evaluation control) from the N evaluation controls, so that the attractiveness and the interestingness in the performance evaluation process are enhanced. In addition, after the evaluation user evaluates the game sub-client corresponding to each evaluation control in the N evaluation controls, the evaluation user can obtain the evaluation total score corresponding to the user terminal, and further can obtain a terminal list based on user participation, so that a personalized evaluation result with user characteristics is obtained, and the accuracy of performance evaluation is improved.
Further, referring to fig. 12, fig. 12 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. The data processing apparatus 1 may be a computer program (comprising program code) running in a computer device, for example, the data processing apparatus 1 is an application software; the data processing device 1 may be adapted to perform the respective steps of the method provided by the embodiments of the application. As shown in fig. 12, the data processing apparatus 1 may be implemented in a computer device having an evaluation client, where the computer device may be a user terminal that is implemented with an evaluation client, for example, the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100a. The data processing apparatus 1 may include: the system comprises a main interface display module 11, a first picture frame display module 12, a second picture frame display module 13, an evaluation result display module 14, an update value acquisition module 15, an evaluation cut-off condition acquisition module 16, a result display interface display module 17, an evaluation score display module 18, a sharing sub-interface display module 19, an interface to be distributed display module 20 and an evaluation data distribution module 21.
The main interface display module 11 is used for displaying N evaluation controls on an evaluation main interface of the evaluation client; one evaluation control corresponds to one game sub-client; n is a positive integer;
the first frame display module 12 is configured to respond to a first trigger operation for a target evaluation control in the N evaluation controls, display an evaluation game interface of a game sub-client corresponding to the target evaluation control, and display a first game frame on the evaluation game interface; the first game picture frame comprises first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute a second trigger operation;
the second frame display module 13 is configured to display a second game frame corresponding to the first game frame on the evaluation game interface in response to the second trigger operation, and switch the evaluation prompt information from the first evaluation prompt information to the second evaluation prompt information in the second game frame.
The first game picture frame comprises an object to be processed with a fixed display position, and the service state of the object to be processed is a first state; the first evaluation prompt information comprises state prompt information and a first evaluation auxiliary parameter with a first initial value;
The second picture frame display module 13 includes: the display device includes a prompt information hiding unit 131, a hit animation display unit 132, a decrement processing unit 133, an aiming area display position switching unit 134, a picture frame display unit 135, and a change processing unit 136.
The prompt information hiding unit 131 is configured to hide the state prompt information when the display duration of the state prompt information reaches a state display duration threshold, and change a service state of an object to be processed from a first state to a second state on the evaluation game interface;
the hit animation display unit 132 is configured to display a hit animation of the object to be processed on the evaluation game interface in response to a second trigger operation for the evaluation game interface, and determine a second game frame corresponding to the first game frame based on the game frame corresponding to the hit animation.
Wherein the object to be processed is a shooting object; the game sub-clients corresponding to the target evaluation control comprise first game sub-clients; the first game sub-client is used for evaluating the object behavior attribute of the evaluation object;
the hit animation display unit 132 includes: a touch event capture sub-unit 1321, a picture frame generation sub-unit 1322, and a first display sub-unit 1323.
The touch event capturing subunit 1321 is configured to respond to a second trigger operation for the evaluating game interface by running a touch chip of the user terminal with the evaluating client, and capture a first touch event associated with the second trigger operation;
the touch event capturing subunit 1321 is further configured to:
receiving a second trigger operation aiming at the evaluating game interface through a touch control chip of the user terminal running with the evaluating client, and scanning the screen level of the user terminal based on the second trigger operation;
recording operation parameters related to a second triggering operation when the touch chip scans that the screen level changes;
a first touch event associated with a second trigger operation is captured based on the operating parameter.
The frame generation subunit 1322 is configured to send the first touch event to a system driver of the user terminal, and transmit the first touch event to the first game sub-client through the system driver; the first game sub-client is used for processing the first touch event and generating a game picture frame corresponding to the hit animation of the object to be processed;
the first display subunit 1323 is configured to determine, based on the game frame corresponding to the hit animation, a second game frame corresponding to the first game frame, and display the second game frame on the evaluation game interface.
The specific implementation manner of the touch event capturing subunit 1321, the frame generating subunit 1322 and the first display subunit 1323 may be referred to the description of step S401 to step S412 in the embodiment corresponding to fig. 4, and will not be further described herein.
The decremental processing unit 133 is configured to perform decremental processing on the first initial value in the second game frame, and use a first evaluation auxiliary parameter corresponding to the decremented first initial value as the second evaluation prompt information.
The first evaluation prompt information comprises a second evaluation auxiliary parameter with a second initial value; the evaluating game interface comprises a position operation control, a business operation control and an auxiliary aiming area at a first display position;
the aiming area display position switching unit 134 is configured to switch, when X objects to be processed having random display positions are displayed on the evaluation game interface, the display position of the auxiliary aiming area from the first display position to the second display position in response to a screen-scribing operation for the position operation control; the second display position is determined based on the screen-scribing operation; x is a positive integer;
the frame display unit 135 is configured to display a second game frame corresponding to the first game frame on the evaluation game interface in response to a second trigger operation for the service operation control;
The game sub-client corresponding to the target evaluation control comprises a second game sub-client; the second game sub-client is used for evaluating the object hit attribute of the evaluation object;
the picture frame display unit 135 includes: a touch event acquisition subunit 1351, a display location matching subunit 1352, and a second display subunit 1353.
The touch event obtaining subunit 1351 is configured to obtain, in response to a second trigger operation for the service operation control, a second touch event associated with the second trigger operation through the second game sub-client; the second touch event comprises an auxiliary aiming area with a second display position;
the display position matching subunit 1352 is configured to match the second display position with the display position of each object to be processed in the X objects to obtain a matching result; the display duration of each object to be processed in the X objects to be processed is the same;
the second display subunit 1353 is configured to generate a second game frame corresponding to the first game frame based on the matching result, and display the second game frame on the evaluation game interface.
Wherein the second display subunit 1353 is further configured to:
If the matching result indicates that the second display position is matched with the display position of the target processing object in the X objects to be processed, determining a hit time stamp associated with a second trigger operation through the second game sub-client;
determining a display cut-off time stamp of the target processing object based on the display duration of the target processing object and the display start time stamp of the target processing object;
generating a first type game screen frame for representing successful hit when the display cut-off timestamp is greater than or equal to the hit timestamp;
and determining a second game picture frame corresponding to the first game picture frame based on the first type game picture frame, and displaying the second game picture frame in the evaluating game interface.
Wherein the second display subunit 1353 is further configured to:
if the matching result indicates that the second display position is not matched with the display position of each object to be processed in the X objects to be processed, generating a second type game picture frame for representing hit failure through the second game sub-client;
and determining a second game picture frame corresponding to the first game picture frame based on the second type game picture frame, and displaying the second game picture frame in the evaluating game interface.
The specific implementation manner of the touch event obtaining subunit 1351, the display position matching subunit 1352, and the second display subunit 1353 may refer to the description of the second game frame in the embodiment corresponding to fig. 3, and will not be further described herein.
The change processing unit 136 is configured to perform a change process on the second initial value in the second game frame, and use a second evaluation auxiliary parameter corresponding to the changed second initial value as a second evaluation prompt message.
The specific implementation manner of the prompt message hiding unit 131, the hit animation display unit 132, the decrement processing unit 133, the aiming area display position switching unit 134, the frame display unit 135 and the change processing unit 136 may be referred to the description of step S103 in the embodiment corresponding to fig. 3, and the detailed description will not be repeated here.
The evaluation result display module 14 is configured to display an evaluation result associated with the evaluation object on the evaluation game interface.
Wherein the first game picture frame comprises an object to be processed having a first state; the second game picture frame comprises an object to be processed with a second state; the second state is a service state obtained after the state of the first state is changed;
The evaluation result display module 14 includes: the state resetting unit 141 and the evaluation result display unit 142.
The state resetting unit 141 is configured to perform state resetting on the object to be processed having the second state, and display the object to be processed after the state resetting on the evaluating game interface; the business state of the object to be processed after the state reset is a first state;
the evaluation result display unit 142 is configured to, when an evaluation result associated with an evaluation target is obtained, display a target display area having a result display duration threshold on the evaluation game interface, and display an evaluation result on the target display area.
Wherein the evaluation result display unit 142 includes: a change timestamp acquisition subunit 1421, a hit timestamp recording subunit 1422, an object interaction duration determination subunit 1423, and an evaluation result display subunit 1424.
The change timestamp obtaining subunit 1421 is configured to obtain a timestamp for performing a state change on the object to be processed in the first game frame, and use the obtained timestamp as a state change timestamp;
the hit timestamp recording subunit 1422 is configured to record a generation timestamp of the second game screen frame, and use the generation timestamp as a hit timestamp associated with the second trigger operation;
The object interaction duration determining subunit 1423 is configured to determine a time difference between the hit timestamp and the state change timestamp, take the time difference as an object interaction duration of the evaluation object on the user terminal running the evaluation client, and determine an object behavior attribute associated with the evaluation object based on the object interaction duration;
the evaluation result display subunit 1424 is configured to determine a target display area with a result display duration threshold on the evaluation game interface, and display an evaluation result including the object behavior attribute on the target display area.
The specific implementation manner of the change timestamp obtaining subunit 1421, the hit timestamp recording subunit 1422, the object interaction duration determining subunit 1423, and the evaluation result displaying subunit 1424 may be referred to the description of the evaluation result in the embodiment corresponding to fig. 3, and will not be further described herein.
The specific implementation manner of the state resetting unit 141 and the evaluation result displaying unit 142 may refer to the description of step S104 in the embodiment corresponding to fig. 3, and the detailed description will not be repeated here.
The update value obtaining module 15 is configured to obtain an update value corresponding to the evaluation auxiliary parameter in the second evaluation prompt message; the updated value is determined after the initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt message is changed;
The evaluation cut-off condition acquisition module 16 is used for acquiring an evaluation cut-off condition associated with a target evaluation control; the evaluation cutoff condition comprises an evaluation threshold;
the result display interface display module 17 is configured to determine that the second evaluation prompt information meets the evaluation cutoff condition if the update value matches the evaluation threshold value, and display an evaluation result display interface of the game sub-client corresponding to the target evaluation control;
the evaluation score display module 18 is configured to display, on an evaluation result display interface, an evaluation score determined by a terminal performance parameter corresponding to the target evaluation control.
The terminal performance parameters corresponding to the target evaluation control comprise object behavior attributes; the evaluation result display interface comprises a first evaluation result display interface for displaying the behavior attribute of the object; the object behavior attribute comprises an object interaction time length used for representing an evaluation object on a user terminal running with an evaluation client;
the evaluation score display module 18 includes: an interaction time length acquisition unit 181, a first evaluation score determination unit 182, a first evaluation score display unit 183, an object hit number acquisition unit 184, a reference hit number acquisition unit 185, a score policy acquisition unit 186, a second evaluation score determination unit 187, and a second evaluation score display unit 188.
The interaction duration obtaining unit 181 is configured to obtain X object interaction durations associated with an evaluation object; x is an initial value corresponding to an evaluation auxiliary parameter in the first evaluation prompt message; x is a positive integer;
the first evaluation score determining unit 182 is configured to obtain an object interaction duration with a minimum value from the X object interaction durations, and determine a first evaluation score corresponding to the evaluation object based on the minimum interaction duration by using the obtained object interaction duration as the minimum interaction duration.
Wherein the first evaluation score determining unit 182 includes: a minimum duration determination subunit 1821, a benchmark interaction duration acquisition subunit 1822, a policy acquisition subunit 1823, and an evaluation score determination subunit 1824.
The minimum duration determination subunit 1821 is configured to obtain an object interaction duration with a minimum value from the X object interaction durations, and take the obtained object interaction duration as a minimum interaction duration;
the reference interaction time length obtaining subunit 1822 is configured to obtain a first evaluation reference score associated with the object behavior attribute and a reference interaction time length corresponding to the first evaluation reference score; the reference interaction time length is an average interaction time length obtained after the Y sample objects are evaluated for the same user terminal; y is a positive integer;
The policy acquisition subunit 1823 is configured to acquire a first score mapping policy associated with the object behavior attribute;
the evaluation score determining subunit 1824 is configured to obtain a first difference between the reference interaction duration and the minimum interaction duration, and determine a first evaluation score corresponding to the evaluation object based on the first difference, the first score mapping policy, and the first evaluation reference score.
The specific implementation manner of the minimum duration determining subunit 1821, the reference interaction duration obtaining subunit 1822, the policy obtaining subunit 1823, and the evaluation score determining subunit 1824 may be referred to the description of the first evaluation score in the embodiment corresponding to fig. 6, and will not be further described herein.
The first evaluation score display unit 183 is configured to display a first evaluation score on a first evaluation result display interface, and use the first evaluation score as an evaluation score determined by a terminal performance parameter corresponding to a target evaluation control.
The terminal performance parameters corresponding to the target evaluation control comprise object hit attributes; the evaluation result display interface comprises a second evaluation result display interface for displaying object hit attributes; the object hit attribute comprises object hit times used for representing the evaluation object on a user terminal running with an evaluation client;
The object hit number obtaining unit 184 is configured to obtain, from the evaluation game interface, the object hit number associated with the evaluation object;
the reference hit number acquisition unit 185 is configured to acquire a second evaluation reference score associated with the object hit attribute and a reference hit number corresponding to the second evaluation reference score; the reference hit number is the average hit number obtained after the Y sample objects are evaluated for the same user terminal; y is a positive integer;
the score policy obtaining unit 186 is configured to obtain a second score mapping policy associated with the object hit attribute;
the second evaluation score determining unit 187 is configured to obtain a second difference between the reference hit number and the target hit number, and determine a second evaluation score corresponding to the evaluation target based on the second difference, the second score mapping policy, and the second evaluation reference score;
the second evaluation score display unit 188 is configured to display a second evaluation score on a second evaluation result display interface, and use the second evaluation score as an evaluation score determined by a terminal performance parameter corresponding to the target evaluation control.
The specific implementation manners of the interaction time length obtaining unit 181, the first evaluation score determining unit 182, the first evaluation score displaying unit 183, the object hit number obtaining unit 184, the reference hit number obtaining unit 185, the score policy obtaining unit 186, the second evaluation score determining unit 187 and the second evaluation score displaying unit 188 may be referred to the description of the second evaluation score in the embodiment corresponding to fig. 9, and will not be described further herein.
The evaluation result display interface comprises a result sharing control and evaluation data associated with an evaluation score;
the sharing sub-interface display module 19 is configured to display a sharing sub-interface independent of the evaluation result display interface in response to a third trigger operation for the result sharing control; the sharing sub-interface comprises Z sharing selection controls; z is a positive integer; one sharing selection control corresponds to one sharing public platform; the sharing sub-interface is an interface overlapped on the evaluating result display interface, and the size of the sharing sub-interface is smaller than that of the evaluating result display interface;
the interface to be distributed display module 20 is configured to respond to a fourth triggering operation for a target sharing selection control in the Z sharing selection controls, display an interface to be distributed of a sharing public platform corresponding to the target sharing selection control, and display evaluation data on the interface to be distributed; the interface to be published comprises a publishing control;
the profile publishing module 21 is configured to respond to a fifth trigger operation for a publishing control, so that the profile is published by a sharing public platform corresponding to the target sharing control.
The specific implementation manner of the main interface display module 11, the first frame display module 12, the second frame display module 13, the evaluation result display module 14, the updated value acquisition module 15, the evaluation cutoff condition acquisition module 16, the result display interface display module 17, the evaluation score display module 18, the sharing sub-interface display module 19, the interface display module to be distributed 20 and the evaluation data distribution module 21 may be referred to the description of the steps S201 to S204 in the embodiment corresponding to fig. 7, and will not be described further herein. In addition, the description of the beneficial effects of the same method is omitted.
Further, referring to fig. 13, fig. 13 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. The data processing means 2 may be a computer program (comprising program code) running in a computer device, for example the data processing means 2 is an application software; the data processing device 2 may be adapted to perform the respective steps of the method provided by the embodiments of the application. As shown in fig. 13, the data processing apparatus 2 may be implemented in a computer device having an evaluation client, where the computer device may be a user terminal that is implemented with an evaluation client, for example, the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100a. The data processing apparatus 2 may include: the system comprises an evaluation control display module 100, a result display interface display module 200, an evaluation total score determining module 300 and a performance display interface display module 400.
The evaluation control display module 100 is configured to display N evaluation controls on an evaluation main interface of an evaluation client; one evaluation control corresponds to one game sub-client; n is a positive integer;
the result display interface display module 200 is configured to output, in response to an evaluation operation for a target evaluation control of the N evaluation controls, an evaluation result display interface associated with the target evaluation control in an evaluation client; the evaluation result display interface comprises evaluation scores determined by terminal performance parameters corresponding to target evaluation controls;
The evaluation total score determining module 300 is configured to determine, when N evaluation scores are obtained, an evaluation total score corresponding to a user terminal running with an evaluation client based on the N evaluation scores;
the performance display interface display module 400 is configured to obtain a terminal list associated with the evaluation total, display a performance display interface of the evaluation client, and display the terminal list on the performance display interface.
The specific implementation manner of the evaluation control display module 100, the result display interface display module 200, the evaluation total score determining module 300 and the performance display interface display module 400 may be referred to the description of step S201 to step S204 in the embodiment corresponding to fig. 7, and will not be described in detail here. In addition, the description of the beneficial effects of the same method is omitted.
Further, referring to fig. 14, fig. 14 is a schematic diagram of a computer device according to an embodiment of the application. As shown in fig. 14, the computer device 3000 may be a computer device having an evaluation client, and the computer device may be a user terminal running with the evaluation client, for example, the user terminal may be any one of the user terminals in the user terminal cluster shown in fig. 1, for example, the user terminal 100a. The computer device 3000 may include: at least one processor 3001, e.g., a CPU, at least one network interface 3004, a user interface 3003, memory 3005, at least one communication bus 3002. Wherein the communication bus 3002 is used to enable connected communications between these components. The user interface 3003 may include a Display screen (Display), a Keyboard (Keyboard), and the network interface 3004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others. The memory 3005 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 3005 may also optionally be at least one memory device located remotely from the aforementioned processor 3001. As shown in fig. 14, the memory 3005, which is one type of computer storage medium, may include an operating system, a network communication module, a user interface module, and a device control application program.
In the computer device 3000 shown in fig. 14, the network interface 3004 is mainly used for network communication with a server corresponding to an evaluation client (for example, a server 10W shown in fig. 1); while the user interface 3003 is primarily used as an interface for providing input to a user; and the processor 3001 may be used to invoke device control applications stored in the memory 3005.
It should be understood that the computer device 3000 described in the embodiment of the present application may perform the description of the data processing method in the embodiment corresponding to fig. 3 or fig. 7, and may also perform the description of the data processing apparatus 1 in the embodiment corresponding to fig. 12 or the description of the data processing apparatus 2 in the embodiment corresponding to fig. 13, which are not repeated herein. In addition, the description of the beneficial effects of the same method is omitted.
Furthermore, it should be noted here that: the embodiment of the present application further provides a computer readable storage medium, in which the aforementioned computer program executed by the data processing apparatus 1 or the data processing apparatus 2 is stored, and the computer program includes program instructions, when executed by the processor, can execute the description of the data processing method in the embodiment corresponding to fig. 3 or fig. 7, and therefore, a description will not be given here. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer-readable storage medium according to the present application, please refer to the description of the method embodiments of the present application. As an example, program instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or, alternatively, across multiple computing devices distributed across multiple sites and interconnected by a communication network, where the multiple computing devices distributed across multiple sites and interconnected by a communication network may constitute a blockchain system.
In one aspect, the application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device may execute the description of the data processing method in the embodiment corresponding to fig. 3 or fig. 7, which is not described herein. In addition, the description of the beneficial effects of the same method is omitted.
Those skilled in the art will appreciate that implementing all or part of the above-described methods may be accomplished by way of computer programs, which may be stored on a computer-readable storage medium, and which, when executed, may comprise the steps of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
The foregoing disclosure is illustrative of the present application and is not to be construed as limiting the scope of the application, which is defined by the appended claims.

Claims (20)

1. A method of data processing, comprising:
displaying N evaluation controls on an evaluation main interface of an evaluation client; one evaluation control corresponds to one game sub-client; the N is a positive integer;
responding to a first triggering operation of a target evaluation control in the N evaluation controls, displaying an evaluation game interface of a game sub-client corresponding to the target evaluation control, and displaying a first game picture frame on the evaluation game interface; the first game picture frame comprises first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute a second trigger operation;
responding to the second triggering operation, displaying a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switching the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame;
and displaying the evaluation result associated with the evaluation object on the evaluation game interface.
2. The method of claim 1, wherein the first game frame includes an object to be processed having a fixed display position, and wherein a traffic state of the object to be processed is a first state; the first evaluation prompt information comprises state prompt information and first evaluation auxiliary parameters with first initial values;
The responding to the second triggering operation displays a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switches the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame, comprising:
when the display time of the state prompt information reaches a state display time threshold, hiding the state prompt information, and changing the service state of the object to be processed from the first state to a second state on the game interface;
responding to a second triggering operation for the evaluation game interface, displaying a hit animation of the object to be processed on the evaluation game interface, and determining a second game picture frame corresponding to the first game picture frame based on the game picture frame corresponding to the hit animation;
and in the second game picture frame, the first initial value is subjected to decremental processing, and a first evaluation auxiliary parameter corresponding to the decremented first initial value is used as second evaluation prompt information.
3. The method according to claim 2, wherein the object to be processed is a shooting class object; the game sub-clients corresponding to the target evaluation control comprise first game sub-clients; the first game sub-client is used for evaluating the object behavior attribute of the evaluation object;
The step of responding to the second triggering operation for the evaluation game interface, displaying the hit animation of the object to be processed on the evaluation game interface, and determining a second game picture frame corresponding to the first game picture frame based on the game picture frame corresponding to the hit animation, wherein the step of determining comprises the following steps:
capturing a first touch event associated with a second trigger operation of the evaluation game interface by a touch chip of a user terminal running the evaluation client;
the first touch event is sent to a system driver of the user terminal, and the first touch event is transmitted to the first game sub-client through the system driver; the first game sub-client is used for processing the first touch event and generating a game picture frame corresponding to the hit animation of the object to be processed;
and determining a second game picture frame corresponding to the first game picture frame based on the game picture frame corresponding to the hit animation, and displaying the second game picture frame on the evaluation game interface.
4. The method of claim 3, wherein the capturing, by the touch chip of the user terminal running the evaluation client, the first touch event associated with the second trigger operation in response to the second trigger operation for the evaluation game interface comprises:
Receiving a second trigger operation aiming at the evaluation game interface through a touch chip of a user terminal running the evaluation client, and scanning the screen level of the user terminal based on the second trigger operation;
recording operation parameters associated with the second triggering operation when the touch chip scans that the screen level changes;
a first touch event associated with the second trigger operation is captured based on the operating parameter.
5. The method of claim 1, wherein the first evaluation cue information comprises a second evaluation assistance parameter having a second initial value; the assessment game interface comprises a position operation control, a business operation control and an auxiliary aiming area at a first display position;
the responding to the second triggering operation displays a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switches the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame, comprising:
when X objects to be processed with random display positions are displayed on the evaluation game interface, responding to a screen-drawing operation aiming at the position operation control, and switching the display position of the auxiliary aiming area from the first display position to a second display position; the second display position is determined based on the screen-swiping operation; x is a positive integer;
Responding to a second triggering operation for the business operation control, and displaying a second game picture frame corresponding to the first game picture frame on the game interface;
and in the second game picture frame, carrying out changing processing on the second initial value, and taking a second evaluation auxiliary parameter corresponding to the changed second initial value as second evaluation prompt information.
6. The method of claim 5, wherein the game sub-client corresponding to the target evaluation control comprises a second game sub-client; the second game sub-client is used for evaluating the object hit attribute of the evaluation object;
the responding to the second triggering operation for the business operation control displays a second game picture frame corresponding to the first game picture frame on the game interface, and the responding comprises the following steps:
responding to a second triggering operation for the business operation control, and acquiring a second touch event associated with the second triggering operation through the second game sub-client; the second touch event includes the auxiliary aiming area having the second display position;
respectively matching the second display position with the display position of each object to be processed in the X objects to be processed to obtain a matching result; the display duration of each object to be processed in the X objects to be processed is the same;
And generating a second game picture frame corresponding to the first game picture frame based on the matching result, and displaying the second game picture frame on the game interface.
7. The method of claim 6, wherein generating a second game screen frame corresponding to the first game screen frame based on the matching result and displaying the second game screen frame on the evaluation game interface comprises:
if the matching result indicates that the second display position is matched with the display position of the target processing object in the X objects to be processed, determining, by the second game sub-client, a hit timestamp associated with the second trigger operation;
determining a display cut-off time stamp of the target processing object based on the display duration of the target processing object and the display start time stamp of the target processing object;
generating a first type game screen frame used for representing successful hit when the display cut time stamp is greater than or equal to the hit time stamp;
and determining a second game picture frame corresponding to the first game picture frame based on the first type game picture frame, and displaying the second game picture frame in the evaluation game interface.
8. The method of claim 7, wherein the method further comprises:
if the matching result indicates that the second display position is not matched with the display position of each object to be processed in the X objects to be processed, generating a second type game picture frame for representing hit failure through the second game sub-client;
and determining a second game picture frame corresponding to the first game picture frame based on the second type game picture frame, and displaying the second game picture frame in the evaluating game interface.
9. The method of claim 1, wherein the first game screen frame includes an object to be processed having a first state; the second game picture frame comprises an object to be processed with a second state; the second state is a service state obtained after the state of the first state is changed;
the step of displaying the evaluation result associated with the evaluation object on the evaluation game interface comprises the following steps:
performing state reset on the object to be processed with the second state, and displaying the object to be processed with the state reset on the game interface to be tested; the business state of the object to be processed after the state reset is the first state;
And when the evaluation result associated with the evaluation object is acquired, displaying a target display area with a result display duration threshold on the evaluation game interface, and displaying the evaluation result on the target display area.
10. The method of claim 9, wherein upon obtaining an evaluation result associated with the evaluation object, displaying a target display area having a result display duration threshold on the evaluation game interface and displaying the evaluation result on the target display area comprises:
acquiring a time stamp for carrying out state change on an object to be processed in the first game picture frame, and taking the acquired time stamp as a state change time stamp;
recording a generation time stamp of the second game picture frame, and taking the generation time stamp as a hit time stamp associated with the second trigger operation;
determining a time difference value between the hit time stamp and the state change time stamp, taking the time difference value as an object interaction duration of the evaluation object on a user terminal running the evaluation client, and determining an object behavior attribute associated with the evaluation object based on the object interaction duration;
And determining a target display area with a result display duration threshold on the evaluation game interface, and displaying an evaluation result comprising the object behavior attribute on the target display area.
11. The method according to claim 1, wherein the method further comprises:
acquiring an updated value corresponding to the evaluation auxiliary parameter in the second evaluation prompt message; the updated value is determined after changing an initial value corresponding to the evaluation auxiliary parameter in the first evaluation prompt message;
acquiring an evaluation cutoff condition associated with the target evaluation control; the evaluation cut-off condition comprises an evaluation threshold;
if the updated value is matched with the evaluation threshold value, determining that the second evaluation prompt information meets the evaluation cut-off condition, and displaying an evaluation result display interface of the game sub-client corresponding to the target evaluation control;
and displaying the evaluation score determined by the terminal performance parameters corresponding to the target evaluation control on the evaluation result display interface.
12. The method of claim 11, wherein the terminal performance parameters corresponding to the target evaluation control include object behavior attributes; the evaluation result display interface comprises a first evaluation result display interface for displaying the object behavior attribute; the object behavior attribute comprises object interaction time length used for representing the evaluation object on a user terminal running the evaluation client;
And displaying the evaluation score determined by the terminal performance parameter corresponding to the target evaluation control on the evaluation result display interface, wherein the evaluation score comprises the following components:
acquiring X object interaction durations associated with the evaluation object; the X is an initial value corresponding to an evaluation auxiliary parameter in the first evaluation prompt message; x is a positive integer;
acquiring an object interaction duration with a minimum value from the X object interaction durations, taking the acquired object interaction duration as the minimum interaction duration, and determining a first evaluation score corresponding to the evaluation object based on the minimum interaction duration;
and displaying the first evaluation score on the first evaluation result display interface, and taking the first evaluation score as an evaluation score determined by the terminal performance parameters corresponding to the target evaluation control.
13. The method according to claim 12, wherein the obtaining the object interaction duration with the minimum value from the X object interaction durations, taking the obtained object interaction duration as the minimum interaction duration, and determining the first evaluation score corresponding to the evaluation object based on the minimum interaction duration, includes:
Acquiring an object interaction duration with a minimum value from the X object interaction durations, and taking the acquired object interaction duration as the minimum interaction duration;
acquiring a first evaluation benchmark score associated with the object behavior attribute and a benchmark interaction duration corresponding to the first evaluation benchmark score; the reference interaction time length is an average interaction time length obtained after Y sample objects are evaluated for the same user terminal; y is a positive integer;
obtaining a first score mapping policy associated with the object behavior attribute;
and acquiring a first difference value between the reference interaction time length and the minimum interaction time length, and determining a first evaluation score corresponding to the evaluation object based on the first difference value, the first score mapping strategy and the first evaluation reference score.
14. The method of claim 11, wherein the terminal performance parameters corresponding to the target evaluation control include an object hit attribute; the evaluation result display interface comprises a second evaluation result display interface for displaying the object hit attribute; the object hit attribute comprises an object hit number used for representing the evaluation object on a user terminal running the evaluation client;
And displaying the evaluation score determined by the terminal performance parameter corresponding to the target evaluation control on the evaluation result display interface, wherein the evaluation score comprises the following components:
obtaining the hit times of the object associated with the evaluation object from the evaluation game interface;
acquiring a second evaluation benchmark score associated with the object hit attribute and benchmark hit times corresponding to the second evaluation benchmark score; the reference hit times are average hit times obtained after the Y sample objects are evaluated for the same user terminal; y is a positive integer;
acquiring a second score mapping policy associated with the object hit attribute;
acquiring a second difference value between the reference hit times and the object hit times, and determining a second evaluation score corresponding to the evaluation object based on the second difference value, the second score mapping strategy and the second evaluation reference score;
and displaying the second evaluation score on the second evaluation result display interface, and taking the second evaluation score as an evaluation score determined by the terminal performance parameters corresponding to the target evaluation control.
15. The method of claim 11, wherein the scoring results presentation interface includes a results sharing control and a scoring data associated with the scoring values;
The method further comprises the steps of:
responding to a third triggering operation for the result sharing control, and displaying a sharing sub-interface independent of the evaluation result display interface; the sharing sub-interface comprises Z sharing selection controls; z is a positive integer; one sharing selection control corresponds to one sharing public platform; the sharing sub-interface is an interface overlapped on the evaluation result display interface, and the size of the sharing sub-interface is smaller than that of the evaluation result display interface;
responding to a fourth triggering operation of a target sharing selection control in the Z sharing selection controls, displaying an interface to be distributed of a sharing public platform corresponding to the target sharing selection control, and displaying the evaluation data on the interface to be distributed; the interface to be published comprises a published control;
and responding to a fifth triggering operation for the release control, so that the sharing public platform corresponding to the target sharing control releases the evaluation data.
16. A method of data processing, comprising:
displaying N evaluation controls on an evaluation main interface of an evaluation client; one evaluation control corresponds to one game sub-client; the N is a positive integer;
Responding to the evaluation operation aiming at the target evaluation control in the N evaluation controls, and outputting an evaluation result display interface associated with the target evaluation control in an evaluation client; the evaluation result display interface comprises evaluation scores determined by terminal performance parameters corresponding to the target evaluation control;
when N evaluation scores are obtained, determining evaluation total scores corresponding to the user terminals running with the evaluation clients based on the N evaluation scores;
and acquiring a terminal list associated with the evaluation total, displaying a performance display interface of the evaluation client, and displaying the terminal list on the performance display interface.
17. A data processing apparatus, comprising:
the main interface display module is used for displaying N evaluation controls on an evaluation main interface of the evaluation client; one evaluation control corresponds to one game sub-client; the N is a positive integer;
the first picture frame display module is used for responding to a first trigger operation of a target evaluation control in the N evaluation controls, displaying an evaluation game interface of a game sub-client corresponding to the target evaluation control, and displaying a first game picture frame on the evaluation game interface; the first game picture frame comprises first evaluation prompt information for indicating an evaluation object corresponding to the evaluation client to execute a second trigger operation;
The second picture frame display module is used for responding to the second trigger operation, displaying a second game picture frame corresponding to the first game picture frame on the evaluation game interface, and switching the evaluation prompt information from the first evaluation prompt information to a second evaluation prompt information in the second game picture frame;
and the evaluation result display module is used for displaying the evaluation result associated with the evaluation object on the evaluation game interface.
18. A computer device, comprising: a processor and a memory;
the processor is connected to a memory, wherein the memory is configured to store a computer program, and the processor is configured to invoke the computer program to cause the computer device to perform the method of any of claims 1-16.
19. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program adapted to be loaded and executed by a processor to cause a computer device having the processor to perform the method of any of claims 1-16.
20. A computer program product or computer program, characterized in that it comprises computer instructions stored in a computer-readable storage medium, which are adapted to be read and executed by a processor to cause a computer device with the processor to perform the method of any of claims 1-16.
CN202111191647.3A 2021-10-13 2021-10-13 Data processing method, device, equipment and storage medium Active CN113886208B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111191647.3A CN113886208B (en) 2021-10-13 2021-10-13 Data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111191647.3A CN113886208B (en) 2021-10-13 2021-10-13 Data processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113886208A CN113886208A (en) 2022-01-04
CN113886208B true CN113886208B (en) 2023-11-03

Family

ID=79002644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111191647.3A Active CN113886208B (en) 2021-10-13 2021-10-13 Data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113886208B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400519A (en) * 2013-08-08 2013-11-20 广东小天才科技有限公司 Method and device for evaluating user's operation
CN104503877A (en) * 2014-12-30 2015-04-08 北京奇虎科技有限公司 Method and device for evaluating intelligent terminal
CN106815130A (en) * 2016-12-26 2017-06-09 珠海金山网络游戏科技有限公司 A kind of method and system of the game quality grading based on mobile terminal hardware
CN109189667A (en) * 2018-08-02 2019-01-11 惠州Tcl移动通信有限公司 Fluency evaluating method and evaluating apparatus, evaluating tool, the storage device of terminal
WO2019091420A1 (en) * 2017-11-09 2019-05-16 腾讯科技(深圳)有限公司 Data display method and device, storage medium, and electronic device
CN109908574A (en) * 2019-02-22 2019-06-21 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400519A (en) * 2013-08-08 2013-11-20 广东小天才科技有限公司 Method and device for evaluating user's operation
CN104503877A (en) * 2014-12-30 2015-04-08 北京奇虎科技有限公司 Method and device for evaluating intelligent terminal
CN106815130A (en) * 2016-12-26 2017-06-09 珠海金山网络游戏科技有限公司 A kind of method and system of the game quality grading based on mobile terminal hardware
WO2019091420A1 (en) * 2017-11-09 2019-05-16 腾讯科技(深圳)有限公司 Data display method and device, storage medium, and electronic device
CN109189667A (en) * 2018-08-02 2019-01-11 惠州Tcl移动通信有限公司 Fluency evaluating method and evaluating apparatus, evaluating tool, the storage device of terminal
CN109908574A (en) * 2019-02-22 2019-06-21 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113886208A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
KR102257801B1 (en) Notification method in virtual scene, related device and computer storage medium
CN109756787B (en) Virtual gift generation method and device and virtual gift presentation system
CN110337319B (en) User analysis system and method
CN106075904A (en) The method and device of cross-platform game fighting, terminal, system
JP6492198B2 (en) Information processing method, terminal, and computer storage medium
US20170312629A1 (en) Video game control server, video game control apparatus, and video game control program product
US10874937B2 (en) Intelligent hardware interaction method and system
CN107911374B (en) Data synchronization method and device, storage medium and electronic device
CN111298430A (en) Virtual item control method and device, storage medium and electronic device
KR101264624B1 (en) Server and the method for matching an opponent using game ranking in real time
CN112337090A (en) Event message broadcasting method and device, storage medium and electronic device
US20190217203A1 (en) User analysis system and method
CN111950670A (en) Virtual interaction task execution method and device, storage medium and electronic device
CN113648650B (en) Interaction method and related device
CN105577641B (en) System and method for inviting users to participate in an activity based on interactive recording
CN114470792A (en) Team matching method, storage medium and electronic device
CN112138379B (en) Interaction method and device among different application modes and storage medium
CN113886208B (en) Data processing method, device, equipment and storage medium
CN113730921B (en) Recommendation method and device for virtual organization, storage medium and electronic equipment
CN108452528B (en) Data display method and device and computer readable storage medium
KR101565473B1 (en) Method and system for providing game
KR101183731B1 (en) Method and server for providing service of using item
CN111111183A (en) Battle watching method and device for casual game objects and server
CN113713379B (en) Object matching method and device, storage medium and electronic equipment
CN109481938B (en) Image generation device and image generation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant