CN117762762A - Automatic test method, device, electronic equipment and storage medium - Google Patents

Automatic test method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117762762A
CN117762762A CN202311833939.1A CN202311833939A CN117762762A CN 117762762 A CN117762762 A CN 117762762A CN 202311833939 A CN202311833939 A CN 202311833939A CN 117762762 A CN117762762 A CN 117762762A
Authority
CN
China
Prior art keywords
target
test
user
result
failure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311833939.1A
Other languages
Chinese (zh)
Inventor
王博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xintong Semiconductor Technology Co ltd
Original Assignee
Shenzhen Xintong Semiconductor Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xintong Semiconductor Technology Co ltd filed Critical Shenzhen Xintong Semiconductor Technology Co ltd
Priority to CN202311833939.1A priority Critical patent/CN117762762A/en
Publication of CN117762762A publication Critical patent/CN117762762A/en
Pending legal-status Critical Current

Links

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

The disclosure provides an automatic test method, an automatic test device, electronic equipment and a storage medium, which belong to the technical field of automatic test, and the method comprises the following steps: receiving target input information input by a user in a visual interface of a test platform, wherein the target input information comprises: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a Graphic Processor (GPU); screening out target test machines matched with a target test environment from a plurality of test machines connected with a server; controlling a target testing machine to install a target display card drive and a target testing tool; controlling the target testing machine to execute at least one target test case through the target testing tool; and outputting at least one running result corresponding to the at least one target test case. The configuration work requiring manual operation by a user before the test can be simplified.

Description

Automatic test method, device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of automated testing, and in particular relates to an automated testing method, an automated testing device, electronic equipment and a storage medium.
Background
A graphics processor (graphics processing unit, GPU) is a microprocessor that runs drawing operations on an electronic device. Before the GPU leaves the factory, it is often necessary to test the performance of the GPU.
Before the performance of the GPU is tested, the user is required to manually configure the tester, the process is tedious, the user is required to spend a long time on the preparation work before the test, and the configuration of the tester is also easy to make mistakes.
Disclosure of Invention
The disclosure provides an automatic test method, an automatic test device, electronic equipment and a storage medium; the configuration work requiring manual operation by a user before the test can be simplified.
The technical scheme of the present disclosure is realized as follows:
in a first aspect, the present disclosure provides an automated testing method, applied to a server, the method comprising: receiving target input information input by a user in a visual interface of a test platform, wherein the target input information comprises: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a Graphic Processor (GPU); screening out target test machines matched with a target test environment from a plurality of test machines connected with a server; controlling a target testing machine to install a target display card drive and a target testing tool; controlling the target testing machine to execute at least one target test case through the target testing tool; and outputting at least one running result corresponding to the at least one target test case.
In a second aspect, the present disclosure provides an automated testing apparatus comprising: a receiving part, a screening part, a control part and an output part; the receiving part is used for receiving target input information input by a user in a visual interface of the test platform, wherein the target input information comprises: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a Graphic Processor (GPU); the screening part is used for screening out target test machines matched with the target test environment from a plurality of test machines connected with the server; a control section for controlling the target tester to mount the target graphic card driver and the target test tool; the control part is also used for controlling the target testing machine to execute at least one target test case through the target testing tool; and the output part is used for outputting at least one running result corresponding to the at least one target test case.
In a third aspect, the present disclosure provides an electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, which program or instruction when executed by the processor implements the steps of the automated test method according to the first aspect.
In a fourth aspect, the present disclosure provides a computer readable storage medium having stored thereon a program or instructions which when executed by a processor performs the steps of the automated test method according to the first aspect.
In a fifth aspect, the present disclosure provides a computer program product, wherein the computer program product comprises a computer program or instructions which, when run on a processor, cause the processor to execute the computer program or instructions to carry out the steps of the automated test method according to the first aspect.
In a sixth aspect, the present disclosure provides a chip comprising a processor and a communication interface coupled to the processor for running programs or instructions implementing the automated test method according to the first aspect.
The present disclosure provides an automated testing method comprising: receiving target input information input by a user in a visual interface of a test platform, wherein the target input information comprises: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case; screening out target test machines matched with a target test environment from a plurality of test machines connected with a server; controlling a target testing machine to install a target display card drive and a target testing tool; controlling the target testing machine to execute at least one target test case through the target testing tool; and outputting at least one running result corresponding to the at least one target test case. The test platform provided by the embodiment of the disclosure comprises a visual interface, and for configuration work beyond testing, a user only needs to select needed configuration in the visual interface, and the server controls the testing machine to complete configuration and execute test cases. Therefore, the workload of a user is reduced, errors caused by manual operation are avoided, and the configuration work before testing is standardized and simplified, so that the testing efficiency is improved.
Drawings
Fig. 1 is a schematic view of an application scenario provided in the present disclosure;
FIG. 2 is one of the flow diagrams of the automated testing method provided by the present disclosure;
FIG. 3a is a schematic diagram of a test environment configuration interface provided by the present disclosure;
FIG. 3b is a schematic diagram of a drive and test tool configuration interface provided by the present disclosure;
FIG. 3c is a schematic diagram of a test case selection interface provided by the present disclosure;
FIG. 4 is a second flow chart of the automated testing method provided by the present disclosure;
FIG. 5 is a third flow chart of the automated testing method provided by the present disclosure;
FIG. 6 is a flow chart of an automated test method provided by the present disclosure;
FIG. 7 is a fifth flow chart of the automated test method provided by the present disclosure;
FIG. 8 is a flow chart of an automated test method provided by the present disclosure;
FIG. 9 is a block diagram of an automated test equipment provided by the present disclosure;
fig. 10 is a schematic hardware structure of an electronic device provided in the present disclosure.
Detailed Description
Technical solutions in embodiments of the present application will be clearly described below with reference to the drawings in the present disclosure, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged where appropriate so that the present disclosure may be practiced in sequences other than those illustrated and described herein, and that the objects identified by "first," "second," etc. are generally of a type and do not limit the number of objects, e.g., the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
First, fig. 1 is a schematic view of an application scenario shown in the present disclosure. As shown in FIG. 1, the server 10 is connected to a plurality of testers, and in FIG. 1, the testers 20, 21 and 22 are shown, the server 10 and any tester are in communication connection through a Local Area Network (LAN), a Wireless Local Area Network (WLAN) and other networks, and the server 10 can be one cluster or a plurality of clusters and can comprise one or more types of servers. The server 10 provides a visual user interface operable to receive user input and control the various test machines in accordance with the user input.
The automated testing method provided by the present disclosure is described in detail below with reference to the accompanying drawings by means of specific embodiments and application scenarios thereof.
As shown in fig. 2, the present disclosure provides an automated testing method, and an exemplary description is given below of the automated testing method provided by the present disclosure taking an execution body as a server. The method may include steps S201 to S205 described below.
S201, receiving target input information input by a user in a visual interface of the test platform.
Wherein the target input information includes: the system comprises a target test environment, a target graphics card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a graphics processor GPU.
In this example, the server-provided test platform includes a visual interface in which a user can enter configuration data that is required prior to testing. As shown in fig. 3 a-3 c, an exemplary visual interface is provided that includes three tabs.
The first tab is a test environment configuration interface, as shown in fig. 3a, the test environment configuration 30 page includes: CPU architecture 301 options, GPU vendor 302 options, memory size 303 options, operating system 304 options, motherboard vendor 305 options, and other 306 options. The list of alternatives may be expanded by the expanded "of various options, such as the list of alternatives by operating system 304 including: windows operating system, mac operating system, linux operating system, chrome operating system, android operating system, IOS operating system. The user may select the desired test environment from the list of options as desired, typically the configuration of the test environment includes 301 to 305 as described above, but in some special cases may also include other configurations, at which point the user may add through other 306 options. After the selection is completed, the user can save the currently selected configuration by clicking the save 307 button.
The second tab is a driver and test tool configuration interface, as shown in FIG. 3b, the driver and test tool configuration 31 includes: the graphics driver 311 option and the test tool 312 option. An alternative list may be expanded by the expansion of the individual options, such as glmark2, glxgears, furMark, etc. included in the list available to the test tool 312. The graphic driver 311 includes various versions of graphic drivers that are selectable after being deployed. After the selection is completed, the user can save the currently selected configuration by clicking the save 313 button.
The third tab is a test case selection interface, as shown in FIG. 3c, and the test case selection 32 includes: test case 321 options and add test case 322. The selectable list may be expanded by the expansion of the respective options, and the test cases 321 option includes a plurality of test cases, such as a test case for testing the respective conventional functions of the GPU and a test case for performing the stress test, which may be selected by the user as needed. If the required test case cannot be retrieved from the options expanded by the test case 321, the user may add one or more test cases by clicking the add test case 322, specifically, may edit one test case and add it, or may directly import a file including a plurality of test cases. After the selection and addition are completed, the user can save the currently selected configuration by clicking the save 323 button. After the configuration of the three tabs is completed, the server can execute subsequent operations according to the received data input by the user from the first tab to the third tab by clicking the start 324 button.
S202, screening out target test machines matched with a target test environment from a plurality of test machines connected with a server.
In this example, the server is connected to a plurality of testers, and the test environments of the testers are stored in the server in advance, or the server issues a test environment request message, and each tester reports the respective test environment to the server. The server screens out target test machines which meet the target test environment input by the user from the plurality of test machines.
S203, controlling the target testing machine to install the target display card driver and the target testing tool.
The server or a database corresponding to the server stores various display card drivers and testing tools in advance, and according to user selection, the server sends the target display card drivers and the target testing tools to the target testing machine, and the server controls the target testing machine to install the target display card drivers and the target testing tools. The display card driver is software and is used for guiding the hardware GPU to better play the performance.
S204, controlling the target testing machine to execute at least one target test case through the target testing tool.
In this example, the server or a database corresponding to the server stores various display card drivers and test tools in advance, and according to user selection and user input, the server sends at least one target test case to the target test machine, and controls the target test machine to execute each target test case.
S205, outputting at least one running result corresponding to at least one target test case.
The at least one operation result may be in a log form and output to the default path, or may be displayed in a visual interface of the test platform, and specifically determined according to actual needs, which is not specifically limited in the application.
The test platform provided by the embodiment of the disclosure comprises a visual interface, and for configuration work beyond testing, a user only needs to select needed configuration in the visual interface, and the server controls the testing machine to complete configuration and execute test cases. Therefore, the workload of a user is reduced, errors caused by manual operation are avoided, and the configuration work before testing is standardized and simplified, so that the testing efficiency is improved.
For the test cases with the operation results indicating the operation failure, if the operation results only indicate the failure, the test personnel needs to spend more time to search the reasons of the execution failure of the test cases. In some embodiments of the present disclosure, as shown in fig. 4 in conjunction with fig. 2, after step S205 described above, the method further includes step S206 described below.
S206, carrying out defect tracking on the failure result and generating a problem description corresponding to the failure result under the condition that at least one operation result comprises the failure result indicating operation failure.
In this example, for a test case with a failed execution, a defect tracking tool (e.g., jira) may perform defect tracking on the test case with the failed execution, and generate a problem description corresponding to the failure result (e.g., jira tool generates a socket), where the problem description generally includes: a question identifier to facilitate quick searching; title of the question and detailed information of the question; emergency degree of the problem, etc. For the generated problem description, the generated problem description can be output to a visual interface, and a tester can directly determine the reason of test failure according to the problem description. Therefore, for the test cases with failed execution, the tester can determine the reason of failure according to the problem description, so that the defect is automatically tracked.
In some embodiments of the present disclosure, when there are more test cases to be executed and a longer execution period is required, the tester may not need to pay attention to the visual interface all the time, and after the execution of the test cases is completed, if there are test cases that fail to be executed, the problem description of the test cases is sent to the tester, specifically, with reference to fig. 4, as shown in fig. 5, after step S206, the method further includes the following steps S207 and S208.
S207, determining a first notification mode corresponding to the first user identification from the corresponding relation between the pre-stored user identification and the notification mode.
The first user identifier is used for indicating a first user, and the first user is a user currently logged in the test platform.
And S208, based on the first notification mode, sending the problem description to the first user.
In this example, after the user logs in to the test platform, the user identifier of the current logged-in user may be recorded, and when the execution of the test case is completed, the problem description generated by the failed test case may be sent to the first user according to a first notification manner (such as a short message notification, a mail notification, etc.) corresponding to the first user identifier. Therefore, the user does not need to pay attention to the test platform continuously, and the problem description corresponding to the failed test result can be sent to the user immediately after the execution of the use case is completed, so that the user can check the problem immediately.
In some embodiments of the present disclosure, it is necessary to send the problem description to a user (e.g., developer, supervisor, etc.) outside of the login test platform, and thus, the target input information further includes: a second user identification; referring to fig. 4, after the above step S206, the method further includes the following steps S209 and S210, as shown in fig. 6.
S209, determining a second notification mode corresponding to the second user identification from the corresponding relation between the pre-stored user identification and the notification mode.
And S210, based on a second notification mode, sending a problem description to a second user.
In this example, the user is allowed to select at least one user identification in the visual interface that receives the question description. The number of the second users indicated by the second user identifier can be one or more, and if the second user identifier indicates one user, the notification is performed in a notification mode corresponding to the one user according to the corresponding relation between the pre-stored user identifier and the notification mode; if the second user identifier indicates a plurality of users, the notification is performed through a notification mode corresponding to each user. Thus, the user is allowed to set the contact person, and the set contact person is convenient to receive the problem description of the test case with the execution failure in real time.
In some embodiments of the present disclosure, when there is a need to view the operation result of the previous test case, the operation result of the operated test case may be obtained, specifically, with reference to fig. 2, as shown in fig. 7, after the step S205, the method further includes the following step S211.
S211, storing at least one operation result into a database.
In the example, the operation result of each test case is persisted to the database, so that subsequent checking, summarizing, comparison and the like can be facilitated.
In some embodiments of the present disclosure, as shown in fig. 8 in conjunction with fig. 7, after the step S211, the method further includes the following steps S212 to S215.
S212, receiving the test cases to be compared selected by the user in the visual interface.
S213, acquiring a plurality of operation results corresponding to the test cases to be compared from the database.
S214, comparing the operation results to obtain a comparison result.
S215, outputting a comparison result.
In this example, if the user has a need to compare the operation results of the test cases (for example, a certain test case is operated on multiple versions of the display card driver, and the operation results are compared), the test case to be compared can be selected, and the server obtains and compares the operation results of all the test cases stored in the history from the database, and outputs the comparison results (for example, the comparison results include environment configuration, display card driver, test tools, time, operation results, and the like). Thus, the user can output the comparison result of the historical operation result of the test case according to the requirement.
Fig. 9 is a block diagram of an automated testing apparatus shown in the present disclosure, as shown in fig. 9, including: a receiving section 901, a screening section 902, a control section 903, an output section 904; a receiving portion 901, configured to receive target input information input by a user in a visual interface of a test platform, where the target input information includes: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a Graphic Processor (GPU); a screening part 902 for screening out a target test machine matching with the target test environment from among a plurality of test machines connected to the server; a control section for controlling the target tester to mount the target graphic card driver and the target test tool; a control portion 903, configured to control the target test machine to execute at least one target test case through the target test tool; and the output part 904 is used for outputting at least one running result corresponding to the at least one target test case.
In some embodiments of the present disclosure, the apparatus further comprises: a defect tracking and result generating section; the defect tracking and result generating part is used for carrying out defect tracking on the failure result and generating a problem description corresponding to the failure result when the at least one operation result comprises the failure result indicating operation failure after outputting the at least one operation result corresponding to the at least one test case, so that a tester can determine the reason of test failure according to the problem description.
In some embodiments of the present disclosure, the apparatus further comprises: a determining section and a transmitting section; the determining part is used for carrying out defect tracking on the failure result and generating a problem description corresponding to the failure result under the condition that at least one operation result comprises a failure result indicating operation failure, and determining a first notification mode corresponding to a first user identifier from a corresponding relation between the user identifier and the notification mode stored in advance, wherein the first user identifier is used for indicating the first user, and the first user is a user currently logged in the test platform; and a transmitting section for transmitting the question description to the first user based on the first notification manner.
In some embodiments of the present disclosure, the target input information further includes: a second user identification; the determining part is further used for performing defect tracking on the failure result and determining a second notification mode corresponding to the second user identifier from the corresponding relation between the pre-stored user identifier and the notification mode after generating the problem description corresponding to the failure result under the condition that at least one operation result comprises the failure result indicating operation failure; and a transmitting section for transmitting the question description to the second user based on the second notification manner.
In some embodiments of the present disclosure, the apparatus further comprises: a storage section; the storing part is used for storing at least one operation result to the database after outputting the at least one operation result corresponding to the at least one test case.
In some embodiments of the present disclosure, the apparatus further comprises: an acquisition section and an alignment section; the receiving portion 901 is further configured to receive, in a visual interface, a test case to be compared selected by a user after saving at least one operation result to a database; the acquisition part is used for acquiring a plurality of operation results corresponding to the test cases to be compared from the database; the comparison part is used for comparing a plurality of operation results to obtain comparison results; the output part 904 is further configured to output the comparison result.
It should be noted that the automatic test device may be an electronic device in the foregoing method embodiment of the present application, or may be a functional module and/or a functional entity in the electronic device that can implement a function of the device embodiment, which is not limited in this application.
In this embodiment of the present application, each module may implement the automated test method provided in the foregoing method embodiment, and may achieve the same technical effects, so that repetition is avoided and redundant description is omitted here.
Referring to fig. 10, a block diagram of an electronic device according to an exemplary embodiment of the present disclosure is shown. In some examples, the electronic device may be at least one of a smart phone, a smart watch, a desktop computer, a laptop computer, a virtual reality terminal, an augmented reality terminal, a wireless terminal, and a laptop portable computer. The electronic device has a communication function and can access a wired network or a wireless network. An electronic device may refer broadly to one of a plurality of terminals, and those skilled in the art will recognize that the number of terminals may be greater or lesser. It will be appreciated that the electronic device performs the computing and processing operations of the technical solutions of the present disclosure, which is not limited by the present disclosure.
As shown in fig. 10, the electronic device in the present disclosure may include one or more of the following components: a processor 1010 and a memory 1020.
Optionally, the processor 1010 utilizes various interfaces and lines to connect various portions of the overall electronic device, perform various functions of the electronic device, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1020, and invoking data stored in the memory 1020. Alternatively, the processor 1010 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field-Programmable gate array (FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 1010 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), a Neural network processor (Neural-network Processing Unit, NPU), and baseband chips, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the touch display screen; the NPU is used to implement artificial intelligence (Artificial Intelligence, AI) functionality; the baseband chip is used for processing wireless communication. It will be appreciated that the baseband chip may not be integrated into the processor 1010 and may be implemented by a single chip.
The Memory 1020 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (ROM). Optionally, the memory 1020 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 1020 may be used to store instructions, programs, code, sets of codes, or instruction sets. The memory 1020 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above respective method embodiments, etc.; the storage data area may store data created according to the use of the electronic device, etc.
In addition, those skilled in the art will appreciate that the configuration of the electronic device shown in the above-described figures does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may have a different arrangement of components. For example, the electronic device further includes a display screen, a camera assembly, a microphone, a speaker, a radio frequency circuit, an input unit, a sensor (such as an acceleration sensor, an angular velocity sensor, a light sensor, etc.), an audio circuit, a WiFi module, a power supply, a bluetooth module, etc., which are not described herein.
The present disclosure also provides a computer readable storage medium storing at least one instruction for execution by a processor to implement the automated test method described in the various embodiments above.
The present disclosure also provides a computer program product comprising computer instructions stored in a computer-readable storage medium; the processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions so that the electronic device executes to implement the automated test method described in the above embodiments.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so that each process of the embodiment of the automatic test method can be implemented, and the same technical effect can be achieved, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, apparatuses, servers and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of skill in the art will appreciate that in one or more of the examples described above, the functions described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
It should be noted that: the embodiments described in the present disclosure may be arbitrarily combined without any collision.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention.

Claims (10)

1. An automated testing method, for application to a server, the method comprising:
receiving target input information input by a user in a visual interface of a test platform, wherein the target input information comprises: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a Graphic Processor (GPU);
screening out target test machines matched with the target test environment from a plurality of test machines connected with the server;
controlling the target testing machine to install the target display card driver and the target testing tool;
controlling the target testing machine to execute the at least one target test case through the target testing tool;
and outputting at least one operation result corresponding to the at least one target test case.
2. The method of claim 1, wherein after outputting the at least one running result corresponding to the at least one test case, the method further comprises:
and under the condition that the at least one operation result comprises a failure result indicating operation failure, performing defect tracking on the failure result, and generating a problem description corresponding to the failure result so that a tester can determine the reason of test failure according to the problem description.
3. The method according to claim 2, wherein, in the case where the at least one operation result includes a failure result indicating an operation failure, performing defect tracking on the failure result, and generating a problem description corresponding to the failure result, the method further includes:
determining a first notification mode corresponding to a first user identifier from a corresponding relation between the pre-stored user identifier and the notification mode, wherein the first user identifier is used for indicating a first user, and the first user is a user currently logged in the test platform;
and sending the problem description to a first user based on the first notification mode.
4. The method of claim 2, wherein the target input information further comprises: a second user identification; and in the case that the at least one operation result includes a failure result indicating operation failure, performing defect tracking on the failure result, and generating a problem description corresponding to the failure result, the method further includes:
determining a second notification mode corresponding to the second user identifier from the corresponding relation between the pre-stored user identifier and the notification mode;
and sending the problem description to a second user based on the second notification mode.
5. The method according to any one of claims 1 to 4, wherein after outputting at least one operation result corresponding to the at least one test case, the method further comprises:
and storing the at least one operation result to a database.
6. The method of claim 5, wherein after saving the at least one operational result to a database, the method further comprises:
receiving a test case to be compared selected by a user in the visual interface;
acquiring a plurality of operation results corresponding to the test cases to be compared from the database;
comparing the operation results to obtain a comparison result;
and outputting the comparison result.
7. An automated test equipment, the equipment comprising: a receiving part, a screening part, a control part and an output part;
the receiving part is used for receiving target input information input by a user in a visual interface of the test platform, and the target input information comprises: the system comprises a target test environment, a target display card driver, a target test tool and at least one target test case, wherein each target test case is used for testing the performance of a Graphic Processor (GPU);
the screening part is used for screening out a target testing machine matched with the target testing environment from a plurality of testing machines connected with the server;
the control part is used for controlling the target testing machine to install the target display card driver and the target testing tool;
the control part is further used for controlling the target testing machine to execute the at least one target test case through the target testing tool;
the output part is used for outputting at least one operation result corresponding to the at least one target test case.
8. The apparatus of claim 7, wherein the apparatus further comprises: a defect tracking and result generating section;
and the defect tracking and result generating part is used for carrying out defect tracking on the failure result and generating a problem description corresponding to the failure result when the at least one operation result comprises a failure result indicating operation failure after outputting the at least one operation result corresponding to the at least one test case, so that a tester can determine the reason of test failure according to the problem description.
9. An electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the automated test method of any of claims 1 to 6.
10. A computer readable storage medium, characterized in that it has stored thereon a program or instructions which, when executed by a processor, implement the steps of the automated test method according to any of claims 1 to 6.
CN202311833939.1A 2023-12-27 2023-12-27 Automatic test method, device, electronic equipment and storage medium Pending CN117762762A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311833939.1A CN117762762A (en) 2023-12-27 2023-12-27 Automatic test method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311833939.1A CN117762762A (en) 2023-12-27 2023-12-27 Automatic test method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117762762A true CN117762762A (en) 2024-03-26

Family

ID=90312391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311833939.1A Pending CN117762762A (en) 2023-12-27 2023-12-27 Automatic test method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117762762A (en)

Similar Documents

Publication Publication Date Title
EP3575975B1 (en) Method and apparatus for operating smart network interface card
CN110928770B (en) Software testing method, device, system, storage medium and electronic equipment
CN112015654A (en) Method and apparatus for testing
CN112527397A (en) Method and device for modifying basic input output system options and computer equipment
CN114546738A (en) Server general test method, system, terminal and storage medium
CN108052449B (en) Operating system running state detection method and device
CN113138886A (en) Method and device for testing embedded equipment and testing equipment
CN111858364A (en) Parameter configuration method, device and system of test terminal
CN112148607A (en) Interface testing method and device for service scene
WO2022100075A1 (en) Method and apparatus for performance test, electronic device and computer-readable medium
CN113448730A (en) Service processing method and device, computer equipment and storage medium
CN112817869A (en) Test method, test device, test medium, and electronic apparatus
CN111611124B (en) Monitoring equipment analysis method, device, computer device and storage medium
CN112416700A (en) Analyzing initiated predictive failures and SMART logs
CN115913913B (en) Network card pre-starting execution environment function fault positioning method and device
CN117762762A (en) Automatic test method, device, electronic equipment and storage medium
CN110048940A (en) Sending method, device, server and the readable storage medium storing program for executing of instant communication message
CN113495843B (en) Method and apparatus for testing play-up performance of video player
CN114064510A (en) Function testing method and device, electronic equipment and storage medium
CN113849356A (en) Equipment testing method and device, electronic equipment and storage medium
CN109684525B (en) Document display method and device, storage medium and test equipment
JP2023504956A (en) Performance detection method, device, electronic device and computer readable medium
CN112230924A (en) Popup frame prompting method and device, computer equipment and storage medium
CN112650557A (en) Command execution method and device
CN110968519A (en) Game testing method, device, server and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination