CN111258913A - Automatic algorithm testing method and device, computer system and readable storage medium - Google Patents

Automatic algorithm testing method and device, computer system and readable storage medium Download PDF

Info

Publication number
CN111258913A
CN111258913A CN202010102333.0A CN202010102333A CN111258913A CN 111258913 A CN111258913 A CN 111258913A CN 202010102333 A CN202010102333 A CN 202010102333A CN 111258913 A CN111258913 A CN 111258913A
Authority
CN
China
Prior art keywords
test
algorithm
target
server
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010102333.0A
Other languages
Chinese (zh)
Inventor
刘璐
臧磊
单以磊
匡原
彭涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Smart Technology Co Ltd
OneConnect Financial Technology Co Ltd Shanghai
Original Assignee
OneConnect Financial Technology Co Ltd Shanghai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Financial Technology Co Ltd Shanghai filed Critical OneConnect Financial Technology Co Ltd Shanghai
Priority to CN202010102333.0A priority Critical patent/CN111258913A/en
Publication of CN111258913A publication Critical patent/CN111258913A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention discloses an algorithm automatic test method, a device, a computer system and a readable storage medium, based on the technical field of test, comprising the following steps: storing production data acquired from a production server to a data server and converting the production data into test data, deploying a test algorithm in the test server, and storing a test logic data packet associated with a test requirement; receiving target information output by a user side, determining a target test algorithm in a test server according to the target information, extracting target test data from a data server, and determining a target test logic in the test server according to a test requirement; and summarizing and packaging the target test logic and the target test data to form a test script, controlling a target test algorithm to run the test script to obtain a test result, calling a callback interface to generate a callback signal, and outputting the callback signal and the test result to a user side. The invention realizes the effect of automatically testing the algorithm and avoids the problems of low working efficiency and great randomness of the current test.

Description

Automatic algorithm testing method and device, computer system and readable storage medium
Technical Field
The invention relates to the technical field of computers, in particular to an algorithm automatic testing method, an algorithm automatic testing device, a computer system and a readable storage medium.
Background
With the more and more extensive application of artificial intelligence taking deep learning as the mainstream, the scale and complexity of the deep learning algorithm after being engineered are continuously improved, and for example, a large amount of tests and a large amount of test data are needed to ensure the performance of the algorithm when the existing deep learning algorithm interface is subjected to function expansion, performance optimization, model iteration and the like;
the current common method is to test the algorithm by adopting a manual test mode, and as a large number of manual test operations need to be repeated, the problems of low working efficiency, large randomness, large error chance and development period extension are caused, so that the consistency and accuracy of requirements, test cases and test results cannot be ensured; and because the test resources of the manual test and the current more primary automatic test are not monitored in place, the coverage test on the algorithm function and performance requirements cannot be realized.
Disclosure of Invention
The invention aims to provide an automatic algorithm testing method, an automatic algorithm testing device, a computer system and a readable storage medium, which are used for solving the problems of low working efficiency, large randomness, large error chance and development period extension caused by repeated large number of manual testing operations in the prior art and the problems that the consistency and the accuracy of requirements, test cases and test results cannot be ensured.
In order to achieve the above object, the present invention provides an automatic algorithm testing method, which comprises:
storing production data acquired from a production server to a data server and converting the production data into test data, deploying a test algorithm in the test server, and storing a test logic data packet associated with a test requirement;
receiving target information output by a user side, determining a target test algorithm in a test server according to the target information, extracting target test data from a data server, and determining a target test logic in the test server according to a test requirement selected by the user side;
and summarizing and packaging the target test logic and the target test data to form a test script, controlling a target test algorithm to run the test script through a test server to obtain a test result, calling a callback interface of the test server to generate a callback signal, and outputting the callback signal and the test result to the user side.
In the above solution, the deploying of the test algorithm in the test server includes:
storing a test algorithm with test information in the test server;
and deploying the test algorithm in the test server, and loading the deployment path of the test algorithm in the test server into the test information.
In the above solution, the storing the test logic data packet associated with the test requirement includes:
storing the pre-generated test logic data packet to a test server, and obtaining a storage path of the test logic data packet in the test server; the test logic data packet is a data packet which is recorded with a method for testing test data;
and setting a logic table in the test server, and writing the storage path and the test requirement into the logic table to associate the test logic data packet with the test requirement.
In the above solution, the target information output by the receiving user side includes:
creating a front-end node and a back-end node which stores a test algorithm and a test logic data packet in the test server;
outputting an algorithm dialog box with a test algorithm ID to a user side through the front end node, and setting the test algorithm ID selected and sent by the user side in the algorithm dialog box as a target algorithm ID;
outputting a test version ID matched with a target algorithm ID to a front-end node through the back-end node according to the target algorithm ID sent by the front-end node; outputting a version dialog box with the test version ID to the user side through the front end node, and setting the version ID selected and sent by the user side in the version dialog box as a target version ID;
and summarizing the target algorithm ID and the target version ID to form target information.
In the above scheme, the determining a target test algorithm in the test server according to the target information and extracting target test data from the data server includes:
determining, by the front-end node, a target test algorithm in the back-end node according to the target information;
extracting test data matched with the target information from a data server through the back-end node, and sending the serial number of the test data to the user side through the front-end node in a data dialog box mode;
and the front-end node forwards the serial number selected and sent by the user side in the data dialog box to the rear-end node, and extracts the test data corresponding to the serial number through the rear-end node and sets the test data as target test data.
In the foregoing solution, the determining a target test logic in the test server according to the selected test requirement of the user side includes:
outputting a requirement dialog box with a test requirement to the user side through the front end node, and setting the test requirement selected and sent by the user side in the requirement dialog box as a target test requirement;
setting, by the front-end node, a test logic packet associated with the target test requirement as target test logic in a back-end node.
In the above scheme, the controlling, by the test server, the test algorithm to run the test script to obtain the test result includes:
extracting a target test logic in the test script through the test server, and judging and acquiring a test requirement of the target test logic;
controlling a test algorithm to run the test script through a test server according to the type of the test requirement to obtain a test result; if the type of the test requirement is a smoking test, running the test script through a test algorithm to obtain a smoking test result; if the type of the test requirement is an algorithm response time test, running the test script through a test algorithm to obtain a response test result; and if the type of the test requirement is a performance test, running a test script through a test algorithm and starting an acquisition program, so that the acquisition program acquires the machine resource occupation condition of the test server when running the test script and generates a performance test result.
In order to achieve the above object, the present invention further provides an automatic algorithm testing device, including:
the test preparation module is used for storing the production data acquired from the production server to the data server and converting the production data into test data, deploying a test algorithm in the test server and storing a test logic data packet associated with a test requirement;
the requirement identification module is used for receiving target information output by a user terminal, determining a target test algorithm in a test server according to the target information, extracting target test data from a data server, and determining a target test logic in the test server according to the test requirement selected by the user terminal;
and the test execution module is used for summarizing and packaging the target test logic and the target test data to form a test script, controlling a target test algorithm through a test server to run the test script to obtain a test result, calling a callback interface of the test server to generate a callback signal, and outputting the callback signal and the test result to the user side.
The invention also provides a computer system, which comprises a plurality of computer devices, wherein each computer device comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, and the processors of the plurality of computer devices jointly realize the steps of the automatic testing method of the algorithm when executing the computer program.
In order to achieve the above object, the present invention further provides a computer-readable storage medium, which includes a plurality of storage media, each storage medium having a computer program stored thereon, wherein the computer programs stored in the storage media, when executed by a processor, collectively implement the steps of the automatic algorithm testing method.
According to the automatic algorithm testing method, the automatic algorithm testing device, the computer system and the readable storage medium, the production data are automatically obtained from the production server, the production algorithm of the production data is operated, the production algorithm is stored in the data server and then is correlated to form the test data, so that test data do not need to be compiled or manually obtained by a tester, the fatigue strength of the tester is reduced, the data obtaining efficiency is improved, the test data are ensured to completely meet the requirements of a production environment, and the accuracy and the reliability of the test and the monitoring coverage rate of the test resources are further ensured; by deploying the test algorithm in the test server for running the test data and setting the test algorithm matched with the target information as the target test algorithm, a user can obtain the target algorithm deployed in advance only by selecting the user side, convenience is provided for the user to select and execute the test algorithm, and the condition that the time of a tester is wasted due to the loading of the test algorithm is avoided; setting a logic table in a test server pre-stored with a test logic data packet for associating the test logic data packet with a test requirement; the user side can directly extract the test logic data packet associated with the test logic data packet from the data server by determining the target test requirement, so that the generation efficiency of the test script is improved; the test data matched with the target information is set as target test data in the data server, so that the consistency of the test requirements, the test data and the test scripts made by the test data and the test algorithm is ensured; the deployment path in the test information of the target test algorithm is obtained, the test server is controlled to enable the target test algorithm to run the test script to obtain the test result, and the user side is enabled to obtain various test results, so that the coverage test of the algorithm function is realized, the technical effect of automatically testing the algorithm is also realized, and the problems of low working efficiency, high randomness, high error opportunity and development period extension caused by manually manufacturing the test script and testing the algorithm at present are solved.
Drawings
FIG. 1 is a schematic diagram of an environment application of a first embodiment of an automatic algorithm testing method according to the present invention;
FIG. 2 is a flowchart of a first embodiment of an automatic algorithm testing method according to the present invention;
FIG. 3 is a flowchart illustrating the method for automatically testing the algorithm of the present invention, wherein the production data is obtained in S1 and is converted into the test data;
FIG. 4 is a flowchart illustrating a test algorithm deployed in the test server in S1 according to an embodiment of the automatic test method for algorithm of the present invention;
FIG. 5 is a flowchart illustrating an embodiment of the method for automated algorithmic testing according to the present invention, wherein the test logic data packet associated with the test requirement is stored in S1;
fig. 6 is a flowchart illustrating an embodiment of the method for automatically testing an algorithm according to the present invention, wherein the target information outputted from the user terminal is received in S2;
FIG. 7 is a flowchart illustrating the method for automatically testing algorithms according to an embodiment of the present invention, namely, determining the target testing algorithm and extracting the target testing data in S2;
FIG. 8 is a flowchart illustrating the method for automatic algorithmic test to determine the target test logic in S2 according to an embodiment of the present invention;
FIG. 9 is a flowchart illustrating the method for automatically testing an algorithm according to an embodiment of the present invention, wherein the test result is obtained in S3;
FIG. 10 is a flowchart illustrating an embodiment of the method for automatically testing algorithm according to the present invention, wherein in step S3, a test result is obtained by a test algorithm according to the type of the test requirement;
FIG. 11 is a schematic diagram of program modules of an automatic algorithm testing device according to a second embodiment of the present invention;
fig. 12 is a schematic diagram of a hardware structure of a computer device in the third embodiment of the computer system according to the present invention.
Reference numerals:
1. algorithm automatic testing device 2, testing server 3 and production server
4. Digital server 5, user terminal 6, network 7 and computer equipment
11. Test preparation module 12, requirement identification module 13 and test execution module
14. Result summarizing module 61, network a 62, network b 63, network c
71. Memory 72, processor
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an automatic algorithm testing method, an automatic algorithm testing device, a computer system and a readable storage medium, which are suitable for the field of computers and are based on a test preparation module, a requirement identification module, a test execution module and a result summarizing module. The production data are acquired from the production server, the production algorithm of the production data is operated, and the production data are stored in the data server and then are correlated with each other to form test data; deploying a test algorithm in a test server for running the test data, sending target information through a user side, and setting the test algorithm matched with the target information as a target test algorithm in the test server; setting a logic table in a test server pre-stored with a test logic data packet for associating the test logic data packet with a test requirement; the user side can directly extract the test logic data packet associated with the test logic data packet from the data server by determining the target test requirement; setting the test data matched with the target information in a data server as target test data, and packaging the target test data and a test logic data packet associated with the test requirement to form a test script; and controlling the test server to enable the target test algorithm to run a test script to obtain a test result by acquiring a deployment path in the test information of the target test algorithm, and outputting the test result and a callback signal to a user side by calling the callback signal generated by the callback interface.
Fig. 1 schematically shows an environment application diagram of an automatic algorithm testing method according to a first embodiment of the present application.
In an exemplary embodiment, the automatic algorithmic test method is run in a test server 2, the test server 2 is connected to a digital server 4 and a plurality of user terminals 5 through a network 6, the digital server is connected to a production server through the network 6; the network 6 comprises a network a61, a network b62 and a network c63, the test server 2 is connected with the digital server 4 through the network a61, the test server is connected with user terminals through the network b62, and the digital server is connected with the production server 3 through the network c 63; the test server 2 acquires the production data and the algorithm information from the production server 3 through the digital server 4, stores the production data and the algorithm information into the digital server 4 and then associates the production data and the algorithm information to acquire test data;
a user inputs target information through a user side 2, and a test server 2 determines a target test algorithm in the test server 2 according to the target information and extracts target test data from a data server; the test server 2 collects and packages the target test logic and the target test data to form a test script, controls a target test algorithm to run the test script to obtain a test result, calls a callback interface of the test server 2 to generate a callback signal, and outputs the callback signal and the test result to the user side 5.
The test server 2 may provide services through one or more networks 6, and the networks 6 may include various network devices, such as routers, switches, multiplexers, hubs, modems, bridges, repeaters, firewalls, proxy devices, and/or the like. The network 6 may include physical links, such as coaxial cable links, twisted pair cable links, fiber optic links, combinations thereof, and/or the like. The network 6 may include wireless links such as cellular links, satellite links, Wi-Fi links, and/or the like.
The test server 2 may be comprised of a single or multiple computer devices (e.g., servers). The single or multiple computing devices may include virtualized compute instances. The virtualized computing instance may include a virtual machine, such as an emulation of a computer system, an operating system, a server, and so forth. The computing device may load the virtual machine based on a virtual image and/or other data that defines the particular software (e.g., operating system, dedicated application, server) used for emulation. As the demand for different types of processing services changes, different virtual machines may be loaded and/or terminated on one or more computing devices. A hypervisor may be implemented to manage the use of different virtual machines on the same computing device.
Example one
Referring to fig. 2, an algorithm automatic testing method of the present embodiment includes:
s1: storing production data acquired from a production server to a data server and converting the production data into test data, deploying a test algorithm in the test server, and storing a test logic data packet associated with a test requirement;
s2: receiving target information output by a user side, determining a target test algorithm in a test server according to the target information, extracting target test data from a data server, and determining a target test logic in the test server according to a test requirement selected by the user side;
s3: and summarizing and packaging the target test logic and the target test data to form a test script, controlling a target test algorithm to run the test script through a test server to obtain a test result, calling a callback interface of the test server to generate a callback signal, and outputting the callback signal and the test result to the user side.
The automatic algorithm testing method provided by the embodiment is used for automatically acquiring production data from a production server, operating the production algorithm of the production data, storing the production data in a data server and then correlating the production data with the production algorithm of the production data to form test data, so that test data do not need to be compiled or manually acquired by testers, the fatigue strength of the testers is reduced, the data acquisition efficiency is improved, and meanwhile, the test data are acquired based on the production environment, so that the test data completely meet the requirements of the production environment, and the accuracy and the reliability of the test are further ensured; and because the production data is automatically acquired from the production server, the monitoring coverage rate of the test resources is ensured.
Deploying a test algorithm in a test server for running the test data, selecting or inputting a test algorithm ID and a test version ID in a dialog box through target information sent by a user side, and setting the test algorithm matched with the target information in the test server as the target test algorithm to automatically determine the target algorithm and extract the target test data from the data server, so that a user can obtain the target algorithm deployed in advance only through selection of the user side, convenience is provided for the user to select and execute the test algorithm, and the condition that time of a tester is wasted due to loading of the test algorithm is avoided;
setting a logic table in a test server pre-stored with a test logic data packet for associating the test logic data packet with a test requirement; the user side can directly extract the test logic data packet associated with the test logic data packet from the data server by determining the target test requirement, so that the generation efficiency of the test script is improved; in the present application, the test requirements include at least a smoking test option, an algorithm response time test option, or a performance test option; and setting the test data matched with the target information in the data server as target test data, and packaging the target test data and the test logic data packet associated with the test requirement to form a test script, so that the consistency of the test requirement, the test data, the test script made of the test data and the test algorithm is ensured.
The method comprises the steps of controlling a test server to enable a target test algorithm to run a test script to obtain a test result by acquiring a deployment path in test information of the target test algorithm, and outputting the test result and a callback signal to a user side by calling the callback signal generated by a callback interface; by automatically generating the test script and running the test script by using the target test algorithm, the user end obtains a smoking test result, a response test result or a performance test result, so that the coverage test of algorithm functions (smoking test), response (response test) and performance (performance test) is realized, the technical effect of automatically testing the algorithm is also realized, and the problems of low working efficiency, large randomness, large error chance and development period extension caused by manually manufacturing the test script and testing the algorithm at present are solved.
It should be noted that the test script is a computer program for simulating a multi-user access test algorithm, which is constructed based on a test logic data packet and test data, and since a method for generating the test script belongs to the common general knowledge in the art, the technical problem mainly solved by the present application is how to accurately obtain the test logic data packet and the test data to obtain the test script, and therefore the generation principle of the test script is not described in detail in the present application; the callback interface is a computer program which is prestored by the test server and outputs a callback signal to the user side according to a calling rule; for example: the callback interface is called by the known system background in the test server to generate a callback signal, and the test server outputs the test result and the callback signal to the user side so as to realize the technical effects of informing the user and displaying the test result to the user side, thereby ensuring that the user can obtain the test result in time.
The automatic algorithm testing method provided by the embodiment is executed by the server computer equipment.
In a preferred embodiment, referring to fig. 3, the storing the production data obtained from the production server to the data server and converting the production data into the test data includes:
s101: acquiring production data and algorithm information of a production algorithm in a production server from the production server;
in the step, the digital server is controlled by the test server to periodically obtain the production data and the algorithm information of the production algorithm in the production server.
S102: and storing the production data and the algorithm information to a data server and correlating the production data and the algorithm information with each other to form test data.
In the step, the digital server is controlled by the test server to store the production data and the algorithm information to the data server and correlate the production data and the algorithm information to form test data; meanwhile, the algorithm information of the production algorithm in this step includes: production algorithm ID, production algorithm name, production version ID; the data server is provided with an algorithm list, and the production data and algorithm information are associated to form test data by recording a production algorithm ID, a production algorithm name, a production version ID and a storage path for expressing the storage position of the production data in the data server in the algorithm list, for example: the algorithm information of the production information is as follows:
Figure BDA0002387276570000101
thus, the production algorithm is correlated with the production data by the algorithm manifest; by the method, the production algorithm and the production data can be ensured to correspond to each other, and if the test server needs to test various production algorithms, the test server cannot fail due to the fact that the production algorithms are wrongly matched with the production data.
It should be noted that the production server has a production environment, the production environment refers to a general name of computer hardware, software, network equipment and historical data necessary for completing external service work; the production data refers to information of demand instructions generated or proposed by a user based on a production environment, and can be embodied in a data set mode, such as an application instruction data set, a picture data set and the like; production algorithms are computer programs deployed in a production environment for processing production data.
In a preferred embodiment, referring to fig. 4, the deploying of the test algorithm in the test server includes:
s111: storing a test algorithm with test information in the test server;
in this step, the test information includes a test algorithm name, a test algorithm ID, a test version ID, and a version number; loading a storage path of the test algorithm in a test server into the test information, and setting the storage path as a version path; wherein a certain test algorithm ID has at least one version ID and its version number.
S112: deploying the test algorithm in the test server, and loading a deployment path of the test algorithm in the test server into the test information;
in the step, a test algorithm is extracted from a test server according to the version path of the test information and is deployed in the test server; and acquiring a storage path of the test algorithm deployed in the test server, loading the storage path into the test information and setting the storage path as a deployment path, so as to improve the efficiency of pulling the deployed test algorithm.
In this embodiment, the test algorithm name is a generic name of the test algorithm; the algorithm ID is an automatically generated unique identifier for identifying the test algorithm; the test version ID is an automatically generated unique identifier used for representing the test algorithm version; the version number is version information of a test algorithm; the version path is a storage path of the test algorithm and is used for automatic delivery of the algorithm, and meanwhile, the version number is matched with the version path and is used for describing the version information of the test algorithm stored under the path; the version path is an SVN (support vector network), i.e. Subversion, which is a version control system of an open source code, the version path is a storage path of a test algorithm of the version in the test server, and the deployment path is a storage path of the test algorithm deployed in the test server, so that the test algorithm can be pulled from the test server through the deployment path.
For example: testing algorithm ID: e8600ced4f83458abb1c406d591a 5778; the name of the test algorithm is as follows: monelisa; test version ID: 9b748a337ddd40a78ad514c9 af; algorithm version number: 3367; version path: (vii) moniisa/3367; deployment path: at/test/monoalisa/.
The test information is established so that a user can quickly and accurately acquire the required test information in the test server, for example: and when the test algorithm needs to be used, the test algorithm is accessed and operated through the deployment path.
In this embodiment, the test server is a test environment, and the test environment (Testingenvironment) refers to a description of software and hardware environment on which a test runs, and any other software interacting with the software to be tested, including drivers and stubs; the test environment is a general term for computer hardware, software, network equipment and historical data necessary for completing software test work, and is generally obtained by copying the configuration of a production environment; the test environment in the embodiment is a Python virtual environment, which is an independent operating environment created by a deep learning algorithm, and the virtual environment can be created by virtualenv or pyenv tools; furthermore, the test data in this embodiment is made from the production data obtained from the production environment of the production server, so that not only the business response of the production environment is not affected, but also the reality and effectiveness of the test data are ensured, and in this embodiment, a goreplay tool can be used to obtain the production data from the production server; the test algorithm is an algorithm deployed in a test environment and used for testing the test environment through a test script to obtain a test result, such as reliability, efficiency and stability of the algorithm; if the test algorithm passes the test, the algorithm is deployed to a production environment for use.
In a preferred embodiment, referring to fig. 5, the storing the test logic data packet associated with the test request comprises:
s121: storing the pre-generated test logic data packet to a test server, and obtaining a storage path of the test logic data packet in the test server; the test logic data packet is a data packet describing a method for testing test data.
S122: setting a logic table in a test server, writing the storage path and the test requirement into the logic table to associate the test logic data packet with the test requirement;
the test requirement in this step refers to a test item sent by the user to the test server, which may be a smoke test, an algorithm response time test, or a performance test.
It should be noted that the test logic data packet is formed by test logic written in advance by a tester and packaged. The smoking test is a method for judging whether input test data pass the test; the algorithm response time test is used for testing data with different sizes, and the algorithm inference time is calculated, for example, data with different sizes is input, if a picture is used as test data, pictures with different sizes such as 50kb, 100kb and 200kb are input, and the algorithm response time is recorded; the performance test is used for simulating multi-user access algorithm service, monitoring occupation information of machine resources (such as cpu utilization rate, cpu occupancy rate, gpu utilization rate, video memory occupancy rate, memory occupancy rate and the like), monitoring abnormal events such as overhigh occupation and the like, and finding out the optimal concurrency number of transaction processing amount (tps) per second in the process of increasing the concurrency number.
In a preferred embodiment, referring to fig. 6, the target information output by the receiving user end includes:
s201: creating a front-end node and a back-end node which stores a test algorithm and a test logic data packet in the test server;
in this step, the front-end node is configured to implement communication between the user side and the back-end node; the method comprises the steps that a server is divided into a front-end node and a back-end node, a dialog box is sent to a user side through the front-end node to obtain the requirement of the user side, and information is extracted from the back-end node according to the requirement; the front-end node is responsible for sending the requirement (dialog box) of the user side and judging the requirement (clicking or inputting information of the user side on the dialog box), so that the back-end node storing a large number of test algorithms does not need to bear a large amount of data processing and sending work, only needs to send corresponding information according to the request sent by the front-end node, and the condition that the back-end node is halted due to a large amount of data processing work is avoided.
S202: outputting an algorithm dialog box with a test algorithm ID to a user side through the front end node, and setting the test algorithm ID selected and sent by the user side in the algorithm dialog box as a target algorithm ID;
the algorithm dialog box in the step comprises an algorithm ID input item, wherein the algorithm ID input item is in any one form of an input box, a radio box and a check box; the user side can select the test algorithm ID by clicking or inputting in the algorithm ID entry of the algorithm dialog box.
S203: outputting a test version ID matched with a target algorithm ID to a front-end node through the back-end node according to the target algorithm ID sent by the front-end node; outputting a version dialog box with the test version ID to the user side through the front end node, and setting the version ID selected and sent by the user side in the version dialog box as a target version ID;
the version dialog box in the step comprises a version ID input item, wherein the version ID input item is any one of an input box, a radio box and a check box; the user end can select the test version ID by clicking or inputting in the version ID entry of the version dialog.
S204: and summarizing the target algorithm ID and the target version ID to form target information.
In this step, the target information is stored in the backend node.
In a preferred embodiment, referring to fig. 7, the determining a target test algorithm in the test server according to the target information and extracting target test data from the data server includes:
s211: determining, by the front-end node, a target test algorithm in the back-end node according to the target information;
in this step, the front-end node extracts the test information matched with the target information from the back-end node, that is: test information matched with both the target algorithm ID and the target version ID; extracting a deployment path in the test information; obtaining a test algorithm in the test server according to the deployment path and setting the test algorithm as a target test algorithm; for example, the user needs to test the 3367 version of the monalisa test algorithm, where the target algorithm ID: e8600ced4f83458abb1c406d591a5778, target version ID: 9b748a337ddd40a78ad514c9 af; then the test algorithm ID is obtained in the back-end node by the front-end node: e8600ced4f83458abb1c406d591a5778, test version ID: 9b748a337ddd40a78ad514c9af, test algorithm name: monelisa, version number: 3367 and set it as the target test algorithm.
S212: extracting test data matched with the target information from a data server through the back-end node, and sending the serial number of the test data to the user side through the front-end node in a data dialog box mode;
illustratively, the target algorithm ID and the target version ID are sequentially compared with the algorithm information of each test data, if the production algorithm ID and the production version ID in the algorithm information are respectively consistent with the target algorithm ID and the target version ID, the test data is judged to be matched with the target algorithm ID and the target version ID, and therefore the effect that the extracted test data is matched with the algorithm required to be tested by the user is guaranteed through the mode. The data dialog box comprises a test requirement input item, and the test requirement input item is in any form of an input box, a radio box and a check box; the user terminal can select or input the test data through the test requirement input item of the data dialog box, so that the user can select the test data conveniently.
S213: and the front-end node forwards the serial number selected and sent by the user side in the data dialog box to the rear-end node, and extracts the test data corresponding to the serial number through the rear-end node and sets the test data as target test data.
The test data may have a unique number, and the number may be sent to the user terminal in a dialog box to facilitate the user terminal in selecting the test data.
In a preferred embodiment, referring to fig. 8, the determining a target test logic in the test server according to the selected test requirement of the user side includes:
s221: outputting a requirement dialog box with a test requirement to the user side through the front end node, and setting the test requirement selected and sent by the user side in the requirement dialog box as a target test requirement;
the version dialog box in the step comprises a test requirement input item, wherein the test requirement input item is in any one form of an input box, a radio box and a check box; the user side can select or input the test requirement input items in the version dialog box to realize the selection of the test requirement; illustratively, the test requirement input items include at least: a smoke test entry, an algorithm response time test entry, and a performance test entry.
S222: setting, by the front-end node, a test logic data packet associated with the target test requirement as target test logic in a back-end node;
in this step, a logic table is extracted from the test server, a test logic data packet matching the target requirement is obtained from the logic table, and the test logic data packet is set as a target test logic.
In a preferred embodiment, referring to fig. 9, controlling the test algorithm to run the test script by the test server to obtain the test result includes:
s301: extracting a target test logic in the test script through the test server, and acquiring a test requirement of the target test logic;
s302: and controlling a test algorithm to run the test script through a test server according to the type of the test requirement so as to obtain a test result.
Specifically, referring to fig. 10, the step S302 includes the following steps:
s302-1: judging the type of the test requirement; if the type is the smoking test, the S302-2 is entered; if the type is an algorithm response time test, entering S302-3; if the type is a performance test, entering S303-4;
s302-2: if the type of the test requirement is a smoking test, the test server controls the test algorithm to run the test script to obtain a smoking test result;
s302-3: if the type of the test requirement is an algorithm response time test, controlling the test algorithm to run the test script through the test server to obtain a response test result;
s302-4: if the type of the test requirement is a performance test, the test server controls the test algorithm to run the test script and starts an acquisition program, so that the acquisition program acquires the machine resource occupation condition of the test server when running the test script and generates a performance test result.
In this embodiment, the smoking test result is information describing whether the smoking test is passed or failed; the response test result is information recording the time used for running the test script; for example, taking a picture as target test data of the test script, making the target test data be a picture data set which comprises pictures with different sizes such as 50kb, 100kb and 200kb, and recording the response time of the target test algorithm running the test script; the performance test result is information describing the occupation situation of the machine resource, and in this embodiment, the occupation situation of the machine resource at least includes a cpu occupancy rate, a memory usage rate, and a display memory usage rate gpu occupancy rate.
It should be noted that the acquisition program is a computer program which is prestored in the test server and acquires the occupation condition of the machine resources of the test server according to an algorithm rule; in this embodiment, the time interval of the acquisition program is set to 30 seconds, and the time interval is used to express the time interval for acquiring the resource occupied by the process of the tested algorithm instance, so that setting the time interval to 30 seconds is the machine resource occupation condition for acquiring the target test algorithm process every 30 seconds.
In an exemplary embodiment, after obtaining the test requirements of the target test logic, the method may further include:
s301-1: comparing the test script with a preset service threshold value in a test server;
s301-2: and if the test data in the test script is greater than the service threshold value, copying a test algorithm in the test server and deploying the test algorithm in a preset standby server, decomposing the test script to form a plurality of sub-test scripts, and distributing the sub-test scripts to the test server and the standby server evenly to enable the sub-test scripts to run respectively.
In this way, the situation that the test server is over-loaded due to the fact that the test script is too large, and the test server is dead or crashed is avoided.
In an exemplary embodiment, after obtaining the performance test result, the method may further include:
s302-1: setting a GPU threshold value, a CPU threshold value, a video memory threshold value and a memory threshold value in the test server;
s302-2: comparing the CPU occupancy rate of the machine resource with a CPU threshold, and if the CPU occupancy rate is greater than the CPU threshold, generating CPU warning information and loading the CPU warning information into the test result;
comparing the memory utilization rate of the machine resource with a memory threshold, and if the memory utilization rate is greater than the memory threshold, generating memory warning information and loading the memory warning information into the test result;
comparing the video memory utilization rate of the machine resource with a video memory threshold, and if the video memory utilization rate is greater than the video memory threshold, generating video memory warning information and loading the video memory warning information into the test result;
and comparing the GPU occupancy rate of the machine resource with a GPU threshold value, and if the GPU occupancy rate is greater than the GPU threshold value, generating GPU warning information and loading the GPU warning information into the performance test result. In this way, visual data is provided for the user, so that the data acquisition efficiency and convenience of the user are improved.
In this embodiment, the CPU occupancy, the memory usage, the video memory usage, and the GPU occupancy of the machine resource occupation in the test result, and the GPU threshold, the CPU threshold, the video memory threshold, and the memory threshold may be displayed in the test result in a manner of a number, a pie chart, a line chart, or a bar chart, so that a user can visually know the data recorded in the test result.
In an exemplary embodiment, the step S30 may be followed by:
s40: and sequentially obtaining the test results of all the version IDs of any test algorithm in the test server according to the step S20 and the step S30, summarizing the test results of all the version IDs in the test algorithm to form an algorithm result set, and outputting the algorithm result set to the user side through a callback interface.
The test result generated after the test script is run by the test algorithm of each version ID is summarized in the step S40, so that the user can compare the test algorithms of each version ID, and the user can determine the quality of each version of the test algorithm.
Example two
Referring to fig. 11, an automatic algorithm testing device 1 of the present embodiment includes:
the test preparation module 11 is configured to store the production data acquired from the production server to the data server and convert the production data into test data, deploy a test algorithm in the test server, and store a test logic data packet associated with a test requirement;
the requirement identification module 12 is configured to receive target information output by a user, determine a target test algorithm in a test server according to the target information, extract target test data from a data server, and determine a target test logic in the test server according to a test requirement selected by the user;
and the test execution module 13 is configured to summarize and package the target test logic and the target test data to form a test script, control a target test algorithm through a test server to run the test script to obtain a test result, call a callback interface of the test server to generate a callback signal, and output the callback signal and the test result to the user side.
In an exemplary embodiment, the automatic algorithmic test means may further comprise:
and the result summarizing module 14 is used for calling the requirement confirming module 12 and the test executing module 13 to sequentially obtain the test results of all the versions of IDs of any test algorithm in the test server, summarizing the test results of the versions of IDs in the test algorithms to form an algorithm result set, and outputting the algorithm result set to the user side through a callback interface.
The technical scheme is based on a test technology in the research and development management field, namely, an automatic test tool is provided to acquire production data from a production server and operate a production algorithm of the production data, and the production data and the production algorithm are mutually associated to form test data; deploying a test algorithm in a test server for running the test data, and setting the test algorithm matched with the target information in the test server as a target test algorithm; setting a logic table in a test server pre-stored with a test logic data packet for associating the test logic data packet with a test requirement; setting the test data matched with the target information in a data server as target test data, and packaging the target test data and a test logic data packet associated with the test requirement to form a test script; and controlling the test server to enable the target test algorithm to run a test script to obtain a test result by acquiring a deployment path in the test information of the target test algorithm, and outputting the test result and a callback signal to a user side by calling the callback signal generated by the callback interface.
Example three:
in order to achieve the above object, the present invention further provides a computer system, the computer system includes a plurality of computer devices 7, the components of the automatic algorithm testing device 1 according to the second embodiment may be distributed in different computer devices 7, and the computer devices 7 may be smartphones, tablet computers, notebook computers, desktop computers, rack-mounted servers, blade servers, tower servers, or rack-mounted servers (including independent servers or a server cluster formed by a plurality of servers) that execute programs, and the like. The computer device 7 of the present embodiment includes at least, but is not limited to: a memory 71, a processor 72, which may be communicatively coupled to each other via a system bus, as shown in FIG. 12. It is noted that fig. 12 only shows the computer device 7 with components, but it is to be understood that not all of the shown components are required to be implemented, and that more or less components may be implemented instead.
In this embodiment, the memory 71 (i.e., a readable storage medium) includes a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the storage 71 may be an internal storage unit of the computer device 7, such as a hard disk or a memory of the computer device 7. In other embodiments, the memory 71 may also be an external storage device of the computer device 7, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the computer device 7. Of course, the memory 71 may also comprise both an internal storage unit of the computer device 7 and an external storage device thereof. In this embodiment, the memory 71 is generally used for storing an operating system installed in the computer device 7 and various application software, such as a program code of the automatic algorithm testing device in the first embodiment. Further, the memory 71 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 72 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 72 is typically used to control the overall operation of the computer device 7. In this embodiment, the processor 72 is configured to run the program codes stored in the memory 71 or process data, for example, run an automatic algorithm testing device, so as to implement the automatic algorithm testing method according to the first embodiment.
Example four:
to achieve the above objects, the present invention also provides a computer-readable storage system including a plurality of storage media, such as a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application store, etc., on which a computer program is stored, which when executed by a processor 72, implements corresponding functions. The computer readable storage medium of this embodiment is used for storing an automatic algorithm testing device, and when being executed by the processor 72, the automatic algorithm testing device implements the automatic algorithm testing method of the first embodiment.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An automatic testing method for an algorithm is characterized by comprising the following steps:
storing production data acquired from a production server to a data server and converting the production data into test data, deploying a test algorithm in the test server, and storing a test logic data packet associated with a test requirement;
receiving target information output by a user side, determining a target test algorithm in a test server according to the target information, extracting target test data from a data server, and determining a target test logic in the test server according to a test requirement selected by the user side;
and summarizing and packaging the target test logic and the target test data to form a test script, controlling a target test algorithm to run the test script through a test server to obtain a test result, calling a callback interface of the test server to generate a callback signal, and outputting the callback signal and the test result to the user side.
2. The method for automated testing of algorithms according to claim 1, wherein said deploying testing algorithms in a testing server comprises:
storing a test algorithm with test information in the test server;
and deploying the test algorithm in the test server, and loading the deployment path of the test algorithm in the test server into the test information.
3. The algorithmic automatic test method of claim 1, wherein storing test logic data packets associated with test requirements comprises:
storing the pre-generated test logic data packet to a test server, and obtaining a storage path of the test logic data packet in the test server; the test logic data packet is a data packet which is recorded with a method for testing test data;
and setting a logic table in the test server, and writing the storage path and the test requirement into the logic table to associate the test logic data packet with the test requirement.
4. The method for automatic testing of algorithms according to claim 1, wherein the receiving of the target information output by the user end comprises:
creating a front-end node and a back-end node which stores a test algorithm and a test logic data packet in the test server;
outputting an algorithm dialog box with a test algorithm ID to a user side through the front end node, and setting the test algorithm ID selected and sent by the user side in the algorithm dialog box as a target algorithm ID;
outputting a test version ID matched with a target algorithm ID to a front-end node through the back-end node according to the target algorithm ID sent by the front-end node; outputting a version dialog box with the test version ID to the user side through the front end node, and setting the version ID selected and sent by the user side in the version dialog box as a target version ID;
and summarizing the target algorithm ID and the target version ID to form target information.
5. The method for automatic testing of algorithms according to claim 4, wherein said determining target test algorithms in a test server and extracting target test data from a data server according to said target information comprises:
determining, by the front-end node, a target test algorithm in the back-end node according to the target information;
extracting test data matched with the target information from a data server through the back-end node, and sending the serial number of the test data to the user side through the front-end node in a data dialog box mode;
and the front-end node forwards the serial number selected and sent by the user side in the data dialog box to the rear-end node, and extracts the test data corresponding to the serial number through the rear-end node and sets the test data as target test data.
6. The method according to claim 4, wherein the determining a target test logic in the test server according to the test requirement selected by the user terminal comprises:
outputting a requirement dialog box with a test requirement to the user side through the front end node, and setting the test requirement selected and sent by the user side in the requirement dialog box as a target test requirement;
setting, by the front-end node, a test logic packet associated with the target test requirement as target test logic in a back-end node.
7. The method for automatic testing of algorithms according to claim 1, wherein said controlling, by said testing server, said testing algorithm to run said testing script to obtain a testing result comprises:
extracting a target test logic in the test script through the test server, and judging and acquiring a test requirement of the target test logic;
controlling a test algorithm to run the test script through a test server according to the type of the test requirement to obtain a test result; if the type of the test requirement is a smoking test, running the test script through a test algorithm to obtain a smoking test result; if the type of the test requirement is an algorithm response time test, running the test script through a test algorithm to obtain a response test result; and if the type of the test requirement is a performance test, running a test script through a test algorithm and starting an acquisition program, so that the acquisition program acquires the machine resource occupation condition of the test server when running the test script and generates a performance test result.
8. An automatic algorithmic test device, comprising:
the test preparation module is used for storing the production data acquired from the production server to the data server and converting the production data into test data, deploying a test algorithm in the test server and storing a test logic data packet associated with a test requirement;
the requirement identification module is used for receiving target information output by a user terminal, determining a target test algorithm in a test server according to the target information, extracting target test data from a data server, and determining a target test logic in the test server according to the test requirement selected by the user terminal;
and the test execution module is used for summarizing and packaging the target test logic and the target test data to form a test script, controlling a target test algorithm through a test server to run the test script to obtain a test result, calling a callback interface of the test server to generate a callback signal, and outputting the callback signal and the test result to the user side.
9. A computer system comprising a plurality of computer devices, each computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processors of the plurality of computer devices when executing the computer program collectively implement the steps of the automated algorithmic test method of any of claims 1 to 7.
10. A computer-readable storage medium comprising a plurality of storage media, each storage medium having a computer program stored thereon, wherein the computer programs stored in the storage media, when executed by a processor, collectively implement the steps of the automatic testing method for algorithms of any one of claims 1 to 7.
CN202010102333.0A 2020-02-19 2020-02-19 Automatic algorithm testing method and device, computer system and readable storage medium Pending CN111258913A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010102333.0A CN111258913A (en) 2020-02-19 2020-02-19 Automatic algorithm testing method and device, computer system and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010102333.0A CN111258913A (en) 2020-02-19 2020-02-19 Automatic algorithm testing method and device, computer system and readable storage medium

Publications (1)

Publication Number Publication Date
CN111258913A true CN111258913A (en) 2020-06-09

Family

ID=70949605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010102333.0A Pending CN111258913A (en) 2020-02-19 2020-02-19 Automatic algorithm testing method and device, computer system and readable storage medium

Country Status (1)

Country Link
CN (1) CN111258913A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767222A (en) * 2020-06-28 2020-10-13 杭州数梦工场科技有限公司 Data model verification method and device, electronic equipment and storage medium
CN111897732A (en) * 2020-08-04 2020-11-06 北京师范大学 Embedded fatigue detection platform and fatigue detection method
CN112255965A (en) * 2020-10-22 2021-01-22 中山市华盛家具制造有限公司 Method for acquiring NC program in equipment machining
CN113377664A (en) * 2021-06-25 2021-09-10 上海商汤科技开发有限公司 Model testing method and device, electronic device and storage medium
CN113434374A (en) * 2021-06-17 2021-09-24 华东师范大学 Multithreading POSIX standard-based real-time operating system memory management algorithm performance test method and system
CN114186697A (en) * 2021-12-10 2022-03-15 北京百度网讯科技有限公司 Method and device for generating and applying deep learning model based on deep learning framework

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767222A (en) * 2020-06-28 2020-10-13 杭州数梦工场科技有限公司 Data model verification method and device, electronic equipment and storage medium
CN111897732A (en) * 2020-08-04 2020-11-06 北京师范大学 Embedded fatigue detection platform and fatigue detection method
CN111897732B (en) * 2020-08-04 2021-11-12 北京师范大学 Embedded fatigue detection platform and fatigue detection method
CN112255965A (en) * 2020-10-22 2021-01-22 中山市华盛家具制造有限公司 Method for acquiring NC program in equipment machining
CN113434374A (en) * 2021-06-17 2021-09-24 华东师范大学 Multithreading POSIX standard-based real-time operating system memory management algorithm performance test method and system
CN113377664A (en) * 2021-06-25 2021-09-10 上海商汤科技开发有限公司 Model testing method and device, electronic device and storage medium
CN114186697A (en) * 2021-12-10 2022-03-15 北京百度网讯科技有限公司 Method and device for generating and applying deep learning model based on deep learning framework
CN114186697B (en) * 2021-12-10 2023-03-14 北京百度网讯科技有限公司 Method and device for generating and applying deep learning model based on deep learning framework

Similar Documents

Publication Publication Date Title
CN111258913A (en) Automatic algorithm testing method and device, computer system and readable storage medium
CN108108297B (en) Method and device for automatic testing
CN110046101B (en) Page automatic testing method and device and computer storage medium
CN107733708B (en) Equipment parameter configuration method and device, computer equipment and storage medium
CN105787364B (en) Automatic testing method, device and system for tasks
US10042744B2 (en) Adopting an existing automation script to a new framework
CN104765678A (en) Method and device for testing applications on mobile terminal
CN107241315B (en) Access method and device of bank gateway interface and computer readable storage medium
CN111159049A (en) Automatic interface testing method and system
CN107045475B (en) Test method and device
CN108984179B (en) Linux compiling processing method and device
CN109901985B (en) Distributed test apparatus and method, storage medium, and electronic device
CN110119350A (en) Software Development Kit test method, device and equipment and computer storage medium
CN114064208A (en) Method and device for detecting application service state, electronic equipment and storage medium
CN107168844B (en) Performance monitoring method and device
CN112650676A (en) Software testing method, device, equipment and storage medium
CN111831542A (en) API application debugging method and device and storage medium
CN114546738A (en) Server general test method, system, terminal and storage medium
CN113377667A (en) Scene-based testing method and device, computer equipment and storage medium
CN111611086A (en) Information processing method, information processing apparatus, electronic device, and medium
CN109388420A (en) Application upgrade test method, device, computer equipment and storage medium
CN110750453A (en) HTML 5-based intelligent mobile terminal testing method, system, server and storage medium
CN115237441A (en) Upgrade test method, device and medium based on cloud platform
CN115599438A (en) Method, device, equipment and medium for constructing application program publishing package
CN110515834B (en) Interface testing method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination