CN112860546A - Service testing method, service testing device, electronic equipment and computer readable storage medium - Google Patents

Service testing method, service testing device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN112860546A
CN112860546A CN202110104276.4A CN202110104276A CN112860546A CN 112860546 A CN112860546 A CN 112860546A CN 202110104276 A CN202110104276 A CN 202110104276A CN 112860546 A CN112860546 A CN 112860546A
Authority
CN
China
Prior art keywords
service
test
services
task
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110104276.4A
Other languages
Chinese (zh)
Inventor
林文珍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN202110104276.4A priority Critical patent/CN112860546A/en
Publication of CN112860546A publication Critical patent/CN112860546A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application provides a service testing method, a service testing device, electronic equipment, a computer readable storage medium and a computer program product; the method comprises the following steps: receiving a service test request, wherein the service test request is used for requesting to test at least two services; analyzing the service test request to obtain a test sample and the service type of each service; constructing a test task corresponding to each service based on the test sample; respectively determining a service platform corresponding to the service type of each service; sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types; and the test task is used for the service platform to test the corresponding service. Through the application, various services can be tested in batches, and the test efficiency is high.

Description

Service testing method, service testing device, electronic equipment and computer readable storage medium
Technical Field
The present disclosure relates to testing technologies, and in particular, to a service testing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
In a service system, a plurality of types of services are often provided for use, and in order to ensure that the provided services can be put into use in sequence, the services generally need to be tested to eliminate vulnerabilities of a service interface and help to judge whether the quality of the service interface is suitable for production or not.
In the related art, each service is usually tested individually, but mixed batch testing of multiple types of services cannot be realized, multiple tests are required to cover all services, and testing efficiency is low.
Disclosure of Invention
The embodiment of the application provides a service testing method, a service testing device, electronic equipment, a computer readable storage medium and a computer program product, which can test multiple services in batch and have high testing efficiency.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a service testing method, which comprises the following steps:
receiving a service test request, wherein the service test request is used for requesting to test at least two services;
analyzing the service test request to obtain a test sample and the service type of each service;
constructing a test task corresponding to each service based on the test sample;
respectively determining a service platform corresponding to the service type of each service;
sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types;
and the test task is used for the service platform to test the corresponding service.
An application embodiment provides a service testing apparatus, including:
the system comprises a receiving module, a processing module and a processing module, wherein the receiving module is used for receiving a service test request which is used for requesting to test at least two services;
the analysis module is used for analyzing the service test request to obtain a test sample and the service type of each service;
the construction module is used for constructing a test task corresponding to each service based on the test sample;
the determining module is used for respectively determining the service platforms corresponding to the service types of the services;
the sending module is used for sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types;
and the test task is used for the service platform to test the corresponding service.
In the above solution, the service testing apparatus further includes: the buffer module is used for buffering the test task to a task queue; correspondingly, the determining the service platform corresponding to the service type of each service respectively includes: based on the caching sequence of each test task in the task queue, sequentially taking out the test tasks from the task queue; and determining a corresponding service platform based on the service type of the service corresponding to the taken test task.
In the above scheme, the building module is further configured to create a first thread, and build, based on the test sample, a test task corresponding to each service through the first thread; correspondingly, the sending module is further configured to create a second thread parallel to the first thread, and send the test task corresponding to each service to the service platform corresponding to the corresponding service type through the second thread.
In the above scheme, the analysis module is further configured to analyze the service test request to obtain a test sample carried by the service test request, or analyze the service test request to obtain a sample identifier of the test sample, and obtain the test sample from a target storage system based on the sample identifier.
In the above solution, the service testing apparatus further includes: the statistical module is used for receiving test results returned by each service platform, and the test results comprise test sub-results aiming at least two statistical indexes; respectively counting corresponding test sub-results of each service platform according to each statistical index to obtain corresponding statistical results; and generating a corresponding statistical report based on the statistical result.
In the above scheme, the service type includes a federal service type, and the service platform corresponding to the federal service type includes: a first participant platform and a second participant platform; the sending module is further configured to execute the following processing for each test task: when the service type corresponding to the test task is a federal service type, sending the test task to the first participant platform; and the first participant platform is used for combining with the second participant platform and testing the corresponding service based on the test task.
The embodiment of the application provides a service testing method, which comprises the following steps:
presenting a test interface and presenting a plurality of services on the test interface;
in response to a selection operation for at least two services of the plurality of services, treating the selected at least two services as services to be tested;
sending a service test request to a management server in response to the test instruction for the at least two services;
and the service test request is used for the management server to construct a test task corresponding to each service and send the test task to a service platform corresponding to each service for testing.
An application embodiment provides a service testing apparatus, including:
the presentation module is used for presenting a test interface and presenting a plurality of services on the test interface;
a selection module configured to take at least two selected services as services to be tested in response to a selection operation for at least two services of the plurality of services;
the sending module is used for responding to the test instruction aiming at the at least two services and sending a service test request to the management server;
and the service test request is used for the management server to construct a test task corresponding to each service and send the test task to a service platform corresponding to each service for testing.
In the above solution, the service testing apparatus further includes: a sample selection module for presenting a sample selection function item; presenting at least one candidate test sample for selection in response to a triggering operation for the sample selection function item; responding to a sample selection operation triggered based on the candidate test samples, and taking the selected candidate test samples as test samples of the at least two services; the test sample is used for the management server to construct a test task corresponding to each service.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the service testing method provided by the embodiment of the application when the executable instructions stored in the memory are executed.
The embodiment of the present application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the method for testing a service provided by the embodiment of the present application.
The embodiment of the present application provides a computer program product, which includes a computer program, and the computer program, when executed by a processor, implements the service testing method provided by the embodiment of the present application.
The embodiment of the application has the following beneficial effects:
compared with the prior art in which separate testing is performed on each service, the service testing requests for at least two services are received, the service testing requests are analyzed, the testing tasks corresponding to the services are constructed based on the analyzed testing samples and are respectively sent to the service platforms corresponding to the service types for testing, batch testing of a plurality of services through one service testing request is achieved, the defect of low testing efficiency in the related technology is overcome, and the service testing efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of an alternative structure of a service test system provided in an embodiment of the present application;
fig. 2 is an alternative structural schematic diagram of an electronic device provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart of an alternative service testing method provided by the embodiment of the present application;
FIG. 4 is a schematic flow chart of an alternative service testing method provided by the embodiment of the present application;
FIG. 5 is a schematic flow chart of an alternative service testing method provided by the embodiment of the present application;
FIG. 6 is a schematic flow chart of an alternative service testing method provided by the embodiment of the present application;
FIG. 7 is an alternative schematic diagram of a test interface provided by embodiments of the present application;
FIG. 8 is an alternative flow chart of a service testing method provided by an embodiment of the present application;
FIG. 9 is an alternative schematic diagram of a test interface provided by embodiments of the present application;
FIG. 10 is an alternative schematic diagram of a test interface provided by embodiments of the present application;
FIG. 11 is an alternative schematic diagram of a test interface provided by embodiments of the present application;
FIG. 12 is a schematic flow chart of an alternative service testing method provided by the embodiment of the present application;
FIG. 13 is an alternative structural diagram of a service test system provided by an embodiment of the present application;
FIG. 14 is a schematic diagram of an alternative configuration of a service test apparatus according to an embodiment of the present application;
fig. 15 is an alternative structural schematic diagram of a service testing apparatus provided in an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) Offline backtracking is a method used to test profitability, risk attributes, and other relevant indicators of a service. The working principle is that a sample of a certain period of time is delivered to a tested service, the tested service is enabled to automatically run in a simulated backtracking environment, and finally the service is evaluated according to the calculation result of the service.
2) Federal machine Learning (Federal machine Learning/Federal Learning), also known as Federal Learning, Joint Learning, and Union Learning. Federal machine learning is a machine learning framework, and can effectively help a plurality of organizations to perform data use and machine learning modeling under the condition of meeting the requirements of user privacy protection, data safety and government regulations. The federated learning is used as a distributed machine learning paradigm, the data island problem can be effectively solved, participators can jointly model on the basis of not sharing data, the data island can be technically broken, and the cooperative intelligence is realized.
Federal Learning (fed Learning, a.k.a. fed Machine Learning) can be divided into three categories: horizontal federal Learning (Horizontal federal Learning), Vertical federal Learning (Vertical federal Learning), and federal Transfer Learning (fed transferred Learning).
Among them, the horizontal federal Learning is also called Feature-Aligned federal Learning (Feature-Aligned fed Learning), that is, the data features of the participants of the horizontal federal Learning are Aligned, which is suitable for the case that the data features of the participants overlap more, and the sample Identifications (IDs) overlap less. Vertical federal Learning is also called Sample-Aligned federal Learning (Sample-Aligned fed Learning), i.e., training samples of participants in vertical federal Learning are Aligned, which is suitable for cases where there is more overlap of participant training Sample IDs and less overlap of data features.
Based on this, embodiments of the present application provide a service testing method, apparatus, electronic device, computer-readable storage medium, and computer program product, which can test multiple services in batch and have high testing efficiency.
First, a description is given of a service testing system provided in an embodiment of the present application, referring to fig. 1, fig. 1 is an alternative architecture schematic diagram of a service testing system 100 provided in an embodiment of the present application, and a terminal 400, a management server 200, and a service platform 500(500-1, 500-2, …, 500-n) are all communicatively connected through a network 300. The network 300 may be a wide area network or a local area network, or a combination of both, using wireless links for data transmission. In some embodiments, the terminal 400 may be, but is not limited to, a laptop, a tablet, a desktop computer, a smart phone, a dedicated messaging device, a portable gaming device, a smart speaker, a smart watch, and the like. The management server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, a big data and artificial intelligence platform, and the like. The network 300 may be a wide area network or a local area network, or a combination of both. The service platform 500 may be implemented as a terminal or as a server. The terminal 400, the management server 200, and the service platform 500 may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited thereto.
The terminal 400 is configured to present a test interface, present a plurality of services on the test interface, regard at least two selected services as services to be tested in response to a selection operation for at least two services of the plurality of services, and send a service test request to the management server 200 in response to a test instruction for the at least two services.
A management server 200, configured to receive a service test request, where the service test request is used to request to test at least two services; analyzing the service test request to obtain a test sample and the service type of each service; constructing a test task corresponding to each service based on the test sample; determining service platforms 500 corresponding to the service types of the services respectively; and sending the test tasks corresponding to the services to the service platform 500 corresponding to the corresponding service types.
The service platform 500 is configured to receive the test task, call the corresponding service interface to execute the test task, test the corresponding service, and return the test result to the management server 200.
The management server 200 is further configured to receive the test results of each service platform 500, generate a statistical report based on each test result, and send the statistical report to the terminal 400.
The terminal 400 is further configured to receive and output the statistics report.
Referring to fig. 2, fig. 2 is an optional schematic structural diagram of an electronic device 500 provided in the embodiment of the present application, in practical application, the electronic device 500 may be implemented as the terminal 400 or the management server 200 in fig. 1, and the electronic device implementing the service testing method in the embodiment of the present application is described by taking the electronic device as the management server 200 shown in fig. 1 as an example. The electronic device 500 shown in fig. 2 includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in the electronic device 500 are coupled together by a bus system 540. It will be appreciated that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 2.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 can store data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the service test apparatus provided in the embodiments of the present application may be implemented in software, and fig. 2 shows a service test apparatus 555 stored in a memory 550, which may be software in the form of programs and plug-ins, and includes the following software modules: the receiving module 5551, the parsing module 5552, the constructing module 5553, the determining module 5554 and the sending module 5555 are logical and thus can be arbitrarily combined or further split according to the implemented functions. The functions of the respective modules will be explained below.
In other embodiments, the service testing Device provided in the embodiments of the present Application may be implemented in hardware, and for example, the service testing Device provided in the embodiments of the present Application may be a processor in the form of a hardware decoding processor, which is programmed to execute the service testing method provided in the embodiments of the present Application, for example, the processor in the form of the hardware decoding processor may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
The service testing method provided by the embodiment of the present application will be described in conjunction with exemplary applications and implementations of the management server provided by the embodiment of the present application. Referring to fig. 3, fig. 3 is an alternative flow chart of a service testing method provided in the embodiment of the present application, which will be described with reference to the steps shown in fig. 3.
Step 101, a management server receives a service test request, where the service test request is used to request to test at least two services.
Here, the service test request is sent to the management server by the terminal, and the user can send out the corresponding service test request based on the human-computer interaction interface of the terminal. The service test request is packaged with service identifications corresponding to at least two services. In the embodiment of the application, the service types of the service include a federal service and a non-federal service. The non-federal service is a common API service and provides a service interface of a non-federal model for use. In some embodiments, the service test request carries service information of a plurality of services, a task type of the test task, and a selected test sample.
And 102, analyzing the service test request to obtain a test sample and the service type of each service.
In the embodiment of the application, the management server analyzes the service test request to obtain the service information and the test samples of the multiple services carried in the service test request. The service information includes a service identifier and a service type of the service. Here, the test sample may be uploaded when the service test request is sent, or may be selected when the service test request is sent, and the selected sample identifier is encapsulated in the service test request.
In some embodiments, based on fig. 3, step 102 may also be implemented by: the management server analyzes the service test request to obtain a test sample carried by the service test request, or analyzes the service test request to obtain a sample identifier of the test sample, and based on the sample identifier, the test sample is obtained from a target storage system.
In actual implementation, when the service test request carries the test sample, the management server directly analyzes the service test request to obtain the test sample carried by the service test request. When the service test request carries the sample identification of the test sample, the management server analyzes the server test request to obtain the sample identification, queries the storage system based on the sample identification, and searches for the test sample corresponding to the sample identification from the storage system.
And 103, constructing a test task corresponding to each service based on the test sample.
In actual implementation, the management server constructs a corresponding test task based on the test sample and generates a corresponding task ID for each service. In some embodiments, the management server may also perform construction of the corresponding test task based on the sample identification of the test sample.
In some embodiments, the management server constructs, for each service, a corresponding test task based on the service type, the test sample, the task type of the test task, and the selection result of whether to generate the statistical report. Illustratively, for the service 1, the service type is the federal service type, the test sample requested by the service test request is sample 1, the task type is online batch test, and the selection result of whether to generate the statistical report is yes, the management server generates a test task corresponding to the service 1 and a task ID corresponding to the test task, such as job1, based on the information.
In some embodiments, referring to fig. 4, fig. 4 is an optional flowchart of a service testing method provided in an embodiment of the present application, and based on fig. 3, after step 103, the method further includes:
step 201, the management server buffers the test task to a task queue.
During actual implementation, the management server caches the task ID corresponding to the constructed test task to the task queue. The task queue may be provided in the management server, or may be provided in an external storage system communicatively connected to the management server.
Accordingly, step 104 can also be implemented as follows:
step 1041, the management server sequentially takes out the test tasks from the task queue based on the caching order of each test task in the task queue.
And 1042, determining a corresponding service platform based on the service type of the service corresponding to the extracted test task.
In practical implementation, the server sequentially takes out the test tasks from the task queue according to a First-in First-out (FIFO) principle, determines the service type of the test task for the service according to the taken out test tasks, and determines the access platform of the corresponding service type based on the service type. It should be noted that the construction of the test task and the sending of the test task by the management server are two parallel processing processes, and the test task is transferred through the task queue, so that the processing speed is increased and the missing processing of the test task is avoided.
In some embodiments, based on fig. 3, step 103 may also be implemented by: and the management server creates a first thread, and constructs a test task corresponding to each service through the first thread based on the test sample. Accordingly, step 105 may also be implemented as follows: the management server creates a second thread parallel to the first thread; and sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types through the second thread.
In actual implementation, the management server constructs two parallel threads, namely a first thread and a second thread, for a test sample constructing task and a test sample sending task respectively. The first thread executes the construction task of the test sample, and the second thread executes the sending task of the test sample, so that the test sample is constructed and sent efficiently and orderly, and the processing speed is improved.
And 104, respectively determining the service platforms corresponding to the service types of the services.
In actual implementation, the management server determines, for the test task, an access address of the service platform of the corresponding service type based on the service type corresponding to the test task. It should be noted that, in the embodiment of the present application, the corresponding test task is only distributed according to the service type, that is, the test task of the service type is sent to the service platform of the corresponding service type as long as the service type is determined. In some embodiments, the differentiation of test tasks is made for service types while also differentiating for individual services. For example, 3 services including service 1, service 2, and service 3 are obtained by analyzing the service test request, where the service 1 to the service 3 may all be set on the same service platform, and may also be set on different service platforms, and when the service is set on different service platforms, the management server further needs to determine the different service platforms according to the different services.
And 105, sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types. And the test task is used for the service platform to test the corresponding service.
In actual implementation, the management server sends the test task to the corresponding service platform according to the service type of the test task, so that the service platform calls the corresponding service interface to execute the test task, and the service is tested. Here, the service test request requests a plurality of services, and the service platform executes test tasks corresponding to the plurality of services in parallel.
According to the service testing method and the service testing device, the service testing requests for at least two services are received, the service testing requests are analyzed, the testing tasks corresponding to the services are constructed based on the testing samples obtained through analysis and are respectively sent to the service platforms corresponding to the service types to be tested, batch testing of a plurality of services through one service testing request is achieved, the defect of low testing efficiency in the related technology is overcome, and the service testing efficiency is improved.
In some embodiments, the service type includes a federal service type, and the service platform corresponding to the federal service type includes: a first participant platform and a second participant platform. Based on fig. 3, step 105 can also be implemented as follows: the management server respectively executes the following processing aiming at each test task: when the service type corresponding to the test task is a federal service type, sending the test task to the first participant platform; and the first participant platform is used for combining with the second participant platform and testing the corresponding service based on the test task.
In some embodiments, referring to fig. 5, fig. 5 is an optional flowchart schematic diagram of a service testing method provided in an embodiment of the present application, and based on fig. 3, the following may also be performed:
step 301, the management server receives test results returned by each service platform, where the test results include test sub-results for at least two statistical indexes.
Step 302, respectively aiming at each statistical index, performing statistics on the corresponding test sub-results of each service platform to obtain corresponding statistical results.
Step 303, generating a corresponding statistical report based on the statistical result.
In actual implementation, when the service test request carries a selection result for generating a statistical report, the management server performs statistics on the test result after receiving the test result returned by each service platform. In the application, the management server performs statistics of the test results for a plurality of statistical indexes. For each statistical index, the test result corresponding to the service platform includes a test sub-result corresponding to the statistical index. Here, the statistics of the test result with respect to the statistical index may be performed by the management server, or the management server may send a statistics request to the statistics platform, and the statistics platform performs statistics of the test result with respect to each statistical index, and then returns the statistics result to the management server.
In actual implementation, the management server generates a corresponding statistical report according to the statistical result obtained by statistics, where the statistical report may be in the form of a table, a document, or the like. The management server can store the generated statistical report locally, and can also send the statistical report to a storage system for storage. And after the management server obtains the statistical report, generating a message for completing the test and sending the message to the terminal so that the terminal outputs the message for completing the test to prompt the user that the test is completed. In addition, the terminal can also present the download function items of the statistical report for the user to download and browse, and the terminal can also directly present the statistical report for the user to browse directly.
Continuing to introduce the service testing method provided by the embodiment of the present application, referring to fig. 6, fig. 6 is an optional flowchart schematic diagram of the service testing method provided by the embodiment of the present application, and the service testing method provided by the embodiment of the present application includes:
step 401, the terminal presents a test interface and presents a plurality of services on the test interface.
Referring to fig. 7, fig. 7 is an alternative schematic diagram of a test interface provided in the embodiments of the present application. And the terminal responds to the starting instruction aiming at the test client, presents the test interface of the test client and presents a plurality of services in the test interface. Here, each service type corresponds to a plurality of services, and the terminal presents the services corresponding to the plurality of service types on the test interface for the user to select. Here, the user may select the service in a batch, may select the service individually, and may submit the service in a batch, or the like.
And 402, responding to the selection operation of at least two services in the plurality of services, and taking the selected at least two services as the service to be tested.
Step 403, in response to the test instruction for the at least two services, sending a service test request to the management server. And the service test request is used for the management server to construct a test task corresponding to each service and send the test task to a service platform corresponding to each service for testing.
In actual implementation, the test interface also presents a confirmation function item, and the terminal responds to the trigger operation aiming at the confirmation function item to generate a test instruction aiming at the at least two services. In some embodiments, the terminal may further generate a corresponding test instruction while obtaining the selected at least two services in response to the selection operation for the at least two services, and send a service test request to the management server in response to the test execution. In some embodiments, the test instruction may be issued to the terminal by other devices, or may be triggered based on a certain trigger condition, for example, when the test period arrives.
In some embodiments, referring to fig. 8, fig. 8 is an optional flowchart of the service testing method provided in the embodiment of the present application, and before step 403, the following may be further performed:
step 501, the terminal presents a sample selection function item.
Step 502, in response to a trigger operation for the sample selection function item, presenting at least one candidate test sample for selection.
Step 503, in response to the sample selection operation triggered based on the candidate test sample, taking the selected candidate test sample as a test sample of the at least two services. The test sample is used for the management server to construct a test task corresponding to each service.
Illustratively, referring to fig. 9, fig. 9 is an alternative schematic diagram of a test interface provided in an embodiment of the present application. In actual implementation, the terminal also presents a sample selection function item on the test interface. And the terminal responds to the trigger operation aiming at the sample selection function item, presents a selection list of the test sample, and presents at least two sub-selection function items in the selection list, wherein each sub-selection function item corresponds to one candidate test sample. The terminal obtains a sample identification of the selected candidate test sample, e.g., sample 1, in response to selecting an operation for the sub-selection function. And then, the terminal responds to the trigger operation aiming at the confirmation function item to generate a corresponding test instruction, generates a corresponding service test request based on the selected at least two services and the sample identification of the test sample, and sends the service test request to the management server. It should be noted that the test sample is stored in the storage system, and after receiving the service test request, the management server parses the service test request to obtain a sample identifier, accesses the corresponding storage system based on the sample identifier, and obtains the test sample corresponding to the sample identifier from the storage system.
In some embodiments, referring to fig. 10, fig. 10 is an alternative schematic view of a test interface provided by embodiments of the present application. In actual implementation, the terminal presents the sample selection interface in response to a trigger operation for selecting a function item for a sample, for example, clicking a "browse" key as shown in fig. 10, and presents a plurality of test samples, such as "sample 1" - "sample 6" as shown in fig. 10, in the sample selection interface, where the test samples are stored locally in the terminal, and the terminal uploads the selected test samples and uploads the test samples to the test client in response to the selection operation for the test samples, so that the test client generates a corresponding service test request based on the uploaded test samples and the selected at least two services.
In some embodiments, referring to fig. 11, fig. 11 is an alternative schematic view of a test interface provided by embodiments of the present application. The test interface is also presented with a selection function item for a task type of the test task, and the terminal obtains the selected task type in response to the selection function item for the task type, such as "online batch test" shown in fig. 11. A selection function item for whether to generate a statistical report is also presented in the test interface, and the terminal obtains a selection result for whether to generate a statistical report in response to a trigger operation for the selection function item, as shown as "yes" in fig. 11. Then, the terminal responds to the trigger operation aiming at the confirmation function item, and generates a corresponding service test request based on the selected service, the task type, the test sample and the selection result aiming at whether the statistical report is generated or not.
In the embodiment of the application, the terminal presents multiple services on the test interface, responds to the selection operation aiming at least two services in the multiple services, takes the selected at least two services as the services for testing, responds to the test instruction aiming at the at least two services, and sends the service test request to the management server, so that the management server constructs the test tasks corresponding to the services and sends the test tasks to the service platforms corresponding to the services for testing, thereby realizing the batch test of the multiple services, enabling a user to automatically perform the batch test of the multiple services by only submitting the test request once, saving the manual workload and improving the test efficiency.
Continuing to describe the service testing method provided in the embodiment of the present application, the service testing method is cooperatively implemented by the terminal, the management server, and the service platform, fig. 12 is an optional flowchart of the service testing method provided in the embodiment of the present application, and referring to fig. 12, the service testing method provided in the embodiment of the present application includes:
step 601, the terminal presents a test interface and presents a plurality of services and sample selection function items on the test interface.
Step 602, the terminal responds to the selection operation for at least two services in the plurality of services, and takes the selected at least two services as the service to be tested.
And step 603, the terminal presents at least one candidate test sample for selection in response to the trigger operation aiming at the sample selection function item.
In step 604, the terminal responds to the sample selection operation triggered based on the candidate test sample, and takes the selected candidate test sample as the test sample of the at least two services.
In some embodiments, referring to fig. 10, the terminal further presents a task type selection function item and a selection function item whether to generate a statistical report on the test interface, the terminal obtains the selected task type in response to a selection operation for the task type selection function item, and the terminal obtains a selection result whether to generate the statistical report in response to a selection operation for the selection function item whether to generate the statistical report, where the selection result includes two results, i.e., "yes" and "no".
It should be noted that the order of the selection operation of the service triggered by the user, the sample selection operation of the test sample, the selection operation of the task type, and the selection operation of the selection function item for whether to generate the statistical report may be any order.
Step 605, the terminal sends a service test request to the management server in response to the test instruction for the at least two services.
In actual implementation, the terminal also presents a confirmation function item on the test interface, and the terminal responds to the confirmation operation aiming at the confirmation function item to obtain a corresponding test instruction. In some embodiments, the test instruction may also be issued to the terminal by other devices, or may also be generated by the terminal based on a certain trigger condition, such as the arrival of a test cycle. And the terminal responds to the test instruction, generates a corresponding service test request based on the selected service, the test sample, the task type and whether the selection result of the statistical report is generated or not, and sends the service test request to the management server.
Step 606, the management server analyzes the service test request to obtain a test sample and a service type of each service.
It should be noted that the service types of the service include a federal service type and a general API service type, that is, a federal service type and a non-federal service type.
Step 607, the management server constructs a test task corresponding to each service based on the test sample.
Here, the management server constructs a corresponding test task for each service. In some embodiments, if the test sample includes multiple trace-back times, for each service, the management server further constructs a test task for each trace-back time, that is, if the test sample includes a trace-back times, then a test tasks are correspondingly constructed for any service, where a is a positive integer greater than or equal to 1.
In step 608, the management server determines the service platforms corresponding to the service types of the services.
Step 609, the management server sends the test task corresponding to each service to the service platform corresponding to the corresponding service type.
And step 610, the service platform calls a service interface to execute the test task to obtain a test result.
Step 611, the service platform sends the test result to the management server.
Step 612, the management server generates a statistical request based on the test result.
Step 613, the management server performs statistics on the test result according to the statistical index to obtain a corresponding statistical result.
In actual implementation, the management server may generate a statistical request, and send the statistical request to the statistical platform, so that the statistical platform responds to the statistical request, performs statistics on the test result for at least two statistical indexes, obtains a corresponding statistical result, and returns the statistical result to the management server. It should be noted that the management server performs statistics on the test result only when the selection result of whether to generate the statistics report carried in the service test request is yes, otherwise the management server does not perform statistics on the test result.
Step 614, the management server sends the statistical result to the terminal.
And step 615, the terminal outputs a statistical result.
In actual implementation, the management server can directly send the statistical result to the terminal, and the terminal directly presents the statistical result in the test interface for the user to browse. In some embodiments, the management server may further send a message that the statistical result has been generated to the terminal, and the terminal receives the message and presents the download function item of the statistical report, so that the user may trigger a download operation on the statistical report for the download function item, and the terminal presents the corresponding statistical report in response to the download operation on the download function item.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
Referring to fig. 13, fig. 13 is an alternative structural schematic diagram of a service testing system provided in the embodiment of the present application. And the terminal presents the test interface and presents the services of the multiple service types in the test interface. Here, the service types include a federal service type and a non-federal service type, and the number of services corresponding to each service type is plural. The non-federated services are generic API services that provide a generic service interface that can be directly invoked. The federated services are federated model services under a federated learning scenario, and a service platform corresponding to a federated service type includes a plurality of participant platforms.
In actual implementation, the user may select the service to be tested based on the services of the plurality of service types presented in the test interface. Specifically, the terminal takes at least two selected services as the service to be tested in response to the selection operation for at least two services among the plurality of services.
It should be noted that the test interface is further presented with a task type selection function item, and the task type selection function item is used for selecting at least two test task types. Here, the test task types include online single test, online batch test, and offline backtracking batch test. In addition, a sample selection function item is also presented in the test interface, and a user can select a test sample by triggering the sample selection function item. Specifically, the terminal presents a plurality of candidate test samples for selection in response to a trigger operation for a sample selection function item, obtains a sample identifier of the selected candidate test sample in response to a sample selection operation triggered based on the candidate test sample, obtains a corresponding candidate test sample from the storage system based on the sample identifier, and takes the selected candidate test sample as a test sample of the selected at least two services. In some embodiments, the user may also perform uploading of the test sample by triggering the triggering operation of the sample selection function item. Specifically, the terminal responds to a trigger operation for selecting a function item for a sample, presents a sample uploading interface, presents a plurality of candidate test samples for selection on the sample uploading interface, responds to a sample selection operation triggered based on the candidate test samples, uploads the selected candidate test samples, and obtains the selected candidate test samples after successful uploading.
In actual implementation, after the terminal obtains the at least two selected services, task types and test samples, the terminal responds to the test instructions for the at least two services, packages the at least two services, task types and test samples, generates corresponding service test requests and sends the service test requests to the management server. In some embodiments, the terminal further presents a selection function item whether to generate the statistical report, obtains a selection result whether to generate the statistical report in response to a trigger operation for the selection function item, and packages the selection result into the service test request to be sent to the management server.
After receiving the service test request, the management server analyzes the service test request to obtain service identifiers of a plurality of services, task types of test tasks, selected test samples and a selection result of whether to generate a statistical report. The management server then performs the assembly of test subtasks for each service. Specifically, the management server generates a corresponding test subtask for a single service based on the service type of the service, the task type of the test task, the sample identifier of the selected test sample, and a selection result of whether to generate a statistical report.
In actual implementation, the management server adds a task ID to each test subtask, and adds the task ID to the task queue according to the order of assembling the test subtasks. And the management server takes out the task ID from the task queue according to the FIFO principle and sends the test subtask corresponding to the task ID to the corresponding service platform. Here, the management server may perform the assembly of the test subtasks and the distribution of the test subtasks in parallel. It should be noted that, if the task type is an online single test, the management server directly sends the service test request to the corresponding service platform, and if the task type is an online batch test or an offline backtracking batch test, the management server needs to analyze the service test request and assemble the corresponding test subtasks.
In actual implementation, after taking out the task ID from the task queue, the management server sends the corresponding test subtask to the service platform of the corresponding service type. And after receiving the test subtask, the service platform calls a corresponding service interface to execute the test subtask so as to test the service. When receiving the plurality of test subtasks, the service platform performs the execution of the plurality of test subtasks in parallel in a batch manner. Specifically, the service platform calls corresponding service interfaces respectively to execute corresponding test subtasks, so as to test a plurality of services in batches.
It should be noted that, for a service platform of a federal service type, the service platform includes a first participant platform and a second participant platform, where the first participant platform receives a test task of a corresponding service type sent by a management server, calls a corresponding federal service interface, executes the test task, obtains a unilateral calculation result, and forwards a calculation request to the second participant platform, so that the second participant platform performs calculation to obtain the unilateral calculation result, and then sends the unilateral calculation result to the first participant platform, and the first participant platform obtains the corresponding test result based on the local unilateral calculation result and the unilateral calculation result sent by the second participant platform. It should be understood that the first participant platform and the second participant platform are federal learning based service platforms, and that both are cryptographic interactions. In some embodiments, if the task type of the test task requested by the service test request is online batch test or offline backtracking batch test, the first participant platform and the second participant platform respectively execute single-side calculation in batch, and the first participant platform collects batch single-side calculation results to obtain a final test result.
In actual implementation, each service platform returns the test result to the management server, and the management server sends the test result to the statistical platform, so that the statistical platform performs statistics on the test result according to a plurality of statistical indexes to obtain a statistical result corresponding to each statistical index. The statistical indexes include the search rate, the zero value rate, the Information Value (IV), the KS (Kolmogorov-Smirnov) test, the ratio of good samples to bad samples and the like. And after the statistical platform obtains a statistical result aiming at each statistical index, the statistical result is returned to the management server, the management server generates a corresponding statistical report based on the test result and the statistical result, stores the statistical report to the storage system, generates a test completion message and sends the test completion message to the terminal, and the terminal outputs the test completion message to prompt the user that the service test is completed. In addition, the terminal can also present a downloading function item of the test result or the statistical report for the user to download and check. Here, the downloading function items of the terminal program further include a batch downloading function item for the user to download in batches.
Continuing with the exemplary structure of the service test device 555 provided in the embodiment of the present application implemented as a software module, in some embodiments, referring to fig. 14, fig. 14 is an alternative structural schematic diagram of the service test device provided in the embodiment of the present application, and the software module stored in the service test device 555 in the memory 550 may include:
a receiving module 5551, configured to receive a service test request, where the service test request is used to request to test at least two services;
an analysis module 5552, configured to analyze the service test request to obtain a test sample and a service type of each service;
a construction module 5553, configured to construct a test task corresponding to each of the services based on the test sample;
a determining module 5554, configured to determine service platforms corresponding to service types of the services, respectively;
a sending module 5555, configured to send the test task corresponding to each service to the service platform corresponding to the corresponding service type;
and the test task is used for the service platform to test the corresponding service.
In some embodiments, the service testing apparatus further comprises: the buffer module is used for buffering the test task to a task queue; correspondingly, the determining the service platform corresponding to the service type of each service respectively includes: based on the caching sequence of each test task in the task queue, sequentially taking out the test tasks from the task queue; and determining a corresponding service platform based on the service type of the service corresponding to the taken test task.
In some embodiments, the building module 5553 is further configured to create a first thread, and build a test task corresponding to each service through the first thread based on the test sample; correspondingly, the sending module 5555 is further configured to create a second thread parallel to the first thread, and send the test task corresponding to each service to the service platform corresponding to the corresponding service type through the second thread.
In some embodiments, the parsing module 5552 is further configured to parse the service test request to obtain a test sample carried by the service test request, or parse the service test request to obtain a sample identifier of the test sample, and obtain the test sample from a target storage system based on the sample identifier.
In some embodiments, the service testing apparatus further comprises: the statistical module is used for receiving test results returned by each service platform, and the test results comprise test sub-results aiming at least two statistical indexes; respectively counting corresponding test sub-results of each service platform according to each statistical index to obtain corresponding statistical results; and generating a corresponding statistical report based on the statistical result.
In some embodiments, the service type includes a federal service type, and the service platform corresponding to the federal service type includes: a first participant platform and a second participant platform; the sending module 5555 is further configured to perform the following processing for each test task: when the service type corresponding to the test task is a federal service type, sending the test task to the first participant platform; and the first participant platform is used for combining with the second participant platform and testing the corresponding service based on the test task.
Continuing to describe an exemplary structure of the service testing apparatus implemented as a software module according to the embodiment of the present application, referring to fig. 15, fig. 15 is an alternative structural schematic diagram of the service testing apparatus according to the embodiment of the present application, where the service testing apparatus 15 according to the embodiment of the present application includes:
a presentation module 151, configured to present a test interface and present a plurality of services on the test interface;
a selection module 152 configured to, in response to a selection operation for at least two services of the plurality of services, take the selected at least two services as services to be tested;
a sending module 153, configured to send a service test request to the management server in response to the test instruction for the at least two services;
and the service test request is used for the management server to construct a test task corresponding to each service and send the test task to a service platform corresponding to each service for testing.
In some embodiments, the service testing apparatus further comprises: a sample selection module for presenting a sample selection function item; presenting at least one candidate test sample for selection in response to a triggering operation for the sample selection function item; responding to a sample selection operation triggered based on the candidate test samples, and taking the selected candidate test samples as test samples of the at least two services; the test sample is used for the management server to construct a test task corresponding to each service.
It should be noted that the description of the apparatus in the embodiment of the present application is similar to the description of the method embodiment, and has similar beneficial effects to the method embodiment, and therefore, the description is not repeated.
The embodiment of the present application provides a computer program product, which includes a computer program, and is characterized in that the computer program, when executed by a processor, implements the service testing method provided by the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions, which when executed by a processor, will cause the processor to perform a method provided by embodiments of the present application, for example, a service testing method as shown in fig. 3.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In conclusion, by the embodiment of the application, various services can be tested in batches, and the test efficiency is high.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (13)

1. A method for service testing, comprising:
receiving a service test request, wherein the service test request is used for requesting to test at least two services;
analyzing the service test request to obtain a test sample and the service type of each service;
constructing a test task corresponding to each service based on the test sample;
respectively determining a service platform corresponding to the service type of each service;
sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types;
and the test task is used for the service platform to test the corresponding service.
2. The method of claim 1, wherein after the building of the test tasks for each of the services, the method further comprises:
caching the test task to a task queue;
correspondingly, the determining the service platform corresponding to the service type of each service respectively includes:
based on the caching sequence of each test task in the task queue, sequentially taking out the test tasks from the task queue;
and determining a corresponding service platform based on the service type of the service corresponding to the taken test task.
3. The method of claim 1, wherein constructing a test task for each of the services based on the test sample comprises:
creating a first thread;
constructing a test task corresponding to each service through the first thread based on the test sample;
correspondingly, the sending the test task corresponding to each service to the service platform corresponding to the corresponding service type includes:
creating a second thread in parallel with the first thread;
and sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types through the second thread.
4. The method of claim 1, wherein parsing the service test request to obtain a test sample comprises:
analyzing the service test request to obtain a test sample carried by the service test request, or
And analyzing the service test request to obtain a sample identifier of the test sample, and acquiring the test sample from a target storage system based on the sample identifier.
5. The method of claim 1, further comprising:
receiving test results returned by each service platform, wherein the test results comprise test sub-results aiming at least two statistical indexes;
respectively counting corresponding test sub-results of each service platform according to each statistical index to obtain corresponding statistical results;
and generating a corresponding statistical report based on the statistical result.
6. The method of claim 1, wherein the service type comprises a federated service type, and wherein the service platform corresponding to the federated service type comprises: a first participant platform and a second participant platform;
the sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types comprises:
executing the following processing respectively aiming at each test task:
when the service type corresponding to the test task is a federal service type, sending the test task to the first participant platform;
and the first participant platform is used for combining with the second participant platform and testing the corresponding service based on the test task.
7. A method for service testing, comprising:
presenting a test interface and presenting a plurality of services on the test interface;
in response to a selection operation for at least two services of the plurality of services, treating the selected at least two services as services to be tested;
sending a service test request to a management server in response to the test instruction for the at least two services;
and the service test request is used for the management server to construct a test task corresponding to each service and send the test task to a service platform corresponding to each service for testing.
8. The method of claim 7, wherein before sending the service test request to the management server, the method further comprises:
presenting a sample selection function item;
presenting at least one candidate test sample for selection in response to a triggering operation for the sample selection function item;
responding to a sample selection operation triggered based on the candidate test samples, and taking the selected candidate test samples as test samples of the at least two services;
the test sample is used for the management server to construct a test task corresponding to each service.
9. A service testing device, comprising:
the system comprises a receiving module, a processing module and a processing module, wherein the receiving module is used for receiving a service test request which is used for requesting to test at least two services;
the analysis module is used for analyzing the service test request to obtain a test sample and the service type of each service;
the construction module is used for constructing a test task corresponding to each service based on the test sample;
the determining module is used for respectively determining the service platforms corresponding to the service types of the services;
the sending module is used for sending the test tasks corresponding to the services to the service platforms corresponding to the corresponding service types;
and the test task is used for the service platform to test the corresponding service.
10. A service testing device, comprising:
the presentation module is used for presenting a test interface and presenting a plurality of services on the test interface;
a selection module configured to take at least two selected services as services to be tested in response to a selection operation for at least two services of the plurality of services;
the sending module is used for responding to the test instruction aiming at the at least two services and sending a service test request to the management server;
and the service test request is used for the management server to construct a test task corresponding to each service and send the test task to a service platform corresponding to each service for testing.
11. An electronic device, comprising:
a memory for storing executable instructions;
a processor for implementing the service testing method of any one of claims 1 to 8 when executing executable instructions stored in the memory.
12. A computer-readable storage medium having stored thereon executable instructions for, when executed by a processor, implementing the service testing method of any one of claims 1 to 8.
13. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, carries out the service testing method of any one of claims 1 to 8.
CN202110104276.4A 2021-01-26 2021-01-26 Service testing method, service testing device, electronic equipment and computer readable storage medium Pending CN112860546A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110104276.4A CN112860546A (en) 2021-01-26 2021-01-26 Service testing method, service testing device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110104276.4A CN112860546A (en) 2021-01-26 2021-01-26 Service testing method, service testing device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112860546A true CN112860546A (en) 2021-05-28

Family

ID=76009319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110104276.4A Pending CN112860546A (en) 2021-01-26 2021-01-26 Service testing method, service testing device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112860546A (en)

Similar Documents

Publication Publication Date Title
US10572360B2 (en) Functional behaviour test system and method
US20210318851A1 (en) Systems and Methods for Dataset Merging using Flow Structures
US9003423B1 (en) Dynamic browser compatibility checker
CN109885624B (en) Data processing method, data processing device, computer equipment and storage medium
CN112187585B (en) Network protocol testing method and device
Balietti nodeGame: Real-time, synchronous, online experiments in the browser
CN108647141A (en) Automatic test approach, device, computer-readable medium and electronic equipment
CN112256537B (en) Model running state display method and device, computer equipment and storage medium
US20170048115A1 (en) SDN Application Integration, Management and Control Method, System and Device
CN108830383B (en) Method and system for displaying machine learning modeling process
US11954013B2 (en) Method of testing applet performance, electronic device, and computer-readable medium
CN112559525A (en) Data checking system, method, device and server
CN112346947A (en) Performance detection method and device, electronic equipment and computer readable medium
CN116992092A (en) Method and device for establishing flow model, storage medium and terminal equipment
CN113378346A (en) Method and device for model simulation
CN112860546A (en) Service testing method, service testing device, electronic equipment and computer readable storage medium
CN115422202A (en) Service model generation method, service data query method, device and equipment
CN106210008B (en) Data interactive method, client and system
CN111367889B (en) Cross-cluster data migration method and device based on webpage interface
US9971978B2 (en) Event-based data management method and device
CN114996148A (en) Automatic testing method and device based on pyspark, computer equipment and storage medium
de Camargo Magano et al. Abstracting Big Data Processing Tools for Smart Cities
WO2024119471A1 (en) Algorithm testing method, algorithm testing platform, and computer readable storage medium
CN114817071B (en) Online automatic test system, method and device and readable storage medium
CN112395197A (en) Data processing method, data processing device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination