CN116643980A - Automatic testing method, system and storage medium based on priority execution - Google Patents

Automatic testing method, system and storage medium based on priority execution Download PDF

Info

Publication number
CN116643980A
CN116643980A CN202310583694.5A CN202310583694A CN116643980A CN 116643980 A CN116643980 A CN 116643980A CN 202310583694 A CN202310583694 A CN 202310583694A CN 116643980 A CN116643980 A CN 116643980A
Authority
CN
China
Prior art keywords
test
case
script
tested
priority
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310583694.5A
Other languages
Chinese (zh)
Inventor
尹莎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Cisco Networking Technology Co Ltd
Original Assignee
Inspur Cisco Networking Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Cisco Networking Technology Co Ltd filed Critical Inspur Cisco Networking Technology Co Ltd
Priority to CN202310583694.5A priority Critical patent/CN116643980A/en
Publication of CN116643980A publication Critical patent/CN116643980A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses an automatic testing method, an automatic testing system and a storage medium based on priority execution, and belongs to the technical field of automatic testing. The method comprises the following steps: under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists; under the condition that the corresponding test case exists, searching the corresponding test script in the test script database based on the case index number of the test case; executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to the monitoring platform. The method solves the technical problem that the priority of the corresponding test script is inconsistent with the priority of the test website easily caused by manual negligence when the priority of the existing test website is modified.

Description

Automatic testing method, system and storage medium based on priority execution
Technical Field
The present application relates to the field of automated testing technologies, and in particular, to an automated testing method, system, and storage medium based on priority execution.
Background
In automated test execution, often, different test phases will have different treatments for test cases with different priorities, and how to select test scripts with corresponding priorities and execute the corresponding scripts is a common method for automated test. The method is mainly characterized in that in the automatic test execution, for how to select the use case with the corresponding priority, the automatic test script is marked with a label with the corresponding priority, for example, the corresponding priority in a test website is 1, and then the corresponding script is marked with a label with the priority of 1. And executing the corresponding priority script later, and filtering according to the labels.
However, once the priority of the use case in the test website changes, the corresponding script priority label needs to be updated in time, and the label is probably not on due to manual errors, so that the executed use case is incorrect. In addition, the existing priority-based automatic test execution method needs to maintain two sets of priorities, test the priorities of websites and the priorities of corresponding test scripts, and maintenance cost is increased. Once the priorities change, the priorities at both locations need to change together. When the priority of the use case of the test website is modified, the priority of the corresponding script is forgotten to be modified, so that the priority of two sides is inconsistent.
Disclosure of Invention
The embodiment of the application provides an automatic test method, an automatic test system and a storage medium based on priority execution, which solve the technical problem that when the priority of the existing test website is modified, the priority of the corresponding test script is easy to be inconsistent with the priority of the test website due to manual negligence.
In a first aspect, an embodiment of the present application provides an automated testing method based on priority execution, where the method includes: under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists; under the condition that the corresponding test case exists, searching the corresponding test script in the test script database based on the case index number of the test case; executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to the monitoring platform.
In one implementation of the present application, before the use case database of the test website matches the priority based on the test instruction in the automation script to be tested, the method further includes: determining the case priority of each preset test case, and marking each preset test case according to the case priority; and storing each marked preset test case in a case database of the test website.
In one implementation of the present application, before the use case database of the test website matches the priority based on the test instruction in the automation script to be tested, the method further includes: constructing a test case index according to the case priority of each preset test case, and determining the case index number of each preset test case; according to the test case index and the case index number, constructing a test script index and a script index number for each preset test script in a test script database; wherein, script index number is the same as the use case script number.
In one implementation of the present application, the method further comprises: ending the automatic script to be tested under the condition that no corresponding test case exists, generating first alarm information and sending the first alarm information to a monitoring platform; and if the corresponding script to be tested is not found in the test script database based on the use case index number of the test case to be tested, ending the automatic script to be tested, generating second alarm information and sending the second alarm information to the monitoring platform.
In one implementation of the present application, generating a test report based on a test result specifically includes: determining a plurality of test result items corresponding to each report item of the test report; and inputting result data corresponding to a plurality of test result items in the test result into a corresponding report item model to obtain a corresponding report item result.
In a second aspect, an embodiment of the present application further provides an automated testing system based on priority execution, where the system includes: the device comprises a matching module, a searching module and an executing module; the matching module is used for matching the use case priority in the use case database of the test website based on the test instruction in the to-be-tested automation script under the condition that the automation test is required to be performed so as to determine whether the corresponding to-be-tested use case exists or not; the searching module is used for searching the corresponding script to be tested in the test script database based on the use case index number of the test case under the condition that the corresponding test case exists; and the execution module is used for executing the script to be tested to obtain a test result, generating a test report based on the test result and sending the test report to the monitoring platform.
In one implementation of the present application, the system further comprises: a configuration module; the configuration module is used for determining the case priority of each preset test case and labeling each preset test case according to the case priority; storing each marked preset test case in a case database of a test website; the configuration module is also used for constructing a test case index according to the case priority of each preset test case and determining the case index number of each preset test case; according to the test case index and the case index number, constructing a test script index and a script index number for each preset test script in a test script database; wherein, script index number is the same as the use case script number.
In one implementation of the present application, the system further comprises: an alarm module; the alarm module is used for ending the automatic script to be tested and generating first alarm information to be sent to the monitoring platform under the condition that the corresponding test case to be tested does not exist; and if the corresponding script to be tested is not found in the test script database based on the use case index number of the test case to be tested, ending the automatic script to be tested, generating second alarm information and sending the second alarm information to the monitoring platform.
In one implementation of the present application, generating a test report based on a test result specifically includes: determining a plurality of test result items corresponding to each report item of the test report; and inputting result data corresponding to a plurality of test result items in the test result into a corresponding report item model to obtain a corresponding report item result.
In a third aspect, embodiments of the present application further provide a non-volatile computer storage medium storing computer-executable instructions for performing an automated test based on priority, wherein the computer-executable instructions are configured to: under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists; under the condition that the corresponding test case exists, searching the corresponding test script in the test script database based on the case index number of the test case; executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to the monitoring platform.
According to the automatic test method, system and storage medium based on priority execution, the test script with the corresponding priority can be executed through the method, and the percentage ensures that the test script is executed correctly, so that the situation that the priority of the test case is not matched with that of the actual script due to manual negligence is avoided. If the priority is to be modified, the database is stored after the modification of the corresponding test website, and the script priority does not need to be changed again on the test script. The priority of the executed test script is not inconsistent with the priority of the corresponding test case, so that the maintenance cost is greatly reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of an automated testing method based on priority execution provided by an embodiment of the present application;
fig. 2 is a schematic diagram of an internal structure of an automated test system based on priority execution according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The embodiment of the application provides an automatic test method, an automatic test system and a storage medium based on priority execution, which solve the technical problem that when the priority of the existing test website is modified, the priority of the corresponding test script is easy to be inconsistent with the priority of the test website due to manual negligence.
The following describes the technical scheme provided by the embodiment of the application in detail through the attached drawings.
Fig. 1 is a flowchart of an automated testing method based on priority execution according to an embodiment of the present application. As shown in fig. 1, the method for automatically testing based on priority execution provided by the embodiment of the application specifically includes the following steps:
step 101, under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists.
In one embodiment of the application, in order to solve the technical problem that the priority of the corresponding test script is inconsistent with the priority of the test website easily caused by manual negligence when the priority of the existing test website is modified, firstly, a test case index is constructed according to the case priority of each preset test case, and the case index number of each preset test case is determined.
Further, according to the test case index and the case index number, a test script index and a script index number are constructed for each preset test script in the test script database; wherein, script index number is the same as the use case script number.
Further, in order to implement the scheme, the use case priority of each preset test case needs to be determined, and each preset test case is marked according to the use case priority; and storing each marked preset test case in a case database of the test website.
In one embodiment of the application, after an index structure is built and each marked preset test case is stored in the case database of the test website, if automatic test is required, based on a test instruction in the automatic script to be tested, the case priority is matched in the case database of the test website so as to determine whether the corresponding test case to be tested exists.
Step 102, searching a corresponding script to be tested in a test script database based on the case index number of the test case under the condition that the corresponding test case exists.
In one embodiment of the application, under the condition that no corresponding test case to be tested exists, the automatic script to be tested is ended, and first alarm information is generated and sent to the monitoring platform.
Further, under the condition that the corresponding test case exists, searching the corresponding test script in the test script database based on the case index number of the test case.
In one embodiment of the application, in the case index number based on the to-be-tested case, no corresponding to-be-tested script is found in the test script database, the to-be-tested automation script is ended, and the second alarm information is generated and sent to the monitoring platform.
And step 103, executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to the monitoring platform.
In one embodiment of the present application, after a corresponding script to be tested is found in the test script database, the script to be tested is executed to obtain a test result.
Further, a test report is generated based on the test results.
Specifically, determining a plurality of test result items corresponding to each report item of the test report; and inputting result data corresponding to a plurality of test result items in the test result into a corresponding report item model to obtain a corresponding report item result.
Further, the test report is sent to the monitoring platform.
The above is a method embodiment of the present application. Based on the same inventive concept, the embodiment of the application also provides an automatic test system based on priority execution, and the structure of the automatic test system is shown in fig. 2.
Fig. 2 is a schematic diagram of an internal structure of an automated test system based on priority execution according to an embodiment of the present application. As shown in fig. 2, the system 200 includes: a matching module 201, a searching module 202, an executing module 203, a configuring module 204 and an alarming module 205.
In one embodiment of the present application, the matching module 201 is configured to, in a case database of a test website, match a case priority based on a test instruction in an automation script to be tested in a case that needs to be automatically tested, so as to determine whether a corresponding to-be-tested case exists; the searching module 202 is configured to search, based on the use case index number of the test case, for a corresponding test script in the test script database, where the corresponding test case is determined to exist; and the execution module 203 is configured to execute the script to be tested to obtain a test result, generate a test report based on the test result, and send the test report to the monitoring platform.
In one embodiment of the application, the system further comprises: a configuration module 204; the configuration module 204 is configured to determine a case priority of each preset test case, and label each preset test case according to the case priority; storing each marked preset test case in a case database of a test website; the configuration module 204 is further configured to construct a test case index according to the case priority of each preset test case, and determine a case index number of each preset test case; according to the test case index and the case index number, constructing a test script index and a script index number for each preset test script in a test script database; wherein, script index number is the same as the use case script number.
In one embodiment of the application, the system further comprises: an alarm module 205; the alarm module 205 is configured to end an automation script to be tested and generate first alarm information to send to the monitoring platform when it is determined that no corresponding test case to be tested exists; and if the corresponding script to be tested is not found in the test script database based on the use case index number of the test case to be tested, ending the automatic script to be tested, generating second alarm information and sending the second alarm information to the monitoring platform.
In one embodiment of the present application, generating a test report based on a test result specifically includes: determining a plurality of test result items corresponding to each report item of the test report; and inputting result data corresponding to a plurality of test result items in the test result into a corresponding report item model to obtain a corresponding report item result.
Some embodiments of the application provide a non-volatile computer storage medium corresponding to the priority-based automated test of FIG. 1, storing computer-executable instructions configured to:
under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists;
under the condition that the corresponding test case exists, searching the corresponding test script in the test script database based on the case index number of the test case;
executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to the monitoring platform.
The embodiments of the present application are described in a progressive manner, and the same and similar parts of the embodiments are all referred to each other, and each embodiment is mainly described in the differences from the other embodiments. In particular, for the internet of things device and the medium embodiment, since they are substantially similar to the method embodiment, the description is relatively simple, and the relevant points are referred to in the description of the method embodiment.
The system, the medium and the method provided by the embodiment of the application are in one-to-one correspondence, so that the system and the medium also have similar beneficial technical effects to the corresponding method, and the beneficial technical effects of the method are explained in detail above, so that the beneficial technical effects of the system and the medium are not repeated here.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (10)

1. An automated testing method based on priority execution, the method comprising:
under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists;
under the condition that the corresponding test case exists, searching the corresponding test script in a test script database based on the case index number of the test case;
executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to a monitoring platform.
2. The automated priority-based test method of claim 1, wherein prior to matching priorities based on test instructions in the automated script to be tested, the method further comprises:
determining the case priority of each preset test case, and marking each preset test case according to the case priority;
and storing the marked preset test cases in a case database of the test website.
3. The automated priority-based test method of claim 2, wherein prior to matching priorities based on test instructions in the automated script to be tested, the method further comprises:
constructing a test case index according to the case priority of each preset test case, and determining the case index number of each preset test case;
constructing a test script index and a script index number for each preset test script in the test script database according to the test case index and the case index number; wherein, the script index number is the same as the use case script number.
4. The automated priority-based test method of claim 1, further comprising:
ending the automated script to be tested under the condition that no corresponding test case exists, generating first alarm information and sending the first alarm information to the monitoring platform; the method comprises the steps of,
and if the corresponding script to be tested is not found in the test script database based on the use case index number of the test case to be tested, ending the automation script to be tested, generating second alarm information and sending the second alarm information to the monitoring platform.
5. An automated testing method based on priority execution according to claim 1, wherein generating a test report based on the test results comprises:
determining a plurality of test result items corresponding to each report item of the test report;
and inputting the result data corresponding to the plurality of test result items in the test result into a corresponding report item model to obtain a corresponding report item result.
6. An automated testing system based on priority execution, the system comprising: the device comprises a matching module, a searching module and an executing module;
the matching module is used for matching the case priority in the case database of the test website based on the test instruction in the automated script to be tested under the condition that the automated test is required to be performed so as to determine whether the corresponding case to be tested exists;
the searching module is used for searching a corresponding test script to be tested in the test script database based on the case index number of the test case to be tested under the condition that the corresponding test case to be tested is determined to exist;
the execution module is used for executing the script to be tested to obtain a test result, generating a test report based on the test result and sending the test report to the monitoring platform.
7. An automated priority-based test system as recited in claim 6 wherein the system further comprises: a configuration module;
the configuration module is used for determining the case priority of each preset test case and labeling each preset test case according to the case priority;
storing the marked preset test cases in a case database of the test website;
the configuration module is further used for constructing a test case index according to the case priority of each preset test case and determining the case index number of each preset test case;
constructing a test script index and a script index number for each preset test script in the test script database according to the test case index and the case index number; wherein, the script index number is the same as the use case script number.
8. An automated priority-based test system as recited in claim 6 wherein the system further comprises: an alarm module;
the alarm module is used for ending the automated script to be tested and generating first alarm information to be sent to the monitoring platform under the condition that the corresponding test case to be tested is determined to be absent; the method comprises the steps of,
and if the corresponding script to be tested is not found in the test script database based on the use case index number of the test case to be tested, ending the automation script to be tested, generating second alarm information and sending the second alarm information to the monitoring platform.
9. An automated priority-based test system as recited in claim 6 wherein generating a test report based on the test results comprises:
determining a plurality of test result items corresponding to each report item of the test report;
and inputting the result data corresponding to the plurality of test result items in the test result into a corresponding report item model to obtain a corresponding report item result.
10. A non-volatile computer storage medium storing computer-executable instructions for priority-based automated testing, the computer-executable instructions configured to:
under the condition that automatic test is required, based on a test instruction in an automatic script to be tested, matching the use case priority in a use case database of a test website to determine whether a corresponding to-be-tested use case exists;
under the condition that the corresponding test case exists, searching the corresponding test script in a test script database based on the case index number of the test case;
executing the script to be tested to obtain a test result, generating a test report based on the test result, and sending the test report to a monitoring platform.
CN202310583694.5A 2023-05-22 2023-05-22 Automatic testing method, system and storage medium based on priority execution Pending CN116643980A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310583694.5A CN116643980A (en) 2023-05-22 2023-05-22 Automatic testing method, system and storage medium based on priority execution

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310583694.5A CN116643980A (en) 2023-05-22 2023-05-22 Automatic testing method, system and storage medium based on priority execution

Publications (1)

Publication Number Publication Date
CN116643980A true CN116643980A (en) 2023-08-25

Family

ID=87642850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310583694.5A Pending CN116643980A (en) 2023-05-22 2023-05-22 Automatic testing method, system and storage medium based on priority execution

Country Status (1)

Country Link
CN (1) CN116643980A (en)

Similar Documents

Publication Publication Date Title
CN112487083B (en) Data verification method and device
CN110222936B (en) Root cause positioning method and system of business scene and electronic equipment
CN111124871A (en) Interface test method and device
CN103810099A (en) Code tracing method and code tracing system
CN108874379B (en) Page processing method and device
CN116866242A (en) Switch regression testing method, device and medium
CN112149038A (en) Browser development method and device, computer equipment and readable storage medium
CN116643980A (en) Automatic testing method, system and storage medium based on priority execution
CN111125087A (en) Data storage method and device
CN110968754B (en) Detection method and device for crawler page turning strategy
CN112559444A (en) SQL (structured query language) file migration method and device, storage medium and equipment
CN111078574A (en) Method and device for generating influence analysis report
CN116594917B (en) UI testing method and device, electronic equipment and machine-readable storage medium
CN113071541B (en) Method and device for generating trackside configuration file
CN109446091B (en) Business entity object testing method and device
CN110968758B (en) Webpage data crawling method and device
CN114090061B (en) Front-end file self-adapting packaging method, device and storage medium
CN114928475B (en) Industrial equipment authentication method, equipment and medium based on identification analysis
CN115629759A (en) Application checking method and device, electronic equipment and computer storage medium
CN109241066B (en) Request processing method and device
CN115658461A (en) Test case management method and device, electronic equipment and computer storage medium
CN114911709A (en) Data processing method and device, electronic equipment and computer storage medium
CN117370449A (en) Data processing method and device, storage medium and electronic device
CN116756158A (en) Data exchange method, device and medium
CN115048139A (en) Service package deployment method, system, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination