CN111881056A - Automatic test framework and test method - Google Patents

Automatic test framework and test method Download PDF

Info

Publication number
CN111881056A
CN111881056A CN202010783298.3A CN202010783298A CN111881056A CN 111881056 A CN111881056 A CN 111881056A CN 202010783298 A CN202010783298 A CN 202010783298A CN 111881056 A CN111881056 A CN 111881056A
Authority
CN
China
Prior art keywords
test
layer
script
data
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010783298.3A
Other languages
Chinese (zh)
Inventor
崔海东
朱海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taishan Information Technology Co ltd
Original Assignee
Taishan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taishan Information Technology Co ltd filed Critical Taishan Information Technology Co ltd
Priority to CN202010783298.3A priority Critical patent/CN111881056A/en
Publication of CN111881056A publication Critical patent/CN111881056A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses an automatic testing framework, which comprises an Office data definition layer, an API function packaging layer, an application script layer, a flow scheduling layer and a testing result data management layer; the Office data definition layer is used for defining and packaging data required by each test; the API function packaging layer is used for packaging the method function; the application script layer is used for storing a test script; the flow scheduling layer is used for coordinating and controlling the test flow; and the test result data management layer is used for generating a test result. The automatic test framework is suitable for a CS framework system, and a tester can flexibly customize a test method under the test framework. The application also discloses that the testing method also has the technical effects.

Description

Automatic test framework and test method
Technical Field
The application relates to the technical field of system testing, in particular to an automatic testing framework; a test method is also disclosed.
Background
The test of the desktop Office product is a work with huge task amount, the general desktop Office product has about four thousand basic functions, and the flow combination application of each basic function is added, so that at least thousands or even tens of thousands of basic test case items are needed for coverage test in a complete test round. And after the version iteration, regression testing is required in addition to the testing of baseline function. If a purely manual test is used, the test time and labor consumption can be very large. In addition, manual testing has contingency and uncertainty, resulting in coverage not being guaranteed. Therefore, a demand application for realizing the automatic test of desktop Office products by using the test tool is generated.
The desktop Office product is designed by adopting a CS (circuit switched) architecture technology, and the script recording mode adopted in the early stage can realize the automatic test of partial functions of the desktop Office product to a certain extent, but script codes of the script recording mode are generated by a test tool and are gradually eliminated due to the defects of poor readability, high maintenance cost and the like. Currently, most automated testing techniques employ commercially available test tools or open source test frameworks. The commercial test tool is high in functional maturity, but high in cost, lack of flexibility and incapable of flexibly customizing the test method, the coupling between the written test script and the operation method is high, and after the Office interface is drawn and modified, the script modification work is very large, and the maintenance cost is high. The open-source test framework is relatively flexible and convenient to expand, but the existing open-source test framework is only suitable for Web application and BS (browser/server) framework application systems and cannot test CS (client/server) framework software systems, especially large desktop Office products.
In view of the above, it is an urgent technical problem for those skilled in the art to provide an automated testing framework applicable to a CS architecture system and capable of flexibly customizing a testing method.
Disclosure of Invention
The application aims to provide an automatic testing framework which is suitable for a CS (circuit switched) architecture system and can flexibly customize a testing method. Another object of the present application is to provide a testing method, which also has the above technical effects.
In order to solve the above technical problem, the present application provides an automated testing framework, including:
the system comprises an Office data definition layer, an API function packaging layer, an application script layer, a flow scheduling layer and a test result data management layer;
the Office data definition layer is used for defining and packaging data required by each test;
the API function encapsulation layer is used for defining and encapsulating the method function;
the application script layer is used for storing a test script;
the flow scheduling layer is used for coordinating and controlling the test flow;
and the test result data management layer is used for generating a test result.
Optionally, the data required by each test includes test object data, operation data, configuration data, and document data.
Optionally, the API function encapsulation layer includes:
the public function library is used for packaging a method function of interaction between the Office system and the operating system;
the interface function library is used for packaging method functions of interaction between the Office system and other application systems;
the Office basic module packaging library is used for storing the method functions of the basic functions of the Office system;
and the test verification function method library is used for storing verification method functions.
Optionally, the test script includes an Office component object, an operation method, an attribute value, and a verification point.
Optionally, the test result data management layer is specifically configured to record log information and abnormal information, manage the log information and the abnormal information in a classified manner, and generate a test report according to result data after the test script is finished running.
Optionally, the application script layer further encapsulates a running entry method function, a loop control method function, an abnormal output control method function, and an initialized running environment method function.
Optionally, the test script has an identifier for characterizing the priority and an identifier for the module function.
In order to solve the above technical problem, the present application further provides a testing method applied to the above automated testing framework, including:
running a test script in the application script layer;
calling a corresponding method function in an API function packaging layer and Office data definition layer to package corresponding data in the process of running the test script to obtain running result data of the test script;
and generating a test result according to the operation result data.
Optionally, the test script includes: office component objects, methods of operation, attribute values, and verification points.
Optionally, the running of the test script in the application script layer includes:
and running the test script according to the task type and the priority of the test script.
The application provides an automated testing framework, includes: the system comprises an Office data definition layer, an API function packaging layer, an application script layer, a flow scheduling layer and a test result data management layer; the Office data definition layer is used for defining and packaging data required by each test; the API function encapsulation layer is used for defining and encapsulating the method function; the application script layer is used for storing a test script; the flow scheduling layer is used for coordinating and controlling the test flow; and the test result data management layer is used for generating a test result.
Therefore, the automated testing framework provided by the application packages and separates the testing data, the operation method and the testing script, organically combines and controls and manages the testing data, the operation method and the testing script through the flow scheduling layer and the testing result data management layer, achieves high cohesion and low coupling, is suitable for a CS (circuit switched) framework system, can flexibly customize the testing method, and can develop testing systems suitable for different desktop Office versions under the framework. And the test framework can organically integrate data drive, keyword drive and the like, improve the development efficiency of the automatic script of desktop Office, and reduce the development and maintenance cost.
The test method provided by the application also has the technical effects.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed in the prior art and the embodiments are briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic diagram of an automated testing framework provided in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a test script according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a testing method according to an embodiment of the present disclosure.
Detailed Description
The core of the application is to provide an automatic testing framework which is suitable for a CS framework system and can flexibly customize a testing method. The other core of the application is to provide a testing method, which also has the technical effects.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic diagram of an automated testing framework according to an embodiment of the present application, and referring to fig. 1, the testing framework includes:
an Office data definition layer 10, an API function packaging layer 20, an application script layer 30, a flow scheduling layer 40 and a test result data management layer 50;
office data definition layer 10, which is used to define and package the data required by each test;
an API function encapsulation layer 20 for defining and encapsulating method functions;
an application script layer 30 for storing a test script;
a flow scheduling layer 40 for coordinating and controlling the test flow;
and the test result data management layer 50 is used for generating a test result.
Specifically, the automated testing framework provided by the application is desktop-oriented and mainly comprises an Office data definition layer 10, an API function encapsulation layer 20, an application script layer 30, a process scheduling layer 40 and a test result data management layer 50.
The Office data definition layer 10 is used to define and encapsulate all data required for testing. The data required by the test includes test object data, operation data, configuration data, document data, and the like. The test object data stores all component objects of desktop Office, and each component object identifier has a unique identification attribute and mainly comprises word processing application component data, spreadsheet application component data, presentation making application component data and COM component data. The word processing application component data, the spreadsheet application component data, the briefing making application component data and the COM component data are respectively and correspondingly stored in a word processing application component database, a spreadsheet application component database, a briefing making application component database and a COM component database. The operation data comprises verification point data and parameter data table data used for script verification. The configuration data refers to system information data, file path information data, parameters of each function call relation, and the like used by the public method. The document data stores documents related to the test called by the script and document list data.
The API function packaging layer 20 is used to define and package method functions, and is independent of a certain testing technique and testing tool, and can be applied to other testing systems, testing structures, and testing systems after being packaged. The API function packaging layer mainly packages a public function library, an interface function library, an Office basic module packaging library and a test verification function method library. The public function is used for encapsulating the method function of the interaction between the Office system and the operating system. For example, a method function of reading and writing a file, a method function of acquiring operating system information, a method function of acquiring version information, a method function of restarting a system, a method function of killing a process, a method function of initializing a test environment, a method function of mouse clicking or double clicking, and the like. The interface function library encapsulates method functions of interaction between the Office system and other application systems. For example, method functions that call parameter files, method functions for Office interface testing, etc. And the Office basic module packaging library stores basic function method functions of the Office system. For example, a method function of creating or opening a file, a method function of saving a file, a method function of starting or closing software, and the like. The test validation function method library provides validation method functions of the test object, including standard attribute validation method functions, image validation method functions, text character comparison method functions, and the like.
The application script layer 30 is used as a storage library for the test script, the test case and the master control scheduling script, and is used for storing the test case, the test script and the master control scheduling script. The test script is the application layer on the uppermost layer, and the step flow of the test script is realized according to the design of the test case. And during testing, calling a corresponding operation method, application component data, parameter data, verification data and the like according to the internal relation of the test script to test. The test script can be designed and realized by methods such as data driving, keyword driving and the like. The test scripts in the script library are labeled with identifiers of priorities, identifiers of module functions, and the like. And the master control scheduling script manages the script according to the identifiers. In addition, a running entry method, a loop control method, an abnormal output control method, a method of initializing a running environment, and the like are packaged in the script library.
In one particular embodiment, the test script includes an Office component object, an operation method, an attribute value, and a verification point.
Specifically, referring to fig. 2, the present embodiment provides a configuration of a test script including an Office component object, an operation method, an attribute value, and a verification point. The Office component object mainly comprises application component data, file data and the like. The operation method refers to the operation of the measured object. For example, common methods such as clicking, inputting, moving, copying, and the like, and interface return value comparison methods, and the like. These operation methods are obtained from the API function encapsulation layer. The attribute value refers to data and value input after the operation on the measured object. Such as parameter data, configuration data, etc. The verification point is the assertion executed by the script and is compared with preset verification data, and the purpose of verifying whether the test achieves the purpose or not and whether the test meets the requirements or not is achieved.
And each test script is operated by identifying the attribute of the tested object through the master control scheduling script, searching the tested object, calling a corresponding operation method to simulate operation and verifying an operation result.
The process scheduling layer 40 acts as a controller for coordinating and scheduling test scripts, test data, operating methods, verification methods, and test processes throughout the entire test process. The control of the test flow from the execution of each test script to the whole system is controlled and executed by a flow scheduling layer. The design of the flow scheduling layer is designed according to the test requirement, and the master control scheduling script is a part of the design.
The test result data management layer 50 generates a test result according to the test result data throughout the entire test process. Specifically, the test result data management layer encapsulates methods for managing log generation and level processing, methods for outputting a test result in a format and a content, and methods for outputting test report data in a format and a format. When the test script runs, the test result data management layer records log information, abnormal information and the like, and classifies and manages the information according to date, priority, error types and the like so as to search and analyze problems. After the operation is finished, the test result management layer classifies and summarizes the data returned according to the result to generate a test report.
Under the automatic test framework, when the automatic test needs to be carried out, a tester can compile a test case according to the corresponding test requirement and the functional flow. The test case writes the steps of the test, the objects, components and operation methods involved in each step in detail, and explicitly records the verification attribute value of each step. And further compiling a test script on the basis of compiling the test case, wherein the test script is strictly compiled according to the steps and the flow of the test case. Specifically, the process of compiling the test script under the automatic test framework is to find the data of the tested object, add the operation method and the attribute value, and finally add the verification point. After the test scripts of each application function are compiled, the master control scheduling running script runs all scripts in series according to task types, priorities and the like, real-time monitoring is carried out, when a certain test script is abnormal, the killing process operation is carried out on the test script, and after a test environment is initialized, the next test script continues to run. In the testing process, the running result and the abnormal information of each test script are monitored and a corresponding log is generated. And further, extracting the log information and the abnormal information, and summarizing to generate a test report. The tester searches for a corresponding script running problem according to the abnormal information, and if the script running problem is confirmed to be BUG, the defect is reported; and if the test script is confirmed to be abnormally interrupted, modifying the perfecting script.
In summary, the automated testing framework provided by the present application includes: the system comprises an Office data definition layer, an API function packaging layer, an application script layer, a flow scheduling layer and a test result data management layer; the Office data definition layer is used for defining and packaging data required by each test; the API function encapsulation layer is used for defining and encapsulating the method function; the application script layer is used for storing a test script; the flow scheduling layer is used for coordinating and controlling the test flow; and the test result data management layer is used for generating a test result. The automatic test framework packages and separates test data, an operation method and a test script, organically combines and controls and manages the test data, the operation method and the test script through a flow scheduling layer and a test result data management layer, achieves high cohesion and low coupling, is suitable for a CS framework system, can flexibly customize a test method, and can develop a test system suitable for Office versions of different types under the framework. And the test framework can organically integrate data drive, keyword drive and the like, improve the development efficiency of the automatic script of desktop Office, and reduce the development and maintenance cost. The test framework does not depend on a certain test tool and a test technology, and the methods packaged by the API function packaging layer are universal and public and have the characteristics of reusability and customization. In actual operation, the existing automatic testing technology can be realized, different service requirements can be met, and flexibility and expandability are strong.
The present application further provides a testing method applied to the automated testing framework described in the above embodiment, and referring to fig. 3, the testing method includes:
s101: running a test script in the application script layer;
s102: calling a corresponding method function in an API function packaging layer and Office data definition layer to package corresponding data in the process of running the test script to obtain running result data of the test script;
s103: and generating a test result according to the operation result data.
In a specific embodiment, the test script includes: office component objects, methods of operation, attribute values, and verification points. And running the test script in the application script layer, wherein the test script comprises the following steps: and running the test script according to the task type and the priority of the test script.
Specifically, when the automated test needs to be performed, a tester can compile a test case according to the corresponding test requirement and the functional flow. The test case writes the steps of the test, the objects, components and operation methods involved in each step in detail, and explicitly records the verification attribute value of each step. And further compiling a test script on the basis of compiling the test case, wherein the test script is strictly compiled according to the steps and the flow of the test case. After the test scripts of all the application functions are compiled, the master control scheduling running script runs all the scripts in series according to the task types, the priorities and the like. And calling a corresponding method function in the API function packaging layer and the Office data definition layer to package corresponding data in the process of running the test script. And real-time monitoring is carried out in the process of running the test scripts, when a certain test script is abnormal, the process killing operation is carried out on the test script, and after the test environment is initialized, the next test script is continuously run. In the testing process, the running result and the abnormal information of each test script are monitored and a corresponding log is generated. And further, extracting the log information and the abnormal information, and summarizing to generate a test report. The tester searches for a corresponding script running problem according to the abnormal information, and if the script running problem is confirmed to be BUG, the defect is reported; and if the test script is confirmed to be abnormally interrupted, modifying the perfecting script.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device, the apparatus and the computer-readable storage medium disclosed by the embodiments correspond to the method disclosed by the embodiments, so that the description is simple, and the relevant points can be referred to the description of the method.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The automated testing framework and the testing method provided by the application are described in detail above. The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (10)

1. An automated test framework, comprising:
the system comprises an Office data definition layer, an API function packaging layer, an application script layer, a flow scheduling layer and a test result data management layer;
the Office data definition layer is used for defining and packaging data required by each test;
the API function encapsulation layer is used for defining and encapsulating the method function;
the application script layer is used for storing a test script;
the flow scheduling layer is used for coordinating and controlling the test flow;
and the test result data management layer is used for generating a test result.
2. The automated testing framework of claim 1, wherein the data required for each test includes test object data, operational data, configuration data, and document data.
3. The automated test framework of claim 2, wherein the API function encapsulation layer comprises:
the public function library is used for packaging a method function of interaction between the Office system and the operating system;
the interface function library is used for packaging method functions of interaction between the Office system and other application systems;
the Office basic module packaging library is used for storing the method functions of the basic functions of the Office system;
and the test verification function method library is used for storing verification method functions.
4. The automated testing framework of claim 3, wherein the test script comprises Office component objects, operation methods, attribute values, and verification points.
5. The automated testing framework of claim 4, wherein the test result data management layer is specifically configured to record log information and exception information, manage the log information and the exception information in a classified manner, and generate a test report according to result data after the test script is run.
6. The automated testing framework of claim 5, wherein the application script layer further encapsulates a run entry method function, a round robin control method function, an exception output control method function, and an initialize run environment method function.
7. The automated testing framework of claim 6, wherein the test script has an identifier characterizing priority and an identifier of module functionality.
8. A testing method applied to the automated testing framework of any one of claims 1 to 7, comprising:
running a test script in the application script layer;
calling a corresponding method function in an API function packaging layer and Office data definition layer to package corresponding data in the process of running the test script to obtain running result data of the test script;
and generating a test result according to the operation result data.
9. The testing method of claim 8, wherein the test script comprises: office component objects, methods of operation, attribute values, and verification points.
10. The method according to claim 9, wherein the running of the test script in the application script layer comprises:
and running the test script according to the task type and the priority of the test script.
CN202010783298.3A 2020-08-06 2020-08-06 Automatic test framework and test method Pending CN111881056A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010783298.3A CN111881056A (en) 2020-08-06 2020-08-06 Automatic test framework and test method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010783298.3A CN111881056A (en) 2020-08-06 2020-08-06 Automatic test framework and test method

Publications (1)

Publication Number Publication Date
CN111881056A true CN111881056A (en) 2020-11-03

Family

ID=73211866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010783298.3A Pending CN111881056A (en) 2020-08-06 2020-08-06 Automatic test framework and test method

Country Status (1)

Country Link
CN (1) CN111881056A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112699055A (en) * 2021-01-19 2021-04-23 航天恒星科技有限公司 Software automation test method and system with low maintenance cost
CN113609015A (en) * 2021-08-05 2021-11-05 先进操作系统创新中心(天津)有限公司 Automatic test framework based on Bash Shell

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108427636A (en) * 2018-01-09 2018-08-21 阿里巴巴集团控股有限公司 Test method, system and the electronic equipment of application
US20190034320A1 (en) * 2017-07-25 2019-01-31 Belay Technologies, Inc. System And Method For Rapid And Repeatable Provisioning And Regression Testing Plans

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190034320A1 (en) * 2017-07-25 2019-01-31 Belay Technologies, Inc. System And Method For Rapid And Repeatable Provisioning And Regression Testing Plans
CN108427636A (en) * 2018-01-09 2018-08-21 阿里巴巴集团控股有限公司 Test method, system and the electronic equipment of application

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112699055A (en) * 2021-01-19 2021-04-23 航天恒星科技有限公司 Software automation test method and system with low maintenance cost
CN113609015A (en) * 2021-08-05 2021-11-05 先进操作系统创新中心(天津)有限公司 Automatic test framework based on Bash Shell

Similar Documents

Publication Publication Date Title
JP4950454B2 (en) Stack hierarchy for test automation
US7895565B1 (en) Integrated system and method for validating the functionality and performance of software applications
US6941546B2 (en) Method and apparatus for testing a software component using an abstraction matrix
US9792203B2 (en) Isolated testing of distributed development projects
US8832125B2 (en) Extensible event-driven log analysis framework
US10146672B2 (en) Method and system for automated user interface (UI) testing through model driven techniques
US8549483B1 (en) Engine for scalable software testing
US20050262482A1 (en) System and method for efficiently analyzing and building interdependent resources in a software project
CN108845940B (en) Enterprise-level information system automatic function testing method and system
US20180285247A1 (en) Systems, methods, and apparatus for automated code testing
US7730452B1 (en) Testing a component of a distributed system
US20050223360A1 (en) System and method for providing a generic user interface testing framework
US20050229161A1 (en) Generic user interface testing framework with load-time libraries
CN104899132A (en) Application software test method, apparatus and system
CN103186463B (en) Determine the method and system of the test specification of software
CN115658529A (en) Automatic testing method for user page and related equipment
CN111881056A (en) Automatic test framework and test method
Geiger et al. On the evolution of BPMN 2.0 support and implementation
US20050228644A1 (en) Generic user interface testing framework with rules-based wizard
CN112699055A (en) Software automation test method and system with low maintenance cost
EP2913757A1 (en) Method, system, and computer software product for test automation
Thooriqoh et al. Selenium Framework for Web Automation Testing: A Systematic Literature Review
US20130111431A1 (en) Validation of a system model including an activity diagram
KR20090099977A (en) A reserved component container based software development method and apparatus
CN114996039A (en) Cloud native system joint debugging method, system and medium based on third-party system docking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201103