CN112162914A - Method and device for automatically generating test case - Google Patents

Method and device for automatically generating test case Download PDF

Info

Publication number
CN112162914A
CN112162914A CN202010730587.7A CN202010730587A CN112162914A CN 112162914 A CN112162914 A CN 112162914A CN 202010730587 A CN202010730587 A CN 202010730587A CN 112162914 A CN112162914 A CN 112162914A
Authority
CN
China
Prior art keywords
test
case
log
test case
management system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010730587.7A
Other languages
Chinese (zh)
Other versions
CN112162914B (en
Inventor
陈冬严
王海滨
戴鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Financial Futures Information Technology Co ltd
Original Assignee
Shanghai Financial Futures Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Financial Futures Information Technology Co ltd filed Critical Shanghai Financial Futures Information Technology Co ltd
Priority to CN202010730587.7A priority Critical patent/CN112162914B/en
Publication of CN112162914A publication Critical patent/CN112162914A/en
Application granted granted Critical
Publication of CN112162914B publication Critical patent/CN112162914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention discloses a method and a device for automatically generating a test case, which can reduce the input cost of personnel and improve the generation efficiency of the automatic case. The technical scheme is as follows: the device comprises a case management system, a plurality of test management agent components running in a local test environment and a local case set component, wherein the case management system is in data communication with each test management agent component and comprises uploading and downloading operations, and the case management system is configured for a user to check and manage test cases and test results; the test management agent component is configured to capture and analyze the log of the micro-service application to generate a test case, execute the generated test case and generate a test result report, and upload the test result report to the case management system.

Description

Method and device for automatically generating test case
Technical Field
The invention relates to a software testing technology, in particular to a method and a device for automatically generating a test case.
Background
In automated testing of interfaces of software, an automation case is usually implemented by writing code or script according to a test case designed in advance, and generally includes a call to a target interface, a set of test inputs, a test fixture, and an expected result.
The cost of writing and post-maintenance of automation case code or scripts occupies a large portion of the human resources and time cost investment in the entire automation testing activity. Under the conditions of software product iteration acceleration and release interval reduction, automatic testing can be realized only by increasing manual input, or only partial functions are selected to implement automatic testing, so that the cost of the product is increased or the quality of the product is degraded.
Disclosure of Invention
The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.
The invention aims to solve the problems and provides a method and a device for automatically generating a test case, which can reduce the personnel input cost and improve the automatic case generation efficiency.
The technical scheme of the invention is as follows: the invention discloses a device for automatically generating test cases, which comprises a case management system, a plurality of test management agent components running in a local test environment and local case set components, wherein the case management system and each test management agent component carry out data communication and comprise uploading and downloading operations, wherein:
the case management system is configured for a user to check and manage the test cases and the test results;
and the test management agent component is configured to capture and analyze the log of the micro-service application to generate a test case, execute the generated test case, generate a test result report and upload the test result report to the case management system.
According to an embodiment of the apparatus for automatically generating test cases, the case management system is further configured to allow a user to edit and manage a policy for automatically generating test cases.
According to an embodiment of the apparatus for automatically generating a test case, the analysis policy of the test case by the test management agent component is performed according to an agreed mode, and includes a called interface name or URL, a reference and reference type, and an interface return result.
According to an embodiment of the apparatus for automatically generating a test case of the present invention, the test management component further includes a log parsing unit, a case uploading and downloading unit, a case executing unit, and a case execution reporting unit, wherein:
the log analysis unit is configured to analyze the log generated by the system according to an agreed mode according to an analysis strategy of the test case, and automatically generate the test case;
the case uploading and downloading unit is configured to download the analysis strategy of the test case from the case management system for analysis by the log analysis unit and upload the analysis result to the case management module;
the case execution unit is configured to execute the test case generated by the log analysis unit and generate a test result report;
the case execution and report unit is configured to test a given test case and generate a test result report.
The invention also discloses a method for automatically generating the test case, which is characterized in that the method is implemented on the device, and the method comprises the following steps:
step 1: the configuration of the test environment initiates a test mode after being modified by a user;
step 2: after the manual test case is executed by a user, automatically generating a recorded log;
and step 3: uploading the automatically generated log to a user;
and 4, step 4: analyzing the automatically generated log, generating a test case and uploading the test case to a case management system;
and 5: the use case set of the generated test case is downloaded to the corresponding local test environment by other users;
step 6: and executing the downloaded test case by the user in the local test environment, and uploading the test result to the case management system.
According to an embodiment of the method for automatically generating the test case, in step 2, the log is a log file generated in the background by making a call to each target microservice during the inter-service call, wherein the data of the call to the inter-service interface is recorded in the log.
According to an embodiment of the method for automatically generating the test case, in step 4, in the generation process of the test case, the key information called by the interface is obtained through pattern matching, the key information comprises the called interface name, the input parameter and the input parameter type of the application and the interface return result, the key information is used as the basic data of the test case, and the test case is automatically generated and stored as a file with a specified format.
According to an embodiment of the method for automatically generating the test case, in step 6, a specified directory in the local test environment is scanned, and the test case is executed and a test result report is generated according to the found file list of the test case through the test fixture.
Compared with the prior art, the invention has the following beneficial effects: the method and the device realize the automatic generation of the interface automatic use case by analyzing the application log, reduce the input cost of personnel and improve the generation efficiency of the automatic use case.
Drawings
The above features and advantages of the present disclosure will be better understood upon reading the detailed description of embodiments of the disclosure in conjunction with the following drawings. In the drawings, components are not necessarily drawn to scale, and components having similar relative characteristics or features may have the same or similar reference numerals.
FIG. 1 is a schematic diagram of an embodiment of an apparatus for automatic test case generation according to the present invention.
FIG. 2 is a flow diagram illustrating an embodiment of a method for automatic test case generation of the present invention.
Fig. 3 shows a detailed schematic diagram of the test management agent component in the device embodiment of fig. 1.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. It is noted that the aspects described below in connection with the figures and the specific embodiments are only exemplary and should not be construed as imposing any limitation on the scope of the present invention.
FIG. 1 illustrates the principles of an embodiment of an apparatus for automatic test case generation of the present invention. Referring to fig. 1, the apparatus of the present embodiment includes a use case management system, and a plurality of test management agent components and local use case set components located in a local test environment.
The case management system and each test management agent component carry out data communication, including uploading and downloading operations.
The case management system is configured for a user to check and manage the test cases and the test results through the UI. Meanwhile, the use case management system supports a user to edit and manage the strategy of automatic generation of the use case.
The test management agent component runs in the local test environment and serves as a bridge between the local test environment and the use case management system. The test management agent component is configured to: and capturing and analyzing the log of the microservice application to generate a test case file, wherein the analysis strategy of the test case is performed according to a predetermined mode and comprises a called interface name or URL, a parameter type, an interface return result and the like. The test management agent component is also used for executing the generated test case, generating a test result report and uploading the test result report to the case management system.
The test management agent component comprises the following elements as shown in fig. 3: the system comprises a log analysis unit, a case uploading and downloading unit, an automatic test case execution unit and a given test case execution unit.
The log analysis unit is configured to analyze the log generated by the system according to an agreed mode according to an analysis strategy of the test case, and automatically generate the test case.
Specifically, the log analysis unit is configured to perform the following processing.
First, the log of locating one interface call process is resolved. Since the service is multi-threaded, different requests may be recorded in a heterogeneous log file. The log contents of an interface call can be uniquely identified with a traceID in the application by means of the traceID. By screening the traceID, the complete content of a certain interface call can be acquired, and the sequence of the interface calls is determined.
After obtaining the complete record of interface call of a certain time, obtaining the key information of the interface call by pattern matching, including the called interface name, the input reference and the input reference type of the application and the interface return result, using the key information as the basic data of the test case, and saving the key information as the file with the specified format as the test case.
In the log, according to convention, after the application container acquires the interface call content, the log with the call interface: + URL is printed, so that the called interface can be analyzed. In addition, the "parameters to be printed out are: a log of "+ parameter lists, and a" parameter type "+ parameter type list, whereby both items may determine the name, type and specific value of the entry of the interface.
At the end of the interface call, it is also necessary to print out an "interface call success, return value is" + return value list, so that the interface call result can be determined as the expected result of the test case. And additionally carrying out replacement processing on data such as personal privacy information, account passwords and the like of the user.
And if the service has service call to other micro services, capturing requests to other micro services and returning results as mock service response content during automatic test according to the scheme.
Because the application framework unifies the service calling entries, interface developers do not need to write the logs for each interface, and the logs are generated by the framework through self-processing when the interfaces are called.
The service personnel or the testing personnel complete a certain service operation by operating the service system, and whether the result returned by the system meets the expectation is confirmed. If not, a system defect is reported. Meanwhile, through the call between services, a call is generated for each target microservice, and a log is generated at the background of the call. In the process of generating the log, for the background micro-service needing to be tested, an appointed test switch is started when the background micro-service is started, the test mode is entered, the data called by the interface between the services are recorded, and the recorded information is stored in the log.
The log is resolved by using a traceID parameter (which can uniquely distinguish and screen out the log content of an interface call). Obtaining key information of interface calling through pattern matching, wherein the key information comprises a called interface name, an entry and entry type, an interface return result and the like of an application, and the key information is used as basic data of a test case and is stored as a file with a specified format (for example, a csv file) to be used as the test case. And additionally carrying out replacement processing on data such as personal privacy information, account passwords and the like of the user.
And the log analysis unit analyzes and scans the csv file under the specified directory through the test fixture according to the interface calling mode of the tested microservice, generates a test case during operation according to the scanning result, and executes and asserts the test case.
The case uploading and downloading unit is configured to download the analysis strategy of the test case from the case management system for the log analysis unit to analyze, and upload the analysis result to the case management module.
The automatic test case execution unit is configured to execute the test case automatically generated by the log analysis unit and generate a test result report. And the automatic test case execution unit scans the specified directory, and executes the case according to the discovered csv test file list through the test fixture and generates a test report.
The given test case execution unit is configured to perform a test on the given test case and generate a test result report.
FIG. 2 shows a flow of an embodiment of a method for automatic test case generation of the present invention. Referring to fig. 2, the method for automatically generating test cases of the present embodiment is implemented on the apparatus shown in fig. 1, and the implementation steps are detailed as follows.
Step 1: the configuration of the test environment is modified by the user to initiate the test mode.
Step 2: and after the manual test case is executed by a user, automatically generating a recorded log.
The user completes a certain service operation by operating the service system, and confirms whether the result returned by the system meets the expectation. If not, a system defect is reported. Meanwhile, through the call between services, a call is generated for each target microservice, and a log is generated at the background of the call. In the process of generating the log, for the background micro-service needing to be tested, an appointed test switch is started when the background micro-service is started so as to enter a test mode, data called by an interface between services are recorded, and the recorded information is stored in the log.
And step 3: automatically generated logs are uploaded to the system via the user.
And 4, step 4: the system analyzes the automatically generated log, generates a test case and uploads the test case to the case management system.
The specific process of this step is as follows.
First, the log of locating one interface call process is resolved. Since the service is multi-threaded, different requests may be recorded in a heterogeneous log file. The log contents of an interface call can be uniquely identified with a traceID in the application by means of the traceID. By screening the traceID, the complete content of a certain interface call can be acquired, and the sequence of the interface calls is determined.
After obtaining the complete record of interface call of a certain time, obtaining the key information of the interface call by pattern matching, including the called interface name, the input reference and the input reference type of the application and the interface return result, using the key information as the basic data of the test case, and saving the key information as the file with the specified format as the test case.
In the log, according to convention, after the application container acquires the interface call content, the log with the call interface: + URL is printed, so that the called interface can be analyzed. In addition, the "parameters to be printed out are: a log of "+ parameter lists, and a" parameter type "+ parameter type list, whereby both items may determine the name, type and specific value of the entry of the interface.
At the end of the interface call, it is also necessary to print out an "interface call success, return value is" + return value list, so that the interface call result can be determined as the expected result of the test case. And additionally carrying out replacement processing on data such as personal privacy information, account passwords and the like of the user.
And if the service has service call to other micro services, capturing requests to other micro services and returning results as mock service response content during automatic test according to the scheme.
Because the application framework unifies the service calling entries, interface developers do not need to write the logs for each interface, and the logs are generated by the framework through self-processing when the interfaces are called.
The log is resolved by using a traceID parameter (which can uniquely distinguish and screen out the log content of an interface call). Obtaining key information of interface calling through pattern matching, wherein the key information comprises a called interface name, an entry and entry type, an interface return result and the like of an application, and the key information is used as basic data of a test case and is stored as a file with a specified format (for example, a csv file) to be used as the test case. And additionally carrying out replacement processing on data such as personal privacy information, account passwords and the like of the user.
And analyzing and scanning the csv file under the specified directory through the test fixture according to the interface calling mode of the tested microservice, generating a test case in operation according to the scanning result, and executing and asserting.
And 5: and the use case set of the generated test case is downloaded to the corresponding local test environment by other users.
Step 6: and executing the downloaded test case by the user in the local test environment, and uploading the test result to the case management system.
And scanning a specified directory in the local test environment, executing the use case according to the discovered csv test file list through the test fixture, and generating a test result report.
While, for purposes of simplicity of explanation, the methodologies are shown and described as a series of acts, it is to be understood and appreciated that the methodologies are not limited by the order of acts, as some acts may, in accordance with one or more embodiments, occur in different orders and/or concurrently with other acts from that shown and described herein or not shown and described herein, as would be understood by one skilled in the art.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks (disks) usually reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. The device for automatically generating the test cases is characterized by comprising a case management system, a plurality of test management agent components running in a local test environment and local case set components, wherein the case management system and each test management agent component are in data communication and comprise uploading and downloading operations, and the method comprises the following steps:
the case management system is configured for a user to check and manage the test cases and the test results;
and the test management agent component is configured to capture and analyze the log of the micro-service application to generate a test case, execute the generated test case, generate a test result report and upload the test result report to the case management system.
2. The apparatus of claim 1, wherein the use case management system is further configured to allow a user to edit and manage policies for automatic generation of use cases.
3. The apparatus of claim 1, wherein the parsing strategy of the test case by the test management agent component is performed according to an agreed mode, and includes a called interface name or URL, a reference type, and an interface return result.
4. The apparatus of claim 1, wherein the test management component further comprises a log parsing unit, a case uploading and downloading unit, a case executing unit, and a case execution reporting unit, and wherein:
the log analysis unit is configured to analyze the log generated by the system according to an agreed mode according to an analysis strategy of the test case, and automatically generate the test case;
the case uploading and downloading unit is configured to download the analysis strategy of the test case from the case management system for analysis by the log analysis unit and upload the analysis result to the case management module;
the case execution unit is configured to execute the test case generated by the log analysis unit and generate a test result report;
the case execution and report unit is configured to test a given test case and generate a test result report.
5. A method for automatic test case generation, wherein the method is implemented on the apparatus according to any one of claims 1 to 4, and the method comprises:
step 1: the configuration of the test environment initiates a test mode after being modified by a user;
step 2: after the manual test case is executed by a user, automatically generating a recorded log;
and step 3: uploading the automatically generated log to a user;
and 4, step 4: analyzing the automatically generated log, generating a test case and uploading the test case to a case management system;
and 5: the use case set of the generated test case is downloaded to the corresponding local test environment by other users;
step 6: and executing the downloaded test case by the user in the local test environment, and uploading the test result to the case management system.
6. The method according to claim 5, wherein in step 2, the log is a log file generated in the background by making calls to each target microservice during the inter-service call, and data of the call to the inter-service interface is recorded in the log.
7. The method according to claim 5, wherein in step 4, in the process of generating the test case, the key information called by the interface is obtained through pattern matching, and includes the called interface name, the type of the input reference and the type of the input reference of the application, and the interface return result, and the key information is used as the basic data of the test case, so as to automatically generate the test case and store the test case as a file with a specified format.
8. The method according to claim 5, wherein in step 6, a specified directory in the local test environment is scanned, and the test fixture executes the test case according to the found file list of the test case and generates a test result report.
CN202010730587.7A 2020-07-27 2020-07-27 Method and device for automatically generating test cases Active CN112162914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010730587.7A CN112162914B (en) 2020-07-27 2020-07-27 Method and device for automatically generating test cases

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010730587.7A CN112162914B (en) 2020-07-27 2020-07-27 Method and device for automatically generating test cases

Publications (2)

Publication Number Publication Date
CN112162914A true CN112162914A (en) 2021-01-01
CN112162914B CN112162914B (en) 2024-06-04

Family

ID=73859863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010730587.7A Active CN112162914B (en) 2020-07-27 2020-07-27 Method and device for automatically generating test cases

Country Status (1)

Country Link
CN (1) CN112162914B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113051155A (en) * 2021-03-11 2021-06-29 中国信息通信研究院 Control system and control method of automatic test platform
CN115379209A (en) * 2022-09-14 2022-11-22 北京睿芯高通量科技有限公司 Automatic test system for video coder-decoder

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090083621A (en) * 2008-01-30 2009-08-04 주식회사 국민은행 Test automating system
CN104601403A (en) * 2015-01-07 2015-05-06 上海瀚之友信息技术服务有限公司 Automatic test system
CN105117345A (en) * 2015-09-23 2015-12-02 网易(杭州)网络有限公司 Interface testing method and device for application program
CN105335278A (en) * 2014-06-16 2016-02-17 阿里巴巴集团控股有限公司 Testing method and device
CN108319547A (en) * 2017-01-17 2018-07-24 阿里巴巴集团控股有限公司 Method for generating test case, device and system
CN109460349A (en) * 2018-09-19 2019-03-12 武汉达梦数据库有限公司 A kind of method for generating test case and device based on log
CN109684209A (en) * 2018-12-17 2019-04-26 北京奇虎科技有限公司 A kind of method for generating test case, device and electronic equipment
CN110362497A (en) * 2019-07-23 2019-10-22 上海金融期货信息技术有限公司 Cover the automation api interface test method and system of full unusual character
CN110888804A (en) * 2019-11-11 2020-03-17 网联清算有限公司 Interface test method and interface test platform

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090083621A (en) * 2008-01-30 2009-08-04 주식회사 국민은행 Test automating system
CN105335278A (en) * 2014-06-16 2016-02-17 阿里巴巴集团控股有限公司 Testing method and device
CN104601403A (en) * 2015-01-07 2015-05-06 上海瀚之友信息技术服务有限公司 Automatic test system
CN105117345A (en) * 2015-09-23 2015-12-02 网易(杭州)网络有限公司 Interface testing method and device for application program
CN108319547A (en) * 2017-01-17 2018-07-24 阿里巴巴集团控股有限公司 Method for generating test case, device and system
CN109460349A (en) * 2018-09-19 2019-03-12 武汉达梦数据库有限公司 A kind of method for generating test case and device based on log
CN109684209A (en) * 2018-12-17 2019-04-26 北京奇虎科技有限公司 A kind of method for generating test case, device and electronic equipment
CN110362497A (en) * 2019-07-23 2019-10-22 上海金融期货信息技术有限公司 Cover the automation api interface test method and system of full unusual character
CN110888804A (en) * 2019-11-11 2020-03-17 网联清算有限公司 Interface test method and interface test platform

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113051155A (en) * 2021-03-11 2021-06-29 中国信息通信研究院 Control system and control method of automatic test platform
CN115379209A (en) * 2022-09-14 2022-11-22 北京睿芯高通量科技有限公司 Automatic test system for video coder-decoder

Also Published As

Publication number Publication date
CN112162914B (en) 2024-06-04

Similar Documents

Publication Publication Date Title
US8713526B2 (en) Assigning runtime artifacts to software components
US10176079B2 (en) Identification of elements of currently-executing component script
EP3975482B1 (en) Quantitative network testing framework for 5g and subsequent generation networks
CN113422794B (en) Flow recording and playback processing method and device and electronic equipment
CN112162914A (en) Method and device for automatically generating test case
CN101043543A (en) Automatized test tool and method for program controlled exchanger
CN111858330A (en) Test script deployment method, device, equipment and readable medium
CN111159520B (en) Sample identification method, device and safety emergency response system
CN111309590A (en) Automatic testing method and simulator for financial transaction platform
US8670441B2 (en) System and method for migrating a large scale batch of customer accounts from one VoIP system to another VoIP system
CN114138633A (en) Method, device and equipment for testing software based on data driving and readable medium
CN101634965A (en) Method for testing Linux kernel-grade unit
CN112988600A (en) Service scene testing method and device, electronic equipment and storage medium
CN113934642B (en) Software compatibility testing method based on dynamic and static combination
CN110597733A (en) Method and device for testing stability of automatic front end and storage medium
CN113157590B (en) Test case generation method and device
CN114880239A (en) Interface automation testing framework and method based on data driving
CN104424096A (en) Automatic testing system and method for Android platform based device
CN114500348A (en) CDN gateway test method and system
CN113163014A (en) Data transmission method, server and data transmission system
CN115022327B (en) Cloud server control method and device and computer readable storage medium
CN114253867B (en) Automatic testing method, device and system based on neural network model
CN111459833B (en) Method for realizing multi-terminal multi-platform automatic test and monitoring of mobile terminal of government and enterprise
CN116627489A (en) Method and device for analyzing report based on script
CN113886159A (en) Method, device and equipment for server centralized test and readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant