CN113076249A - Automatic test application platform and test method based on enterprise demand development - Google Patents
Automatic test application platform and test method based on enterprise demand development Download PDFInfo
- Publication number
- CN113076249A CN113076249A CN202110396933.7A CN202110396933A CN113076249A CN 113076249 A CN113076249 A CN 113076249A CN 202110396933 A CN202110396933 A CN 202110396933A CN 113076249 A CN113076249 A CN 113076249A
- Authority
- CN
- China
- Prior art keywords
- management
- carried out
- layer
- case
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 55
- 238000011161 development Methods 0.000 title claims abstract description 18
- 238000010998 test method Methods 0.000 title abstract description 4
- 238000007726 management method Methods 0.000 claims abstract description 66
- 238000000034 method Methods 0.000 claims abstract description 9
- 238000012544 monitoring process Methods 0.000 claims abstract description 9
- 230000003993 interaction Effects 0.000 claims abstract description 8
- 238000013523 data management Methods 0.000 claims abstract description 7
- 238000012423 maintenance Methods 0.000 claims abstract description 7
- 230000002688 persistence Effects 0.000 claims abstract description 4
- 238000012986 modification Methods 0.000 claims description 10
- 230000004048 modification Effects 0.000 claims description 10
- 238000012217 deletion Methods 0.000 claims description 9
- 230000037430 deletion Effects 0.000 claims description 9
- 230000002085 persistent effect Effects 0.000 claims description 9
- 238000007792 addition Methods 0.000 claims description 6
- 238000013461 design Methods 0.000 claims description 6
- 239000000047 product Substances 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 5
- 238000012795 verification Methods 0.000 claims description 5
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000007405 data analysis Methods 0.000 claims description 3
- 238000013439 planning Methods 0.000 claims description 3
- 239000013589 supplement Substances 0.000 claims description 3
- 238000013522 software testing Methods 0.000 abstract description 2
- 108090000623 proteins and genes Proteins 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 11
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000011056 performance test Methods 0.000 description 2
- 244000101724 Apium graveolens Dulce Group Species 0.000 description 1
- 235000015849 Apium graveolens Dulce Group Nutrition 0.000 description 1
- 235000010591 Appio Nutrition 0.000 description 1
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- 238000003326 Quality management system Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000011981 development test Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The invention provides an automatic test application platform and a test method based on enterprise requirement development, wherein the automatic test application platform comprises a business layer, a presentation layer, a service layer, a persistence layer and a data layer; compared with a conventional software testing solution, the technical scheme is continued to deepen after a conventional automatic testing platform scheme in the industry, enterprise management characteristic genes are matched, an automatic application platform with a B/S (browser/Server) framework is adopted, an interaction method of the webUI is improved deeply by utilizing a development framework of python + Django, a lower technical threshold is created, the working efficiency of the enterprise-level automatic testing platform is improved, the efficiency of a development case is greatly improved by the platform, and the platform is developed into a three-dimensional application platform integrating interface testing, performance monitoring, data management and operation and maintenance.
Description
Technical Field
The invention relates to the technical field of software development, in particular to an automatic testing application platform and a testing method based on enterprise requirement development.
Background
Quality management of software is a very important ring in science and technology research and development companies. In software quality management, if the test is carried out by hand, the development rhythm is difficult to keep up. Therefore, the introduction of the automation technology can effectively replace the manual repeated operation, and the improvement of the testing efficiency is necessary. In recent years, the development of automated testing has transitioned from pure code editing to GUI applications. Under the vigorous development of GUI application software, two branches of the C/S architecture and the B/S architecture are separated simultaneously. An automatic application platform of a B/S framework is selected, a development framework of python + Django is utilized, the interaction method of the webUI is improved deeply, a lower technical threshold is created, and the working efficiency is improved.
At present, the solution for automatically testing the web application platform mainly includes that the conventional solution only covers the most core process, and more details about enterprise-level application are not defined, so that the conventional solution cannot meet the requirements of development and testing in order to really achieve enterprise-level application.
Therefore, further improvements are needed in the art.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an automatic test application platform and a test method developed based on enterprise requirements.
In order to achieve the purpose, the invention adopts the following specific scheme:
providing an automatic test application platform developed based on enterprise requirements, wherein the automatic test application platform comprises a business layer, a presentation layer, a service layer, a persistence layer and a data layer;
the service layer and the presentation layer directly face the user and carry out service function data transmission and result return through the interface and the service layer; the service layer stores, supplements and transmits the data of the server service to the persistent layer; the persistent layer adds, deletes, changes and checks data to the data layer through the ORM framework;
the service layer comprises http interface automation, tool sets, data analysis, UI automation and automatic operation and maintenance;
the presentation layer comprises projects, modules, use cases, test plans, mock, reports, system settings, a data center and a user center;
the service layer comprises case loading and scheduling, data dynamic management, rapid scheduling, web service, report generation, mail service, corresponding analysis and verification, planning tasks, version management and case retry;
the persistent layer comprises platform service data management and product database interaction;
and the data layer comprises a test platform database and a product database.
Further, the project is managed, and project management at a system level is performed;
the module performs module management, performs service attribution management in the system, refines module management and customizes an execution plan;
the use case is used for carrying out use case management, development, editing, modification and deletion of the use case, managing a unit use case combined design mode and managing a component combined design mode;
the test plan is used for managing a task execution plan and managing a planned execution task;
the mock is used for conducting mock management and simulating third-party information callback test;
the report is used for carrying out report management and managing normative report data of the operation result under the task level;
the system setting comprises environment setting, task monitoring, global variable setting and global function management; different environment management, timing task progress monitoring and variable and function data management.
Further, the project management comprises project addition, deletion, modification and search, which are executed according to projects, and project embedded function management; and defining a project private function according to the system management use case.
Furthermore, the module management comprises module addition, deletion, modification and check, and is executed according to modules; the use cases are managed by module.
Furthermore, the case management comprises case addition, deletion, modification and retrieval, case debugging and execution, case coverage statistics, case collection and case combination; use case editing, statistics, parameterization, verification and free combination.
Further, the task execution plan is managed, and free set of use cases and timing execution of the set of use cases are carried out; and the cross-project use case combination is used for executing the use case in idle time through a timing task.
Further, the report management is used for carrying out report viewing and message sending; report generation, problem viewing, email notification.
Further, the mock management comprises baffle adding, deleting, changing, checking and enabling and disabling; and simulating a third party to return data, and making a use case flow through.
The method for testing the automatic testing application platform developed based on enterprise requirements comprises the following steps:
s1, adding new items;
s2, adding modules;
s3, adding new cases;
s4, newly adding execution;
s5, checking the result;
wherein S3 further includes the steps of:
s31, creating a new use case;
s32, judging whether parameterization is needed;
s33, judging whether to check;
s34, judging whether to assemble;
s35, case debugging;
s36, judging whether the debugging is passed;
s37, case saving;
wherein S4 further includes the steps of:
s41, selecting a use case;
s42, selecting an environment;
s43, setting a task;
s44, judging whether to execute immediately;
and S45, checking the report.
Further, the air conditioner is provided with a fan,
step S32, if the judgment is that the parameterization is to be carried out, the parameterization is carried out, if the judgment is not to be carried out, the next step is directly carried out;
step S33, further includes if the judgment is to check, then check is carried out, if the judgment is not to check, then directly enter the next step;
step S34, if the judgment is that the assembly is to be carried out, the assembly is carried out, if the judgment is not to be carried out, the next step is directly carried out;
step S36, if the debugging is judged to pass, the next step is directly carried out, if the debugging is judged not to pass, the use case is edited, and the debugging is carried out again;
and step S44, if the execution is judged to be immediate, the execution is immediate, and then the next step is carried out, if the execution is not judged to be immediate, the timing task is set, the execution is carried out according to the set requirement, and after the execution, the next step is carried out.
By adopting the technical scheme of the invention, the invention has the following beneficial effects:
compared with the conventional software testing solution, the automatic testing application platform based on enterprise requirement development continues deepening after the conventional automatic testing platform scheme in the industry is carried out, the automatic testing application platform with the B/S framework is adopted, the interaction method of the webUI is deepened and improved by utilizing the development framework of python + Django, a lower technical threshold is created, the enterprise-level automatic testing platform with the working efficiency improved is used, and the development of the application of the enterprise-level automatic testing platform is completed. The platform can be developed in 2 minutes from 5 minutes or more for developing a use case to the present. The original single interface automatic test is developed into the three-dimensional platform application integrating the interface test, the performance monitoring, the data management and the operation and maintenance.
Drawings
FIG. 1 is a schematic diagram of a structural framework scheme of an embodiment of the present invention;
FIG. 2 is a code framework diagram of an embodiment of the present invention;
FIG. 3 is a system exploded schematic view of an embodiment of the present invention;
FIG. 4 is a schematic diagram of a hierarchical architecture according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a data transmission interaction according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart illustrating the overall operation of an embodiment of the present invention;
fig. 7 is a schematic diagram of use case generation and use case execution according to an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the following figures and specific examples.
The specific principles and steps of the present invention are explained in conjunction with fig. 1-7:
FIG. 1 is a schematic diagram of a structural framework scheme according to an embodiment of the present invention;
the framework scheme of the automatic test platform comprises the following steps: the system comprises project management, module management, operation environment management, use case management, function expansion, variable expansion, mock management, task execution plan management, report management, semi-automatic operation and maintenance, performance test management, development-test quality report management and coupling Zen channel management system. Wherein, project management: project management at a system level; and (3) module management: service attribution management in the system; managing a refinement module, and customizing an execution plan; and (3) managing the running environment: managing an execution environment of the test case; the system operates in multiple environments and is easily suitable for different testing stages; case management: development, editing, modification, deletion and the like of use cases; a unit case combined design mode; a component combination design mode; function expansion: expanding a platform level function; expanding a project level function; expanding variables: platform-based global variables; a global variable of the operating environment; a global variable of a scene; local variables of the use case; mock management: simulating third-party information callback test; managing a task execution plan: performing tasks programmatically; report management: normative report data of the operation result under the task level; semi-automatic operation and maintenance: deploying a code environment; and (3) performance test management: monitoring the response time of the interface test; monitoring performance and early warning performance; development-management of quality of test report: managing version quality data; quality management; coupling the Zen channel management system: and the existing quality management system of the company is coupled for data intercommunication.
FIG. 2 is a code framework diagram of an embodiment of the present invention;
the code framework comprises a Web, a service layer and a data layer, wherein the Web comprises: chrome, IE, Safari; the service layer comprises: nginx, UWSGI; project management, module management, environment configuration, retry/lock; com tasks, function extensions, S/T, Debug; mock, Suite, PageObject; MQ, HttpRunner; celery, Request; reporting the forms; a meditation channel; monitoring the performance; a test tool; MVT, boottrap; the data layer includes: MySQL.
FIG. 3 is a system exploded view of an embodiment of the present invention;
the system decomposition comprises the following steps: the system comprises an automation management module, a data center, an environment management module and an automation execution module.
FIG. 4 is a schematic diagram of a hierarchical architecture according to an embodiment of the present invention;
the system comprises a business layer, a presentation layer, a service layer, a persistence layer and a data layer; the service layer and the presentation layer directly face the user and carry out service function data transmission and result return through the interface and the service layer; the service layer stores, supplements and transmits the data of the server service to the persistent layer; the persistent layer adds, deletes, changes and checks data to the data layer through the ORM framework; the service layer comprises http interface automation, tool sets, data analysis, UI automation and automatic operation and maintenance; the presentation layer comprises projects, modules, use cases, test plans, mock, reports, system settings, a data center and a user center; the service layer comprises case loading and scheduling, data dynamic management, rapid scheduling, web service, report generation, mail service, corresponding analysis and verification, planning tasks, version management and case retry; the persistent layer comprises platform service data management and product database interaction; and the data layer comprises a test platform database and a product database.
FIG. 5 is a schematic diagram of data transmission interaction according to an embodiment of the present invention;
and the web end browser of the user computer end is connected to the Linux server through the Ethernet network communication component. FIG. 6 is a schematic diagram illustrating the overall flow of the operation of an embodiment of the present invention;
the method comprises the steps of S1, adding items; s2, adding modules; s3, adding new cases; s4, newly adding execution; and S5, checking the result.
Fig. 7 is a schematic diagram of use case generation and use case execution according to the embodiment of the present invention.
The method comprises the following steps:
s31, creating a new use case; s32, judging whether parameterization is needed; s33, judging whether to check; s34, judging whether to assemble; s35, case debugging; s36, judging whether the debugging is passed; s37, case saving; wherein S4 further includes the steps of: s41, selecting a use case; s42, selecting an environment; s43, setting a task; s44, judging whether to execute immediately; and S45, checking the report.
Step S32, if the judgment is that the parameterization is to be carried out, the parameterization is carried out, if the judgment is not to be carried out, the next step is directly carried out; step S33, further includes if the judgment is to check, then check is carried out, if the judgment is not to check, then directly enter the next step; step S34, if the judgment is that the assembly is to be carried out, the assembly is carried out, if the judgment is not to be carried out, the next step is directly carried out; step S36, if the debugging is judged to pass, the next step is directly carried out, if the debugging is judged not to pass, the use case is edited, and the debugging is carried out again; and step S44, if the execution is judged to be immediate, the execution is immediate, and then the next step is carried out, if the execution is not judged to be immediate, the timing task is set, the execution is carried out according to the set requirement, and after the execution, the next step is carried out.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. Automatic test application platform based on enterprise's demand development, its characterized in that: the system comprises a business layer, a presentation layer, a service layer, a persistence layer and a data layer;
the service layer and the presentation layer directly face the user and carry out service function data transmission and result return through the interface and the service layer; the service layer stores, supplements and transmits the data of the server service to the persistent layer; the persistent layer adds, deletes, changes and checks data to the data layer through the ORM framework;
the service layer comprises http interface automation, tool sets, data analysis, UI automation and automatic operation and maintenance;
the presentation layer comprises projects, modules, use cases, test plans, mock, reports, system settings, a data center and a user center;
the service layer comprises case loading and scheduling, data dynamic management, rapid scheduling, web service, report generation, mail service, corresponding analysis and verification, planning tasks, version management and case retry;
the persistent layer comprises platform service data management and product database interaction;
and the data layer comprises a test platform database and a product database.
2. The automated testing application platform developed based on enterprise needs of claim 1, wherein: in the presentation layer, the project performs project management at a system level;
the module performs service attribution management in the system, refines module management and customizes an execution plan;
the use case is used for use case management, development, editing, modification and deletion of the use case, a unit use case combined design mode is managed, and a component combined design mode is managed;
the test plan manages planned execution tasks and is used for managing the task execution plan;
the mock simulates third-party information callback test and carries out mock management;
the report is used for report management and managing normative report data of the operation result under the task level;
the system setting comprises environment setting, task monitoring, global variable setting and global function management; different environment management, timing task progress monitoring and variable and function data management.
3. The automated testing application platform developed based on enterprise needs of claim 2, wherein: the project management comprises project addition, deletion, modification and check, and is executed according to projects and project embedded function management; and defining a project private function according to the system management use case.
4. The automated testing application platform developed based on enterprise needs of claim 2, wherein: the module management comprises module addition, deletion, modification and check, and is executed according to the module; the use cases are managed by module.
5. The automated testing application platform developed based on enterprise needs of claim 2, wherein: the case management comprises case addition, deletion, modification and check, case debugging and execution, case coverage statistics, case sets and case combinations; use case editing, statistics, parameterization, verification and free combination.
6. The automated testing application platform developed based on enterprise needs of claim 2, wherein: the task execution plan management is used for executing the free set of the use cases and the set of the use cases in a timing mode; and the cross-project use case combination is used for executing the use case in idle time through a timing task.
7. The automated testing application platform developed based on enterprise needs of claim 2, wherein: the report management is used for report viewing and message sending; report generation, problem viewing, email notification.
8. The automated testing application platform developed based on enterprise needs of claim 2, wherein: the mock management comprises baffle adding, deleting, changing, checking, starting and stopping; and simulating a third party to return data, and making a use case flow through.
9. The method for testing the automatic testing application platform developed based on enterprise requirements is characterized by comprising the following steps of:
s1, adding new items;
s2, adding modules;
s3, adding new cases;
s4, newly adding execution;
s5, checking the result;
wherein S3 further includes the steps of:
s31, creating a new use case;
s32, judging whether parameterization is needed;
s33, judging whether to check;
s34, judging whether to assemble;
s35, case debugging;
s36, judging whether the debugging is passed;
s37, case saving;
wherein S4 further includes the steps of:
s41, selecting a use case;
s42, selecting an environment;
s43, setting a task;
s44, judging whether to execute immediately;
and S45, checking the report.
10. The method for testing an automated testing application platform developed based on enterprise requirements of claim 9, wherein:
step S32, if the judgment is that the parameterization is to be carried out, the parameterization is carried out, if the judgment is not to be carried out, the next step is directly carried out;
step S33, further includes if the judgment is to check, then check is carried out, if the judgment is not to check, then directly enter the next step;
step S34, if the judgment is that the assembly is to be carried out, the assembly is carried out, if the judgment is not to be carried out, the next step is directly carried out;
step S36, if the debugging is judged to pass, the next step is directly carried out, if the debugging is judged not to pass, the use case is edited, and the debugging is carried out again;
and step S44, if the execution is judged to be immediate, the execution is immediate, and then the next step is carried out, if the execution is not judged to be immediate, the timing task is set, the execution is carried out according to the set requirement, and after the execution, the next step is carried out.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110396933.7A CN113076249B (en) | 2021-04-13 | 2021-04-13 | Automatic test application platform and test method based on enterprise demand development |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110396933.7A CN113076249B (en) | 2021-04-13 | 2021-04-13 | Automatic test application platform and test method based on enterprise demand development |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113076249A true CN113076249A (en) | 2021-07-06 |
CN113076249B CN113076249B (en) | 2024-04-12 |
Family
ID=76617888
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110396933.7A Active CN113076249B (en) | 2021-04-13 | 2021-04-13 | Automatic test application platform and test method based on enterprise demand development |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113076249B (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140366005A1 (en) * | 2013-06-05 | 2014-12-11 | Vmware, Inc. | Abstract layer for automatic user interface testing |
CN104598376A (en) * | 2014-12-30 | 2015-05-06 | 中国科学院计算机网络信息中心 | Data driving layered automation test system and method |
US20160198045A1 (en) * | 2015-01-06 | 2016-07-07 | Cyara Solutions Pty Ltd | Interactive voice response system crawler |
CN107643981A (en) * | 2017-08-29 | 2018-01-30 | 顺丰科技有限公司 | A kind of automatic test platform and operation method of polynary operation flow |
CN107678939A (en) * | 2017-08-29 | 2018-02-09 | 苏州惠邦科信息技术有限公司 | Android terminal emulation test system |
CN110059008A (en) * | 2019-04-12 | 2019-07-26 | 广东电网有限责任公司信息中心 | A kind of test cloud platform system and test method towards power business |
CN110765009A (en) * | 2019-10-10 | 2020-02-07 | 南京创维信息技术研究院有限公司 | Automatic AI voice software testing framework of carrying out |
CN110888802A (en) * | 2019-10-24 | 2020-03-17 | 广州永融科技股份有限公司 | Method for managing test requirement |
CN111092771A (en) * | 2019-12-24 | 2020-05-01 | 浙江航天恒嘉数据科技有限公司 | Internet of things simulation test platform |
CN111162957A (en) * | 2019-11-23 | 2020-05-15 | 卡斯柯信号(郑州)有限公司 | Cloud simulation-based rail transit signal system testing method and device with state cipher algorithm |
CN112181852A (en) * | 2020-10-28 | 2021-01-05 | 深圳市万睿智能科技有限公司 | Interface automatic testing method and device, computer equipment and storage medium |
-
2021
- 2021-04-13 CN CN202110396933.7A patent/CN113076249B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140366005A1 (en) * | 2013-06-05 | 2014-12-11 | Vmware, Inc. | Abstract layer for automatic user interface testing |
CN104598376A (en) * | 2014-12-30 | 2015-05-06 | 中国科学院计算机网络信息中心 | Data driving layered automation test system and method |
US20160198045A1 (en) * | 2015-01-06 | 2016-07-07 | Cyara Solutions Pty Ltd | Interactive voice response system crawler |
CN107643981A (en) * | 2017-08-29 | 2018-01-30 | 顺丰科技有限公司 | A kind of automatic test platform and operation method of polynary operation flow |
CN107678939A (en) * | 2017-08-29 | 2018-02-09 | 苏州惠邦科信息技术有限公司 | Android terminal emulation test system |
CN110059008A (en) * | 2019-04-12 | 2019-07-26 | 广东电网有限责任公司信息中心 | A kind of test cloud platform system and test method towards power business |
CN110765009A (en) * | 2019-10-10 | 2020-02-07 | 南京创维信息技术研究院有限公司 | Automatic AI voice software testing framework of carrying out |
CN110888802A (en) * | 2019-10-24 | 2020-03-17 | 广州永融科技股份有限公司 | Method for managing test requirement |
CN111162957A (en) * | 2019-11-23 | 2020-05-15 | 卡斯柯信号(郑州)有限公司 | Cloud simulation-based rail transit signal system testing method and device with state cipher algorithm |
CN111092771A (en) * | 2019-12-24 | 2020-05-01 | 浙江航天恒嘉数据科技有限公司 | Internet of things simulation test platform |
CN112181852A (en) * | 2020-10-28 | 2021-01-05 | 深圳市万睿智能科技有限公司 | Interface automatic testing method and device, computer equipment and storage medium |
Non-Patent Citations (3)
Title |
---|
BRUNO LEGEARD 等: "Smartesting CertifyIt: Model-Based Testing for Enterprise IT", 《2013 IEEE SIXTH INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION》, 29 July 2013 (2013-07-29), pages 2159 - 4848 * |
刘伟鹏: "持续集成的自动化测试平台的实现与应用", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 9, 15 September 2019 (2019-09-15), pages 138 - 334 * |
朱红 等: "基于Jenkins的移动通信业务自动化测试平台的设计与测试", 《科技与创新》, no. 7, 15 April 2016 (2016-04-15), pages 12 - 14 * |
Also Published As
Publication number | Publication date |
---|---|
CN113076249B (en) | 2024-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2228726B1 (en) | A method and system for task modeling of mobile phone applications | |
CN105893593B (en) | A kind of method of data fusion | |
US20090177926A1 (en) | Incident simulation support environment | |
CN106933729A (en) | A kind of method of testing and system based on cloud platform | |
US7926024B2 (en) | Method and apparatus for managing complex processes | |
CN112163337A (en) | Avionics collaborative design method and system based on SysML | |
CN107577709B (en) | Graphical management method of information system resource model | |
CN111260251A (en) | Operation and maintenance service management platform and operation method thereof | |
Hammad et al. | Provenance as a service: A data-centric approach for real-time monitoring | |
CN112580199A (en) | Electric power system multidimensional data unified construction system based on CIM model | |
US8819619B2 (en) | Method and system for capturing user interface structure in a model based software system | |
CN115328758A (en) | Performance test method and system for large data volume of industrial software | |
KR20230062761A (en) | System hindrance integration management method | |
KR100496958B1 (en) | System hindrance integration management method | |
CN112286754A (en) | Method and system for realizing modular construction of IT (information technology) resource inspection automation | |
CN113076249B (en) | Automatic test application platform and test method based on enterprise demand development | |
CN116029648A (en) | Relationship modeling management method, device and system based on product BOM structure | |
CN114706738A (en) | Method and device for automatically burying point at client | |
CN114911773A (en) | Universal meta-model design method | |
CN113220592A (en) | Processing method and device for automated testing resources, server and storage medium | |
CN112529467A (en) | Intelligent scheduling system for new media | |
CN114004553B (en) | System, server and client for generating plans in visual mode | |
CN116560637B (en) | Method and system for developing application system in configuration form for digital transformation | |
CN114692382B (en) | Management method and device for nuclear power simulation model development data and computer equipment | |
Ulrich et al. | Reverse engineering models from traces to validate distributed systems–an industrial case study |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |