CN109753430B - Interface test method of ground data processing system - Google Patents

Interface test method of ground data processing system Download PDF

Info

Publication number
CN109753430B
CN109753430B CN201811556428.9A CN201811556428A CN109753430B CN 109753430 B CN109753430 B CN 109753430B CN 201811556428 A CN201811556428 A CN 201811556428A CN 109753430 B CN109753430 B CN 109753430B
Authority
CN
China
Prior art keywords
test
simulation
ipf
interface
tested object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811556428.9A
Other languages
Chinese (zh)
Other versions
CN109753430A (en
Inventor
关键
赵生林
成恩伟
崔亮
黄伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Linose Technology Beijing Co ltd
Original Assignee
Linose Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Linose Technology Beijing Co ltd filed Critical Linose Technology Beijing Co ltd
Priority to CN201811556428.9A priority Critical patent/CN109753430B/en
Publication of CN109753430A publication Critical patent/CN109753430A/en
Application granted granted Critical
Publication of CN109753430B publication Critical patent/CN109753430B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Debugging And Monitoring (AREA)

Abstract

The invention relates to an interface test method of a ground data processing system, which is suitable for maintenance, upgrade and fault judgment of the satellite ground data processing system, in particular to operation, maintenance and management of the satellite ground data processing system in the aerospace field. The method of the invention commands and schedules each data processing system according to the real flow by simulating the satellite operation and control system, arranges the execution flow, configures the operation parameters for each processing system, prepares to input the processing file, monitors the data processing and operation process in real time, and finally obtains scientific evaluation according to each parameter and performance index in the process, thereby realizing automatic continuous test and monitoring, not interrupting the output of technical indexes, and being capable of continuously tracking the state of the system to be tested. According to the trend, scientific basis is provided for system operation, maintenance, fault early warning and the like, the reliability of the system is improved, and the service life of the system is prolonged.

Description

Interface test method of ground data processing system
Technical Field
The invention relates to an interface test method of a ground data processing system, which is suitable for maintenance, upgrade and fault judgment of the satellite ground data processing system, in particular to operation, maintenance and management of the satellite ground data processing system in the aerospace field.
Background
Along with the development of social economy, the emission quantity of commercial, military and other satellites is increased, a large amount of data acquired by the satellites every day is sent back to the ground, a plurality of complex data processing systems are needed to analyze and process related data corresponding to the ground, and along with long-time operation, various subsystems, components and the like gradually expose design defects, potential faults, performance degradation and other problems which affect normal operation of functions. The conventional testing method is used for testing before and after the data processing systems are on line, so that the normal operation of each data processing system can be better guaranteed, the normal working time of the system is prolonged, the service life of the system is prolonged, and related personnel are assisted to provide scientific basis and troubleshooting positioning.
Disclosure of Invention
The technical problem solved by the invention is as follows: the interface test method of the ground data processing system is characterized in that a simulated satellite operation control system commands and schedules each data processing system according to a real flow, arranges an execution flow, configures operation parameters for each processing system, prepares an input processing file, monitors the data processing operation process in real time, and finally obtains scientific evaluation according to each parameter and performance index in the process.
The technical solution of the invention is as follows: a test method for ground data processing system interface includes the following steps:
s1, storing the test data file needed by the operation of the tested object into the server; the test data file comprises a test input data file and a test output data file;
s2, creating a simulation interface template in the simulation software according to the configuration items of the tested object; the configuration items comprise the types and the numbers of test input data and test output data of the tested object, and the lowest configuration and other software dependence of a computer when the tested object runs, and the interface template is a data model for representing the configuration items in the computer;
s3, creating an interface instance of the tested object according to the simulation interface template and the test data file, and generating a test task list according to the interface instance;
s4, decomposing the test task list into sub tasks according to the lowest configuration and other software dependencies, and distributing the sub tasks and test data files to corresponding test node computers; the test node computer executes the received subtasks, displays the running condition at a browser end in real time and displays the running condition to a user; and after the test task list is executed, outputting a test result report for a user to refer.
Furthermore, when more than two tested objects exist and a business process relation exists between the tested objects, combining the interface instance of each tested object according to the business process relation to generate a process instance; generating a flow test task list according to the flow example, executing the flow test task list, and starting the test; the flow instance is a combination of interface instances of each tested object.
Further, if a certain tested object cannot be used in the process instance, the user determines whether to simulate the input and output of the tested object, so that the execution of the process test task list is not interrupted, and a simulation instruction is sent, wherein the simulation instruction comprises all information of the tested object to be simulated.
Further, the method for simulating the input of the tested object comprises the following steps: receiving a simulation instruction, and creating a simulation pile of a tested object to be simulated in simulation software according to the simulation instruction; packaging and uploading the simulation piles to a server; and after receiving the simulation instruction, the test node computer downloads the simulation pile from the corresponding address in the server and executes the simulation pile.
Further, the method for simulating the execution of the tested object comprises the following steps: calculating the simulation processing time length according to the configuration items; carrying out simulation processing on the simulation processing time length; and simultaneously outputting the simulation process log and progress in real time.
Further, the method for simulating the output of the tested object comprises the following steps: when the simulation processing duration is finished, outputting an output file to a specified position; the specified position is the input of the next process of the simulated test object
Furthermore, there are a plurality of interface instances for creating the tested object according to the interface template, and different test task lists are correspondingly generated and executed.
Further, in the process from the test beginning to the test ending, a test log is generated in real time for a tester to monitor and refer to the execution condition of the test task list.
Compared with the prior art, the invention has the advantages that:
(1) the invention can realize automatic continuous test and monitoring, does not interrupt the output of technical indexes and can continuously track the condition of the tested system. By the trend, scientific basis is provided for system operation, maintenance, fault early warning and the like, the reliability of the system is improved, and the service life of the system is prolonged;
(2) the method of the invention ensures the correctness of the result, the interface matching property, the full coverage of the test items and the completeness of the operating environment of each test configuration item by recording, tracking and managing the test condition of each test configuration item, thereby forming a standardized test acceptance flow. A verification means is provided for the quality of each software of the system, the quality is strictly closed, and production accidents caused by the fact that wrong software enters a production system are avoided;
(3) The method of the invention can simulate the input and output interfaces of the ground data processing system with faults, simulate the running time, and start other data processing systems before and after seamless connection to enable normal operation.
(4) The management function is a B/S mode, the deployment and the maintenance are easy, the test node system supports flexible expansion for the C/S mode, each node can support different operating systems to adapt to different tested systems with different requirements, more test cases can be generated for test objects more conveniently, the test coverage is wider, and single-node test can be combined conveniently into more complex multi-system conforming test cases.
Drawings
FIG. 1 is a block diagram of the process flow of the present invention;
FIG. 2 is a flowchart of the test task scheduling of the present invention;
fig. 3 is a diagram illustrating a diagram of a Gitlab directory according to an embodiment of the present invention.
Detailed Description
A test method of a ground data processing system interface comprises simulation test management (SMS), interface control and simulation (ISS), service process simulation (OSS), test data management (TDS), test Task Management (TMS) and simulation test container management (STC).
The simulation test management (SMS) is composed of a user management module, a basic information management module, a simulation test environment management module, a simulation test operation management module, a simulation test process monitoring and information collection module and a simulation test report management module;
The interface control and simulation (ISS) is composed of an interface template management module, an interface case management function, a simulation stub management function and a log management function module.
The service process simulation (OSS) is composed of a process template management module, a process case management module and a log management module.
The test data management (TDS) is composed of an external data management module, a scatterometer scientific simulation data management module, a data synchronization module and a log management module.
The test Task Management (TMS) is composed of a test task list management module, a test task deployment scheme construction module, an engine driving function module, a fault disk recovery function module and a log management module.
The simulation test container management (STC) is composed of installation and deployment of a tested system, test data acquisition, cluster node profile acquisition and log management.
As shown in fig. 1 and 2, the specific process is as follows:
user management phase
1) The administrator logs in the system and creates and maintains the corresponding relation among the user, the role, the authority point and the three.
System setup phase
1) The administrator creates and maintains system configuration information and test environment (simulation test cluster) information;
2) the tester creates and maintains the basic information of the tested object and the test data and the corresponding relation between the two.
Test preparation phase
1) The administrator can maintain the built-in interface template, the simulation pile and the flow template;
2) the tester can create and maintain test data entities, interface templates, interface cases, simulation piles, flow templates, and flow cases.
Test planning phase
1) The tester creates and maintains a test task list, wherein the test task list is a one-time test task plan formed by combining a plurality of interface cases and process cases.
Test execution phase
1) Automatically forming a deployment scheme of the tested object according to the running resources required by the tested object and the resource use condition of the simulation test cluster;
2) a user can adjust a deployment scheme autonomously formed by the system, the simulation test container management software executes a final deployment scheme, and the software to be tested is deployed to a node designated by the simulation test cluster;
3) executing a test task list: driving an interface case and a flow case in the test task list to automatically execute;
4) monitoring/controlling the test task execution process: the tester can check the information of the execution state of the test task, the actual measurement data, the working state of the simulation test cluster and the like in real time. Meanwhile, specific operation control (including starting, stopping and the like) can be carried out on the comprehensive simulation test subsystem according to the test requirement;
5) After the test is finished, the software automatically stores the test result to the database.
Test evaluation phase
1) An administrator can view the system log;
2) the tester can check the test results, compare the test results of a plurality of similar test tasks, formulate a file template according to the specification, and output a simulation test report according to a report template with a uniform format.
If a certain tested object is unavailable, in order to ensure that the whole process is not interrupted, a tester needs to judge whether the unavailable tested object needs to be simulated or not. The simulation method comprises the following steps:
the method comprises the steps of creating an IPF simulation pile project in a linux environment, introducing a dependency library (PIK) required by IPF development, defining an IPF development processing subclass according to an IPF development specification, realizing related interfaces in the subclass, simulating functions of running log output, file input and output file conversion, simulating time delay and the like.
And after the development of the IPF simulation pile is completed, packaging and uploading the IPF simulation pile and the dependency library thereof to a git server. And the test engine sends the IPF component download address configured according to the use case to the container management program (note: if the IPF simulation stub is configured to be used, the download address points to the git address where the simulation stub is located, otherwise, the download address points to the git address of the real IPF component).
And when receiving a request for testing the IPF component, the container management program downloads the IPF component and a dependency library thereof from a git server address indicated by the test engine, deploys the IPF component and the dependency library to a directory appointed in the computing node, and starts the operation of the IPF component. If the pile is an IPF simulation pile, the operation log (including the operation progress percentage) is simulated and output in the operation process of the IPF simulation pile. The IPF stub copies the input file and renames it to an output file (note: the input file should be prepared in advance) based on the input and output file information. If the simulation delay is set, the IPF simulation pile delays for waiting the set delay time.
One embodiment of the present invention is as follows.
Firstly, opening and logging in a simulation test system.
And secondly, adding the tested object. The measurand is created in system configuration management-measurand management. Clicking an adding button, popping up a newly-built tested object window, inputting a tested object name, selecting a system to which the new tested object belongs, a component code number and a version number, selecting a tested system, inputting a delivery unit name, a delivery unit contact, a delivery unit telephone and a delivery unit code number, and storing data.
And thirdly, adding test data. In the system configuration management-external data management function, automatically scanning the imported data, and selecting, clicking, adding and popping up a next operation interface.
And fourthly, creating an IPF interface template. An IPF interface template is created in test preparation-IPF interface template management. Clicking the adding icon, popping up a new interface template, and inputting the name of the object to be tested, the name of the interface template, the version number, the remark, the Job _ Order template and the environment requirement monitoring process.
And fifthly, creating an IPF interface use case. And creating an IPF interface case in the test preparation-IPF interface case management. Clicking the adding icon, popping up a new interface case window, and inputting an interface template name, an IPF interface case name, a version number, a remark, a Job _ Order template and a path.
And sixthly, creating an IPF flow template. An IPF flow template is created in test preparation-IPF flow template management. Drawing a corresponding flow chart according to the upstream-downstream relation of each measured object in the actual flow, and setting a new flow template name.
Seventhly, an IPF process example is created. An IPF flow instance is created in test preparation-IPF flow instance management. A flow template is selected in the following interface, and then information such as interface use cases or time simulation piles is appointed for each node of the template.
And step eight, creating a test task list. A test task sheet is created in a test execution-task single management. Clicking an adding icon, popping up a newly-built task list window, inputting a task list name, a test purpose, remarks, a selection flow example, a selection interface example and a selection simulation pile example, and storing data.
And step nine, executing the task list. The task sheet is selected and executed in the test execution-task order process and the monitoring test process is started.
And (3) test evaluation: the test report is downloaded in a test evaluation-test report view. And after uploading the tested software and the input file to the Gitlab archive test is successful, opening and logging in the Gitlab, and uploading the IPF program and the input data file which pass the test to the Gitlab. The structure of the directory inside after the project is created is shown in fig. 3.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (8)

1. A test method for ground data processing system interface is characterized by comprising the following steps:
s1, storing the test data file needed by the operation of the tested object into the server; the test data file comprises a test input data file and a test output data file;
s2, creating a simulation interface template in the simulation software according to the configuration items of the tested object; the configuration items comprise the types and the numbers of test input data and test output data of the tested object, and the lowest configuration and other software dependence of a computer when the tested object runs, and the interface template is a data model for representing the configuration items in the computer;
S3, creating an interface instance of the tested object according to the simulation interface template and the test data file, and generating a test task list according to the interface instance;
s4, decomposing the test task list into subtasks according to the lowest configuration and other software dependence, and distributing the subtasks and test data files to corresponding test node computers; the test node computer executes the received subtasks, displays the running condition at a browser end in real time and displays the running condition to a user; after the test task list is executed, outputting a test result report for a user to refer;
the test evaluation phase comprises:
1) the administrator checks the system log;
2) the tester checks the test results, compares the test results of a plurality of similar test tasks, makes a file template according to the specification, and outputs a simulation test report according to a report template with a uniform format;
if some tested object is unavailable, the tester judges whether the unavailable tested object needs to be simulated or not; the simulation method comprises the following steps:
creating an IPF simulation pile project in a linux environment, introducing a dependency library required by IPF development, defining an IPF development processing subclass according to an IPF development specification, realizing related interfaces in the subclass, and simulating operation log output, file input and output file conversion and simulation delay;
After the development of the IPF simulation pile is completed, packaging and uploading the IPF simulation pile and a dependency library thereof to a git server; the test engine downloads the address according to the IPF assembly configured by the use case and sends the address to the container management program;
when receiving a request for testing the IPF component, the container management program downloads the IPF component and a dependency library thereof from a git server address indicated by a test engine, deploys the IPF component and the dependency library to a directory appointed in a computing node, and starts the operation of the IPF component; if the pile is an IPF simulation pile, the operation log can be simulated and output in the operation process of the IPF simulation pile; the IPF simulation pile copies the input file and renames the input file into an output file according to the information of the input and output files; if the simulation delay is set, the IPF simulation pile delays for waiting for the set delay time;
s3 and S4 specifically include:
creating an IPF interface template; creating an IPF interface template in test preparation-IPF interface template management;
creating an IPF interface use case; creating an IPF interface case in test preparation-IPF interface case management;
creating an IPF flow template; creating an IPF flow template in test preparation-IPF flow template management;
creating an IPF flow example; creating an IPF process example in the test preparation-IPF process example management;
creating a test task list; creating a test task list in a test execution-task single management;
And executing the task list.
2. The method of claim 1, wherein the method comprises: when the tested objects have more than two business process relations, combining the interface examples of each tested object according to the business process relations to generate process examples; generating a flow test task list according to the flow example, executing the flow test task list, and starting the test; the flow instance is a combination of interface instances of each tested object.
3. The method of claim 2, wherein the method comprises: if a certain tested object cannot be used in the process example, a user judges whether the input and the output of the tested object are simulated or not, so that the execution of the process test task list is not interrupted, and a simulation instruction is sent, wherein the simulation instruction comprises all information of the tested object to be simulated.
4. The method for testing the ground data processing system interface of claim 3, wherein the method for simulating the input of the tested object comprises the following steps: receiving a simulation instruction, and creating a simulation pile of a tested object to be simulated in simulation software according to the simulation instruction; packaging and uploading the simulation piles to a server; and after receiving the simulation instruction, the test node computer downloads the simulation pile from the corresponding address in the server and executes the simulation pile.
5. The method as claimed in claim 4, wherein the method for simulating the execution of the object under test is: calculating the simulation processing time length according to the configuration items; carrying out simulation processing on simulation processing duration; and simultaneously, outputting simulation process logs and progress in real time.
6. The method for testing the ground data processing system interface of claim 5, wherein the method for simulating the output of the tested object comprises the following steps: when the simulation processing duration is finished, outputting an output file to a specified position; the designated position is input for simulating the next process of the tested object.
7. A method for testing an interface of a ground data processing system according to any one of claims 1 to 6, characterized by: and the interface instances for creating the tested object according to the simulation interface template and the test data file are various, and different test task lists are correspondingly generated and executed.
8. A method for testing an interface of a ground data processing system according to any one of claims 1 to 6, characterized by: and in the process from the test beginning to the test ending, generating a test log in real time for a tester to monitor and refer to the execution condition of the test task list.
CN201811556428.9A 2018-12-19 2018-12-19 Interface test method of ground data processing system Active CN109753430B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811556428.9A CN109753430B (en) 2018-12-19 2018-12-19 Interface test method of ground data processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811556428.9A CN109753430B (en) 2018-12-19 2018-12-19 Interface test method of ground data processing system

Publications (2)

Publication Number Publication Date
CN109753430A CN109753430A (en) 2019-05-14
CN109753430B true CN109753430B (en) 2022-07-29

Family

ID=66402824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811556428.9A Active CN109753430B (en) 2018-12-19 2018-12-19 Interface test method of ground data processing system

Country Status (1)

Country Link
CN (1) CN109753430B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110321288B (en) * 2019-06-21 2023-05-12 北京机电工程研究所 Simulation test method for information processing system on aircraft
CN110716817B (en) * 2019-09-10 2024-06-25 中国平安财产保险股份有限公司 System operation fault processing method and device, storage medium and electronic equipment
CN111611444B (en) * 2020-04-22 2023-05-23 国家卫星气象中心(国家空间天气监测预警中心) Universal fault diagnosis system for polar orbit meteorological satellite
CN112834966B (en) * 2020-12-31 2022-03-15 中国科学院微小卫星创新研究院 Automatic test system for satellite electrical interface
CN114297084B (en) * 2021-12-31 2022-08-19 北京航天驭星科技有限公司 Method and device for testing satellite test, operation and control data interface, electronic equipment and medium
CN114817063A (en) * 2022-05-17 2022-07-29 中国联合网络通信集团有限公司 Simulation test method, device and storage medium
CN116627849B (en) * 2023-07-24 2024-01-26 中邮消费金融有限公司 System test method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102541725A (en) * 2010-12-09 2012-07-04 中国科学院沈阳计算技术研究所有限公司 Simulation test method of numerical control system functional module
CN104461854A (en) * 2013-09-12 2015-03-25 中国船舶工业综合技术经济研究院 General simulation testing platform for software of ship equipment and construction method of general simulation testing platform
CN105487977A (en) * 2015-11-30 2016-04-13 北京锐安科技有限公司 Agility-oriented automatic test management system and method
CN108121657A (en) * 2017-11-29 2018-06-05 北京京航计算通讯研究所 Programmable logic device software simulation verification system based on system model

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9389989B2 (en) * 2014-03-19 2016-07-12 International Business Machines Corporation Self verifying device driver for multi-version compatible data manipulation devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102541725A (en) * 2010-12-09 2012-07-04 中国科学院沈阳计算技术研究所有限公司 Simulation test method of numerical control system functional module
CN104461854A (en) * 2013-09-12 2015-03-25 中国船舶工业综合技术经济研究院 General simulation testing platform for software of ship equipment and construction method of general simulation testing platform
CN105487977A (en) * 2015-11-30 2016-04-13 北京锐安科技有限公司 Agility-oriented automatic test management system and method
CN108121657A (en) * 2017-11-29 2018-06-05 北京京航计算通讯研究所 Programmable logic device software simulation verification system based on system model

Also Published As

Publication number Publication date
CN109753430A (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN109753430B (en) Interface test method of ground data processing system
CN105359102B (en) Advanced customer support service-advanced support cloud portal
US10338550B2 (en) Multisite version and upgrade management system
US20200183896A1 (en) Upgrade of heterogeneous multi-instance database clusters
US8984489B2 (en) Quality on submit process
US9350623B2 (en) System and method for automated deployment of multi-component computer environment
US7895565B1 (en) Integrated system and method for validating the functionality and performance of software applications
US7761851B2 (en) Computer method and system for integrating software development and deployment
US20150100829A1 (en) Method and system for selecting and executing test scripts
US20150100832A1 (en) Method and system for selecting and executing test scripts
CN111080257A (en) DevOps-based end-to-end online research and development management system and method
CN104407971A (en) Method for automatically testing embedded software
US20150100830A1 (en) Method and system for selecting and executing test scripts
CN107463362A (en) The method and system of lasting deployment based on multiple Jenkins
CN106933729A (en) A kind of method of testing and system based on cloud platform
US20150100831A1 (en) Method and system for selecting and executing test scripts
CN102799709B (en) System simulation test environment building and configuring system and method based on extensive markup language (XML)
CN107480050B (en) Test method for automatically testing update package
US20220261240A1 (en) Agile, automotive spice, dev ops software development and release management system
Berton et al. The ESOC End-to-End Ground Segment Reference Facility
CN111078524A (en) Continuous integration test method based on electric power 6+1 system
Hill Jr et al. Sequence-of-events-driven automation of the deep space network
CN111382082A (en) Continuous integration test method and device
Kanchana et al. Automated Development and Testing of ECUs in Automotive Industry with Jenkins
Bocchino et al. Industry Best Practices in Robotics Software Engineering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant