CN117061377A - Equipment testing method, device, equipment and storage medium - Google Patents

Equipment testing method, device, equipment and storage medium Download PDF

Info

Publication number
CN117061377A
CN117061377A CN202311091596.6A CN202311091596A CN117061377A CN 117061377 A CN117061377 A CN 117061377A CN 202311091596 A CN202311091596 A CN 202311091596A CN 117061377 A CN117061377 A CN 117061377A
Authority
CN
China
Prior art keywords
test
failed
determining
cases
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311091596.6A
Other languages
Chinese (zh)
Inventor
段冲磊
马伯祥
聂泽宇
李志宁
吴承泽
陈嘉慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Jiefang Automotive Co Ltd
Original Assignee
FAW Jiefang Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Jiefang Automotive Co Ltd filed Critical FAW Jiefang Automotive Co Ltd
Priority to CN202311091596.6A priority Critical patent/CN117061377A/en
Publication of CN117061377A publication Critical patent/CN117061377A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/06Generation of reports
    • H04L43/065Generation of reports related to network devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/50Testing arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

The invention discloses a device testing method, a device, equipment and a storage medium, and relates to the technical field of vehicles, wherein the method comprises the following steps: testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing; according to a preconfigured analysis system, carrying out cause analysis on test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases; and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases. According to the technical scheme, the automatic testing efficiency of the vehicle can be improved, the test result report can be automatically sent out without manual work, and the labor cost and the time cost are reduced.

Description

Equipment testing method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of vehicles, in particular to a device testing method, a device, equipment and a storage medium.
Background
The test of the vehicle-mounted Ethernet data link layer mainly aims at network switching equipment in a vehicle-mounted network, and whether the network switching equipment can normally operate is judged by carrying out different types of tests on the network switching equipment.
At present, in the automatic test process of the vehicle-mounted Ethernet, a tester is required to connect a controller to be tested to test equipment according to requirements, relevant information of the controller to be tested is input into an upper computer system of the test equipment, and the test is executed after relevant files are imported. After the test is completed, the tester forms a test report according to the output test result and the reason for generating the test result by data analysis.
However, the test report needs to be issued by a tester, and has the problems of low test efficiency, large consumption of labor cost and time cost, and the like.
Disclosure of Invention
The invention provides a device testing method, a device, equipment and a storage medium, which are used for realizing automatic corresponding test result report, improving the testing efficiency and reducing the labor cost and the time cost.
In a first aspect, an embodiment of the present invention provides a device testing method, including:
testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing;
according to a preconfigured analysis system, carrying out cause analysis on test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases;
And determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases.
The technical scheme of the embodiment of the invention provides a device testing method, which comprises the following steps: testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing; according to a preconfigured analysis system, carrying out cause analysis on test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases; and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases. According to the technical scheme, after the test system receives the test task, each test item of the device to be tested is tested through the test cases, the test results corresponding to the test cases are determined to be passed or failed, then the reason analysis is carried out on the test cases with the test results being failed according to the pre-configured analysis system, the abnormal reasons corresponding to the test cases with the test results being failed are determined, finally the test report of the device to be tested is determined according to the test items and the test results corresponding to the test cases and the abnormal reasons with the test results being failed, compared with the vehicle-mounted Ethernet automatic test method in the prior art, the method can use the pre-configured analysis system to carry out reason analysis on the test cases with the test results being failed, determine the abnormal reasons corresponding to the test cases with the test results being failed, further, the test report failure rate and the failure time are improved, and the artificial reasons are not required to be analyzed are met.
Further, before testing each test item of the device to be tested based on the test case included in the test task, the method further includes:
acquiring a history test report obtained by testing a plurality of history devices to be tested by a test case contained in a test task, wherein the history test report comprises device information of each history device to be tested, a test result corresponding to each test case, and an initial abnormality reason corresponding to the test case which fails;
clustering initial abnormal reasons corresponding to the test cases with the test results of failed, and determining standard abnormal reasons corresponding to the test cases with the test results of failed;
and configuring the standard abnormal reasons corresponding to the test cases with failed test results to an initial system to obtain an analysis system.
Further, after obtaining a history test report obtained by testing a plurality of history devices to be tested by a test case included in the test task, the method further includes:
carrying out association analysis on each test case with the test result of failed, determining the association relation between each test case with the test result of failed, and determining the association test case with the test result of failed corresponding to each test case;
And configuring the associated test cases corresponding to the test cases with failed test results to an initial system to obtain an analysis system.
Further, based on the test cases included in the test task, each test item of the device to be tested is tested, and the test result corresponding to each test case is determined to be passed or failed, including:
for each test case contained in the test task, testing each test item of the equipment to be tested based on the current test case, and determining whether the test result of the current test case is passed or failed;
when the test result of the current test case is failed, determining that the test result of the associated test case corresponding to the current test case is failed.
Further, according to a preconfigured analysis system, performing cause analysis on a test case with a test result of failed, and determining that the test result is an abnormal cause corresponding to the failed test case, the method includes:
inquiring in a preconfigured analysis system based on the test cases with failed test results to obtain standard exception reasons corresponding to the test cases with failed test results;
and determining the standard abnormal reasons corresponding to the test cases with the test results of failed as the abnormal reasons corresponding to the test cases with the test results of failed.
Further, determining a test report of the device to be tested according to the test item and the test result corresponding to each test case and the abnormal reason corresponding to the test case with the test result being failed, and further comprising:
determining that the state of the test item corresponding to the test case with the test result being failed is abnormal, and determining the abnormal reason corresponding to the test case with the test result being failed as the abnormal reason corresponding to the test item with the test result being failed;
determining that the state of the test item corresponding to the test case with the test result passing is normal;
and determining the abnormal reasons corresponding to the test items with abnormal states as test reports of the equipment to be tested.
Further, after determining that the test result is the standard exception cause corresponding to each failed test case, the method further comprises:
determining that the test result is the real abnormal reason corresponding to each failed test case;
if the standard abnormality cause does not include the true abnormality cause, the true abnormality cause is added to the standard abnormality cause.
In a second aspect, an embodiment of the present invention further provides an apparatus testing device, where the apparatus includes:
The determining module is used for testing each test item of the equipment to be tested based on the test cases contained in the test task and determining whether the test result corresponding to each test case is passed or failed;
the analysis module is used for carrying out reason analysis on the test cases with the test results of failed according to a preconfigured analysis system, and determining that the test results are abnormal reasons corresponding to the failed test cases;
and the execution module is used for determining a test report of the equipment to be tested according to the test items and the test results corresponding to the test cases and the abnormal reasons corresponding to the test cases, the test results of which are failed.
In a third aspect, an embodiment of the present invention further provides a computer apparatus, including:
at least one processor; and a memory communicatively coupled to the at least one processor;
wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the device testing method of any one of the first aspects.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer-executable instructions for performing the device testing method of any of the first aspects when executed by a computer processor.
In a fifth aspect, the present invention provides a computer program product comprising computer instructions which, when run on a computer, cause the computer to perform the device testing method as provided in the first aspect.
It should be noted that the above-mentioned computer instructions may be stored in whole or in part on a computer-readable storage medium. The computer readable storage medium may be packaged together with the processor of the device testing apparatus or may be packaged separately from the processor of the device testing apparatus, which is not limited in the present invention.
The description of the second, third, fourth and fifth aspects of the present invention may refer to the detailed description of the first aspect; also, the advantageous effects described in the second aspect, the third aspect, the fourth aspect, and the fifth aspect may refer to the advantageous effect analysis of the first aspect, and are not described herein.
In the present invention, the names of the above-described device testing apparatuses do not constitute limitations on the devices or function modules themselves, and in actual implementations, these devices or function modules may appear under other names. Insofar as the function of each device or function module is similar to that of the present invention, it falls within the scope of the claims of the present invention and the equivalents thereof.
These and other aspects of the invention will be more readily apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a device testing method provided by an embodiment of the present invention;
FIG. 2 is a flow chart of another device testing method provided by an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a device testing apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms "first" and "second" and the like in the description and in the drawings are used for distinguishing between different objects or between different processes of the same object and not for describing a particular order of objects.
Furthermore, references to the terms "comprising" and "having" and any variations thereof in the description of the present application are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like. Furthermore, embodiments of the application and features of the embodiments may be combined with each other without conflict.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more.
When the automatic test process is carried out on the vehicle-mounted Ethernet data link layer, a tester is required to connect the controller to be tested to the test equipment according to the requirement, the upper computer system of the test equipment inputs the related information of the controller, and the test is carried out after the related file is imported. After the test is completed, the tester forms a test report according to the output test result and the reason for generating the test result by data analysis. Because the functional strategies of different controllers are different and the levels of manufacturers are different, in order to adapt to different controllers, test scripts need to be frequently changed, and a great deal of labor cost and time cost are consumed.
Therefore, the invention provides a device testing method, which realizes automatic corresponding test result report, improves the testing efficiency and reduces the labor cost and the time cost.
The device testing method provided by the invention will be described in detail with reference to the drawings and examples.
Fig. 1 is a flowchart of a device testing method provided in an embodiment of the present invention, where the embodiment may be adapted to a case of automatically generating test reports corresponding to performance tests and functional tests of a network switching device in a vehicle-mounted network, where the method may be performed by a device testing apparatus, and the device testing apparatus may be implemented in a hardware/software manner. The apparatus may be integrated into an electronic device, for example, may be installed in a platform computer, a vehicle-mounted controller, etc., which is not limited in this embodiment of the invention. As shown in fig. 1, the method specifically comprises the following steps:
step 110: and testing each test item of the equipment to be tested based on the test cases contained in the test task, and determining whether the test result corresponding to each test case is passed or failed.
In this embodiment, the test case is a description of a test task performed on a device to be tested, which embodies a test scheme, a method, a technology and a policy. The content of the method comprises a test target, a test environment, input data, a test step, an expected result, a test script and the like, and finally a document is formed. Briefly, a test case may be considered as a set of test inputs, execution conditions, and expected results tailored for a particular goal to verify whether a particular requirement is met.
The device to be tested is a device to be tested, and optionally, the device to be tested may be a vehicle-mounted controller or a vehicle-mounted switch, which is not limited in the embodiment of the present invention.
Specifically, when a test task is executed, the test system tests each test item of the device to be tested through the test case, and determines whether the test result corresponding to each test case passes or fails. The test system is a system loaded with the equipment test method.
In the embodiment of the invention, when a test task is executed, the test system tests each test item of the equipment to be tested through the test cases, determines whether the test result corresponding to each test case passes through the given corresponding result, and realizes the functions of automatically testing the equipment to be tested and obtaining the corresponding test result.
Step 120: and carrying out cause analysis on the test cases with failed test results according to a preconfigured analysis system, and determining that the test results are abnormal causes corresponding to the failed test cases.
In this embodiment, the pre-configured analysis system is a system obtained by configuring, to the initial system, standard exception reasons corresponding to test cases that have failed in the test results in the history test report in the early test period. The standard abnormal reasons corresponding to the test cases are obtained by carrying out cluster analysis on the initial abnormal reasons corresponding to the test cases.
Specifically, after a test result is obtained, analyzing reasons for test cases with test results being failed through an analysis system configured in advance, comparing the test cases with test results being failed in the analysis system with each test case in a historical test report, and determining that the test results are abnormal reasons corresponding to the test cases with failed in the historical test report.
In the embodiment of the invention, the reason analysis is carried out on the test case with the test result of failed by the analysis system configured in advance, and the abnormal reason corresponding to the test case with the test result of failed is determined, so that the automatic analysis of the test result and the corresponding analysis result are realized without manual work, the test efficiency is improved, and the labor cost is saved.
Step 130: and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases.
In this embodiment, the abnormal cause refers to an abnormal cause corresponding to a test case that causes the test result to be failed.
Specifically, the test system obtains an abnormal reason corresponding to a test case which leads to the test result being failed from a preset analysis system, and combines the test items and the test results corresponding to each test case, thereby obtaining a report of the equipment to be tested, wherein the test case comprises the test results being passed and failed.
In the embodiment of the invention, after the test system obtains the abnormal reasons corresponding to the test cases which lead to the test result being failed, the test system combines the test items corresponding to the test cases and the test results, thereby obtaining the report of the equipment to be tested. The method and the device can automatically generate the corresponding test report of the device to be tested without manual work, and liberate manpower.
The device testing method provided by the embodiment of the invention comprises the following steps: based on the test cases contained in the test task, testing each test item of the device to be tested, determining whether the test result corresponding to each test case is passed or failed, analyzing the reasons of the test cases with the test result being failed according to a pre-configured analysis system, determining that the test result is an abnormal reason corresponding to the failed test case, and determining a test report of the device to be tested according to the test items corresponding to each test case, the test result and the abnormal reason corresponding to the failed test case. According to the technical scheme, after the test system receives the test task, each test item of the equipment to be tested is tested through the test cases, the test results corresponding to the test cases are determined to be passed or failed, then the reason analysis is carried out on the test cases with the test results being failed according to the pre-configured analysis system, the abnormal reasons corresponding to the test cases with the test results being failed are determined, finally the test report of the equipment to be tested is determined according to the test items and the test results corresponding to the test cases and the abnormal reasons corresponding to the test cases with the test results being failed.
Fig. 2 is a flowchart of another method for testing equipment according to an embodiment of the present invention, which is embodied based on the above embodiment. In this embodiment, the method may further include:
step 210: and acquiring a history test report obtained by testing a plurality of history devices to be tested by the test cases contained in the test task.
Specifically, before a test task is performed, a test system obtains a history test report obtained after a test case of the test task tests a plurality of history devices to be tested. The history test report is obtained by testing the history equipment to be tested through the prior art and analyzing the corresponding test results by combining the testers, and comprises equipment information of each history equipment, the test results corresponding to each test case and initial abnormal reasons corresponding to failed test cases.
In the embodiment of the invention, the obtained historical test report can help a test system to better learn the operation of the test case, accumulate test experience and expand the database of the test system, so that the test result can be judged more accurately and the reasons of failure items can be given.
Further, after obtaining a history test report obtained by testing a plurality of history devices to be tested by a test case included in the test task, the method further includes:
and carrying out association analysis on each test case with the test result of failed, determining the association relation between each test case with the test result of failed, and determining the association test case with the test result of failed corresponding to each test case.
In this embodiment, the associated test cases are test cases obtained by performing association analysis on test cases whose test results are failed, and determining that the test results are association relationships between the failed test cases.
For example, when the test result of the test case a fails, the test result of the test case B also fails, and the associated test case B is determined to be a. In practical application, when one test case passes, the execution of the relevant test case is carried out, and if the test result of the test case is failed, the relevant test case is skipped.
Specifically, after the history test report obtained by testing a plurality of history devices to be tested by the test cases of the test task is obtained, the association analysis can be further performed on each test case with the test result of failed, wherein the association analysis is to find the association relation among each test case, and according to the mined association relation, the information of one attribute can be deduced from the information of the other attribute. The association analysis may be performed by using a corresponding association analysis algorithm, and optionally, the association analysis algorithm may be an Apriori, FP-Tree, EClat algorithm, or gray association method, which is not limited in the embodiment of the present invention. And after the association analysis, determining that the test result is an association test case corresponding to each failed test case. Alternatively, the determined associated test case may have an inclusion relationship with the present test case in scope, an equal relationship with the present test case in condition, and the like, which is not limited in the embodiment of the present invention.
For example, when one test case passes, execution of the relevant test case is performed, and if the test result of the test case is failed, the relevant test case is skipped, so that the test time cost is saved maximally. Such as: the communication voltage of the test case A is 10 to 20 volts, the test result is failed, the communication voltage of the associated test case B of the test case A is 10 volts, the test result is certainly failed, and the test case B can be skipped without executing the test case A, so that the test time is saved.
In the embodiment of the invention, the influence of the test case on which test cases exists is determined by carrying out the association analysis on each test case which fails in the test result, so that the corresponding association test case is obtained, the test time is saved, and the test cost is reduced.
Step 220: clustering initial abnormal reasons corresponding to the test cases with the test results of failed, and determining standard abnormal reasons corresponding to the test cases with the test results of failed.
In this embodiment, the initial abnormality cause is a cause that, in a plurality of tests performed by the test case in the history period, if the test result of the test case is failed, the test result of the test case is failed. Clustering is to divide a data set into different classes or clusters according to a specific standard, so that the similarity of data objects in the same cluster is as large as possible, and meanwhile, the difference of data objects not in the same cluster is also as large as possible, namely, the data in the same class are gathered together as much as possible after clustering, and the data in different classes are separated as much as possible. The standard abnormal reasons are the standard abnormal reasons corresponding to the test cases with the test results of failed after the initial abnormal reasons corresponding to the test cases with the test results of failed are clustered, and the similar initial abnormal reasons are divided together to form the standard abnormal reasons corresponding to the test cases with the test results of failed.
Specifically, after a history test report of testing a plurality of history test devices by test cases included in a test task is obtained, clustering initial abnormal reasons corresponding to each test case, wherein the test result is failed, determining a clustering method and confirming the formed class number by researching similarity among the initial abnormal reasons, and then evaluating and outputting, so that standard abnormal reasons corresponding to each test case, the test result of which is failed, are obtained. Alternatively, the clustering method may be a partitioning method, a layering method, a density-based method, a grid-based method, or a model-based method, which is not limited in this embodiment of the present invention.
In the embodiment of the invention, the initial abnormal reasons are clustered, so that the similarity and the difference between the initial abnormal reasons can be found, and the high probability reasons corresponding to the test cases which lead to the test result being failed can be conveniently determined.
In one embodiment, step 220 further comprises:
determining that the test result is the real abnormal reason corresponding to each failed test case; if the standard abnormality cause does not include the true abnormality cause, the true abnormality cause is added to the standard abnormality cause.
Specifically, after the test task is executed, the real abnormal reasons corresponding to the failed test cases can be determined as the test results, and optionally, the real abnormal reasons are analyzed by the test personnel and/or obtained by the test system after training. And comparing the real abnormal reasons with the existing standard abnormal reasons one by one, and adding the real abnormal reasons into the standard abnormal reasons if the existing standard abnormal reasons do not comprise the real abnormal reasons.
In the embodiment of the invention, after the test result is determined to be the real abnormal reason corresponding to each failed test case, if the standard abnormal reason does not comprise the real abnormal reason, the real abnormal reason is added to the standard abnormal reason, so that the test system can be helped to continuously perform self-improvement, the self experience is enriched, and the working efficiency of the test system is improved.
Step 230: and configuring the standard abnormal reasons corresponding to the test cases with failed test results to an initial system to obtain an analysis system.
In this embodiment, the initial system is a test system for which an analysis system has not been configured.
Specifically, before executing a test task, configuring a standard abnormality cause corresponding to each test case with a test result of failed to an original test system to obtain an analysis system, wherein the analysis system has a function of analyzing a test result obtained after the device to be tested is tested.
In the embodiment of the invention, when a test task is executed, the standard abnormal reasons corresponding to each test case with the test result of failed are configured to the initial system to obtain the analysis system, then the reason analysis is carried out on the test cases with the test result of failed according to the configured analysis system, so that the abnormal reasons corresponding to the test cases with the test result of failed can be obtained, the test system is helped to learn the operation of the test cases better, test experience is accumulated, and the database of the test system is expanded, so that the test result can be judged more accurately, the test time is saved, the working efficiency of the test system is improved, and the test cost is reduced.
In one embodiment, step 230 further comprises:
and configuring the associated test cases corresponding to the test cases with failed test results to an initial system to obtain an analysis system.
Specifically, after a history test report obtained by testing a plurality of history test devices by a test case included in a test task is obtained, each test case with a test result of failed is subjected to association analysis, the corresponding associated test case is determined, and then the associated test case with the test result of failed is configured to an initial test system to obtain an analysis system.
In the embodiment of the invention, the corresponding analysis system can be obtained by configuring the associated test cases corresponding to the test cases which fail in the test result to the initial system, so that the test system has corresponding test experience, the test result is more accurately judged, the test time is saved, and the time cost is reduced.
Step 240: and testing each test item of the equipment to be tested based on the test cases contained in the test task, and determining whether the test result corresponding to each test case is passed or failed.
In one embodiment, step 240 further comprises:
for each test case contained in the test task, testing each test item of the equipment to be tested based on the current test case, and determining whether the test result of the current test case is passed or failed; when the test result of the current test case is failed, determining that the test result of the associated test case corresponding to the current test case is failed.
Specifically, for each test case included in the test task, when each test item of the device to be tested is tested by using the current test case, determining a test result of the current test case. Of course, in order to increase the test speed, in the case that the test result of the current test case is determined to be failed, the test result of the associated test case corresponding to the current test case may be determined to be failed.
In the embodiment of the invention, the test result of the associated test case corresponding to the test case which fails is also determined as failed, so that the effect of saving the test period is achieved, the test efficiency is improved,
step 250: and carrying out cause analysis on the test cases with failed test results according to a preconfigured analysis system, and determining that the test results are abnormal causes corresponding to the failed test cases.
In one embodiment, step 250 further comprises:
inquiring in a preconfigured analysis system based on the test cases with failed test results to obtain standard exception reasons corresponding to the test cases with failed test results; and determining the standard abnormal reasons corresponding to the test cases with the test results of failed as the abnormal reasons corresponding to the test cases with the test results of failed.
Specifically, after the test results corresponding to each test case are obtained, the original test cases in the system are compared one by one in an analysis system configured in advance, the test results of the test cases are inquired to be failed standard exception reasons, and then the standard exception reasons are determined to be exception reasons corresponding to the test cases with failed test results.
In the embodiment of the invention, the test result of the test case is inquired to be the failed standard abnormal reason in the analysis system configured in advance, and then the standard abnormal reason is determined to be the abnormal reason corresponding to the failed test case, so that the effect of saving the test time can be achieved, and the test cost is reduced.
Step 260: and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases.
In one embodiment, step 260 further comprises:
determining that the state of the test item corresponding to the test case with the test result being failed is abnormal, and determining the abnormal reason corresponding to the test case with the test result being failed as the abnormal reason corresponding to the test item with the test result being failed; determining that the state of the test item corresponding to the test case with the test result passing is normal; and determining the abnormal reasons corresponding to the test items with abnormal states as test reports of the equipment to be tested.
Specifically, the state of a test item corresponding to the test case is determined according to the test result of the test case, if the test result of the test case is passed, the state of the test item corresponding to the test case is marked as normal, if the test result of the test case is failed, the state of the test item corresponding to the test case is marked as abnormal, the abnormal reason corresponding to the test item corresponding to the test result is determined according to the abnormal reason corresponding to the test case failed, and finally, the test report of the device to be tested is obtained according to the state of the test item, the test result, the test case with the test result passed, the abnormal reason corresponding to the test item with the abnormal state, and the test report of the test case.
In the embodiment of the invention, the state of the test item corresponding to the test case is determined through the test result of the test case, the abnormal reason corresponding to the test item corresponding to the test result is determined according to the abnormal reason corresponding to the failed test case, and then the test report of the equipment to be tested is generated according to the state of the test item, the test result, the test case, the abnormal reason corresponding to the test item, and the test case, so that the effect of automatically providing the test report is achieved, the test efficiency is improved, the manpower is saved, and the manpower cost is reduced.
The device testing method provided by the embodiment of the invention comprises the following steps: acquiring a history test report obtained by testing a plurality of history devices to be tested by a test case contained in a test task; clustering initial abnormal reasons corresponding to the test cases with the test results of failed, and determining standard abnormal reasons corresponding to the test cases with the test results of failed; configuring standard exception reasons corresponding to all test cases with failed test results to an initial system to obtain an analysis system; testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing; according to a preconfigured analysis system, carrying out cause analysis on test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases; and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases. According to the technical scheme, the historical test reports obtained by testing the plurality of historical test equipment by the test cases contained in the test task are obtained, then the initial abnormal reasons corresponding to the test cases with the test results of failed are clustered, and the standard abnormal reasons corresponding to the test cases with the test results of failed are determined. And then, configuring the standard abnormality reasons corresponding to the test cases with the test results of failed to an initial system to obtain an analysis system. And after the test system receives the test task, testing each test item of the device to be tested through the test cases, determining whether the test result corresponding to each test case is passing or not passing, performing cause analysis on the test case with the test result being not passing according to a pre-configured analysis system, determining an abnormal cause corresponding to the test case with the test result being not passing, and finally determining a test report of the device to be tested according to the test item corresponding to each test case, the test result and the abnormal cause corresponding to the test case with the test result being not passing. Compared with the prior art, the method and the device can help a test system to better learn the operation of the test cases, accumulate test experience, judge test results more accurately, give reasons of failure items, cluster initial abnormal reasons corresponding to each test case which fails in the test results, and determine standard abnormal reasons corresponding to each test case which fails in the test results. The similarity and the difference between the initial abnormal reasons can be found, so that a tester can better understand the abnormal reasons and extract useful information from the abnormal reasons, then when a test task is executed, standard abnormal reasons corresponding to test cases with test results which are failed are configured to an initial system to obtain an analysis system, and therefore, according to the configured analysis system, the reason analysis is carried out on the test cases with the test results which are failed, the abnormal reasons corresponding to the test cases with the test results which are failed can be obtained, and the working efficiency of the test system is improved. Furthermore, the test report of the equipment to be tested can be determined according to the test items and the test results corresponding to the test cases and the abnormal reasons corresponding to the test cases which lead to the test results not passing, so that the problems that the test items are required to be checked manually, the reasons of failure are analyzed and the test report is required to be sent out when each test task is executed are solved, the fact that the test results not passing can be analyzed without manual work is realized, the report of the test results is obtained, the continuity and the efficiency of automatic test are improved, and the labor cost and the time cost are reduced.
On the other hand, the test system is helped to accumulate test experience through learning a history test report, the test result is judged more accurately, and the standard abnormality reason is given; and if the standard abnormal reasons do not comprise the real abnormal reasons, the real abnormal reasons are added to the standard abnormal reasons, so that the test system is continuously self-perfected, the experience of the test system is enriched, the given abnormal reasons are more and more accurate, and the test system is more and more suitable for controllers with different strategies. Meanwhile, through determining the associated test cases, after one test case fails, other test cases affected by the test case can be automatically skipped, and the effect of saving the test period is achieved.
Fig. 3 is a block diagram of a device testing apparatus according to an embodiment of the present invention, where the device may be adapted to automatically generate a test report corresponding to a performance test and a functional test of a network switching device in a vehicle-mounted network, so as to improve efficiency of an automated test. The apparatus may be implemented in software and/or hardware and is typically integrated in an electronic device, such as a tablet computer.
As shown in fig. 3, the apparatus includes:
a determining module 310, configured to test each test item of the device to be tested based on the test case included in the test task, and determine whether the test result corresponding to each test case is passed or failed;
The analysis module 320 is configured to perform cause analysis on the test cases with failed test results according to a preconfigured analysis system, and determine that the test results are abnormal causes corresponding to the failed test cases;
and the execution module 330 is configured to determine a test report of the device under test according to the test item and the test result corresponding to each test case, and the exception reason corresponding to the test case where the test result is failed.
According to the equipment testing device provided by the embodiment, test items of equipment to be tested are tested based on test cases contained in a test task, whether test results corresponding to the test cases are passed or failed is determined, cause analysis is performed on the test cases with failed test results according to a pre-configured analysis system, abnormal causes corresponding to the test cases with failed test results are determined, and a test report of the equipment to be tested is determined according to the test items corresponding to the test cases, the test results and the abnormal causes corresponding to the test cases with failed test results. According to the technical scheme, after the test system receives the test task, each test item of the equipment to be tested is tested through the test cases, the test results corresponding to the test cases are determined to be passed or failed, then the reason analysis is carried out on the test cases with the test results being failed according to the pre-configured analysis system, the abnormal reasons corresponding to the test cases with the test results being failed are determined, finally the test report of the equipment to be tested is determined according to the test items and the test results corresponding to the test cases and the abnormal reasons corresponding to the test cases with the test results being failed.
On the basis of the above embodiment, the determining module 310 is further configured to:
before each test item of the equipment to be tested is tested based on the test cases contained in the test task, a history test report obtained by testing a plurality of history equipment to be tested by the test cases contained in the test task is obtained, wherein the history test report comprises equipment information of each history equipment, test results corresponding to each test case and initial abnormal reasons corresponding to failed test cases; clustering initial abnormal reasons corresponding to the test cases with the test results of failed, and determining standard abnormal reasons corresponding to the test cases with the test results of failed; and configuring the standard abnormal reasons corresponding to the test cases with failed test results to an initial system to obtain an analysis system.
On the basis of the above embodiment, the determining module 310 is further configured to:
after a history test report obtained by testing a plurality of history devices to be tested by the test cases contained in the test task is obtained, carrying out association analysis on each test case with failed test results, determining association relations among each test case with failed test results, and determining the associated test cases with failed test results corresponding to each test case; and configuring the associated test cases corresponding to the test cases with failed test results to an initial system to obtain an analysis system.
Based on the above embodiment, the determining module 310 is specifically configured to:
for each test case contained in the test task, testing each test item of the equipment to be tested based on the current test case, and determining whether the test result of the current test case is passed or failed; when the test result of the current test case is failed, determining that the test result of the associated test case corresponding to the current test case is failed.
Based on the above embodiment, the analysis module 320 is specifically configured to:
inquiring in a preconfigured analysis system based on the test cases with failed test results to obtain standard exception reasons corresponding to the test cases with failed test results; and determining the standard abnormal reasons corresponding to the test cases with the test results of failed as the abnormal reasons corresponding to the test cases with the test results of failed.
Based on the above embodiment, the execution module 330 is specifically configured to:
determining that the state of the test item corresponding to the test case with the test result being failed is abnormal, and determining the abnormal reason corresponding to the test case with the test result being failed as the abnormal reason corresponding to the test item with the test result being failed; determining that the state of the test item corresponding to the test case with the test result passing is normal; and determining the abnormal reasons corresponding to the test items with abnormal states as test reports of the equipment to be tested.
The device testing device provided by the embodiment of the invention can execute the device testing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the device testing method.
It should be noted that, in the embodiment of the device testing apparatus described above, each unit and module included are only divided according to the functional logic, but not limited to the above-described division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. Fig. 4 shows a block diagram of an exemplary electronic device 4 suitable for use in implementing embodiments of the invention. The electronic device 4 shown in fig. 4 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the invention.
As shown in fig. 4, the electronic device 4 is in the form of a general purpose computing electronic device. The components of the electronic device 4 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 4 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by electronic device 4 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. Electronic device 4 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard disk drive"). Although not shown in fig. 4, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. The system memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
Electronic device 4 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with electronic device 4, and/or any devices (e.g., network card, modem, etc.) that enable electronic device 4 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, the electronic device 4 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through the network adapter 20. As shown in fig. 4, the network adapter 20 communicates with other modules of the electronic device 4 via the bus 18. It should be appreciated that although not shown in fig. 4, other hardware and/or software modules may be used in connection with electronic device 4, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and page displays by running programs stored in the system memory 28, for example, implementing the device testing method provided by the embodiment of the present invention, the method includes:
testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing;
according to a preconfigured analysis system, carrying out cause analysis on test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases;
and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases.
Of course, those skilled in the art will understand that the processor may also implement the technical solution of the device testing method provided in any embodiment of the present invention.
An embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a device testing method such as provided by the embodiment of the present invention, the method including:
Testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing;
according to a preconfigured analysis system, carrying out cause analysis on test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases;
and determining a test report of the equipment to be tested according to the test items and test results corresponding to the test cases and the abnormal reasons corresponding to the failed test cases.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium may be, for example, but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
It will be appreciated by those of ordinary skill in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be centralized on a single computing device, or distributed over a network of computing devices, or they may alternatively be implemented in program code executable by a computer device, such that they are stored in a memory device and executed by the computing device, or they may be separately fabricated as individual integrated circuit modules, or multiple modules or steps within them may be fabricated as a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
In addition, the technical scheme of the invention can acquire, store, use, process and the like the data, which accords with the relevant regulations of national laws and regulations.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (10)

1. A method of device testing, comprising:
testing each test item of the equipment to be tested based on the test cases contained in the test tasks, and determining whether the test result corresponding to each test case is passing or failing;
according to a preconfigured analysis system, carrying out cause analysis on the test cases with failed test results, and determining that the test results are abnormal causes corresponding to the failed test cases;
and determining a test report of the equipment to be tested according to the test item and the test result corresponding to each test case and the abnormal reason corresponding to the test case, the test result of which is failed.
2. The device testing method of claim 1, further comprising, prior to testing each test item of the device under test based on the test case included in the test task:
acquiring a history test report obtained by testing a plurality of history devices to be tested by the test case contained in the test task, wherein the history test report comprises device information of each history device to be tested, a test result corresponding to each test case, and an initial abnormality reason corresponding to the failed test case;
Clustering initial abnormality reasons corresponding to the test cases with failed test results, and determining standard abnormality reasons corresponding to the test cases with failed test results;
and configuring the standard abnormality reasons corresponding to the test cases with failed test results to an initial system to obtain the analysis system.
3. The device testing method according to claim 2, further comprising, after obtaining a history test report obtained by testing a plurality of history devices under test by the test case included in the test task:
carrying out association analysis on each test case with the test result of failed, determining the association relation between each test case with the test result of failed, and determining the associated test case with the test result of failed;
and configuring the associated test cases corresponding to the test cases with failed test results to an initial system to obtain the analysis system.
4. The device testing method according to claim 1, wherein testing each test item of the device under test based on the test case included in the test task, determining whether the test result corresponding to each test case is passed or failed, includes:
For each test case contained in the test task, testing each test item of the equipment to be tested based on the current test case, and determining whether the test result of the current test case is passed or failed;
and when the test result of the current test case is failed, determining that the test result of the associated test case corresponding to the current test case is failed.
5. The device testing method according to claim 1, wherein the performing, according to the pre-configured analysis system, the cause analysis on the test case whose test result is failed, and determining the abnormal cause corresponding to the test case whose test result is failed, includes:
inquiring in a pre-configured analysis system based on the test cases with failed test results to obtain standard exception reasons corresponding to the test cases with failed test results;
and determining the standard abnormality reason corresponding to the test case with the test result of failed as the abnormality reason corresponding to the test case with the test result of failed.
6. The apparatus testing method according to claim 1, wherein determining the test report of the apparatus under test according to the test item and the test result corresponding to each test case, and the abnormality cause corresponding to the test case for which the test result is failed, further comprises:
Determining that the state of the test item corresponding to the test case with the test result of failed is abnormal, and determining the abnormal reason corresponding to the test case with the test result of failed as the abnormal reason corresponding to the test item with the test result of failed;
determining that the state of the test item corresponding to the test case with the passed test result is normal;
and determining the abnormal reasons corresponding to the test items with abnormal states as the test reports of the equipment to be tested.
7. The device testing method according to claim 2, further comprising, after determining that the test result is a cause of a standard abnormality corresponding to each of the test cases that have failed:
determining that the test result is the real abnormal reason corresponding to each failed test case;
if the standard abnormality cause does not include the true abnormality cause, the true abnormality cause is added to the standard abnormality cause.
8. A device testing apparatus, comprising:
the determining module is used for testing each test item of the equipment to be tested based on the test cases contained in the test tasks and determining whether the test result corresponding to each test case is passed or failed;
The analysis module is used for carrying out reason analysis on the test cases with failed test results according to a preconfigured analysis system, and determining that the test results are abnormal reasons corresponding to the failed test cases;
and the execution module is used for determining a test report of the equipment to be tested according to the test items and the test results corresponding to the test cases and the abnormal reasons corresponding to the test cases, the test results of which are failed.
9. A computer device, the computer device comprising:
at least one processor; and a memory communicatively coupled to the at least one processor;
wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the device testing method of any one of claims 1-7.
10. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the device testing method of any of claims 1-7.
CN202311091596.6A 2023-08-28 2023-08-28 Equipment testing method, device, equipment and storage medium Pending CN117061377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311091596.6A CN117061377A (en) 2023-08-28 2023-08-28 Equipment testing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311091596.6A CN117061377A (en) 2023-08-28 2023-08-28 Equipment testing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117061377A true CN117061377A (en) 2023-11-14

Family

ID=88658823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311091596.6A Pending CN117061377A (en) 2023-08-28 2023-08-28 Equipment testing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117061377A (en)

Similar Documents

Publication Publication Date Title
CN108563214B (en) Vehicle diagnosis method, device and equipment
Deb et al. QSI's integrated diagnostics toolset
CN110888414B (en) Test method for upgrading vehicle controller
US9606902B2 (en) Malfunction influence evaluation system and evaluation method using a propagation flag
CN108802511B (en) Method and system for testing battery management unit
EP3379436A1 (en) Method and apparatus for testing design of satellite wiring harness and signal processing units
CN101715577A (en) Can carry out from the order of simulation system with from the electronic cards of the order of diagnostic module and relevant analogy method
CN109408361A (en) Monkey tests restored method, device, electronic equipment and computer readable storage medium
CN110990289B (en) Method and device for automatically submitting bug, electronic equipment and storage medium
US11055207B2 (en) Automatic generation of integration tests from unit tests
US11994977B2 (en) Test case generation apparatus, test case generation method, and computer readable medium
CN111274130A (en) Automatic testing method, device, equipment and storage medium
CN112506772B (en) Web automatic test method, device, electronic equipment and storage medium
US11960385B2 (en) Automatic generation of integrated test procedures using system test procedures
US10372849B2 (en) Performing and communicating sheet metal simulations employing a combination of factors
CN117061377A (en) Equipment testing method, device, equipment and storage medium
CN111044826B (en) Detection method and detection system
CN111190821B (en) Test platform construction method and test method of cabin door integrated management software
CN112561331A (en) Visual experimental analysis evaluation index construction method
CN111752823A (en) Method, device and equipment for testing vehicle-mounted power supply application software
KR20230014333A (en) Vehicle controller test device and method therefor
CN110737983A (en) Method, device and equipment for testing functions of vehicles and storage medium
Sarla et al. Automation of Combinatorial Interaction Test (CIT) Case Generation and Execution for Requirements based Testing (RBT) of Complex Avionics Systems
Shannon et al. A systems approach to diagnostic ambiguity reduction in naval avionic systems
CN114281671A (en) Application program testing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination