CN114124769A - Base station testing method and device, electronic equipment and storage medium - Google Patents

Base station testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114124769A
CN114124769A CN202010797450.3A CN202010797450A CN114124769A CN 114124769 A CN114124769 A CN 114124769A CN 202010797450 A CN202010797450 A CN 202010797450A CN 114124769 A CN114124769 A CN 114124769A
Authority
CN
China
Prior art keywords
test
task
newly added
test task
testing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010797450.3A
Other languages
Chinese (zh)
Other versions
CN114124769B (en
Inventor
王岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Datang Mobile Communications Equipment Co Ltd
Original Assignee
Datang Mobile Communications Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Datang Mobile Communications Equipment Co Ltd filed Critical Datang Mobile Communications Equipment Co Ltd
Priority to CN202010797450.3A priority Critical patent/CN114124769B/en
Publication of CN114124769A publication Critical patent/CN114124769A/en
Application granted granted Critical
Publication of CN114124769B publication Critical patent/CN114124769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/50Testing arrangements

Abstract

The invention provides a base station testing method, a base station testing device, electronic equipment and a storage medium, and relates to the technical field of communication. The method comprises the following steps: determining a test task; if the test task is a newly added test task, obtaining description information of the test task, wherein the description information comprises an important level and a test specification of the test task; under the condition that the importance level meets a preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification; if the value of the automatic identification is a target value, acquiring a keyword corresponding to the testing step and a configuration parameter of the keyword; and calling a corresponding keyword function according to the configuration parameters of the keywords corresponding to the testing step for testing. The invention meets the function expansion requirement of the base station, improves the test efficiency and standardizes the test flow.

Description

Base station testing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method and an apparatus for testing a base station, an electronic device, and a storage medium.
Background
In a communication network, a base station is a bridge connecting a user terminal and a core network, and factors such as the working performance of the base station determine the communication quality of the whole communication network, so that the base station is tested to ensure the stability of the working performance of the base station, which is very important for ensuring the communication quality.
The existing Operation Maintenance (OM) test is completed by a base station maintainer, and the base station maintainer needs to manually test all base stations in a communication network through an OM center.
However, with the continuous upgrade of the base station version, the functions of the base station are continuously expanded, the OM test on the base station is increasingly complicated, the manual test is difficult to meet the requirement of the function expansion of the base station, the test efficiency is low, and the test is not standard.
Disclosure of Invention
The invention provides a base station testing method and device, electronic equipment and a storage medium, and aims to solve the problems that manual testing in the prior art is difficult to meet the function expansion requirement of a base station, the testing efficiency is low, and the testing is irregular.
According to a first aspect of the present invention, there is provided a base station testing method, the method comprising:
determining a test task;
if the test task is a newly added test task, obtaining description information of the test task, wherein the description information comprises an important level and a test specification of the test task;
under the condition that the importance level meets a preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification;
if the value of the automatic identification is a target value, acquiring a keyword corresponding to the testing step and a configuration parameter of the keyword;
and calling a corresponding keyword function according to the configuration parameters of the keywords corresponding to the testing step for testing.
According to a second aspect of the present invention, there is provided a base station test apparatus, the apparatus comprising:
the determining module is used for determining a testing task;
the first obtaining module is used for obtaining description information of the test task if the test task is a newly added test task, wherein the description information comprises the importance level and the test specification of the test task;
the analysis module is used for analyzing a test case of the test task according to the test specification under the condition that the importance level meets a preset condition, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification;
a second obtaining module, configured to obtain a keyword corresponding to the testing step and a configuration parameter of the keyword if the value of the automation identifier is a target value;
and the first testing module is used for calling the corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
According to a third aspect of the present invention, there is provided an electronic apparatus comprising:
a processor, a memory and a computer program stored on the memory and executable on the processor, the processor implementing the aforementioned method when executing the program.
According to a fourth aspect of the invention, there is provided a readable storage medium having instructions which, when executed by a processor of an electronic device, enable the electronic device to perform the aforementioned method.
The invention provides a base station testing method, a base station testing device, electronic equipment and a storage medium, which can automatically supplement a test case and a keyword corresponding to a newly added test task in the function development process of a base station, and automatically test by calling a keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing process.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a flowchart illustrating specific steps of a method for testing a base station according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating specific steps of a base station testing method according to a second embodiment of the present invention;
FIG. 3 is a block diagram of a Robot Framework according to a second embodiment of the present invention;
FIG. 4 is a diagram of an organizational hierarchy of a Robot Framework test library according to a second embodiment of the present invention;
fig. 5 is a structural diagram of a base station testing apparatus according to a third embodiment of the present invention;
fig. 6 is a structural diagram of a base station testing apparatus according to a fourth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating specific steps of a base station testing method according to an embodiment of the present invention is shown.
Step 101, determining a test task.
In the embodiment of the invention, the test task indicates the function of the base station to be tested, and the function of the base station comprises the following steps: the method comprises the functions of Base station connection, Management Information Base (MIB) loading, Simple Network Management Protocol (SNMP) adding, deleting, modifying, checking, event checking, alarm checking, log uploading, Base station function calling, external wire engineering (OSP) connection and the like, and the functions added in the function development process of the Base station.
Optionally, the tester is prompted to input a test instruction, and the test task is determined according to the test instruction.
Step 102, if the test task is a newly added test task, obtaining description information of the test task, wherein the description information comprises an importance level and a test specification of the test task.
And if the test task is a newly added test task, indicating that the function to be tested by the test task is a newly added function.
Optionally, the test task includes a task identifier, and if the task identifier is a preset identifier, the test task is determined to be a newly added test task. Specifically, when a tester inputs a test instruction, the test instruction may include a task identifier of the test task, so that the computer may determine whether the test task is a newly added test task according to the task identifier, for example, two task identifiers "N" and "O" are set for the test task, and the "N" is used as a preset identifier, and when the task identifier of the test task is "N", the test task is determined to be the newly added test task. In a specific application, a person skilled in the art may set a task identifier and a preset identifier according to an actual requirement, and the present invention is not limited specifically.
Optionally, the test task includes a task number, a test case corresponding to the test task is searched in a test library according to the task number, the test library is used for storing the test case corresponding to the test task, and the test case and the task number have a corresponding relationship; if at least one test case corresponding to the task number exists in the test library, the test task is a non-newly added task; and if the test case corresponding to the task number does not exist in the test database, the test task is a newly added task.
In the embodiment of the invention, if the test task is a newly added test task, a tester is prompted to submit the description information of the test task, wherein the description information comprises the importance level and the test specification of the test task. The importance level of the test task is determined according to the function to be tested, and if the function to be tested is the main function of the base station, the importance level of the test task is high. In practical application, a tester can determine the importance level of a newly added function according to a service requirement corresponding to the newly added function, and further determine the importance level of a test task. The test specification of a test task is used to indicate a specific implementation of the test task.
103, under the condition that the importance level meets a preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification.
In the embodiment of the present invention, the importance level satisfies a preset condition, and specifically includes: dividing the importance level of the test task into a plurality of levels, wherein each level corresponds to a numerical value, the larger the numerical value is, the higher the importance level is, and if the numerical value corresponding to the importance level is larger than a preset threshold value, the importance level meets a preset condition; or, setting an importance level label for the test task, distinguishing the importance level of the test task according to the color of the label, and if the color of the importance level label is a preset color, the importance level meets a preset condition, for example, red is set as the preset color, and the color of the importance level label of a certain test task is green, which indicates that the test task is not important, and the importance level of the test task does not meet the preset condition.
In the embodiment of the invention, the test case is a specific embodiment of the test task, and the test case comprises a test step, an input parameter and an expected result corresponding to the execution of the test task. One test task corresponds to at least one test case, one test case comprises at least one test step, one input parameter and one expected result, and the test task can be executed according to the test case.
And 104, if the value of the automatic identification is a target value, acquiring a keyword corresponding to the testing step and a configuration parameter of the keyword.
And if the value of the automatic identification is a target value, the corresponding test step can be executed by the machine, and a keyword corresponding to the test step and a configuration parameter of the keyword are obtained, so that the computer calls a corresponding keyword function according to the configuration parameter of the keyword to perform automatic test. Specifically, a preset parameter may be selected as the automatic identifier, the preset parameter includes at least two parameter values, one of the parameter values is used as a target value, and if the parameter value of the preset parameter is the target value, it indicates that the test step corresponding to the preset parameter can be performed by a machine, for example, the automatic identifier is set as an automatic identifier, and the value of the automatic identifier has two types: YES and NO, taking YES as a target value, and if the value of the automate is YES, indicating that the corresponding test step can be executed by the machine; alternatively, the color label is used as the automatic identification, and a color is selected as the target value, for example, green is set as the target value of the automatic identification, so long as the color label of the automatic identification is green, it indicates that the corresponding test step can be performed by the machine. The automatic identification and the target value of the automatic identification may be set by those skilled in the art according to actual applications, and the present invention is not particularly limited.
In the embodiment of the present invention, one test step corresponds to one keyword, the keyword is usually a descriptive language for the test step, and includes a keyword and a user-defined keyword in a test library, the configuration parameters of the keyword may be zero or multiple, usually, one keyword includes at least one configuration parameter, and the configuration parameters include an input parameter and an expected output parameter of the keyword.
Optionally, if the testing step includes at least two substeps, the step of obtaining the keyword corresponding to the testing step includes:
acquiring basic keywords corresponding to the substeps;
and combining the basic keywords of at least two sub-steps into the corresponding keywords of the testing step.
In the embodiment of the present invention, a testing step can only correspond to one keyword, and therefore, if a testing step includes at least two sub-steps, the keyword of the testing step is not the basic keyword corresponding to the at least two sub-steps, but the basic keyword in the at least two sub-steps is combined to obtain a new keyword, and the new keyword is used as the keyword of the testing step. For example, the testing step a comprises a sub-step B and a sub-step C, wherein the basic keyword of the sub-step B is "receive registration message" and the basic keyword of the sub-step C is "UE authentication", and then the keyword of the testing step a may be "user registration". Generally, the keyword is a descriptive language for the testing step for indicating the testing function corresponding to the testing step, therefore, when a testing step includes multiple sub-steps, the keyword for the testing step is generally used for summarizing the testing function of each sub-step to obtain a new descriptive language, rather than simply superimposing the basic keywords of each sub-step, the basic keywords of the sub-steps may be keywords in a testing library or keywords customized by a user, and the keywords corresponding to the testing step including multiple sub-steps are generally customized by the user. And 105, calling a corresponding keyword function according to the configuration parameters of the keywords corresponding to the testing step to test.
In the Robot Framework automated testing Framework, the testing template is included, and the keyword driving testing case can be converted into the data driving testing case through the testing template.
The invention provides a base station testing method, which is characterized in that a test case of a test task is analyzed according to description information of the test task, keywords corresponding to each test step in the test case are obtained, and a keyword function is called to automatically test the test task, so that the test case and the keywords corresponding to a newly added test task can be automatically supplemented in the function development process of a base station, and the automatic test is carried out by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing process.
Example two
Referring to fig. 2, a flowchart illustrating specific steps of a base station testing method according to a second embodiment of the present invention is shown.
Step 201, determining a test task.
In the embodiment of the invention, the test task indicates the function of the base station to be tested, and the function of the base station comprises the following steps: the method comprises the functions of Base station connection, Management Information Base (MIB) loading, Simple Network Management Protocol (SNMP) adding, deleting, modifying, checking, event checking, alarm checking, log uploading, Base station function calling, external wire engineering (OSP) connection and the like, and the functions added in the function development process of the Base station.
Optionally, the tester is prompted to input a test instruction, and the test task is determined according to the test instruction.
The embodiment of the invention builds the base station test system based on the Robot Framework architecture, and realizes the automatic test of the base station system. Referring to fig. 3, a modular structure diagram of a Robot Framework in an embodiment of the present invention is shown, where Test Data is used to store Test Data, the Robot Framework is a core frame, Test Libraries are Test Libraries, the Test Tools include various Test Tools, and System Under Test is a System Under Test. When the Robot Framework is started, test data is started, a test case is executed, a log and a report are generated, the core Framework does not know details about the base station system, interaction is executed by a test library, and the core Framework obtains relevant information of the base station system, such as a test task, by reading the test library.
Step 202, if the test task is a newly added test task, obtaining description information of the test task, where the description information includes an importance level and a test specification of the test task.
And if the test task is a newly added test task, indicating that the function to be tested by the test task is a newly added function.
Optionally, the test task includes a task identifier, and if the task identifier is a preset identifier, the test task is determined to be a newly added test task. Specifically, when a tester inputs a test instruction, the test instruction may include a task identifier of the test task, so that the computer may determine whether the test task is a newly added test task according to the task identifier, for example, two task identifiers "N" and "O" are set for the test task, and the "N" is used as a preset identifier, and when the task identifier of the test task is "N", the test task is determined to be the newly added test task. In a specific application, a person skilled in the art may set a task identifier and a preset identifier according to an actual requirement, and the present invention is not limited specifically. Optionally, the test task includes a task number, a test case corresponding to the test task is searched in a test library according to the task number, the test library is used for storing the test case corresponding to the test task, and the test case and the task number have a corresponding relationship; if at least one test case corresponding to the task number exists in the test library, the test task is a non-newly added task; and if the test case corresponding to the task number does not exist in the test database, the test task is a newly added task.
Referring to fig. 4, which shows an organization level structure diagram of a Robot Framework Test library in an embodiment of the present invention, for a base station system, each sub-Module constructs a Test Case in a Test Case File (Test Case File) based on a Module function, each sub-Module corresponds to at least one Test Case, a plurality of Test cases of the same sub-Module form a Test set (Module Test Suites) of the sub-Module, and the Test Case suite of each sub-Module forms a Test set (Om Test Suites) of a higher-level Module; the module resource files (Om Resources) contain keywords of the module and configuration parameters of the keywords, and are used for calling each test case in the test set of the module; the Common resource files (Common Resources) contain higher-level user-defined keywords based on the configuration parameters of the keywords, so that Test cases of different base station subsystems can be called, the implementation of the keywords in the Common resource files depends on a lower-level standard library and an extension library, such as a Python library, an SNMP library, a Test Case Lib library and the like, wherein the Python library contains Python files used for storing specific Test methods, the SNMP library is related to signaling interaction, and the Test Case Lib library contains operations related to OSP.
In combination with the foregoing, in the embodiment of the present invention, the test task corresponds to the function of the base station, and the test case is constructed according to the functions of each module in the base station system, so that a corresponding relationship exists between the test task and the test case, and according to the task number of the test task, whether the test case corresponding to the test task exists can be queried in the test library, so as to determine whether the test task is a new test task.
In the initial test library, the admission test case of the base station system is included, and the common functions of each module in the base station system are corresponded, for example: and the test cases correspond to the conventional functions of activation and deactivation of the cell module, loading of a processor and a board card of the equipment module, package pulling and upgrading of the software module and the like. The newly added test task corresponds to the newly added function in the base station system development process and can also correspond to newly added BUG backtracking.
Optionally, counting a newly added function and/or a newly added BUG of the base station system according to a preset period; determining a newly added test task according to the newly added function and/or the newly added BUG; and acquiring the description information of the test task. The embodiment of the invention determines the newly added test task by counting the newly added function and the newly added BUG of the base station system according to the preset period, realizes the automatic updating of the test task and can meet the function expansion requirement of the base station. In the embodiment of the invention, if the test task is a newly added test task, a tester is prompted to submit the description information of the test task, wherein the description information comprises the importance level and the test specification of the test task. The importance level of the test task is determined according to the function to be tested, and if the function to be tested is the main function of the base station, the importance level of the test task is high. In practical application, a tester can determine the importance level of a newly added function according to a service requirement corresponding to the newly added function, and further determine the importance level of a test task. The test specification of a test task is used to indicate a specific implementation of the test task.
And 203, under the condition that the importance level meets a preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification.
In the embodiment of the present invention, the importance level satisfies a preset condition, and specifically includes: dividing the importance level of the test task into a plurality of levels, wherein each level corresponds to a numerical value, the larger the numerical value is, the higher the importance level is, and if the numerical value corresponding to the importance level is larger than a preset threshold value, the importance level meets a preset condition; or, setting an importance level label for the test task, distinguishing the importance level of the test task according to the color of the label, and if the color of the importance level label is a preset color, the importance level meets a preset condition, for example, red is set as the preset color, and the color of the importance level label of a certain test task is green, which indicates that the test task is not important, and the importance level of the test task does not meet the preset condition.
In the embodiment of the invention, the test case is a specific embodiment of the test task, and the test case comprises a test step, an input parameter and an expected result corresponding to the execution of the test task. One test task corresponds to at least one test case, one test case comprises at least one test step, one input parameter and one expected result, and the test task can be executed according to the test case.
And 204, if the value of the automatic identification is the target value, generating a test step configuration page.
The automatic identification is used for indicating whether the testing step can be executed by a machine or not, if the value of the automatic identification is a target value, the corresponding testing step can be executed by the machine, a testing step configuration page is generated, so that a user inputs a keyword corresponding to the testing step and a configuration parameter of the keyword on the testing step configuration page, and the computer calls a corresponding keyword function according to the configuration parameter of the keyword to realize automatic testing. Specifically, a preset parameter may be selected as the automatic identifier, the preset parameter includes at least two parameter values, one of the parameter values is used as a target value, and if the parameter value of the preset parameter is the target value, it indicates that the test step corresponding to the preset parameter can be performed by a machine, for example, the automatic identifier is set as an automatic identifier, and the value of the automatic identifier has two types: YES and NO, taking YES as a target value, and if the value of the automate is YES, indicating that the corresponding test step can be executed by the machine; alternatively, the color label is used as the automatic identification, and a color is selected as the target value, for example, green is set as the target value of the automatic identification, so long as the color label of the automatic identification is green, it indicates that the corresponding test step can be performed by the machine. The automatic identification and the target value of the automatic identification may be set by those skilled in the art according to actual applications, and the present invention is not particularly limited.
Step 205, obtaining keywords corresponding to the testing step and configuration parameters of the keywords, which are input by the testing personnel on the testing step configuration page.
And step 206, calling a corresponding keyword function according to the configuration parameters of the keywords corresponding to the testing step to test.
If the value of the automatic identification is a target value, the test step corresponding to the automatic identification is indicated to be executable by a machine, a test template is included in a Robot Framework automatic test Framework, and a keyword driving test case can be converted into a data driving test case through the test template, the Robot Framework automatic test method and the test system in the embodiment of the invention realize the automatic test of a test task, therefore, in the actual test process, only configuration parameters of keywords need to be defined, and the keyword function can be called through the test template to realize the automatic test, under the condition that the value of the automatic identification is the target value, a test step configuration page is generated for a tester to define the keywords corresponding to the test step and the configuration parameters of the keywords in the configuration page, and further according to the keywords and the keyword parameters input by the tester on the configuration page, and calling the corresponding keyword function to perform automatic testing.
Optionally, if the test task is a non-newly added test task, acquiring identification information of the non-newly added test task; searching the test cases of the non-newly added test tasks in a preset file according to the identification information; and executing the test case of the non-newly added test task for testing.
As can be seen from the foregoing step 202, in the embodiment of the present invention, the test case of each module of the base station system is stored in the test library, and if the test task is a non-added task, the test case corresponding to the test task may be directly obtained in the test library, so as to execute the obtained test case for testing.
Whether the test task is a non-newly added test task may be determined by adopting the method for determining whether the test task is a newly added test task in step 202, which is not described herein again.
And step 207, if the test task corresponds to at least two test cases, if at least one test case fails to test, continuing to execute the next test case.
Each test case comprises one or more test steps, and if all the test steps pass, the execution of the test case is indicated to pass. Each testing step has corresponding preset input and expected output, and if the actual testing result obtained by executing the testing step is inconsistent with the expected output, the testing step is indicated to fail to be executed.
In general, when a test case fails to be executed, the whole test task is stopped, and the test efficiency is reduced. Therefore, in the embodiment of the invention, if a certain test case fails to be executed, the test task is continuously executed, the next test case is tested until all test cases are executed, and then the test results of all test cases corresponding to the test task are exported in batch, so that the test efficiency is improved.
The invention provides a base station testing method, which is characterized in that a test case of a test task is analyzed according to description information of the test task, keywords corresponding to each test step in the test case are obtained, and a keyword function is called to automatically test the test task, so that the test case and the keywords corresponding to a newly added test task can be automatically supplemented in the function development process of a base station, and the automatic test is carried out by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing process. In addition, in the invention, if the execution of a certain test case fails, the next test case is continuously executed until all the test cases corresponding to the test task are executed, and then the test result is output, thereby improving the test efficiency.
EXAMPLE III
Referring to fig. 5, a structural diagram of a base station testing apparatus according to a third embodiment of the present invention is shown, which specifically includes the following steps:
a determining module 301, configured to determine a test task;
a first obtaining module 302, configured to obtain description information of the test task if the test task is a newly added test task, where the description information includes an importance level and a test specification of the test task;
the analysis module 303 is configured to analyze a test case of the test task according to the test specification when the importance level meets a preset condition, where the test case includes a test step of the test task, and the test step includes an automation identifier;
a second obtaining module 304, configured to obtain a keyword corresponding to the testing step and a configuration parameter of the keyword if the value of the automation identifier is a target value;
the first testing module 305 is configured to invoke a corresponding keyword function according to the configuration parameter of the keyword corresponding to the testing step to perform testing.
The invention provides a base station testing device, which analyzes a test case of a test task according to description information of the test task, acquires keywords corresponding to each testing step in the test case, calls a keyword function to automatically test the test task, can automatically supplement the test case and the keywords corresponding to a newly added test task in the function development process of a base station, and automatically tests by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing process.
The third embodiment is a corresponding apparatus embodiment to the first embodiment, and the detailed information may refer to the detailed description of the first embodiment, which is not described herein again.
Example four
Referring to fig. 6, a structural diagram of a base station testing apparatus according to a fourth embodiment of the present invention is shown, which includes the following details:
a determining module 401, configured to determine a test task.
A first obtaining module 402, configured to obtain description information of the test task if the test task is a newly added test task, where the description information includes an importance level and a test specification of the test task.
Optionally, the first obtaining module 402 includes:
the counting submodule is used for counting the newly added functions and/or newly added BUGs of the base station system according to a preset period;
the task determination submodule is used for determining a newly added test task according to the newly added function and/or the newly added BUG;
and the description information acquisition submodule is used for acquiring the description information of the test task.
An analyzing module 403, configured to analyze a test case of the test task according to the test specification when the importance level meets a preset condition, where the test case includes a test step of the test task, and the test step includes an automation identifier.
A second obtaining module 404, configured to obtain a keyword corresponding to the testing step and a configuration parameter of the keyword if the value of the automation identifier is a target value.
The second obtaining module 404 includes:
the page generation submodule 4041 is used for generating a test step configuration page;
the first obtaining sub-module 4042 is configured to obtain the keywords corresponding to the testing step and the configuration parameters of the keywords, which are input by the tester on the testing step configuration page.
Optionally, if the testing step includes at least two sub-steps, the second obtaining module 404 includes:
a second obtaining submodule, configured to obtain a basic keyword corresponding to the substep and a configuration parameter of the basic keyword;
and the combination sub-module is used for combining the basic keywords of at least two sub-steps into the keywords corresponding to the testing step.
And a testing module 405, configured to call a corresponding keyword function according to the configuration parameter of the keyword corresponding to the testing step to perform testing.
Optionally, the apparatus further comprises:
a third obtaining module, configured to obtain identification information of the non-newly added test task if the test task is the non-newly added test task;
the searching module is used for searching the test cases of the non-newly added test tasks in a preset file according to the identification information;
and the non-newly added test task test module is used for executing the test cases of the non-newly added test tasks to test.
And the test control module 406 is configured to continue executing the next test case if the test task corresponds to at least two test cases and at least one test case fails to test.
The invention provides a base station testing device, which analyzes a test case of a test task according to description information of the test task, acquires keywords corresponding to each testing step in the test case, calls a keyword function to automatically test the test task, can automatically supplement the test case and the keywords corresponding to a newly added test task in the function development process of a base station, and automatically tests by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing process. In addition, in the invention, if the execution of a certain test case fails, the next test case is continuously executed until all the test cases corresponding to the test task are executed, and then the test result is output, thereby improving the test efficiency. The fourth embodiment is a device embodiment corresponding to the second embodiment, and the detailed information may refer to the detailed description of the second embodiment, which is not repeated herein.
An embodiment of the present invention further provides an electronic device, including: a processor, a memory and a computer program stored on the memory and executable on the processor, the processor implementing the aforementioned method when executing the program.
Embodiments of the present invention also provide a readable storage medium, and when instructions in the storage medium are executed by a processor of an electronic device, the electronic device is enabled to execute the foregoing method.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (14)

1. A method for testing a base station, the method comprising:
determining a test task;
if the test task is a newly added test task, obtaining description information of the test task, wherein the description information comprises an important level and a test specification of the test task;
under the condition that the importance level meets a preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification;
if the value of the automatic identification is a target value, acquiring a keyword corresponding to the testing step and a configuration parameter of the keyword;
and calling a corresponding keyword function according to the configuration parameters of the keywords corresponding to the testing step for testing.
2. The method according to claim 1, wherein the step of obtaining the keyword corresponding to the testing step and the configuration parameter of the keyword comprises:
generating a test step configuration page;
and acquiring keywords corresponding to the testing step and configuration parameters of the keywords, which are input by a tester on the testing step configuration page.
3. The method of claim 1, wherein if the testing step comprises at least two sub-steps, the step of obtaining the keyword corresponding to the testing step comprises:
acquiring basic keywords corresponding to the substeps;
and combining the basic keywords of at least two sub-steps into the corresponding keywords of the testing step.
4. The method according to claim 1, wherein the step of obtaining the description information of the test task if the test task is a newly added test task comprises:
counting the newly added functions and/or newly added BUGs of the base station system according to a preset period;
determining a newly added test task according to the newly added function and/or the newly added BUG;
and acquiring the description information of the test task.
5. The method of claim 1, further comprising:
if the test task is a non-newly added test task, acquiring identification information of the non-newly added test task;
searching the test cases of the non-newly added test tasks in a preset file according to the identification information;
and executing the test case of the non-newly added test task for testing.
6. The method according to any one of claims 1 to 5, wherein if the test task corresponds to at least two test cases, the method further comprises:
and if at least one test case fails to test, continuing to execute the next test case.
7. A base station test apparatus, the apparatus comprising:
the determining module is used for determining a testing task;
the first obtaining module is used for obtaining description information of the test task if the test task is a newly added test task, wherein the description information comprises the importance level and the test specification of the test task;
the analysis module is used for analyzing a test case of the test task according to the test specification under the condition that the importance level meets a preset condition, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification;
a second obtaining module, configured to obtain a keyword corresponding to the testing step and a configuration parameter of the keyword if the value of the automation identifier is a target value;
and the testing module is used for calling the corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
8. The apparatus of claim 7, wherein the second obtaining module comprises:
the page generation submodule is used for generating a test step configuration page;
and the first obtaining sub-module is used for obtaining the keywords corresponding to the testing step and the configuration parameters of the keywords, which are input by the testing personnel on the testing step configuration page.
9. The apparatus of claim 7, wherein if the testing step comprises at least two substeps, the second obtaining module comprises:
the second obtaining submodule is used for obtaining the basic keywords corresponding to the substep;
and the combination sub-module is used for combining the basic keywords of at least two sub-steps into the keywords corresponding to the testing step.
10. The apparatus of claim 7, wherein the first obtaining module comprises:
the counting submodule is used for counting the newly added functions and/or newly added BUGs of the base station system according to a preset period;
the task determination submodule is used for determining a newly added test task according to the newly added function and/or the newly added BUG;
and the description information acquisition submodule is used for acquiring the description information of the test task.
11. The apparatus of claim 7, further comprising:
a third obtaining module, configured to obtain identification information of the non-newly added test task if the test task is the non-newly added test task;
the searching module is used for searching the test cases of the non-newly added test tasks in a preset file according to the identification information;
and the non-newly added test task test module is used for executing the test cases of the non-newly added test tasks to test.
12. The apparatus according to any of claims 7 to 11, wherein if the test task corresponds to at least two test cases, the apparatus further comprises:
and the test control module is used for continuously executing the next test case if at least one test case fails to test.
13. An electronic device, comprising:
processor, memory and computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the program.
14. A readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any of claims 1 to 6.
CN202010797450.3A 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium Active CN114124769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010797450.3A CN114124769B (en) 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010797450.3A CN114124769B (en) 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114124769A true CN114124769A (en) 2022-03-01
CN114124769B CN114124769B (en) 2023-08-11

Family

ID=80373506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010797450.3A Active CN114124769B (en) 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114124769B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116204446A (en) * 2023-05-06 2023-06-02 云账户技术(天津)有限公司 Automatic test flow management method and device based on JIRA platform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150324274A1 (en) * 2014-05-09 2015-11-12 Wipro Limited System and method for creating universal test script for testing variants of software application
CN106937303A (en) * 2015-12-30 2017-07-07 中国移动通信集团河南有限公司 A kind of base station test method and system, terminal, Cloud Server
CN110471839A (en) * 2019-07-11 2019-11-19 平安普惠企业管理有限公司 Fixed time test task control method, device, computer equipment and storage medium
CN110888818A (en) * 2019-12-22 2020-03-17 普信恒业科技发展(北京)有限公司 Test case configuration system and method, automatic test system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150324274A1 (en) * 2014-05-09 2015-11-12 Wipro Limited System and method for creating universal test script for testing variants of software application
CN106937303A (en) * 2015-12-30 2017-07-07 中国移动通信集团河南有限公司 A kind of base station test method and system, terminal, Cloud Server
CN110471839A (en) * 2019-07-11 2019-11-19 平安普惠企业管理有限公司 Fixed time test task control method, device, computer equipment and storage medium
CN110888818A (en) * 2019-12-22 2020-03-17 普信恒业科技发展(北京)有限公司 Test case configuration system and method, automatic test system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116204446A (en) * 2023-05-06 2023-06-02 云账户技术(天津)有限公司 Automatic test flow management method and device based on JIRA platform
CN116204446B (en) * 2023-05-06 2023-08-18 云账户技术(天津)有限公司 Automatic test flow management method and device based on JIRA platform

Also Published As

Publication number Publication date
CN114124769B (en) 2023-08-11

Similar Documents

Publication Publication Date Title
CN107273286B (en) Scene automatic test platform and method for task application
CN107908541B (en) Interface testing method and device, computer equipment and storage medium
US5418793A (en) Method and apparatus for testing a complex entity
CN109344053B (en) Interface coverage test method, system, computer device and storage medium
CN101556550A (en) Analysis method for automatic test log and device
CN112363953B (en) Interface test case generation method and system based on crawler technology and rule engine
CN109977012B (en) Joint debugging test method, device, equipment and computer readable storage medium of system
CN112241360A (en) Test case generation method, device, equipment and storage medium
CN115017040A (en) Test case screening method and system, electronic equipment and storage medium
CN114124769B (en) Base station testing method and device, electronic equipment and storage medium
CN117493188A (en) Interface testing method and device, electronic equipment and storage medium
CN112835802A (en) Equipment testing method, device, equipment and storage medium
CN111767218B (en) Automatic test method, equipment and storage medium for continuous integration
CN110059002A (en) Generation method, test equipment, storage medium and the device of test data
CN113238901B (en) Multi-device automatic testing method and device, storage medium and computer device
CN113672509A (en) Automatic testing method, device, testing platform and storage medium
CN113434405A (en) Method and device for determining test file, storage medium and electronic device
CN114154169A (en) Jenkins and JMeter-based automatic test method and device
CN113052501A (en) Automatic safe operation and maintenance method and terminal based on assets
CN112100086B (en) Software automation test method, device, equipment and computer readable storage medium
CN112540359B (en) Universal test system suitable for microwave radar
CN117435510B (en) Automatic test method, terminal equipment and computer readable storage medium
CN114153719A (en) Test method and related equipment
CN116089267A (en) Test interface generation method, device, equipment and storage medium
CN114385493A (en) Performance test method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant