CN113157583B - Test method, device and equipment - Google Patents

Test method, device and equipment Download PDF

Info

Publication number
CN113157583B
CN113157583B CN202110466642.0A CN202110466642A CN113157583B CN 113157583 B CN113157583 B CN 113157583B CN 202110466642 A CN202110466642 A CN 202110466642A CN 113157583 B CN113157583 B CN 113157583B
Authority
CN
China
Prior art keywords
test
data
vulnerability
result
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110466642.0A
Other languages
Chinese (zh)
Other versions
CN113157583A (en
Inventor
苏蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202110466642.0A priority Critical patent/CN113157583B/en
Publication of CN113157583A publication Critical patent/CN113157583A/en
Application granted granted Critical
Publication of CN113157583B publication Critical patent/CN113157583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the specification provides a testing method, a testing device and testing equipment, which can be applied to the technical field of big data. The method comprises the following steps: determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task; generating at least two groups of test data according to the test scene; testing by using the test data to obtain a test result; evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; and screening valid test data in the at least two groups of test data based on the evaluation result. After generating a plurality of groups of test data, the method can correspondingly evaluate the test data based on the actual test effect of the test data, thereby ensuring the test effect of the test data, avoiding the interference of invalid test data on the test result and being beneficial to the development and operation of products.

Description

Test method, device and equipment
Technical Field
The embodiment of the specification relates to the technical field of big data, in particular to a testing method, a testing device and testing equipment.
Background
With the deep research and development of big data technology, the HBase database is widely applied to big data processing and application with the remarkable advantages of high concurrency read-write, low delay, low cost, high reliability and the like of a system under massive data. The online interface query service based on the HBase database also well meets the requirement of high-timeliness reading of mass data.
In practical applications, in order to ensure the accuracy of online interface query, accuracy of massive data query is generally required to be achieved through accurate and comprehensive test data and means before formal application. However, because the demand of test data is large, the current test data is generally generated by corresponding scripts, and the problems of operation failure, inconsistent result data and expectation and the like exist in the actual test process of the data, the generated test data cannot ensure the integrity and consistency, and the test effect is greatly disturbed. Therefore, a technical solution capable of ensuring the accuracy and the validity of the generated test data is needed.
Disclosure of Invention
An objective of the embodiments of the present disclosure is to provide a testing method, device and apparatus, so as to solve the problem of how to ensure validity of test data to improve the testing effect.
In order to solve the above technical problems, an embodiment of the present disclosure provides a testing method, including: determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task; generating at least two groups of test data according to the test scene; testing by using the test data to obtain a test result; evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; and screening valid test data in the at least two groups of test data based on the evaluation result.
The embodiment of the specification also provides a testing device, which comprises: the test scene determining module is used for determining a test scene corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task; the test data generation module is used for generating at least two groups of test data according to the test scene; the test result acquisition module is used for testing by utilizing the test data to acquire a test result; the test result evaluation module is used for evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; and the effective test data screening module is used for screening effective test data in the at least two groups of test data based on the evaluation result.
The embodiment of the specification also provides test equipment, which comprises a memory and a processor; the memory is used for storing computer program instructions; the processor is configured to execute the computer program instructions to implement the steps of: determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task; generating at least two groups of test data according to the test scene; testing by using the test data to obtain a test result; evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; and screening valid test data in the at least two groups of test data based on the evaluation result.
As can be seen from the technical solutions provided in the embodiments of the present specification, the embodiments of the present specification generate corresponding test data after obtaining a test scenario corresponding to a test task. After the test data is used for testing to obtain a test result, the test result can be evaluated to obtain an evaluation result, so that the vulnerability condition of the test data is evaluated, and finally effective test data is obtained by screening in the test data based on the evaluation result. After generating a plurality of groups of test data, the method can correspondingly evaluate the test data based on the actual test effect of the test data, thereby ensuring the test effect of the test data, avoiding the interference of invalid test data on the test result and being beneficial to the development and operation of products.
Drawings
In order to more clearly illustrate the embodiments of the present description or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present description, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a testing method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of generating test data according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of acquiring valid test data according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a test system according to an embodiment of the present disclosure;
FIG. 5 is a schematic flow chart of a testing method according to an embodiment of the present disclosure;
FIG. 6 is a block diagram of a testing device according to an embodiment of the present disclosure;
fig. 7 is a structural diagram of a test apparatus according to an embodiment of the present specification.
Detailed Description
The technical solutions of the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is apparent that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
In order to solve the above technical problems, the embodiment of the present specification proposes a testing method. The execution subject of the test method may be a test device, including but not limited to a server, an industrial personal computer, a PC, etc. As shown in fig. 1, the test method may include the following specific implementation steps.
S110: determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task.
The test tasks may be for a certain time or for a certain period of time for performing the corresponding test procedure.
The test task comprises test task information, the test task information can directly specify the information such as a test scene, a test data type and the like corresponding to the test task, and the test task information can also be information for describing a test purpose, so that the execution test is indirectly determined based on the test task information
The test scene can be used for describing the type and/or the application environment corresponding to the test task, and further specific test data can be generated according to the definition of the test scene. The test scenes can be preset with corresponding scene templates, so that corresponding test scenes can be determined according to specific information of the test tasks.
In some embodiments, the test scenarios may include at least one of a technology class test scenario, a business class test scenario, and a configuration class test scenario. The technical test scenario may be a test scenario set for a specific technical problem, for example, a specific data processing procedure in a test procedure, a test verification procedure, and other technical problems. The service class test scenario may be, for example, a scenario for executing a specific preset service, and the test effect aimed at may be, for example, judging whether the corresponding service parameter can be accurately acquired, for example, for testing that a certain service can be accurately and effectively executed on line. The configuration class test scenario may be a test performed on specific service configuration information, for example, it may be verified that specific parameters can be configured normally in the service execution process.
In some implementations, the test scenarios corresponding to the test tasks may be determined based on pre-set rules. The rule may determine the corresponding test scenario based on information such as the type of the test task or the application scenario of the test task. Corresponding rules may be pre-specified for determining different test tasks before executing a particular test task. Specifically, the first layer rule, the second layer rule and the third layer rule may be sequentially specified according to different types and scenes of the test tasks. The first layer rules may be used to define a general direction of the test scene, e.g., based on the embodiments described above for the test scene, the first layer rules may define a general test direction based on the test scene.
Describing a specific example, the first layer rule may be, for example, that when the test scenario is a technology-class test scenario, the first layer rule corresponds to Region-by-Region distribution verification and abnormal test verification; when the test scene is a configuration type test scene, the application permission verification and the channel control verification are corresponding; and when the test scene is a business class test scene, the channel classification verification, the marketing list verification and the primary key Row-key verification are corresponding.
Accordingly, in the case where the first layer rule is set, the second layer rule may also be set for the first layer rule. The second layer rule may be further subdivided with respect to the first layer rule, a specific division rule may be determined based on a general direction, and specific test data may be further generated based on the determined specific rule.
Further to the above examples, in the case of determining to verify by Region distribution, it may be configured as single Region oblique distribution verification and multi-Region average distribution verification; under the condition of determining abnormal test verification, the method can be configured into missing column data verification, column field null verification, column format abnormal verification and the like; under the condition of determining to be application authority verification, the system can be configured to be provided with personal client marketing authority verification, legal client marketing authority verification, self-service terminal marketing authority verification and the like; under the condition of determining channel control verification, the system can be configured into transfer money control verification, account detail control verification, financial product control verification and inquiry balance control verification; under the condition of determining channel classification verification, the method can be configured to accurately recommend PAD channel verification, personal detail channel verification and the like; in the case of determining to be marketing list verification, the method can be configured to be free of marketing list verification, single marketing list verification and multiple marketing list verification; in the case of determining to be the primary key Row-key verification, the method can be configured as customer number # channel # marketing campaign number primary key verification, customer number # channel primary key verification and customer number # channel # priority primary key verification.
Correspondingly, after the test task is acquired, a specific test scene can be determined according to the rule. The test scene may be an application scene corresponding to the first layer rule or a specific application scene corresponding to the second layer rule. Other types of rules may be set in practical applications based on the needs, and are not limited to the above examples, and are not described herein.
S120: and generating at least two groups of test data according to the test scene.
The test data may be data that is applied to a specific test procedure. The test data may be service data generated by simulating a service in an actual application, or may be application program data corresponding to a program executing process, and the specific type is not limited. Different test processes can be executed based on the test data, and corresponding test results are obtained. The test data may include valid test data and invalid test data, where the valid test data may be data capable of successfully completing a test process, and the invalid test data may be data in which at least one vulnerability occurs in the test process.
In some embodiments, the test data may be data stored in an HBase database. The HBase database is a relational database, and test data in the HBase database can be stored according to a key-value format, so that the HBase database can be used for rapidly searching corresponding test data.
After determining the test scenario corresponding to the test task, the corresponding test data may be determined according to the test scenario. In case the test scenario defines a respective test requirement, corresponding test data may be generated based on the test scenario. For example, a corresponding template may be preset for defining the correspondence between different test scenarios and different test data, so that the test data corresponding to the test scenarios can be determined according to the template.
As shown in fig. 2, one example of generating test data based on a test scenario. After the test scenario is set in step S201, whether the test scenario is a configuration type test scenario, a technology type test scenario, or a service type test scenario may be sequentially selected. If the configuration type test scene is the configuration type test scene, executing step S202 to deduce the configuration type test data according to the rule; if the technical test scene is the technical test scene, executing step S203 to deduce technical test data according to the rule; if the service class test scenario is the service class test scenario, step S204 is executed to derive service class test data according to the rule.
In some embodiments, based on the specific example in step S110, different types of test data may be respectively configured for the technical class test scenario, the service class test scenario, and the configuration class test scenario. Specifically, in the case that the test scenario is a technical test scenario, the test data includes at least one of single Region oblique distribution verification data, multiple Region average distribution verification data, missing column data verification data, column field null verification data, and column format anomaly verification data.
Under the condition that the test scene is a business type test scene, the test data comprises at least one of accurate recommended product PAD channel verification data, personal detail channel verification data, no marketing list verification data, single marketing list verification data, multi-marketing list verification data, customer number and channel and marketing activity number primary key verification data, customer number and channel and priority primary key verification data.
Under the condition that the test scene is a configuration type test scene, the test data comprises at least one of personal client marketing permission verification data, legal client marketing permission verification data, self-service terminal marketing permission verification data, transfer money transfer control verification data, account detail control verification data, financial product control verification data and inquiry balance control verification data.
The above embodiments are only schematic illustrations of specific test data selected for different test scenarios, and in practical application, the test data may be generated in other manners according to requirements, which is not described herein.
S130: and testing by using the test data to obtain a test result.
After the test data is obtained, a corresponding test process can be executed based on the test data, and a corresponding test result can be obtained based on the test process.
The test process based on the test data may be preset with a corresponding template, so that the test data is executed based on the template to obtain a corresponding test result.
In some embodiments, the testing process may be to construct a test packet based on the test data, and then execute the test packet by using a preset test rule to obtain a response packet. And comparing the response message with an expected test result, thereby obtaining the test result.
The test message may have a message format corresponding to the corresponding interface test, and after the test message is constructed based on the test data, a specific test process can be conveniently and effectively performed by using the test message.
In this embodiment, when the test message is specifically executed, the application test service interface may be registered in the registry, and the test message is sent to the registry through the automated test script, so that the registry calls the service interface to return a response message based on the information in the test message. The specific test process can obtain the response message corresponding to each sent message through the automatic test script, and the response message is stored in a storage file in a corresponding format.
The response message is a response result obtained by the test message in the test process and is used for reflecting corresponding information in the test process. Based on the above embodiment, the automated test script may parse the response message in the storage file, compare the parsed response message with the expected result field by field, and if the comparison result is consistent, the message may be verified, and the corresponding test result may be the normal test result.
Accordingly, if the comparison result is inconsistent, for example, at least one of field value inconsistency, field missing, field redundancy and field inconsistency occurs in the comparison process, the test result may be defined as an abnormal test result. Correspondingly, the data corresponding to the conditions can be stored as test vulnerability data for use in a subsequent analysis process.
In some embodiments, a test expiration date may be set for the test results. The test expiration date is used to define a time period for selecting test results. The test expiration date may be selected as compared to the current time, for example, data within the last month may be selected, or may be a date as compared to a specific time point, for example, data within a certain time period from the start of the test program submission.
Specifically, the test validity period may be set to three months, that is, from the beginning of the program submitting the month version to three months after the month version, and the test result in the period is used as the result of the subsequent evaluation process, so as to ensure the timeliness of the test data.
S140: evaluating the test result to obtain an evaluation result; and the evaluation result is used for representing the vulnerability situation corresponding to the test result.
After the test result is obtained, the test result may be evaluated to obtain an evaluation result. The evaluation result is used for describing the vulnerability situation in the test process. When the corresponding test process is executed by using the test data, the test may be successfully completed, the test may not be completed normally, or more execution errors may occur while the test is completed. Based on the specific test completion condition, corresponding evaluation can be performed to reflect the vulnerability condition corresponding to the test result.
In some embodiments, the evaluation result may be obtained by first extracting test vulnerability data from the test result. The test vulnerability data is used for describing vulnerabilities occurring in the test process. Specifically, the test vulnerability data may include at least one of a vulnerability number and a vulnerability discovery time. The test vulnerability data may reflect an execution condition of a test process, so that an evaluation result may be determined based on the vulnerability data.
In particular, the test vulnerability data may correspond to different phases, for example, test vulnerability data from a program to a version delivery phase may be counted, and test vulnerability data from a version delivery phase to a production phase may be counted. By acquiring the test vulnerability data in stages, the test vulnerability data can be better analyzed aiming at the test conditions of different stages, so that the accuracy of analysis results is improved.
In some implementations, after the test vulnerability data is obtained, vulnerabilities therein may also be ranked. Specifically, a vulnerability data type corresponding to the test vulnerability data may be determined first, and then a vulnerability rating corresponding to the test vulnerability data may be obtained according to the vulnerability data type. The vulnerability data type is used for describing a specific type corresponding to the vulnerability, and can be set in advance according to historical test conditions or experience of testers. Vulnerability ratings are used to describe the severity of a vulnerability and may include, for example, a severity level, a general level, and a mild level. By determining the grade corresponding to the loophole, the evaluation of the loophole can be better realized in the subsequent process.
S150: and screening valid test data in the at least two groups of test data based on the evaluation result.
After the evaluation result is obtained, effective test data can be screened from the test data according to the evaluation result. The effective test data are test data which can obtain better test effect in the test process. Because the evaluation result can be used for reflecting the advantages and disadvantages of the test result corresponding to the test data, the corresponding effective test data can be selected based on the evaluation result.
In some embodiments, after the evaluation result is obtained, a validity score corresponding to the test data may be calculated based on the evaluation result, and screening of the valid test data may be completed by comparing the validity score with a validity score threshold. The validity score threshold is used for limiting the minimum value of the validity score corresponding to the validity test data, specifically, the validity score threshold can be obtained based on the training of the historical sample data, or can be set based on the experience of the tester, and the validity score threshold is not limited to this.
In particular, the formula can be utilizedCalculating a validity score corresponding to the test data, wherein G is the validity score, gamma, lambda is a weight factor and gamma+lambda=1, W i is a coefficient corresponding to the vulnerability rating and/>P i is the vulnerability number of the corresponding vulnerability rating, Q 1 is the vulnerability number before the test task is delivered, and Q 2 is the vulnerability number after the test task is delivered. Based on the ranking of vulnerability data in step S140, n may be set to 4, and correspondingly, P i distinguishes between the deadline test issue number P 1, the severity test issue number P 2, the general degree test issue number P 3, and the mild degree test issue number P 4.
A specific example is described below in connection with fig. 3. As shown in fig. 3, after step S301 is performed to count the number of questions before and after delivery, step S302 is performed to count the number of questions of different severity levels, step S303 is performed to count the total number of questions tested, step S304 is performed to set a validity threshold, and step S305 may be performed to calculate a validity score of the test data, and step S306 is performed to determine whether the validity score is lower than the validity threshold, so that the test data is updated according to the determination result.
In some embodiments, test data other than valid test data may be used as invalid test data. The invalid test data has more test problems in the test process and cannot obtain a better test effect, so that the invalid test data can be prevented from being used for testing in the subsequent process, or the invalid test data can be directly removed from the database, so that the test effect is ensured.
The above-described process of determining valid test data and clearing invalid test data may be performed on a fixed periodic basis, such as monthly or quarterly scoring the test data for validity to ensure validity of the data in the database.
An example of a scenario is described below in connection with fig. 4 and 5. As shown in fig. 4, a block diagram of a test system is shown. The HBASE database cluster comprises a control node and a computing node and is used for storing and maintaining test data, the configuration database server comprises a scheduling server and an Oracle database server and is used for maintaining corresponding logic rules of the method, and therefore a specific test process is executed and realized. The application part comprises a database server for calling the data in the database, and the application server is used for executing a specific test process according to the received test data. Specifically, the test data may be sent to a service interface of an external application, so as to implement interface test, and receive response messages fed back by the interfaces, so as to complete the test process.
As shown in fig. 5, a flow chart of the test method is shown. And defining a rule base based on the step S1, and executing the step S2 according to the defined rule base after determining the corresponding logic and the test method of the test process to generate test data. And executing step S3 by using the generated test data, performing interface verification, and acquiring a corresponding verification result. Then, step S4 may be executed to evaluate the validity of the test data, and step S5 may be executed to update the test data by determining whether the data is lower than the validity threshold, thereby ensuring the validity of the test data.
Based on the description of the embodiment and the scenario examples, it can be seen that, after the method obtains the test scenario corresponding to the test task, corresponding test data is generated. After the test data is used for testing to obtain a test result, the test result can be evaluated to obtain an evaluation result, so that the vulnerability condition of the test data is evaluated, and finally effective test data is obtained by screening in the test data based on the evaluation result. After generating a plurality of groups of test data, the method can correspondingly evaluate the test data based on the actual test effect of the test data, thereby ensuring the test effect of the test data, avoiding the interference of invalid test data on the test result and being beneficial to the development and operation of products.
Based on the testing method corresponding to fig. 1, a testing device according to an embodiment of the present disclosure is described. As shown in fig. 6, the test apparatus includes the following modules.
A test scenario determination module 610, configured to determine a test scenario corresponding to a test task; the test scene is used for describing the type and/or application environment of the test task.
The test data generating module 620 is configured to generate at least two sets of test data according to the test scenario.
And the test result obtaining module 630 is configured to perform a test by using the test data to obtain a test result.
The test result evaluation module 640 is configured to evaluate the test result to obtain an evaluation result; and the evaluation result is used for representing the vulnerability situation corresponding to the test result.
And a valid test data screening module 650 for screening valid test data among the at least two sets of test data based on the evaluation result.
Based on the test method corresponding to fig. 4, the embodiment of the present disclosure provides a test apparatus. As shown in fig. 7, the test apparatus may include a memory and a processor.
In this embodiment, the memory may be implemented in any suitable manner. For example, the memory may be a read-only memory, a mechanical hard disk, a solid state hard disk, or a usb disk. The memory may be used to store computer program instructions.
In this embodiment, the processor may be implemented in any suitable manner. For example, a processor may take the form of, for example, a microprocessor or processor, and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application SPECIFIC INTEGRATED Circuits (ASICs), programmable logic controllers, and embedded microcontrollers, among others. The processor may execute the computer program instructions to perform the steps of: determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task; generating at least two groups of test data according to the test scene; testing by using the test data to obtain a test result; evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; and screening valid test data in the at least two groups of test data based on the evaluation result.
It should be noted that the test method, apparatus and device described in the embodiments of the present disclosure may be applied to the field of big data technology, or may be applied to any other field except the field of big data technology, which is not limited thereto.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable GATE ARRAY, FPGA)) is an integrated circuit whose logic functions are determined by user programming of the device. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented with "logic compiler (logic compiler)" software, which is similar to the software compiler used in program development and writing, and the original code before being compiled is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but HDL is not just one, but a plurality of kinds, such as ABEL(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language), and VHDL (Very-High-SPEED INTEGRATED Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
From the above description of the embodiments, it will be clear to a person skilled in the art that the present description may be implemented by means of software plus the necessary first hardware platform. Based on this understanding, the technical solution of the present specification may be embodied in essence or a part contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present specification.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The specification is operational with numerous first or special purpose computer system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Although the present specification has been described by way of example, it will be appreciated by those skilled in the art that there are many variations and modifications to the specification without departing from the spirit of the specification, and it is intended that the appended claims encompass such variations and modifications as do not depart from the spirit of the specification.

Claims (8)

1. A method of testing, comprising:
determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task;
generating at least two groups of test data according to the test scene;
testing by using the test data to obtain a test result;
Evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; wherein, include: extracting test vulnerability data from the test results; the test vulnerability data comprises at least one of vulnerability quantity and vulnerability discovery time; determining an evaluation result based on the test vulnerability data; determining a vulnerability data type corresponding to the test vulnerability data; acquiring a vulnerability rating corresponding to the test vulnerability data based on the vulnerability data type; the vulnerability ratings include a deadline level, a severity level, a general level, and a mild level;
screening valid test data from the at least two sets of test data based on the evaluation result; wherein, include: using the formula Calculating a validity score corresponding to the test data, wherein G is the validity score, gamma, lambda is a weight factor and gamma+lambda=1, W i is a coefficient corresponding to the vulnerability rating and/>P i is the vulnerability number of the corresponding vulnerability rating, Q 1 is the vulnerability number before the test task is delivered, and Q 2 is the vulnerability number after the test task is delivered; the validity score is compared to a validity score threshold to screen validity test data.
2. The method of claim 1, wherein the test scenarios comprise at least one of a technology class test scenario, a business class test scenario, and a configuration class test scenario.
3. The method of claim 2, wherein, in the case where the test scenario is a technology-like test scenario, the test data comprises at least one of single-Region oblique distribution verification data, multi-Region average distribution verification data, missing column data verification data, column field null verification data, column format exception verification data;
Under the condition that the test scene is a business type test scene, the test data comprises at least one of accurate recommended product PAD channel verification data, personal detail channel verification data, no marketing list verification data, single marketing list verification data, multi-marketing list verification data, customer number and channel and marketing activity number primary key verification data, customer number and channel and priority primary key verification data;
Under the condition that the test scene is a configuration type test scene, the test data comprises at least one of personal client marketing permission verification data, legal client marketing permission verification data, self-service terminal marketing permission verification data, transfer money transfer control verification data, account detail control verification data, financial product control verification data and inquiry balance control verification data.
4. The method of claim 1, wherein the test results comprise test results selected over a test lifetime.
5. The method of claim 1, wherein the testing with the test data to obtain test results comprises:
constructing a test message based on the test data;
executing the test message by using a preset test rule, and obtaining a response message;
Comparing the response message with an expected test result to obtain a test result; the test results comprise normal test results and abnormal test results.
6. The method of claim 5, wherein the exception test result comprises test vulnerability data; the test vulnerability data includes data corresponding to at least one of field value inconsistencies, field missing, field redundancy, and field inconsistencies.
7. A test device, comprising:
The test scene determining module is used for determining a test scene corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task;
The test data generation module is used for generating at least two groups of test data according to the test scene;
The test result acquisition module is used for testing by utilizing the test data to acquire a test result;
The test result evaluation module is used for evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; wherein, include: extracting test vulnerability data from the test results; the test vulnerability data comprises at least one of vulnerability quantity and vulnerability discovery time; determining an evaluation result based on the test vulnerability data; determining a vulnerability data type corresponding to the test vulnerability data; acquiring a vulnerability rating corresponding to the test vulnerability data based on the vulnerability data type; the vulnerability ratings include a deadline level, a severity level, a general level, and a mild level;
The effective test data screening module is used for screening effective test data in the at least two groups of test data based on the evaluation result; wherein, include: using the formula Calculating a validity score corresponding to the test data, wherein G is the validity score, gamma, lambda is a weight factor and gamma+lambda=1, W i is a coefficient corresponding to the vulnerability rating and/>P i is the vulnerability number of the corresponding vulnerability rating, Q 1 is the vulnerability number before the test task is delivered, and Q 2 is the vulnerability number after the test task is delivered; the validity score is compared to a validity score threshold to screen validity test data.
8. A test apparatus comprising a memory and a processor;
the memory is used for storing computer program instructions;
The processor is configured to execute the computer program instructions to implement the steps of: determining a test scenario corresponding to the test task; the test scene is used for describing the type and/or application environment of the test task; generating at least two groups of test data according to the test scene; testing by using the test data to obtain a test result; evaluating the test result to obtain an evaluation result; the evaluation result is used for representing the vulnerability situation corresponding to the test result; wherein, include: extracting test vulnerability data from the test results; the test vulnerability data comprises at least one of vulnerability quantity and vulnerability discovery time; determining an evaluation result based on the test vulnerability data; determining a vulnerability data type corresponding to the test vulnerability data; acquiring a vulnerability rating corresponding to the test vulnerability data based on the vulnerability data type; the vulnerability ratings include a deadline level, a severity level, a general level, and a mild level; screening valid test data from the at least two sets of test data based on the evaluation result; wherein, include: using the formula Calculating a validity score corresponding to the test data, wherein G is the validity score, gamma, lambda is a weight factor and gamma+lambda=1, W i is a coefficient corresponding to the vulnerability rating and/>P i is the vulnerability number of the corresponding vulnerability rating, Q 1 is the vulnerability number before the test task is delivered, and Q 2 is the vulnerability number after the test task is delivered; the validity score is compared to a validity score threshold to screen validity test data. /(I)
CN202110466642.0A 2021-04-28 2021-04-28 Test method, device and equipment Active CN113157583B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110466642.0A CN113157583B (en) 2021-04-28 2021-04-28 Test method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110466642.0A CN113157583B (en) 2021-04-28 2021-04-28 Test method, device and equipment

Publications (2)

Publication Number Publication Date
CN113157583A CN113157583A (en) 2021-07-23
CN113157583B true CN113157583B (en) 2024-04-26

Family

ID=76871877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110466642.0A Active CN113157583B (en) 2021-04-28 2021-04-28 Test method, device and equipment

Country Status (1)

Country Link
CN (1) CN113157583B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111294345A (en) * 2020-01-20 2020-06-16 支付宝(杭州)信息技术有限公司 Vulnerability detection method, device and equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9904786B2 (en) * 2013-01-17 2018-02-27 International Business Machines Corporation Identifying stored security vulnerabilities in computer software applications

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111294345A (en) * 2020-01-20 2020-06-16 支付宝(杭州)信息技术有限公司 Vulnerability detection method, device and equipment

Also Published As

Publication number Publication date
CN113157583A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
US11132624B2 (en) Model integration method and device
US7685468B2 (en) Method and system for test case generation
US20110270770A1 (en) Customer problem escalation predictor
CN108628748B (en) Automatic test management method and automatic test management system
CN113227971A (en) Real-time application error identification and mitigation
CN108876213B (en) Block chain-based product management method, device, medium and electronic equipment
CN109992473B (en) Application system monitoring method, device, equipment and storage medium
JP6419667B2 (en) Test DB data generation method and apparatus
KR101368103B1 (en) Risk-management device
CN111626498A (en) Equipment operation state prediction method, device, equipment and storage medium
CN115757150A (en) Production environment testing method, device, equipment and storage medium
Wang Model of open source software reliability with fault introduction obeying the generalized pareto distribution
CN110942314A (en) Abnormal account supervision method and device
CN107038377B (en) Website authentication method and device and website credit granting method and device
CN111159040A (en) Test data generation method, device, equipment and storage medium
CN113157583B (en) Test method, device and equipment
CN117495544A (en) Sandbox-based wind control evaluation method, sandbox-based wind control evaluation system, sandbox-based wind control evaluation terminal and storage medium
CN112416725A (en) Pressure testing method and device
CN116957828A (en) Method, equipment, storage medium and device for checking account
CN110008098B (en) Method and device for evaluating operation condition of nodes in business process
CN115249043A (en) Data analysis method and device, electronic equipment and storage medium
Ostrand et al. Predicting bugs in large industrial software systems
CN109840213B (en) Test data creating method, device, terminal and storage medium for GUI test
CN105989021A (en) Document processing method and device
CN114401494B (en) Short message issuing abnormality detection method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant