CN114860615A - Rule automatic testing method and device, electronic equipment and storage medium - Google Patents

Rule automatic testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114860615A
CN114860615A CN202210629406.0A CN202210629406A CN114860615A CN 114860615 A CN114860615 A CN 114860615A CN 202210629406 A CN202210629406 A CN 202210629406A CN 114860615 A CN114860615 A CN 114860615A
Authority
CN
China
Prior art keywords
user
result
rules
test
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210629406.0A
Other languages
Chinese (zh)
Inventor
余俊
邓建伟
岑恒辉
张晋铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Futuo Network Technology Shenzhen Co ltd
Original Assignee
Futuo Network Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futuo Network Technology Shenzhen Co ltd filed Critical Futuo Network Technology Shenzhen Co ltd
Priority to CN202210629406.0A priority Critical patent/CN114860615A/en
Publication of CN114860615A publication Critical patent/CN114860615A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application discloses a rule automatic testing method, a device, equipment and a storage medium, wherein the method comprises the following steps: synchronizing a plurality of rules on a system and automatically generating a plurality of test cases corresponding to the rules, wherein the test cases comprise expected trigger results corresponding to the rules; constructing user data according to the test case, and sending the user data to a sandbox, wherein the sandbox is used for simulating an actual trigger environment for the rules on the system; receiving an actual trigger result returned by the sandbox under a target rule matched with the user data; and comparing the expected trigger result with the actual trigger result, and generating a test result according to the obtained comparison result. The embodiment of the application can synchronize all rules on the system to automatically generate the corresponding test cases for automatic verification, reduces operation and maintenance manpower, increases effectiveness and accuracy of automatic verification, and reduces test cost.

Description

Rule automatic testing method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of computers, in particular to a rule automatic testing method and device, electronic equipment and a storage medium.
Background
The rule is a rule followed by an operation and operation rule, generally, the rule is a systematic decision program which can continuously and predictably apply information, and in an actual operation environment of a system with a plurality of rules, the accident of rule repeated triggering or rule error triggering is often caused by operation mismatch rules, rule code logic operation errors and the like. However, at present, the test for the rule mainly depends on manual test, which is not only low in efficiency and long in time consumption, but also cannot traverse all the rules running on the line under the condition of limited resources, so that the quality hidden danger exists, and the technical template for the manual test in the prior art is poor in compatibility and insufficient in application range, and needs to be clearly configured in the environment, so that the automation rate is insufficient, the operation and maintenance manpower is wasted, and the cost of the rule test is increased while the validity and the accuracy of the automatic verification are lost.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present application provide a rule automated testing method, apparatus, electronic device, and storage medium.
According to an aspect of an embodiment of the present application, there is provided a method for automatically testing rules, including:
synchronizing a plurality of rules on a system and automatically generating a plurality of test cases corresponding to the rules, wherein the test cases comprise expected trigger results corresponding to the rules;
constructing user data according to the test case, and sending the user data to a sandbox, wherein the sandbox is used for simulating an actual trigger environment for the rules on the system;
receiving an actual trigger result returned by the sandbox under a target rule matched with the user data;
and comparing the expected trigger result with the actual trigger result, and generating a test result according to the obtained comparison result.
According to an aspect of an embodiment of the present application, there is provided an automatic rule testing apparatus, including:
the first generation module is used for synchronizing a plurality of rules on a system and automatically generating a plurality of test cases corresponding to the rules, wherein the test cases comprise expected trigger results corresponding to the rules;
the user data construction module is used for constructing user data according to the test cases and sending the user data to a sandbox, and the sandbox is used for simulating the actual trigger environment of the system for the rules;
the receiving module is used for receiving an actual trigger result which is returned by the sandbox and matched with the user data under the target rule;
and the second generation module is used for comparing the expected trigger result with the actual trigger result and generating a test result according to the obtained comparison result.
In an embodiment of the application, the user data constructing module is specifically configured to:
constructing user attribute data and user behavior data according to the test case;
simulating a user behavior event message based on the user attribute data and the user behavior data, and sending the simulated user behavior event message as the user data to the sandbox.
In an embodiment of the application, the user data constructing module is further specifically configured to:
using the test case as a user data interface contained in a preset database for incoming call;
and receiving user attribute data and user behavior data returned by the user data interface aiming at the test case.
In an embodiment of the application, the user data constructing module is further specifically configured to:
simulating a user identity according to the user attribute data, and simulating a user behavior triggered under the user identity according to the user behavior data;
and generating the user behavior event message based on the user behavior triggered under the user identity obtained by simulation.
In an embodiment of the application, the second generating module is specifically configured to:
determining an expected trigger result corresponding to the target rule from the plurality of test cases;
and comparing the expected trigger result corresponding to the target rule with the actual trigger result to generate the test result according to the obtained comparison result.
In one embodiment of the present application, the apparatus further comprises:
the calculation module is used for calculating an approximate value between the expected trigger result and the actual trigger result;
and the error reporting module is used for sending error reporting information containing the optimization scheme to the background if the approximate value is smaller than the preset threshold.
In one embodiment of the present application, the apparatus further comprises:
the data cleaning module is used for cleaning the user data if the test result indicates that the test passes;
and the data updating module is used for sending the user data to the sandbox again if the test result indicates that the test is not passed, so as to update the test result based on the data returned by the sandbox.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; a storage device to store one or more programs that, when executed by the one or more processors, cause the electronic device to implement the rule automation testing method as previously described.
According to an aspect of embodiments herein, there is provided a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform a rule automation testing method as described above.
There is also provided, according to an aspect of an embodiment of the present application, a computer program product comprising a computer program which, when executed by a processor, implements the steps in the rule testing method as described above.
According to the technical scheme, all rules on a synchronization line are used for automatically generating the corresponding test cases and the corresponding expected trigger results, then the user behavior is simulated according to the data in the test cases, the rules are triggered in the actual environment simulated by the sandbox to generate the corresponding actual trigger results, and the actual trigger results and the expected trigger results are compared to obtain the test results, so that the corresponding test cases are automatically generated according to the rules on the system and are automatically verified, operation and maintenance manpower can be reduced, the effectiveness and the accuracy of automatic verification are improved, and meanwhile, the test cost is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 schematically illustrates an exemplary system architecture block diagram to which the subject technology applies;
FIG. 2 schematically illustrates a flow chart of a method for automated testing of rules provided by an embodiment of the present application;
FIG. 3 is a flow chart of step S220 in the embodiment shown in FIG. 2 in an exemplary embodiment;
FIG. 4 is a flow chart of step S222 in the embodiment shown in FIG. 3 in an exemplary embodiment;
FIG. 5 is a flow chart of step S240 in the embodiment shown in FIG. 2 in an exemplary embodiment;
FIG. 6 is a flow chart illustrating a method for automated testing of rules in accordance with another exemplary embodiment of the present application;
FIG. 7 is a block diagram schematically illustrating an exemplary embodiment of a rule automation testing apparatus;
FIG. 8 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Reference to "a plurality" in this application means two or more. "and/or" describe the association relationship of the associated objects, meaning that there may be three relationships, e.g., A and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments, and in the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the embodiments of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 schematically shows a block diagram of an exemplary system architecture to which the solution of the present application applies.
As shown in fig. 1, system architecture 100 may include a terminal device 110, a network 120, and a server 130. Terminal device 110 may include a smart phone, a tablet computer, a notebook computer, an intelligent voice interaction device, an intelligent appliance, a vehicle-mounted terminal, and so on. The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing a cloud computing service. Network 120 may be a communication medium of various connection types capable of providing a communication link between terminal device 110 and server 130, such as a wired communication link or a wireless communication link.
The present embodiment environment may be embodied as an exemplary test system, in which the terminal device 110 is implemented as a test terminal, and the server 130 is correspondingly a test server. The server 130 automatically synchronizes a plurality of rules on the system to be tested and automatically generates a plurality of test cases corresponding to the rules, wherein the test cases comprise expected trigger results corresponding to the rules, then user data is constructed according to the test cases, and the user data is sent to a sandbox, the sandbox is used for simulating actual trigger environments of the system for the rules, so that actual trigger results under target rules matched with the user data and returned by the sandbox are received, and finally the expected trigger results and the actual trigger results are compared, so that the test results can be generated according to the obtained comparison results. The server 130 may also send the obtained test result to the terminal device 110 for visual display, so as to be obtained by the relevant tester.
The system architecture in the embodiments of the present application may have any number of terminal devices, networks, and servers, according to implementation needs. For example, the server 130 may be a server group composed of a plurality of server devices. In addition, the technical solution provided in the embodiment of the present application may be applied to the terminal device 110, or may be applied to the server 130, or may be implemented by both the terminal device 110 and the server 130, which is not particularly limited in this application.
The following describes the data query method provided by the present application in detail with reference to specific embodiments.
FIG. 2 schematically illustrates a flow chart of a method for automated testing of rules provided by an embodiment of the present application, which may be performed by a server, such as the server 130 shown in FIG. 1; the method may also be performed by a terminal device, such as terminal device 110 shown in fig. 1, without limitation.
As shown in fig. 2, the method for automatically testing rules provided in the embodiment of the present application includes steps S210 to S240, which are described in detail as follows:
step S210, synchronizing a plurality of rules on the system and automatically generating a plurality of test cases corresponding to each rule, where the test cases include expected trigger results corresponding to each rule.
As mentioned above, the rule is a rule followed by a running and operating rule, in this embodiment, the rule is a systematic decision-making procedure that applies information in a sustainable and predictable manner, and generates a corresponding trigger result according to the external information trigger or the rule trigger condition when the rule trigger condition is satisfied, where the trigger result may include: awards, penalties, offers, and the like with consequential events. For example: triggering reward rules to obtain corresponding reward results, triggering punishment rules to obtain corresponding punishment results, triggering preferential rules to obtain corresponding preferential results, and the like.
In this embodiment, all rules on the system are synchronized, and after the rules are changed or newly added, all rules on the platform are automatically traversed, and a large number of rules generally exist on the system, and due to mutual influence and mutual restriction among the rules, a systematic decision program is formed. In this embodiment, a test case according to the dimension of a single rule is automatically generated, where the test case refers to a case required for testing a trigger condition corresponding to the rule, that is, the test case referred to in this embodiment is precedent data required for performing an automated test on a corresponding single rule. Specifically, the rules referred to in this embodiment may be reward rules, preferential policies, exemption rules, penalty rules, and the like introduced by each system. Specifically, taking the reward rule running on the system as an example, a bank, a financial institution, a security company and the like can run different reward rules according to the behavior of the user under certain identity. In this embodiment, the system automatically traverses all existing rules on the platform, and triggers all possible events under a rule according to the dimension of a single rule and generates a plurality of test cases corresponding to all possible events, where a test case includes an expected reward result of the event.
Illustratively, rules such as registration, account opening, deposit, transfer and the like are run on a system of a stock company, and specifically, the registration refers to that a computer user inputs a user name and a password to obtain the approval of a computer network system; the account opening means that: the account making is the transaction account for making stock and stock in the security company, and the existing account making method comprises the following steps: opening an account under the line and opening an account on the line; registering an account and applying for opening an account; the step of entry is as follows: deposit refers to the transfer of funds into a stock account; the rotating bin means that: typically futures trading, transfers an upcoming due contract to the next or later time month. In this embodiment, dimension trigger verification according to a single rule needs to be performed on the registration, account opening, deposit, and transfer rules, so as to generate a plurality of corresponding test cases.
It should be understood that the dimension of a single rule refers to automatically generating various trigger possible test cases under each single rule according to the trigger verification of each rule in all rules configured on the system. That is, only all trigger possibilities under the current single rule are considered without considering other rules.
In the example of the deposit ladder rule, it should be noted that the deposit ladder in this embodiment refers to that the user transfers funds into a stock account in batches, that is, all deposit amounts of the user are superimposed, a trigger event is generated according to different deposit amounts obtained by the superimposition, and then different reward results are generated according to the deposit rule. Illustratively, in this embodiment, reward 1 is earned when the user has a net deposit of 5000HKD (Hong Kong doller harbor) within a regular time frame, and reward 2 is earned when the user has a net deposit of 30000 HKD. Wherein award 1 and award 2 are different awards provided for different amounts of the deposit. This rule means that user A can receive a reward 1 when he enters 5000HKD, and receive a reward 2 when he enters 25000HKD again; user B may receive bonus 1 and bonus 2 for a one-time deposit of 30000 HKD.
Aiming at the deposit ladder climbing rule, a plurality of corresponding test cases are automatically generated, and verification is mainly carried out according to deposit amount and deposit time:
test case 1: step 1, triggering ladder climbing by single step (user A enters money 5000HKD and obtains reward 1);
test case 2: triggering the ladder 2 in a single step (after the user B enters 5000HKD for deposit, the user B enters 25000HKD for deposit again, and rewards 1 and 2 are obtained);
test case 3: obtaining all rewards for the maximum climbing ladder amount of the first deposit, repeating the maximum climbing ladder amount of the deposit, and not triggering the rewards repeatedly (after the user C enters 30000HKD, the user C enters 30000HKD again to obtain rewards 1 and 2);
test case 4: the single deposit is less than the threshold amount, and the reward is not triggered (the user D deposits 4999.99HKD and does not trigger the reward);
test case 5: the net value is less than the deposit amount (there is a deposit record), and the reward is not triggered (the user E deposits 4999.99HKD, deposits 0.01HKD, and the net deposit 4999.99HKD does not trigger the reward);
test case 6: initiating deposit before the rule starting time, depositing the deposit in the rule time range, and not triggering the reward;
test case 7: initiating deposit after the rule ending time, and depositing the deposit before the rule expiration time without triggering reward;
test case 8: and initiating deposit within the rule time range, and depositing the deposit to be paid after the rule expiration time without triggering the reward.
Step S220, constructing user data according to the test case, and sending the user data to a sandbox, wherein the sandbox is used for simulating the actual trigger environment of the system for the plurality of rules.
It should be noted that the sandbox refers to a virtual environment, programs running in the environment are independent, the current operating system cannot be affected, all operations for closing the sandbox can be restored, programs and software which may have risks can be tested by using the sandbox, and the sandbox is a virtualization technology. In this embodiment, the sandbox is an isolated environment provided for running user data triggering rules. The corresponding user data is constructed according to event messages in a plurality of automatically generated test cases, wherein the event messages refer to event messages which trigger expected trigger results in the test cases, for example, after the user B deposits 5000HKD, 25000HKD is deposited again in the test case 2, that is, the event messages refer to messages sent by a certain object, for example: the event message is the user pressing a certain button, a certain file being changed, etc.
Illustratively, user attribute data and user behavior data corresponding to a test case are obtained by analyzing according to a corresponding event message in the test case, and specifically, the user attribute data includes: the user ID (Identity), the user ID account level, the user Identity attribute, the user IP (Identity of position) address, the company of the user and other data related to the user attribute; user behavior data refers to user-triggered behaviors such as: registering, opening an account, making money, transferring and the like, wherein the data are related to user behaviors. And obtaining a user behavior event message constructed by user data simulation by combining the attribute data of the user and the behavior data of the user, and sending the obtained user behavior event message to a sandbox environment for simulating the actual trigger environment of the system for a plurality of rules, so as to generate a corresponding actual trigger result by triggering a target rule corresponding to the user data in the sandbox environment. Specifically, in the sandbox environment, a user behavior event message which is sent by the system and constructed according to user attribute data and user behavior data in the test case is monitored, the user related attribute and behavior data in the user event behavior message are obtained through an interface provided by the user related service in the sandbox environment and are matched with a corresponding target rule, and then an actual trigger result is generated according to the logic of the target rule.
And step S230, receiving an actual trigger result returned by the sandbox under the target rule matched with the user data.
It should be understood that the actual trigger result refers to a corresponding trigger result generated by matching, in a sandbox environment for simulating an actual environment of the system for the plurality of rules, a rule matched with the user data in the plurality of rules according to the user data constructed in the test case as a target rule, and triggering the rule according to the user data under the target rule. That is, after receiving the user data and actually triggering the target rule, the sandbox returns an actual trigger result triggered by the user data under the target rule to the system.
Taking the test case 2 as an example, after the user data is extracted as 5000HKD deposit of the user B, 25000HKD deposit is performed again, and further, the extracted user attribute data "user B" is extracted to obtain user behavior data: after deposit 5000HKD, deposit 25000HKD again, the user event behavior message of the specific user, namely the user B, is constructed according to the extracted user attribute data and user behavior data: after 5000HKD deposit is made under the identity of the user B, 25000HKD deposit is made again and sent to a sandbox used for simulating the actual trigger environment of the rules on the system, the sandbox obtains the user-related attributes and behavior data in the user event behavior messages through an interface provided by the user-related service in the sandbox environment according to the user event behavior messages obtained through monitoring and matches with the corresponding target rules, and then the actual trigger results are generated according to the logic of the target rules. . Specifically, following the test case 2 as an example, after obtaining the user-related attribute and behavior data of "user B deposit 5000 HKD" through the related user interface according to the user behavior event message in the sandbox, the target rule obtained by matching the deposit 25000HKD "is the deposit rule again, and the actual trigger result obtained under the deposit rule is" obtain award 1, award 2 ".
And S240, comparing the expected trigger result with the actual trigger result, and generating a test result according to the obtained comparison result.
Comparing an expected trigger result in the test case with an actual trigger result triggered in a sandbox simulating an actual environment, comparing whether the actual trigger result is consistent with the expected trigger result, and generating a corresponding test result according to the obtained comparison result.
In the embodiment, the consistency of the actual trigger result in the sandbox and the expected trigger result in the test case is automatically compared, so that the operation and maintenance manpower can be reduced, the effectiveness and the accuracy of automatic verification are improved, and the test cost is reduced.
In another exemplary embodiment of the present application, referring to fig. 3, step S220 in the embodiment shown in fig. 2 can be further implemented as steps S211 to S222 shown as follows:
step S221, constructing user attribute data and user behavior data according to the test case;
step S222, simulating a user behavior event message based on the user attribute data and the user behavior data, and sending the simulated user behavior event message as user data to the sandbox.
Constructing the user data according to the test case comprises constructing user attribute data and user behavior data according to the test case. Specifically, user attribute data and user behavior data corresponding to an event message in the test case are extracted, wherein the online reward rule of a security company is taken as an example, the user attribute data comprises information related to attributes such as an inland user, a hong Kong user, a company where the user is located, an account ID and the like, and the user behavior data comprises information related to user behaviors such as registration, account opening, deposit, transfer and the like. That is, the user behavior event message is simulated according to the user attribute data and the user behavior data extracted from the test case, for example: inland user B (account ID: 123456) enters a fund 40000HKD, and sends the simulated user behavior event message to a sandbox for simulating the actual triggering environment for a plurality of rules on the system.
In the embodiment, the user data in the automatically generated test case is constructed and the user event message is simulated, so that the consistency between the user data triggering the target rule in the sandbox environment and the user data in the automatically generated test case is ensured, and the accuracy of the test result is further ensured.
Based on the above embodiments, in another exemplary embodiment of the present application, referring to fig. 4, step S221 may further implement steps S2211 to S2212 as shown below:
step S2211, the test case is used as an input reference to call a user data interface contained in a preset database;
and step S2212, receiving the user attribute data and the user behavior data returned by the user data interface aiming at the test case.
The references include: the value of the entry is needed by the called function, that is, when the function is called, parameters are usually passed, the code inside the function remains unchanged, and different data are processed for different parameters. The common parameter entering modes include position parameter transmission, keyword parameter transmission, default value parameter, multi-value parameter and the like. In this embodiment, the test case is used as a reference mode for transferring parameters to call a corresponding function so as to construct a user data interface in a preset database, and then user attribute data and user behavior data corresponding to the test case are constructed in the preset database according to the user data interface.
In the embodiment, the user attribute data and the user behavior data are constructed from the preset data according to the test case as the input parameter, so that the correlation between the test case and the constructed user data is ensured, and the accuracy and the effectiveness of the test result are further ensured.
Further, based on the above embodiment, in another exemplary embodiment provided by the rule automation testing method of the present application, the method further includes:
simulating the user identity according to the user attribute data, and simulating the user behavior triggered under the user identity according to the user behavior data; and generating a user behavior event message based on the user behavior triggered under the user identity obtained by simulation.
And simulating the user behavior event message according to the user attribute data and the user behavior data further extracted from the user data extracted from the test case, specifically, simulating the identity information corresponding to the user according to the attribute data of the user, and simulating the behavior of the user according to the user behavior data under the identity information corresponding to the user, so as to generate the user behavior event message based on the user behavior triggered under the user identity obtained by simulation. For example: user a, IP address: guangdong, user's home: inland user, company: xxxxx, age: age 35, waiting for data associated with the user attributes to simulate the user identity; the user behavior data is also obtained by a relational database and by constructing simulated user behavior data by corresponding communication protocols, for example: and warehousing 30000, transferring out 5000 and transferring into 6000, so that the behavior event message simulating the user can be constructed according to the user data as follows: user a, IP address: guangdong, user origin: inland user, company: xxxxx, age 35, enter 30000, roll out 5000, enter 6000, then match the target rule from all rules in the sandbox synchronized system real world environment, and generate the corresponding real trigger result under the target rule.
In the embodiment, the user event behavior message is simulated and sent to the sandbox environment through the user data constructed in the test case, so that the consistency of the user data used for triggering the rules in the sandbox and the user data constructed by the test case is ensured, and the validity of the test result is further ensured.
Based on the above embodiment, in another exemplary embodiment of the present application, please refer to fig. 5, step S240 may further implement steps S241 to S242 as follows:
step S241, determining an expected trigger result corresponding to the target rule from a plurality of test cases;
step S242, comparing the expected trigger result corresponding to the target rule with the actual trigger result to generate a test result according to the obtained comparison result.
Specifically, the target test case used as a reference for constructing an input preset database of user data is determined according to the user data transmitted into the sandbox, an expected trigger result contained in the target test case is determined, and the expected trigger result and an actual trigger result are compared to generate a corresponding test result based on the obtained comparison result.
Still taking the aforementioned deposit rule as an example, if the target test case obtained according to the user data tracing is the test case 3 (after the user C deposits 30000HKD, deposit 30000HKD again, obtain reward 1, reward 2), the expected trigger result is obtained as follows: win prize 1, prize 2; the actual trigger result correspondingly generated according to the rule that the user data constructed by using the target test case as the input parameter and inputting the user data into the preset database is triggered in the sandbox environment is as follows: prize 1, prize 2 is earned. The comparison result obtained by comparing the expected trigger result with the actual trigger result is that the expected trigger result is consistent with the actual trigger result, so that the test result of the test case 3 under the deposit rule is passed.
In this embodiment, the actual trigger result triggered by the user data is obtained from the sandbox for simulating the actual trigger environment of the plurality of rules on the user data input system constructed according to the test case, so that the accuracy of the test result is ensured.
Further, based on the above embodiment, in another embodiment provided by the method for automatically testing rules of the present application, the method further includes:
step S250, calculating an approximate value between the expected trigger result and the actual trigger result;
and step S260, if the approximate value is smaller than the preset threshold value, sending error reporting information containing the optimization scheme to a background.
Specifically, the test process for the rule may further include: and obtaining the reference target test case of the generated user data according to the tracing of the user data sent to the sandbox environment, obtaining the corresponding expected trigger result from the target test case, and calculating an approximate value between the expected trigger result and the actual trigger result of the user data in the sandbox environment. Specifically, in this embodiment, the expected trigger result and the actual trigger result are quantized to obtain expected trigger data and actual trigger data, which may be replaced by a preset data corresponding relationship to obtain data of a consistent type; then, an approximation Sim (Dat _ pre, Dat _ act) between the expected trigger data Dat _ pre and the actual trigger data Dat _ act is calculated as:
Sim(Dat_pre,Dat_act)=α·e -Dat_pre-Dat_act ·(Dat_pre+Dat_act)
where α represents a preset approximation factor. In this embodiment, when the calculated approximate value is smaller than the preset approximate value threshold, it indicates that the test fails, which indicates that a large difference exists between the expected trigger result and the actual trigger result, so that a corresponding optimization scheme is generated based on a comparison result between the expected trigger result and the actual trigger result, and error information including the optimization scheme is sent to the background. That is, when the test result of the rule is failed, an optimization scheme is generated based on the comparison result of the actual trigger result and the expected trigger result, so that the system can be automatically corrected or corrected by related personnel according to the optimization scheme.
Further, based on the above embodiment, in another embodiment provided by the method for automatically testing rules of the present application, the method further includes:
if the test result indicates that the test is passed, cleaning user data;
and if the test result indicates that the test is not passed, sending the user data to the sandbox again so as to update the test result based on the data returned by the sandbox.
And when the test result indicates that the current test passes, cleaning the user data constructed according to the test case, checking error-reporting field data if the test result indicates that the current test does not pass, confirming whether the error-reporting reason is caused by code operation error or operation configuration error, and sending the user data to a sandbox of the actual environment of the simulation system again so as to update the test result based on the data returned by the sandbox. Therefore, the test case data and the test data which pass through the system are cleaned, and the running load of the server is reduced; the failed data is retriggered to update the test results to help make corrections to the test system.
Fig. 6 shows a flowchart of an automatic rule testing method according to an embodiment of the present application, which is as follows:
s1, a plurality of rules are arranged on an automatic background synchronization system;
s2, the automatic background sends the updated rules to the sandbox environment;
s3, automatically generating a plurality of test cases corresponding to each rule by the automatic background, wherein the test cases comprise expected trigger results corresponding to each rule;
s4, constructing user data according to the test case, and sending the user data to a sandbox;
s5, the sandbox triggers the matched target rule according to the user data and generates an actual trigger result under the target rule;
s6, the sandbox returns the generated actual trigger result to the automation background;
and S7, comparing the expected trigger result in the test case with the actual trigger result sent by the sandbox by the automatic background, and generating a test result according to the obtained comparison result.
A background automatically synchronizes a plurality of rules on the system environment and updates the synchronized rules to a sandbox environment, in this embodiment, the sandbox is used to simulate the actual trigger environment of the system for the plurality of rules, the background automatically generates a plurality of test cases corresponding to the rules on the system, wherein the test cases include expected trigger results corresponding to the rules, the background inputs user data corresponding to the test cases into a preset database according to the test cases as input parameters, and sends the constructed user data to the sandbox used to simulate the actual trigger environment of the system for the plurality of rules, and matches the target rules in the sandbox, and generates corresponding actual trigger results under the target rules, the sandbox returns the actual trigger results generated by the user data to the background, and the background compares the actual trigger results with the expected trigger results in the test cases, and generating a test result based on the comparison result of the actual trigger result and the test case. The rules on the system are tested through background automation, dependence on manual testing is reduced, and the accuracy of the test result is improved while the test efficiency is improved.
It should be noted that although the various steps of the methods in this application are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the shown steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Fig. 7 is a block diagram of a rule automation test device 300 shown in an exemplary embodiment of the present application. The apparatus may be applied to the implementation environment shown in fig. 2, and is specifically configured in the intelligent terminal 110 or the server 130. The apparatus may also be applied to other exemplary implementation environments and specifically configured in other devices, and the embodiment does not limit the implementation environment to which the apparatus is applied.
As shown in fig. 7, the exemplary rule automated testing apparatus 300 includes:
a first generating module 310, configured to synchronize a plurality of rules on a system and automatically generate a plurality of test cases corresponding to each rule, where a test case includes an expected trigger result corresponding to each rule; the user data construction module 320 is configured to construct user data according to the test cases and send the user data to a sandbox, and the sandbox is used for simulating an actual trigger environment for a plurality of rules on the system; the receiving module 330 is configured to receive an actual trigger result returned by the sandbox under a target rule matched with the user data; the second generating module 340 is configured to compare the expected trigger result with the actual trigger result, and generate a test result according to the obtained comparison result.
In the exemplary rule automatic testing device, a plurality of test cases corresponding to a plurality of rules on the system and expected trigger results included in the test cases are automatically generated, user data are constructed according to the test cases, the user data are used for triggering target rules in actual trigger environment sandboxes for the plurality of rules on the simulation system to generate corresponding actual trigger results, and the test results can be obtained according to comparison between the actual trigger results and the expected trigger results. By the method and the device, dependence of rule testing on manual work can be avoided, testing cost is reduced, and accuracy and effectiveness of rule testing are guaranteed.
In an embodiment of the present application, the user data constructing module is specifically configured to:
constructing user attribute data and user behavior data according to the test cases;
and simulating the user behavior event message based on the user attribute data and the user behavior data, and sending the simulated user behavior event message serving as the user data to the sandbox.
In an embodiment of the application, the user data constructing module is further specifically configured to:
using the test case as a user data interface contained in a preset database for incoming call;
and receiving user attribute data and user behavior data returned by the user data interface aiming at the test case.
In an embodiment of the application, the user data constructing module is further specifically configured to:
simulating the user identity according to the user attribute data, and simulating the user behavior triggered under the user identity according to the user behavior data;
and generating a user behavior event message based on the user behavior triggered under the user identity obtained by simulation.
In an embodiment of the application, the second generating module is specifically configured to:
determining an expected trigger result corresponding to the target rule from the plurality of test cases;
and comparing the expected trigger result corresponding to the target rule with the actual trigger result to generate a test result according to the obtained comparison result.
In one embodiment of the present application, the apparatus further comprises:
the calculation module is used for calculating an approximate value between an expected trigger result and an actual trigger result;
and the error reporting module is used for sending error reporting information containing the optimization scheme to the background if the approximate value is smaller than the preset threshold value.
In one embodiment of the present application, the apparatus further comprises:
the data cleaning module is used for cleaning the user data if the test result indicates that the test is passed;
and the data updating module is used for sending the user data to the sandbox again if the test result indicates that the test is not passed, so as to update the test result based on the data returned by the sandbox.
It should be noted that the rule automatic testing apparatus provided in the foregoing embodiment and the rule automatic testing method provided in the foregoing embodiment belong to the same concept, and specific ways of executing operations by each module and unit have been described in detail in the method embodiment, and are not described herein again. In practical applications, the rule automation testing apparatus provided in the above embodiment may distribute the functions through different function modules as needed, that is, divide the internal structure of the apparatus into different function modules to complete all or part of the functions described above, which is not limited herein.
An embodiment of the present application further provides an electronic device, including: one or more processors; the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the electronic equipment is enabled to realize the rule automatic testing method provided in the above embodiments.
FIG. 8 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application. It should be noted that the computer system 1200 of the electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 8, the computer system 1200 includes a Central Processing Unit (CPU)1201, which can perform various appropriate actions and processes, such as executing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1202 or a program loaded from a storage section 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data necessary for system operation are also stored. The CPU 1201, ROM 1202, and RAM 1203 are connected to each other by a bus 1204. An Input/Output (I/O) interface 1205 is also connected to bus 1204.
The following components are connected to the I/O interface 1205: an input section 1206 including a keyboard, a mouse, and the like; an output section 1207 including a Display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 1208 including a hard disk and the like; and a communication section 1209 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 1209 performs communication processing via a network such as the internet. A driver 1210 is also connected to the I/O interface 1205 as needed. A removable medium 1211, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1210 as necessary, so that a computer program read out therefrom is mounted into the storage section 1208 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1209, and/or installed from the removable medium 1211. The computer program executes various functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 1201.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer-readable signal medium may comprise a propagated data signal with a computer-readable computer program embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Yet another aspect of the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a rule automation testing method as described above. The computer-readable storage medium may be included in the electronic device described in the above embodiment, or may exist separately without being incorporated in the electronic device.
Another aspect of the application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to execute the rule automation testing method provided in the above embodiments.
The above description is only a preferred exemplary embodiment of the present application, and is not intended to limit the embodiments of the present application, and one of ordinary skill in the art can easily make various changes and modifications according to the main concept and spirit of the present application, so that the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An automated rule testing method, comprising:
synchronizing a plurality of rules on a system and automatically generating a plurality of test cases corresponding to the rules, wherein the test cases comprise expected trigger results corresponding to the rules;
constructing user data according to the test case, and sending the user data to a sandbox, wherein the sandbox is used for simulating an actual trigger environment for the rules on the system;
receiving an actual trigger result returned by the sandbox under a target rule matched with the user data;
and comparing the expected trigger result with the actual trigger result, and generating a test result according to the obtained comparison result.
2. The method for automated testing of rules according to claim 1, wherein the constructing user data from the test cases and sending the user data to a sandbox comprises:
constructing user attribute data and user behavior data according to the test case;
simulating a user behavior event message based on the user attribute data and the user behavior data, and sending the simulated user behavior event message as the user data to the sandbox.
3. The method for automated testing of rules according to claim 2, wherein the constructing user attribute data and user behavior data from the test cases comprises:
using the test case as a user data interface contained in a preset database for incoming call;
and receiving user attribute data and user behavior data returned by the user data interface aiming at the test case.
4. The method for automated testing of rules according to claim 2, wherein said simulating a user behavior event message based on said user attribute data and said user behavior data comprises:
simulating a user identity according to the user attribute data, and simulating a user behavior triggered under the user identity according to the user behavior data;
and generating the user behavior event message based on the user behavior triggered under the user identity obtained by simulation.
5. The method for automatically testing rules according to claim 1, wherein the comparing the expected trigger result with the actual trigger result and generating a test result according to the obtained comparison result comprises:
determining an expected trigger result corresponding to the target rule from the plurality of test cases;
and comparing the expected trigger result corresponding to the target rule with the actual trigger result to generate the test result according to the obtained comparison result.
6. The method for automated testing of rules of claim 1, wherein after comparing the expected trigger result to the actual trigger result and generating a test result based on the comparison, the method further comprises:
calculating an approximation between the expected trigger result and the actual trigger result;
and if the approximate value is smaller than the preset threshold value, sending error reporting information containing the optimization scheme to a background.
7. The method for automated testing of rules of claim 1, wherein after comparing the expected trigger result to the actual trigger result and generating a test result based on the comparison, the method further comprises:
if the test result indicates that the test is passed, cleaning the user data;
and if the test result indicates that the test is not passed, the user data is sent to the sandbox again so as to update the test result based on the data returned by the sandbox.
8. An automated rule testing device, comprising:
the first generation module is used for synchronizing a plurality of rules on a system and automatically generating a plurality of test cases corresponding to the rules, wherein the test cases comprise expected trigger results corresponding to the rules;
the user data construction module is used for constructing user data according to the test cases and sending the user data to a sandbox, and the sandbox is used for simulating the actual trigger environment of the system for the rules;
the receiving module is used for receiving an actual trigger result which is returned by the sandbox and matched with the user data under the target rule;
and the second generation module is used for comparing the expected trigger result with the actual trigger result and generating a test result according to the obtained comparison result.
9. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement the rule automation testing method of any one of claims 1 to 7.
10. A computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform the method of automated testing of rules of any of claims 1 to 7.
CN202210629406.0A 2022-06-02 2022-06-02 Rule automatic testing method and device, electronic equipment and storage medium Pending CN114860615A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210629406.0A CN114860615A (en) 2022-06-02 2022-06-02 Rule automatic testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210629406.0A CN114860615A (en) 2022-06-02 2022-06-02 Rule automatic testing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114860615A true CN114860615A (en) 2022-08-05

Family

ID=82624063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210629406.0A Pending CN114860615A (en) 2022-06-02 2022-06-02 Rule automatic testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114860615A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117873906A (en) * 2024-03-11 2024-04-12 云账户技术(天津)有限公司 Method and device for testing prize amount distribution in transaction system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117873906A (en) * 2024-03-11 2024-04-12 云账户技术(天津)有限公司 Method and device for testing prize amount distribution in transaction system
CN117873906B (en) * 2024-03-11 2024-05-24 云账户技术(天津)有限公司 Method and device for testing prize amount distribution in transaction system

Similar Documents

Publication Publication Date Title
US20240080196A1 (en) Computer-implemented systems and methods for combining blockchain technology with digital twins
WO2020018921A1 (en) Blockchain transaction safety using smart contracts
CN116882994A (en) Method and system for providing authenticated, auditable and immutable input for intelligent contracts
CN111309594B (en) System testing method, device, equipment and storage medium
CN112733206A (en) Resource allocation method, device, server and medium
WO2023165271A1 (en) Knowledge graph construction and graph calculation
CN109582569A (en) Lending platforms test method, device, terminal device and readable storage medium storing program for executing
CN114860615A (en) Rule automatic testing method and device, electronic equipment and storage medium
CN111581077A (en) Intelligent contract testing method and device
CN113112358A (en) Block chain-based decision method and device and electronic equipment
CN111310242B (en) Method and device for generating device fingerprint, storage medium and electronic device
CN107277108B (en) Method, device and system for processing messages at nodes of block chain
CN117010892A (en) Payment risk detection method, device, electronic equipment and readable medium
CN112734373A (en) Information processing method, information processing apparatus, electronic device, and medium
CN113592645A (en) Data verification method and device
CN112926981A (en) Transaction information processing method, device and medium for block chain and electronic equipment
WO2020155167A1 (en) Application of cross-organizational transactions to blockchain
CN112598420B (en) Online regression verification method and device
US11687441B2 (en) Intelligent dynamic web service testing apparatus in a continuous integration and delivery environment
CN116506333B (en) Transaction system production inversion detection method and equipment
US20240161109A1 (en) Distributed evaluation platform for nonfungible tokens using virtual token cloning
CN116955381A (en) Transaction rollback method and system under distributed scene and electronic equipment
CN118113619A (en) Test method, apparatus, device, medium and program product
CN116932374A (en) Timing case convergence method, device, equipment and storage medium of distributed system
CN115470142A (en) Method and device for testing friend query plug-in, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination