CN114924982A - Data comparison test method based on artificial intelligence and related equipment - Google Patents

Data comparison test method based on artificial intelligence and related equipment Download PDF

Info

Publication number
CN114924982A
CN114924982A CN202210598502.3A CN202210598502A CN114924982A CN 114924982 A CN114924982 A CN 114924982A CN 202210598502 A CN202210598502 A CN 202210598502A CN 114924982 A CN114924982 A CN 114924982A
Authority
CN
China
Prior art keywords
test
data
service
service node
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210598502.3A
Other languages
Chinese (zh)
Inventor
王成文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202210598502.3A priority Critical patent/CN114924982A/en
Publication of CN114924982A publication Critical patent/CN114924982A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application provides a data comparison test method and device based on artificial intelligence, an electronic device and a storage medium, wherein the data comparison test method based on artificial intelligence comprises the following steps: sending a test request to a test terminal according to an initially applied server to obtain a test sequence of each service node; calling service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image; updating the initial application to obtain an updated application, and calling test data corresponding to each service node of the updated application to construct a second test image; and comparing the first test image with the second test image to obtain a comparison result. According to the method and the device, the obtained test data of different types are converted into the image domain for batch comparison test according to the test sequence of each service node, and the comparison test efficiency of the batch data can be effectively improved.

Description

Data comparison test method based on artificial intelligence and related equipment
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a data comparison testing method and apparatus based on artificial intelligence, an electronic device, and a storage medium.
Background
The client application needs to be updated for many times during the use process, so that the function of the application needs to be tested before the application is released, and then the fault is eliminated.
In the prior art, a test script is mainly constructed in a manual mode to carry out test work, when large batches of data need to be subjected to comparison test, a large amount of manpower is required to be invested for checking one by one, meanwhile, the test script needs to be continuously maintained along with the updating of application, the compiling complexity of the test script is high, the construction of a test scene is difficult, the maintenance cost is high, the test period is long, and the test efficiency is low.
Disclosure of Invention
In view of the foregoing, there is a need for a data comparison and test method based on artificial intelligence and related devices, so as to solve the technical problem of how to improve the comparison and test efficiency of large quantities of data, where the related devices include a data comparison and test apparatus based on artificial intelligence, an electronic device and a storage medium.
The application provides a data comparison test method based on artificial intelligence, which comprises the following steps:
sending a test request to a test terminal according to the initially applied server to obtain a test sequence of each service node;
calling service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image;
updating the initial application to obtain an updated application, and calling test data corresponding to each service node of the updated application to construct a second test image;
and comparing the first test image with the second test image to obtain a comparison result.
Therefore, the method and the device can convert the obtained different types of test data into the image domain for batch comparison test according to the test sequence of each service node, and can effectively improve the comparison test efficiency of the batch data.
In some embodiments, the sending the test request to the test terminal according to the server of the initial application to obtain the test sequence of each service node includes:
constructing a test environment according to the test environment configuration information;
determining the calling relation of each service node based on the test environment and the test scene configuration information;
and acquiring the test sequence of each service node based on the calling relation.
Therefore, corresponding configuration can be completed on different test environments and test scenes according to the test request, the efficiency of establishing the test environments is improved, the corresponding call relation of each service node is determined, and the subsequent process can call corresponding test data quickly according to the call relation.
In some embodiments, the constructing a test environment according to the test environment configuration information includes:
acquiring an environment configuration file according to a preset mode;
analyzing the environment configuration file to obtain environment configuration data;
matching the environment configuration data with the test environment configuration information to obtain environment configuration parameters;
and constructing the test environment based on the environment configuration parameters.
Therefore, the test environment is established through the configuration file, time-consuming operations such as copying of files and the like for establishing the test environment in a conventional mode can be avoided, and the efficiency of establishing the test environment is effectively improved.
In some embodiments, the invoking of the business test data corresponding to the business nodes of the initial application based on the test order to construct a first test image includes:
establishing a test database to store test data;
calling the test data in the test database based on the test sequence to acquire service test data;
converting the service test data to an image gray space according to a normalization algorithm to obtain a test image;
calculating the service test data volume corresponding to each service node to obtain the service test data proportion;
dividing the test image based on the service test data proportion and each service node to obtain a classified test image;
the locations of the classified test images are set based on the test order to construct a first test image.
Therefore, the acquired service test data are converted into the image domain and are correspondingly divided, the calling relation among all service nodes can be clearly and visually displayed, and the service test data can be conveniently and rapidly compared through an image matching algorithm in the subsequent process.
In some embodiments, said invoking test data in said test database to obtain business test data based on said test order comprises:
judging whether test data corresponding to the service node exists in the test database or not;
if yes, calling test data corresponding to the service node to serve as service test data;
if the test data does not exist, a test data demand form corresponding to the service node is obtained based on the test database, corresponding original data is obtained from production data based on the test data demand form, and desensitization processing is carried out on the original data to obtain service test data.
Therefore, enough service test data can be obtained, each service node can be normally tested, and meanwhile, the reliable protection of sensitive private data can be realized by desensitizing the original data.
In some embodiments, said setting the position of said classified test image based on said test order to construct a first test image comprises:
acquiring a service node calling relation based on the test sequence;
counting the service test data volume of each service node called by other service nodes based on the service node calling relation to obtain a service test data calling proportion;
and determining the position of the classified test image based on the service test data calling proportion and the service node calling relation so as to construct a first test image.
Therefore, the calling proportion of the service test data among the service nodes can be clearly and intuitively represented in the finally obtained first test image, the specific calling relation among the service nodes is further displayed, and the subsequent process is facilitated to judge which service nodes are affected when the service test data is compared with errors.
In some embodiments, the comparison result comprises a difference and no difference, and after comparing the first test image and the second test image to obtain the comparison result, the method further comprises:
recording difference content between the initial application and the updated application according to a preset mode to obtain a difference record table;
if the comparison result is different, judging whether the content corresponding to the difference exists in the difference record table;
if the content corresponding to the difference does not exist in the difference record table, indicating that the updated application program has defects and needs to be repaired;
and if the content corresponding to the difference exists in the difference record table, indicating that the program of the updated application has no defect, and ignoring the difference.
Therefore, whether the updated application program has defects can be quickly determined by establishing the difference record table, and the efficiency of checking the program defects by workers is greatly improved.
The embodiment of the present application further provides a data comparison testing device based on artificial intelligence, including:
the system comprises an acquisition unit, a test unit and a test unit, wherein the acquisition unit is used for sending a test request to a test terminal according to an initially applied server so as to acquire a test sequence of each service node;
the construction unit is used for calling the service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image;
the updating unit is used for updating the initial application to obtain an updated application and calling the test data corresponding to each service node of the updated application to construct a second test image;
and the comparison unit is used for comparing the first test image with the second test image to obtain a comparison result.
An embodiment of the present application further provides an electronic device, including:
a memory storing at least one instruction;
and the processor executes the instructions stored in the memory to realize the artificial intelligence-based data comparison testing method.
The embodiment of the present application further provides a computer-readable storage medium, where at least one instruction is stored in the computer-readable storage medium, and the at least one instruction is executed by a processor in an electronic device to implement the artificial intelligence based data comparison testing method.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of an artificial intelligence based data comparison test method to which the present application relates.
Fig. 2 is a flowchart of a preferred embodiment according to the present application, in which a server sends a test request to a test terminal to obtain a test sequence of service nodes.
Fig. 3 is a flowchart of a preferred embodiment of invoking service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image according to the present application.
FIG. 4 is a functional block diagram of a preferred embodiment of an artificial intelligence based data comparison testing apparatus according to the present application.
Fig. 5 is a schematic structural diagram of an electronic device according to a preferred embodiment of the artificial intelligence-based data comparison testing method.
Fig. 6 is an exemplary diagram of a first test image to which the present application relates.
Detailed Description
For a clearer understanding of the objects, features and advantages of the present application, reference is made to the following detailed description of the present application along with the accompanying drawings and specific examples. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, and the described embodiments are merely some, but not all embodiments of the present application.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first" and "second" may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The embodiment of the present Application provides a data comparison testing method based on artificial intelligence, which can be applied to one or more electronic devices, where the electronic device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and hardware of the electronic device includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The electronic device may be any electronic product capable of performing human-computer interaction with a client, for example, a Personal computer, a tablet computer, a smart phone, a Personal Digital Assistant (PDA), a game machine, an Internet Protocol Television (IPTV), an intelligent wearable device, and the like.
The electronic device may also include a network device and/or a client device. The network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of hosts or network servers.
The Network where the electronic device is located includes, but is not limited to, the internet, a wide area Network, a metropolitan area Network, a local area Network, a Virtual Private Network (VPN), and the like.
FIG. 1 is a flow chart of the preferred embodiment of the data comparison test method based on artificial intelligence. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
And S10, sending a test request to the test terminal according to the server of the initial application to obtain the test sequence of each service node.
Referring to fig. 2, in an optional embodiment, the sending the test request to the test terminal according to the initial application to obtain the test sequence of each service node includes:
s101, constructing a test environment according to the test environment configuration information.
In this optional embodiment, the test environment configuration information is used to configure a test environment of each service node, and the test scenario service information is used to configure a test scenario of each service node, where the test environment refers to a general term of computer hardware, software, network equipment, and historical data necessary for completing a test task, and the test scenario may be a financial transaction, such as a transfer service, a payment service, a large remittance service, and the like between banks.
In this optional embodiment, the test environment configuration information may include a test environment attribute keyword and a key value, where the test environment attribute keyword may be a test environment name keyword, a test environment identification keyword, and the like; the test scenario configuration information may include a test scenario attribute keyword and a key value, where the test scenario attribute keyword may be a test scenario name keyword, a test scenario identification keyword, and the like, and which type of keyword is specifically used to identify a target test environment and a target test scenario may be customized as needed.
In this optional embodiment, different test tasks correspond to different test environment configuration information and test scenario configuration information. Before sending the test request, the server acquires the test environment configuration information and the test scene configuration information, and generates test task information according to the test environment configuration information and the test scene configuration information. The test environment configuration information and the test scene configuration information can be generated by the configuration of an operator in the background.
In this optional embodiment, the XML environment configuration file may be obtained, the XML environment configuration file may be analyzed to obtain the environment configuration parameter matched with the test environment configuration information, and the corresponding test environment may be established according to the environment configuration parameter.
In this optional embodiment, an XML (Extensible Markup Language) is a Markup Language used for Markup a file to make the file have a structure, and the XML environment configuration file carries environment configuration data through the XML Language, may include environment configuration parameters corresponding to a plurality of different test environments, and distinguishes the positions of each different test environment and the matched environment configuration parameters in the XML environment configuration file through a certain structure and a certain tag.
Because the label of the XML is not predefined, the label can be defined by self according to the requirement, so that the XML environment configuration file is more flexible and convenient to generate. During testing, only the XML environment configuration file needs to be analyzed, and the target environment configuration parameters matched with the test environment configuration information are searched to establish the corresponding test environment. When the XML environment configuration file is analyzed, the position of the configuration parameter corresponding to each testing environment can be obtained by searching the testing environment label, and then the testing environment attribute key words and key values which are consistent with the testing environment configuration information correspondingly are searched in the environment configuration parameters corresponding to each testing environment, so that the environment configuration parameters corresponding to the testing environments are determined.
In this optional embodiment, the XML environment configuration file may be obtained from the version management tool SVN in real time, so as to ensure that the XML environment configuration file is the latest configuration file, or the XML environment configuration file may be obtained from the terminal, however, the XML environment configuration file obtained from the terminal is often encrypted, and it is necessary to decrypt the XML environment configuration file first to obtain the original data.
In the optional embodiment, the environment configuration parameters have different types, such as a file type and a registration table type, different operations can be performed according to the specific types of the environment parameters to establish different test environments, and the test environment is established through the XML environment configuration file, so that the problem that the test environment is restored by the virtual machine due to long time consumption of operations such as recovering the virtual machine and copying the file is solved, and the efficiency of establishing the test environment is greatly improved.
S102, determining the calling relation of each service node based on the test environment and the test scene configuration information.
In this optional embodiment, the service node is each sub-process required for completing a complete test task, and if the test task is bank account opening, the corresponding sub-process submits account opening data, checks account opening data, registers customer data, and opens an account for the applicant, and the present scheme uses each sub-process as the service node.
In this optional embodiment, when determining the configuration information of the test environment, the call relationship of each corresponding service node may be obtained according to a preset correspondence between the scene to be tested and the configuration information of the test environment.
S103, acquiring the test sequence of each service node based on the calling relationship.
Illustratively, the calling relationship among the service nodes in the bank trust scenario is as follows: credit application, credit acceptance, credit investigation, credit examination, credit approval, contract signing, credit issuing, credit payment, post-credit management and credit recovery; the test sequence among the service nodes should also be credit application, credit acceptance, credit investigation, credit examination and approval, contract signing, credit release, credit payment, post-credit management and credit recovery.
Therefore, corresponding configuration can be completed on different test environments and test scenes according to the test request, the efficiency of establishing the test environments is improved, the corresponding call relation of each service node is determined, and the subsequent process can call corresponding test data quickly according to the call relation.
And S11, calling the service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image.
Referring to fig. 3, in an optional embodiment, invoking service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image includes:
and S111, establishing a test database to store the test data.
In this optional embodiment, a test database may be pre-established for storing the test data of each service node, and when the test database stores the test data of each service node, a uniform function tag may be set for the service node and the test data corresponding to the service node, so that the test data corresponding to the service node may be determined according to the function tag carried by each service node.
And S112, calling the test data in the test database based on the test sequence to acquire service test data.
In this optional embodiment, test data corresponding to each service node may be read from the test database according to a functional tag corresponding to each service node, if test data corresponding to the service node exists in the test database, the test data corresponding to the service node is called to serve as service test data, if test data corresponding to the service node does not exist in the test database, a test data requirement form corresponding to the service node is obtained based on the test database, corresponding original data is obtained from production data based on the test data requirement form, desensitization processing is performed on the original data to obtain the service test data, where data desensitization refers to performing data deformation on some sensitive information through a desensitization rule, so as to achieve reliable protection of sensitive private data. This allows the desensitized real data set to be used securely in development, testing and other non-production environments as well as outsourcing environments.
For example, the test database stores test data requirement lists corresponding to different service nodes and established test data, and after receiving a test request, the test database can search for the test data corresponding to each service node according to the function tag. And when the test data corresponding to the service node does not exist, searching the actual production data corresponding to the service node in the existing production data according to the test data requirement form in the test database and the corresponding relation between the service node and the production data. Then, desensitizing the obtained real production data according to rules in a data desensitization rule base pre-stored in a test database so as to be used for testing later.
And S113, converting the service test data into an image gray space according to a normalization algorithm to obtain a test image.
In an optional embodiment, the obtained test data of each service node is converted into an image gray scale space through a normalization algorithm, the image gray scale space is [ 0-255 ], and the image gray scale space is placed in the same test image, wherein the service test data of each service node respectively corresponds to different label types in the test image.
In this optional embodiment, a linear function normalization method may be used to normalize the service test data, where the normalization formula is:
Figure BDA0003668718880000101
wherein Xk norm Representing the normalized data value of the kth service test data, Xk representing the data value of the kth service test data, Xk max ,k min Respectively representing the maximum value and the minimum value in all the service test data.
And S114, calculating the service test data volume corresponding to each service node to obtain the service test data proportion.
In this optional embodiment, the service test data amount and the total amount of all the service test data corresponding to each service node in the test database are counted according to the functional label, so as to calculate the service test data proportion corresponding to each service node.
For example, there are A, B, C service nodes, and the amount of service test data corresponding to each service node is 100, 200, and 300, then the total amount of all service test data is 600, and the ratio of service test data corresponding to each service node is 1/6, 1/3, and 1/2.
And S115, dividing the test image based on the service test data proportion and each service node to obtain a classified test image.
As shown in fig. 6, in this optional embodiment, the test image is divided according to the service test data proportion and the corresponding proportion of each service node, each service node corresponds to one category in the test image, and the area of the test image is divided proportionally according to the service test data amount corresponding to each service node.
Illustratively, the test image has 3000 pixels, and the ratios of service test data corresponding to A, B, C service nodes are 1/6, 1/3 and 1/2, so that the test image is divided into A, B, C classes, and the area of each service node corresponding to the test image has 500 pixels, 1000 pixels and 1500 pixels.
In this optional embodiment, the test image may be a regular image, such as a square or a rectangle, or may be an irregular pattern, and the present scheme does not require the shape of the test image.
S116, setting the positions of the classified test images based on the test sequence to construct a first test image.
In this optional embodiment, when the service test data corresponding to each service node is converted into the test image, the pixel point positions of the test data corresponding to the service nodes may be set in the test image according to the calling relationship of each service node, and since one service node may have a calling relationship with a plurality of service nodes, the pixel point positions may be allocated according to the calling relationship between the current service node and the rest of the service nodes, wherein the margin line in the graph where the current service node is located may be divided according to the number of service test data called by the rest of the service nodes by the current service node, respectively.
For example, now there are four process nodes, the test image is correspondingly divided into four categories, namely 1, 2, 3 and 4, the category 1 and the categories 2, 3 and 4 in the test image have calling relations, and the data volume of the category 1 called by the categories 2, 3 and 4 is 30, 40 and 50, so that in the test image, the category 1 is connected with the category graphs of the categories 2, 3 and 4, and meanwhile, the edge lines at the common connection are divided in proportion, and the edge lines totally include 120 pixels, wherein 30 pixels are connected with the category 2, 40 pixels are connected with the category 3, and 50 pixels are connected with the category 4.
Illustratively, as shown in fig. 6, the test image is a square formed by ABCD, and there are 6 service nodes, which correspond to 6 types of service test data, respectively, and after being converted into the test image, correspond to graphs 1 to 6, wherein the service node 1 is fully invoked by the service node 2, so that the edge line EF of the graph 1 and the graph 2 are fully overlapped, the service node 2 needs to be invoked by the service node 3 and the service node 4, so that the edge line GH of the service node 2 is divided into two parts, GI and IH respectively, where point I is determined by the proportion of the amount of test data that service node 2 is invoked by service node 3 and service node 4, the areas of the graphs 1 to 6 are determined by the test data volume of the corresponding service node, and finally, the service test data corresponding to all the service nodes are converted into the test image shown in fig. 6, and the test image obtained after the conversion is used as the first test image.
In this optional embodiment, the number of pixel points on the edge line of each graph may be obtained by a connected domain analysis method, and the distribution may be performed according to a proportional relationship of the call data amount between the graphs, so as to determine the corresponding edge line boundary point, for example, the edge line boundary point between the graph 2 and the graphs 3 and 4 is I. The Connected Component Analysis (Connected Component Labeling) refers to finding and Labeling each Connected Region in an image, and in the scheme, the number of pixels on an edge line of each graph can be obtained by calling an OpenCV instruction corresponding to a Connected Component Analysis method.
Therefore, the acquired service test data are converted into the image domain and are correspondingly divided, the calling relation among all service nodes can be clearly and visually displayed, and the service test data can be conveniently and rapidly compared through an image matching algorithm in the subsequent process.
And S12, updating the initial application to obtain an updated application, and calling the test data corresponding to each service node of the updated application to construct a second test image.
In an optional embodiment, after a new version of the application code is deployed, the updated initial application is used as the updated application, the corresponding test scene is constructed again, and the test data corresponding to each service node is called again to construct the second test image corresponding to the new version, that is, the construction process of the second test image is consistent with the construction process of the first test image.
Therefore, the construction process of the second test image can be consistent with that of the first test image, and extra difference caused by inconsistent construction process is avoided, so that the real difference between the second test image and the first test image and the corresponding reason can be analyzed in the subsequent process.
And S13, comparing the first test image and the second test image to obtain a comparison result.
In an alternative embodiment, the first and second test images may be matched for comparison using a normalized cross-correlation matching algorithm to obtain a comparison result.
In this alternative embodiment, the Normalized Cross Correlation (NCC) matching algorithm is a classical statistical matching algorithm, and the degree of matching is determined by calculating the Cross Correlation value between the template image and the matching image, and in this embodiment, the first test image and the second test image are respectively used as the template image and the matching image.
In this alternative embodiment, the main steps of the normalized cross-correlation matching algorithm are:
1. carrying out average smoothing filtering on the first test image and the second test image to generate a normalized cross-correlation matrix;
2. obtaining the maximum value and corresponding index of each row and each column (one point in each image is relative to all corresponding points in the other image) according to the generated normalized cross-correlation matrix;
3. according to the result of 2, if the corresponding point indexes of the two images are consistent, the two images are a pair of initial matching point pairs;
4. and repeating the step 3 to circularly obtain all matched point pairs which are matched one by one.
In this alternative embodiment, the normalized cross-correlation matching algorithm obtains a matching result having a value range of [ -1,1], where a closer match result to 1 indicates a more relevant first test image and second test image, and a closer match result to-1 indicates a less relevant first test image and second test image.
In this optional embodiment, when the comparison result indicates a difference, a corresponding difference point is determined according to a pixel coordinate index value corresponding to the difference, and a currently corresponding service node is determined according to a graph category label corresponding to the difference point, so as to determine whether the application code portion of the new version corresponding to the current service node has a corresponding modification to determine whether the difference point is caused by a program defect of the new version code, and if so, the code of the portion corresponding to the service node is repaired.
In this optional embodiment, the newly added or changed content of the new version code and the service nodes that will be affected by the newly added or changed content may be affected and recorded in advance to obtain a difference record table, when there is a difference between the first test image and the second test image, it is determined whether the difference is a program defect by analyzing whether the difference exists in the difference record table, if the difference exists in the service node affected by the newly added or changed content, it indicates that the difference is not a program defect, and the difference may be ignored, and if the difference does not exist in the service node affected by the newly added or changed content, it indicates that the difference is caused by a program defect and needs to be repaired.
Therefore, whether the updated application program has defects or not can be quickly determined by establishing the difference record table, and the efficiency of checking the program defects by workers is greatly improved.
Referring to fig. 4, fig. 4 is a functional block diagram of a preferred embodiment of the data comparison testing apparatus based on artificial intelligence according to the present invention. The artificial intelligence based data comparison testing device 11 comprises an acquisition unit 110, a construction unit 111, an updating unit 112 and a comparison unit 113. A module/unit as referred to herein is a series of computer readable instruction segments capable of being executed by the processor 13 and performing a fixed function, and is stored in the memory 12. In the present embodiment, the functions of the modules/units will be described in detail in the following embodiments.
In an alternative embodiment, the obtaining unit 110 is configured to send a test request to the test terminal according to the server of the initial application to obtain the test sequence of each service node.
In an optional embodiment, the sending the test request to the test terminal according to the server of the initial application to obtain the test sequence of each service node includes:
constructing a test environment according to the test environment configuration information;
determining the calling relation of each service node based on the test environment and the test scene configuration information;
and acquiring the test sequence of each service node based on the calling relation.
In this optional embodiment, the test environment configuration information is used to configure a test environment of each service node, and the test scenario service information is used to configure a test scenario of each service node, where the test environment refers to a general term of computer hardware, software, network equipment, and historical data necessary for completing a test task, and the test scenario may be a financial transaction, such as a transfer service, a payment service, a large remittance service, and the like between banks.
In this optional embodiment, the test environment configuration information may include a test environment attribute keyword and a key value, where the test environment attribute keyword may be a test environment name keyword, a test environment identification keyword, and the like; the test scenario configuration information may include a test scenario attribute keyword and a key value, where the test scenario attribute keyword may be a test scenario name keyword, a test scenario identification keyword, and the like, and which type of keyword is specifically adopted to identify a target test environment and a target test scenario may be customized as required.
In this optional embodiment, different test tasks correspond to different test environment configuration information and test scenario configuration information. Before sending the test request, the server acquires the test environment configuration information and the test scene configuration information, and generates test task information according to the test environment configuration information and the test scene configuration information. The test environment configuration information and the test scenario configuration information can be generated by the configuration of operators in the background.
In this optional embodiment, the XML environment configuration file may be acquired, the XML environment configuration file may be analyzed to acquire the environment configuration parameters matched with the test environment configuration information, and the corresponding test environment may be established according to the environment configuration parameters.
In this optional embodiment, an XML (Extensible Markup Language) is a Markup Language used for making a Markup file have a structure, the XML environment configuration file carries environment configuration data through the XML Language, and may include environment configuration parameters corresponding to a plurality of different test environments, and distinguish the positions of the different test environments and the matched environment configuration parameters in the XML environment configuration file through a certain structure and a certain tag.
Because the tags of the XML are not predefined, the tags can be defined by self according to needs, so that the XML environment configuration file is more flexible and convenient to generate. During testing, the XML environment configuration file is only needed to be analyzed, and the target environment configuration parameters matched with the test environment configuration information are searched to establish the corresponding test environment. When the XML environment configuration file is analyzed, the position of the configuration parameter corresponding to each testing environment can be obtained by searching the testing environment label, and then the testing environment attribute key words and key values which are consistent with the testing environment configuration information are searched in the environment configuration parameters corresponding to each testing environment, so that the environment configuration parameters corresponding to the testing environments are determined.
In this optional embodiment, the XML environment configuration file may be obtained from the version management tool SVN in real time, so as to ensure that the XML environment configuration file is the latest configuration file, and the XML environment configuration file may also be obtained from the terminal.
In the optional embodiment, the environment configuration parameters have different types, such as a file type and a registration table type, different operations can be performed according to the specific types of the environment parameters to establish different test environments, and the test environment is established through the XML environment configuration file, so that the problem that the test environment is restored by the virtual machine due to long time consumption of operations such as recovering the virtual machine and copying the file is solved, and the efficiency of establishing the test environment is greatly improved.
In this optional embodiment, the service node is each sub-process required for completing a complete test task, and if the test task is bank account opening, the corresponding sub-process submits account opening data, checks account opening data, registers customer data, and opens an account for the applicant, and the present scheme uses each sub-process as the service node.
In this optional embodiment, when determining the configuration information of the test environment, the call relationship of each corresponding service node may be obtained according to a preset correspondence between the scene to be tested and the configuration information of the test environment.
Illustratively, the calling relationship among the service nodes in the bank trust scenario is as follows: the method comprises the following steps of credit granting application, credit granting acceptance, credit granting survey, credit granting examination and approval, contract signing, credit granting payment, post-credit management and credit granting recovery; the test sequence among the service nodes should also be credit application, credit acceptance, credit investigation, credit examination, contract approval, contract signing, credit issuing, credit payment, post-credit management and credit recovery.
In an optional embodiment, the constructing unit 111 is configured to invoke service test data corresponding to each service node of the initial application based on the test sequence to construct the first test image.
In an optional embodiment, the invoking, based on the test sequence, service test data corresponding to each service node of the initial application to construct a first test image includes:
establishing a test database to store test data;
calling the test data in the test database based on the test sequence to acquire service test data;
converting the service test data to an image gray space according to a normalization algorithm to obtain a test image;
calculating the service test data volume corresponding to each service node to obtain the service test data proportion;
dividing the test image based on the service test data proportion and each service node to obtain a classified test image;
the locations of the classified test images are set based on the test order to construct a first test image.
In this optional embodiment, a test database may be pre-established for storing the test data of each service node, and when the test database stores the test data of each service node, a uniform function tag may be set for the service node and the test data corresponding to the service node, so that the test data corresponding to the service node may be determined according to the function tag carried by each service node.
In this optional embodiment, test data corresponding to each service node may be read from the test database according to a functional tag corresponding to each service node, if test data corresponding to the service node exists in the test database, the test data corresponding to the service node is called to serve as service test data, if test data corresponding to the service node does not exist in the test database, a test data requirement form corresponding to the service node is obtained based on the test database, corresponding original data is obtained from production data based on the test data requirement form, desensitization processing is performed on the original data to obtain the service test data, where data desensitization refers to performing data deformation on some sensitive information through a desensitization rule, so as to achieve reliable protection of sensitive private data. This allows for safe use of the desensitized real data set in development, testing and other non-production environments as well as outsourcing environments.
For example, the test database stores test data requirement lists corresponding to different service nodes and established test data, and after receiving a test request, the test database can search for the test data corresponding to each service node according to the function tag. And when the test data corresponding to the service node does not exist, searching the actual production data corresponding to the service node in the existing production data according to the test data requirement form in the test database and the corresponding relation between the service node and the production data. Then, desensitizing the obtained real production data according to rules in a data desensitization rule base pre-stored in a test database so as to be used for testing later.
In an optional embodiment, the obtained test data of each service node is converted into an image gray scale space through a normalization algorithm, the image gray scale space is [ 0-255 ], and the image gray scale space is placed in the same test image, wherein the service test data of each service node respectively corresponds to different label types in the test image.
In this optional embodiment, a linear function normalization method may be used to normalize the service test data, where the normalization formula is:
Figure BDA0003668718880000171
wherein Xk norm Representing the normalized data value of the kth service test data, Xk representing the data value of the kth service test data, Xk max ,Xk min Respectively representing the maximum value and the minimum value in all the service test data.
In this optional embodiment, the service test data amount and the total amount of all the service test data corresponding to each service node in the test database are counted according to the functional label, so as to calculate the service test data proportion corresponding to each service node.
For example, there are A, B, C service nodes, and the amount of service test data corresponding to each service node is 100, 200, and 300, then the total amount of all service test data is 600, and the ratio of service test data corresponding to each service node is 1/6, 1/3, and 1/2.
As shown in fig. 6, in this optional embodiment, the test image is divided according to the service test data proportion and the corresponding proportion of each service node, each service node corresponds to one category in the test image, and the area of the test image is divided proportionally according to the service test data amount corresponding to each service node.
Illustratively, the test image has 3000 pixels, and the ratios of service test data corresponding to A, B, C service nodes are 1/6, 1/3 and 1/2, so that the test image is divided into A, B, C classes, and the area of each service node corresponding to the test image has 500 pixels, 1000 pixels and 1500 pixels.
In this optional embodiment, the test image may be a regular image, such as a square or a rectangle, or may be an irregular pattern, and the present solution does not require the shape of the test image.
In this optional embodiment, when the service test data corresponding to each service node is converted into the test image, the pixel point positions of the test data corresponding to the service nodes may be set in the test image according to the calling relationship of each service node, and since one service node may have a calling relationship with a plurality of service nodes, the pixel point positions may be allocated according to the calling relationship between the current service node and the rest of the service nodes, where the edge line in the graph where the current service node is located may be divided according to the number of service test data called by the rest of the service nodes by the current service node.
For example, now there are four process nodes, the test image is correspondingly divided into four categories 1, 2, 3, and 4, the category 1 in the test image has a calling relationship with the categories 2, 3, and 4, and the data amount called by the categories 2, 3, and 4 in the category 1 is 30, 40, and 50, the category 1 in the test image is connected with the category graphs of the categories 2, 3, and 4, and meanwhile, the edge lines at the common connection are divided in proportion, and the edge lines include 120 pixels in total, wherein 30 pixels are connected with the category 2, 40 pixels are connected with the category 3, and 50 pixels are connected with the category 4.
Illustratively, as shown in fig. 6, the test image is a square formed by ABCD, and there are 6 service nodes, which correspond to 6 types of service test data, respectively, and after being converted into the test image, correspond to graphs 1 to 6, wherein the service node 1 is fully invoked by the service node 2, so that the edge line EF of the graph 1 and the graph 2 are fully overlapped, the service node 2 needs to be invoked by the service node 3 and the service node 4, so that the edge line GH of the service node 2 is divided into two parts, GI and IH respectively, where point I is determined by the ratio of the amount of test data that service node 2 is invoked by service node 3 and service node 4, the areas of the graphs 1 to 6 are determined by the test data volume of the corresponding service node, and finally, the service test data corresponding to all the service nodes are converted into the test image shown in fig. 6, and the test image obtained after the conversion is used as the first test image.
In this optional embodiment, the number of pixel points on the edge line of each graph may be obtained by a connected domain analysis method, and the distribution may be performed according to a proportional relationship of the call data amount between the graphs, so as to determine the corresponding edge line boundary point, for example, the edge line boundary point between the graph 2 and the graphs 3 and 4 is I. The Connected Component Analysis (Connected Component Labeling) refers to finding and Labeling each Connected Region in an image, and in the scheme, the number of pixels on an edge line of each graph can be obtained by calling an OpenCV instruction corresponding to a Connected Component Analysis method.
In an alternative embodiment, the updating unit 112 is configured to update the initial application to obtain an updated application, and call the test data corresponding to each service node of the updated application to construct the second test image.
In an optional embodiment, after a new version of the application code is deployed, the updated initial application is used as the updated application, the corresponding test scene is constructed again, and the test data corresponding to each service node is called again to construct the second test image corresponding to the new version, that is, the construction process of the second test image is consistent with the construction process of the first test image.
In an alternative embodiment, the comparison unit 113 is configured to compare the first test image and the second test image to obtain a comparison result.
In an alternative embodiment, the first and second test images may be matched for comparison using a normalized cross-correlation matching algorithm to obtain a comparison result.
In this alternative embodiment, a Normalized Cross Correlation (NCC) matching algorithm is a classical statistical matching algorithm, and the degree of matching is determined by calculating a Cross Correlation value between a template image and a matching image, in which the first test image and the second test image are respectively used as the template image and the matching image.
In this alternative embodiment, the main steps of the normalized cross-correlation matching algorithm are:
1. carrying out average smoothing filtering on the first test image and the second test image to generate a normalized cross-correlation matrix;
2. obtaining the maximum value and corresponding index of each row and each column (one point in each image is relative to all corresponding points in the other image) according to the generated normalized cross-correlation matrix;
3. according to the result of 2, if the corresponding point indexes of the two images are consistent, the two images are a pair of initial matching point pairs;
4. and repeating the step 3 to circularly obtain all matched point pairs which are matched one by one.
In this alternative embodiment, the normalized cross-correlation matching algorithm obtains a matching result having a value range of [ -1,1], where a closer match result to 1 indicates a more relevant first test image and second test image, and a closer match result to-1 indicates a less relevant first test image and second test image.
In this optional embodiment, when the comparison result indicates that there is a difference, a corresponding difference point is determined according to a pixel coordinate index value corresponding to the difference, and a currently corresponding service node is determined according to a graph category label corresponding to the difference point, so as to determine whether a new version of an application code portion corresponding to the current service node has a corresponding modification to determine whether the difference point is caused by a program defect of a new version of the code, and if so, the code of the corresponding portion of the service node is repaired.
In this optional embodiment, the newly added or changed content of the new version code and the service nodes that will be affected by the newly added or changed content may be affected and recorded in advance to obtain a difference record table, when there is a difference between the first test image and the second test image, it is determined whether the difference is a program defect by analyzing whether the difference exists in the difference record table, if the difference exists in the service node affected by the newly added or changed content, it indicates that the difference is not a program defect, and the difference may be ignored, and if the difference does not exist in the service node affected by the newly added or changed content, it indicates that the difference is caused by a program defect and needs to be repaired.
According to the technical scheme, the obtained different types of test data can be converted into the image domain for batch comparison test according to the test sequence of each service node, and the comparison test efficiency of the batch data can be effectively improved.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 1 comprises a memory 12 and a processor 13. The memory 12 is used for storing computer readable instructions, and the processor 13 is used for executing the computer readable instructions stored in the memory to implement the artificial intelligence based data comparison testing method according to any one of the above embodiments.
In an alternative embodiment, the electronic device 1 further comprises a bus, a computer program stored in said memory 12 and executable on said processor 13, such as an artificial intelligence based data comparison test program.
Fig. 5 shows only the electronic device 1 with the memory 12 and the processor 13, and it will be understood by those skilled in the art that the structure shown in fig. 5 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than shown, or a combination of certain components, or a different arrangement of components.
In conjunction with fig. 1, the memory 12 in the electronic device 1 stores a plurality of computer-readable instructions to implement an artificial intelligence based data comparison test method, and the processor 13 can execute the plurality of instructions to implement:
sending a test request to a test terminal according to the initially applied server to obtain a test sequence of each service node;
calling service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image;
updating the initial application to obtain an updated application, and calling test data corresponding to each service node of the updated application to construct a second test image;
and comparing the first test image with the second test image to obtain a comparison result.
Specifically, the processor 13 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the instruction, which is not described herein again.
It will be understood by those skilled in the art that the schematic diagram is only an example of the electronic device 1, and does not constitute a limitation to the electronic device 1, the electronic device 1 may have a bus-type structure or a star-shaped structure, the electronic device 1 may further include more or less hardware or software than those shown in the figures, or different component arrangements, for example, the electronic device 1 may further include an input and output device, a network access device, etc.
It should be noted that the electronic device 1 is only an example, and other existing or future electronic products, such as may be adapted to the present application, should also be included in the scope of protection of the present application, and is included by reference.
Memory 12 includes at least one type of readable storage medium, which may be non-volatile or volatile. The readable storage medium includes flash memory, removable hard disks, multimedia cards, card type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disks, optical disks, etc. The memory 12 may in some embodiments be an internal storage unit of the electronic device 1, e.g. a removable hard disk of the electronic device 1. The memory 12 may also be an external storage device of the electronic device 1 in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 1. The memory 12 may be used not only for storing application software installed in the electronic device 1 and various types of data, such as codes of artificial intelligence based data comparison test programs, etc., but also for temporarily storing data that has been output or is to be output.
The processor 13 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 13 is a Control Unit (Control Unit) of the electronic device 1, connects various components of the whole electronic device 1 by using various interfaces and lines, and executes various functions and processes data of the electronic device 1 by running or executing programs or modules (for example, executing data comparison test programs based on artificial intelligence, etc.) stored in the memory 12 and calling data stored in the memory 12.
The processor 13 executes an operating system of the electronic device 1 and various installed application programs. The processor 13 executes the application program to implement the steps in each of the above embodiments of artificial intelligence based data comparison testing methods, such as the steps shown in fig. 1 to 3.
Illustratively, the computer program may be partitioned into one or more modules/units, which are stored in the memory 12 and executed by the processor 13 to accomplish the present application. The one or more modules/units may be a series of computer readable instruction segments capable of performing certain functions, which are used to describe the execution of the computer program in the electronic device 1. For example, the computer program may be divided into an acquisition unit 110, a construction unit 111, an update unit 112, a comparison unit 113.
The integrated unit implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a computer device, or a network device, etc.) or a processor (processor) to execute parts of the artificial intelligence based data comparison testing method according to the embodiments of the present application.
The integrated modules/units of the electronic device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of the embodiments of the methods described above may be implemented.
Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), random-access Memory and other Memory, etc.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
The block chain referred by the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one arrow is shown in FIG. 5, but this does not indicate only one bus or one type of bus. The bus is arranged to enable connection communication between the memory 12 and at least one processor 13 etc.
The embodiment of the present application further provides a computer-readable storage medium (not shown), in which computer-readable instructions are stored, and the computer-readable instructions are executed by a processor in an electronic device to implement the artificial intelligence based data comparison testing method according to any of the above embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
Furthermore, it will be obvious that the term "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the specification may also be implemented by one unit or means through software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present application and not for limiting, and although the present application is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present application without departing from the spirit and scope of the technical solutions of the present application.

Claims (10)

1. A data comparison testing method based on artificial intelligence is characterized by comprising the following steps:
sending a test request to a test terminal according to the initially applied server to obtain a test sequence of each service node;
calling service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image;
updating the initial application to obtain an updated application, and calling test data corresponding to each service node of the updated application to construct a second test image;
and comparing the first test image with the second test image to obtain a comparison result.
2. The artificial intelligence based data contrast test method according to claim 1, wherein the test request includes test environment configuration information and test scenario configuration information, and the sending of the test request to the test terminal according to the initially applied server to obtain the test sequence of each service node includes:
constructing a test environment according to the test environment configuration information;
determining the calling relationship of each service node based on the test environment and the test scene configuration information;
and acquiring the test sequence of each service node based on the calling relation.
3. The artificial intelligence based data contrast test method of claim 2, wherein said constructing a test environment according to the test environment configuration information comprises:
acquiring an environment configuration file according to a preset mode;
analyzing the environment configuration file to obtain environment configuration data;
matching the environment configuration data with the test environment configuration information to obtain environment configuration parameters;
and constructing the test environment based on the environment configuration parameters.
4. The artificial intelligence based data contrast test method according to claim 1, wherein the invoking of the business test data corresponding to each business node of the initial application based on the test order to construct a first test image comprises:
establishing a test database to store test data;
calling the test data in the test database based on the test sequence to acquire service test data;
converting the service test data to an image gray space according to a normalization algorithm to obtain a test image;
calculating the service test data volume corresponding to each service node to obtain the service test data proportion;
dividing the test image based on the service test data proportion and each service node to obtain a classified test image;
the locations of the classified test images are set based on the test order to construct a first test image.
5. The artificial intelligence based data contrast test method of claim 4, wherein said invoking test data in the test database to obtain business test data based on the test order comprises:
judging whether test data corresponding to the service node exists in the test database or not;
if yes, calling test data corresponding to the service node to serve as service test data;
if the test data does not exist, a test data demand form corresponding to the service node is obtained based on the test database, corresponding original data is obtained from production data based on the test data demand form, and desensitization processing is carried out on the original data to obtain service test data.
6. The artificial intelligence based data contrast test method of claim 5, wherein said positioning the classified test images based on the test order to construct a first test image comprises:
acquiring a service node calling relation based on the test sequence;
counting the service test data volume of each service node called by other service nodes based on the service node calling relation to obtain a service test data calling proportion;
and determining the position of the classified test image based on the service test data calling proportion and the service node calling relation so as to construct a first test image.
7. The artificial intelligence based data contrast test method of claim 1, wherein the contrast results include differences and no differences, and after comparing the first test image and the second test image to obtain the contrast results, the method further comprises:
recording difference content between the initial application and the updated application according to a preset mode to obtain a difference record table;
if the comparison result is different, judging whether the content corresponding to the difference exists in the difference record table;
if the content corresponding to the difference does not exist in the difference record table, indicating that the updated application program has defects and needs to be repaired;
and if the content corresponding to the difference exists in the difference record table, indicating that the program of the updated application has no defect, and ignoring the difference.
8. The utility model provides a data contrast testing arrangement based on artificial intelligence which characterized in that includes:
the system comprises an acquisition unit, a test unit and a test unit, wherein the acquisition unit is used for sending a test request to a test terminal according to an initially applied server so as to acquire a test sequence of each service node;
the construction unit is used for calling the service test data corresponding to each service node of the initial application based on the test sequence to construct a first test image;
the updating unit is used for updating the initial application to obtain an updated application and calling the test data corresponding to each service node of the updated application to construct a second test image;
and the comparison unit is used for comparing the first test image with the second test image to obtain a comparison result.
9. An electronic device, comprising:
a memory storing computer readable instructions; and
a processor executing computer readable instructions stored in the memory to implement the artificial intelligence based data comparison testing method of any of claims 1-7.
10. A computer-readable storage medium having computer-readable instructions stored thereon which, when executed by a processor, implement the artificial intelligence based data contrast testing method of any of claims 1-7.
CN202210598502.3A 2022-05-30 2022-05-30 Data comparison test method based on artificial intelligence and related equipment Pending CN114924982A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210598502.3A CN114924982A (en) 2022-05-30 2022-05-30 Data comparison test method based on artificial intelligence and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210598502.3A CN114924982A (en) 2022-05-30 2022-05-30 Data comparison test method based on artificial intelligence and related equipment

Publications (1)

Publication Number Publication Date
CN114924982A true CN114924982A (en) 2022-08-19

Family

ID=82813569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210598502.3A Pending CN114924982A (en) 2022-05-30 2022-05-30 Data comparison test method based on artificial intelligence and related equipment

Country Status (1)

Country Link
CN (1) CN114924982A (en)

Similar Documents

Publication Publication Date Title
CN112669138B (en) Data processing method and related equipment
CN110427375B (en) Method and device for identifying field type
CN111666346A (en) Information merging method, transaction query method, device, computer and storage medium
CN111598850A (en) Data auditing method and system
CN110941488A (en) Task processing method, device, equipment and storage medium
CN112671609A (en) Asset census and safety detection method and device and terminal equipment
CN115049878A (en) Target detection optimization method, device, equipment and medium based on artificial intelligence
CN113918467A (en) Financial system testing method, device, equipment and storage medium
CN111680110B (en) Data processing method, data processing device, BI system and medium
CN109947797B (en) Data inspection device and method
CN108846292A (en) Desensitization process method and device
CN111639903A (en) Review processing method for architecture change and related equipment
CN111612616A (en) Block chain account evaluation method and device, terminal device and computer readable medium
CN111242779A (en) Financial data characteristic selection and prediction method, device, equipment and storage medium
CN111061733A (en) Data processing method and device, electronic equipment and computer readable storage medium
CN116150185A (en) Data standard extraction method, device, equipment and medium based on artificial intelligence
CN114924982A (en) Data comparison test method based on artificial intelligence and related equipment
CN113434397B (en) Task system testing method and device, electronic equipment and storage medium
CN113282837B (en) Event analysis method, device, computer equipment and storage medium
CN112001792B (en) Configuration information consistency detection method and device
CN113504865A (en) Work order label adding method, device, equipment and storage medium
CN111859985A (en) AI customer service model testing method, device, electronic equipment and storage medium
CN113283677A (en) Index data processing method, device, equipment and storage medium
CN111882415A (en) Training method and related device of quality detection model
CN117076546B (en) Data processing method, terminal device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination