CN115203027A - Test method and device based on data integration model - Google Patents

Test method and device based on data integration model Download PDF

Info

Publication number
CN115203027A
CN115203027A CN202210745096.9A CN202210745096A CN115203027A CN 115203027 A CN115203027 A CN 115203027A CN 202210745096 A CN202210745096 A CN 202210745096A CN 115203027 A CN115203027 A CN 115203027A
Authority
CN
China
Prior art keywords
data
database
model
file
definition file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210745096.9A
Other languages
Chinese (zh)
Inventor
梁晓珺
朱乐和
罗秉安
连煜伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202210745096.9A priority Critical patent/CN115203027A/en
Publication of CN115203027A publication Critical patent/CN115203027A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45504Abstract machines for programme code execution, e.g. Java virtual machine [JVM], interpreters, emulators
    • G06F9/45508Runtime interpretation or emulation, e g. emulator loops, bytecode interpretation
    • G06F9/45512Command shells

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Debugging And Monitoring (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The disclosure provides a test method based on a data integration model, which can be applied to the technical field of finance. The method comprises the following steps: configuring each database connection into a data integration model according to the database configuration file; inputting a business data processing flow chart into the data integration model to generate a target model definition file, wherein the target model definition file is used for representing the data relation between the input and output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information; generating at least one test script case according to the target model definition file and the data integration model; and executing the test script case to output a visual test result. The present disclosure also provides a test apparatus, a device, a storage medium, and a program product based on the data integration model.

Description

Test method and device based on data integration model
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to the field of distributed service testing technologies, and in particular, to a testing method, apparatus, device, storage medium, and program product based on a data integration model.
Background
With the development of distributed technology, distributed platform services are widely applied to various applications, and accordingly, higher requirements are put on testing of the distributed platform services. In order to provide reusability of services, a single service often performs a relatively independent function, and a complex business scenario often requires combining multiple services to perform. For example: in a multi-service integration environment, a service for implementing complex banking often needs to combine services of a plurality of different applications to implement a business processing process thereof. These combined, unapplied services often require access to and processing of data from multiple, disparate databases.
In the related art, the distributed service test is performed manually, and a client connected with a plurality of databases needs to be installed manually to connect the plurality of databases. Performing pre-service data preparation: manually inquiring data of a plurality of databases to register service data before transaction, or to change or delete data before transaction. Performing a post-service data check: pre-test data preparation and post-transaction data reconciliation were performed manually. And (3) performing data recovery after service: data changes or data deletions are made manually so that subsequent transactions can be better performed. The testing method needs testers to be skilled in mastering various database operations, and is not beneficial to improving the testing efficiency.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
In view of the foregoing, the present disclosure provides a data integration model-based testing method, apparatus, device, medium, and program product that improve testing efficiency.
According to a first aspect of the present disclosure, there is provided a test method based on a data integration model, including:
connecting and configuring each database into a data integration model according to the database configuration file;
inputting a business data processing flow chart into the data integration model to generate a target model definition file, wherein the target model definition file is used for representing the data relation between the input and the output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information;
generating at least one test script case according to the target model definition file and the data integration model; and
and executing the test script case to output a visual test result.
According to an embodiment of the present disclosure, the inputting the business data processing flow diagram into the data integration model to generate a target model definition file includes:
identifying an operation identifier in the business data processing flow chart;
determining the data relation between the service input and output field to be tested and each database table field according to the operation identification; and
and generating a target model definition file according to the data relation and the database data operation analysis in the data integration model, wherein the target model definition file is in an XML format.
According to an embodiment of the present disclosure, the generating at least one test script case according to the object model definition file and the data integration model includes:
and generating a test script Excel file and a test script execution Java file according to the target model definition file and the data integration model.
According to the embodiment of the disclosure, the data integration model comprises a data preparation sub-model, a data assertion sub-model and a data recovery sub-model, and the generating of the test script Excel file according to the target model definition file and the data integration model comprises the following steps:
generating a data preparation test script Excel file according to the target model definition file and the data preparation sub-model;
generating a data assertion test script Excel file according to the target model definition file and the data assertion sub-model;
generating a data recovery test script Excel file according to the target model definition file and the data recovery sub-model; and
and generating a test script Excel file according to the data preparation test script Excel file, the data assertion test script Excel file and the data recovery test script Excel file.
According to an embodiment of the present disclosure, the database configuration file includes a database definition XML file, an attribute definition file, and a routing definition file, and configuring each database connection into the data integration model according to the database configuration file includes:
determining data source connection attributes according to the database definition XML file, wherein the data source connection attributes comprise database names, database connection information and database login verification information;
determining an attribute value corresponding to the data source connection attribute according to the attribute definition file; and
and connecting and configuring each database to the data integration model according to the attribute value corresponding to the data source connection attribute.
According to an embodiment of the present disclosure, the databases include Oracle, mysql, and DB2.
A second aspect of the present disclosure provides a test apparatus based on a data integration model, including: the database configuration module is used for connecting and configuring each database into the data integration model according to the database configuration file;
the first generation module is used for inputting a business data processing flow chart into the data integration model so as to generate a target model definition file, the target model definition file is used for representing the data relation between the input and output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information;
the second generation module is used for generating at least one test script case according to the target model definition file and the data integration model; and
and the execution module is used for executing the test script case so as to output a visual test result.
According to an embodiment of the present disclosure, the first generating module includes:
the identification submodule is used for identifying the operation identifier in the business data processing flow chart;
the first determining submodule is used for determining the data relation between the input and output field of the service to be tested and each database table field according to the operation identification; and
and the first generation submodule is used for generating a target model definition file according to the data relation and database data operation analysis in the data integration model, and the target model definition file is in an XML format.
According to an embodiment of the present disclosure, the second generating module includes:
and the second generation submodule is used for generating a test script Excel file and a test script execution Java file according to the target model definition file and the data integration model.
According to an embodiment of the present disclosure, the second generation submodule includes:
the first generation unit is used for generating a data preparation test script Excel file according to the target model definition file and the data preparation sub-model;
the second generation unit is used for generating a data assertion test script Excel file according to the target model definition file and the data assertion sub-model;
the third generation unit is used for generating a data recovery test script Excel file according to the target model definition file and the data recovery sub-model; and
and the fourth generating unit is used for generating a test script Excel file according to the data preparation test script Excel file, the data assertion test script Excel file and the data recovery test script Excel file.
According to an embodiment of the present disclosure, a database configuration module includes:
the second determining submodule is used for determining a data source connection attribute according to the database definition XML file, wherein the data source connection attribute comprises database names, database connection information and database login verification information;
a third determining submodule, configured to determine, according to the attribute definition file, an attribute value corresponding to the data source connection attribute; and
and the connection configuration submodule is used for connecting and configuring each database to the data integration model according to the attribute value corresponding to the data source connection attribute.
A third aspect of the present disclosure provides an electronic device, comprising: one or more processors; a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the above-described data integration model-based testing method.
The fourth aspect of the present disclosure also provides a computer-readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform the above-mentioned data integration model-based testing method.
A fifth aspect of the present disclosure also provides a computer program product comprising a computer program which, when executed by a processor, implements the above-described data integration model-based testing method.
According to the test method based on the data integration model, all databases are connected and configured into the data integration model through the database configuration file, the business data processing flow diagram is generated into the target model definition file based on the data integration model, the target model definition file is input into the data integration model to generate the test script case, and the test script case is executed to output a visual test result. Compared with the related art, the testing method based on the data integration model does not need testers to be familiar with multi-database operation and focus on the circulation process of business data, the workload of manual testing is reduced, and the testing efficiency is improved.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following description of embodiments of the disclosure, which proceeds with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates an application scenario diagram of a data integration model based testing method, apparatus, device, medium and program product according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow diagram of a data integration model based testing method according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart of a database connection configuration method according to an embodiment of the present disclosure;
FIG. 4a schematically illustrates a business data processing flow diagram according to an embodiment of the disclosure;
FIG. 4b schematically shows a flow chart of a method of generating an object model definition file according to an embodiment of the present disclosure;
FIG. 5a is a flowchart of a method for generating a test script case according to an embodiment of the present disclosure;
fig. 5b is a second flowchart of a method for generating a test script case according to an embodiment of the disclosure;
FIG. 6a schematically illustrates a data manipulation diagram of a data preparation submodel according to an embodiment of the present disclosure;
FIG. 6b schematically illustrates a data operation diagram of a data assertion sub-model according to an embodiment of the disclosure;
FIG. 6c schematically illustrates a data operation diagram of a data recovery submodel according to an embodiment of the disclosure;
FIG. 7 is a block diagram schematically illustrating a configuration of a test apparatus based on a data integration model according to an embodiment of the present disclosure; and
FIG. 8 schematically illustrates a block diagram of an electronic device suitable for implementing a data integration model-based testing method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).
The terms appearing in the present disclosure are explained first:
a data integration model: in the embodiment of the disclosure, models are established for the test data query, change and deletion models of different databases and the data relations of different databases. The method mainly comprises three type submodels: a data preparation submodel, a data assertion submodel, and a data recovery submodel.
Apache POI technique: the API provides the functions of reading and writing Microsoft Office format files for Java programs.
MySQL: is a relational database management system. MySQL is one of the most popular Relational Database Management systems, and in terms of WEB applications, mySQL is one of the best RDBMS (Relational Database Management System) application software.
IBM DB2: is a set of relational database management system developed by IBM corporation, whose main operating environments are UNIX (including IBM's own AIX), linux, IBMi (old called OS/400), z/OS, and Windows server versions. DB2 is mainly applied to large application systems, has better scalability, can support environments from mainframes to single users, and is applied to all common server operating system platforms.
Oracle: a relational database management system. The method is a high-efficiency and high-reliability database which is suitable for high throughput.
In order to provide reusability of services, a single service often performs a relatively independent function, and a complex business scenario often requires combining multiple services to perform. For example: in a multi-service integration environment, a service for implementing complex banking often needs to combine services of a plurality of different applications to implement a business processing process thereof. These combined, unapplied services often require access to and processing of data from multiple different databases. For example: the purchase of a structured deposit product requires access to the personal financial system host (DB 2 database) and platform (MySql database) for media and funds checking, to the customer information system (MySql database) for customer information checking, to the asset system (Oracle database) for customer asset checking, and so on. If manual testing is used, a client connected with a plurality of databases needs to be manually installed and connected with the plurality of databases. Performing pre-service data preparation: and manually inquiring data of the databases to register service data before transaction, or change or delete data before transaction. Performing a post-service data check: pre-test data preparation and post-transaction data reconciliation were performed manually. And (3) performing data recovery after service: data changes or data deletions are made manually so that subsequent transactions can be better performed.
The above test method has the following problems:
1. when a plurality of database clients need to be installed and tested, a plurality of databases need to be accessed simultaneously, the technical requirements on testers are high, and the testers need to be skilled in mastering various database operations. The technical requirements on testers are increased, and the technical level of the testers greatly influences the test effect. The main body of the tester is not concentrated on combing the business data flow relationship.
2. The business data processing logic in the business scene does not have a clear and visible view, which is not beneficial to the evaluation of the test scheme and the reproduction of the test scene.
3. The operation on different databases is manually executed each time, and a large amount of repeated manual test operation exists on the database operation in different scenes, which is not beneficial to improving the test efficiency. Especially, when the modification of a certain component causes the regression test of all the function scenes, the repeated manual test needs to pay huge workload.
Based on the above technical problem, an embodiment of the present disclosure provides a test method based on a data integration model, including: connecting and configuring each database into a data integration model according to the database configuration file; inputting a business data processing flow chart into the data integration model to generate a target model definition file, wherein the target model definition file is used for representing the data relation between the input and output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information; generating at least one test script case according to the target model definition file and the data integration model; and executing the test script case to output a visual test result.
FIG. 1 schematically illustrates an application scenario diagram of a data integration model based testing method, apparatus, device, medium and program product according to an embodiment of the present disclosure.
As shown in FIG. 1, an application scenario 100 according to this embodiment may comprise a distributed microservice test scenario. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The test person may use the terminal devices 101, 102, 103 to interact with the server 105 over the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as shopping-like applications, web browser applications, search-like applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a test case generation server, which generates a test case according to the business data flow information input by the tester, for example, in response to a test instruction sent by the tester using the terminal devices 101, 102, and 103, the test case generation server may analyze and process the received business data flow information, input a preset data integration model, and automatically generate the test case.
It should be noted that the data integration model-based testing method provided by the embodiment of the present disclosure may be generally executed by the server 105. Accordingly, the data integration model-based test device provided by the embodiment of the present disclosure may be generally disposed in the server 105. The test method based on the data integration model provided by the embodiment of the present disclosure may also be executed by a server or a server cluster which is different from the server 105 and can communicate with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the test device based on the data integration model provided by the embodiment of the present disclosure may also be disposed in a server or a server cluster different from the server 105 and capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation.
It should be noted that the test method and apparatus based on the data integration model determined in the embodiments of the present disclosure may be applied to the automated test field in the financial field, and may also be applied to any field other than the financial field.
The data integration model-based test method according to the embodiment of the present disclosure will be described in detail below with reference to fig. 2 to 6c based on the scenario described in fig. 1.
FIG. 2 schematically shows a flow diagram of a data integration model-based testing method according to an embodiment of the present disclosure. As shown in fig. 2, the test method based on the data integration model of this embodiment includes operations S210 to S240, and the method may be performed by an automatic test tool, which is developed using the data integration model based on TestNG and is a unified-view multiple database integration automatic test tool. A user can use the tool to simulate the data processing flow of the whole service by establishing a data integration model, then the model is translated into a service data processing flow code by a background, and a test script with a unified style is automatically generated according to the service data processing flow code, so that the requirements of testers on the operation capacity of various databases are greatly reduced. Secondly, the tool utilizes the data integration model to carry out automatic testing, and testers can express various database operations and data relation comparison among various databases in a visual mode of the model, so that a testing scheme and case evaluation are facilitated, and a business data processing flow is accurately reproduced. In addition, various convenient functions and assertion templates are provided by matching the tool, and testers can use the method of filling EXCEL, so that the method is convenient and quick, and the code amount of the codes is greatly reduced. The tool also supports EXCEL fill-in data preparation and data recovery data, ready for automated testing. Meanwhile, the tool supports EXCEL filling assertion, and testers can customize personalized assertion according to the actual service data processing requirements. The specific test procedure will be described below with reference to operations S210 to S240.
In operation S210, each database connection is configured into the data integration model according to the database configuration file.
In one example, in order to enable a tester to concentrate on the combing of business data streams, in the data integration model, data relationships accessing different databases are integrated into the model, and for a test user, the test user does not need to care which database the data comes from, and only needs to test whether the data streams of the service are correct according to business processing logic. Before testing, database parameters are configured according to database information required to be used in a business data flow to generate a database configuration file, and databases of various types are configured in a data integration model in a connected mode according to the database configuration file. The specific process can be seen from operation S211 to operation S213 shown in fig. 3. And will not be described in detail herein.
In operation S220, a business data processing flowchart is input to the data integration model to generate a target model definition file.
According to the embodiment of the disclosure, the target model definition file is used for representing data relations between the input and output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information.
In one example, the business data processing flow chart is determined according to a data integration model and an actual business data processing flow, the business data processing flow chart is used as input, the data integration model is input, and a target model definition file is generated after translation, the target model definition file is used for integrating data relations between service input/output and each database table, for example, data logic processing related to query, change and deletion of data in the database, and the data flows are related to specific business service processing logic. The generation process of the object model definition file may refer to operations S221 to S223 shown in fig. 4, which are not described herein again.
In operation S230, at least one test script case is generated according to the object model definition file and the data integration model.
In one example, the automatic test tool translates the executable test script case according to the object model definition file, and the specific operations are as follows: and generating a test script Excel of the service and a Java file executed by the service by taking the object model definition file as an input object. The test script Excel is generated by operating Microsoft Excel through Apache POI, and the Java file executed by the service is a JAVA script generated by the input and output stream technology of JAVA and executed by the service. The specific process may refer to operation S231 shown in fig. 5.
In operation S240, the test script case is executed to output a visual test result.
In one example, a test case script is executed, test results are output, and results are verified according to a test scenario after the case script is executed. The tester fills in the query conditions and expected values in the database table page. The tool reads assertion data by using Apache POI and explains the assertion, accesses a database through JDBC, inquires, deletes and changes the database data, stores a judgment result into a color-changing object, records a script running result Excel through the Apache POI, marks a table field passing the assertion as green, and marks the table field passing the assertion as red.
Specifically, the SQL judgment processing result is spliced and stored in the color changing object, the script operation result Excel is recorded through the Apache POI, the SQL sentence which is successfully executed is marked as green, and the SQL sentence which is failed to be executed is marked as yellow. Synchronizing partial data in dataArray to Input or request object, obtaining request object from Spring and requesting service to upload data, storing service return result in outputMap, recording script operation result Excel through Apache POI, marking green if output reaches expected value, otherwise marking red. The tool reads assertion data by using Apache POI and explains the assertion, accesses a database through JDBC, inquires, deletes, changes the database data, stores a judgment result into a color-changing object, records a script running result Excel through the Apache POI, marks a table field passing the assertion as green, and marks the table field passing the assertion as red. And reading data recovery data by using an Apache POI, interpreting a program, accessing a database through JDBC, inquiring, deleting, changing database data, storing a running result into a color-changing object, and recording a script running result Excel through the Apache POI.
According to the testing method based on the data integration model, all databases are connected and configured into the data integration model through the database configuration file, the business data processing flow diagram is generated into the target model definition file based on the data integration model, the target model definition file is input into the data integration model to generate the testing script case, and the testing script case is executed to output the visual testing result. Compared with the related art, the test method based on the data integration model does not need testers to be familiar with multi-database operation, and focuses on the circulation process of business data, so that the workload of manual test is reduced, and the test efficiency is improved.
Fig. 3 schematically shows a flow chart of a database connection configuration method according to an embodiment of the present disclosure. As shown in fig. 3, operation S210 includes operations S211 to S213.
In operation S211, a data source connection attribute is determined according to the database definition XML file.
According to the embodiment of the disclosure, the data source connection attribute includes each database name, database connection information, and database login verification information.
In operation S212, an attribute value corresponding to the data source connection attribute is determined according to the attribute definition file.
In operation S213, each database connection is configured to the data integration model according to the attribute value corresponding to the data source connection attribute.
According to an embodiment of the present disclosure, the database configuration file includes a database definition XML file, an attribute definition file, and a route definition file.
According to an embodiment of the present disclosure, each database includes Oracle, mysql, and DB2.
In one example, each database is connected with two major components: configuring data source connection attributes and a data source session factory. The data source connection attribute is configured, because a large-scale development enterprise can relate to multi-version test, and often because a service without a version needs to be tested, a database with different addresses needs to be connected, so that the data source connection attribute can flexibly read values in an attribute definition file for configuration, and the variable name is consistent with the attribute definition file. Such as: database name: driver Class Name defines the variable $ { driver Class Name }, database connections (database IP and port): url defines the variable $ { PRDS _ APP _ ACCESS _1url }, database username: username defines the variable $ { username }, database password: password defines a variable $ { password }. While the fixed, infrequently modified attribute definitions of the database may be configured using fixed parameter values. Such as: maximum number of active connections: maxActive may be fixedly configured as 50. The method comprises the following steps that a configurator configures database connection parameters in an attribute definition file, and comprises the following steps: database Name driver Class Name = com. Mysql. Jdbc. Driver, database connection (database IP and port), database username: username = appuser, database password: password = app, etc.
The route definition file is mainly used for the data access mode of the sub-SET (SET slice) of the existing distributed database, and the file can be routed to the corresponding SET (SET slice) for data operation according to the sub-SET rule defined by the system. In current distributed databases, such as: the MySQL database supports the division of a table data into a plurality of SET (SET object) for storage, so as to make up the centralized access of the centralized database, easily generate the situations of access blockage and low access efficiency. For example: when the data of the bank system is stored, the information of the same client is inclined to be stored in the database of the same SET (SET client), so that the service can be continuously provided for the outside. Customer-related services are commonly routed using customer information numbers. The operation rule generally uses a model operation, and if the system database is divided into 8 SETs (slices), the following operations can be used: m%8 is operated, and the operation result is used as a target SET (slice).
The database configuration file can be automatically connected to various types of databases according to the configuration information in the database configuration file, such as Oracle, mysql, DB2 or other types of databases.
Fig. 4a schematically shows a business data processing flow chart according to an embodiment of the present disclosure, and fig. 4b schematically shows a flow chart of a method for generating an object model definition file according to an embodiment of the present disclosure. As shown in fig. 4b, operation S220 includes operations S221 through S223.
In operation S221, an operation identifier in the service data processing flowchart is identified.
In operation S222, a data relationship between the service input and output field to be tested and each database table field is determined according to the operation identifier.
In operation S223, a target model definition file is generated according to the data relationship and the database data operation analysis in the data integration model.
According to an embodiment of the present disclosure, the object model definition file is in an XML format.
In one example, as shown in fig. 4a, the operation identifier includes "=" and "+", and the operation identifier characterizes the data relationship between the service input and output field to be tested and each database table field, for example, "=" characterizes the corresponding relationship between some fields of the two tables, and "+" characterizes the logical operation relationship between the fields. For example, S1 in fig. 4a indicates that field 1 in the service input address field is equal to field 1 from MySql database table 1; s2 indicates that the sum of field 2 and field 3 in the service entry address is equal to field 2 from table 2 of the Oracle database; s3 indicates that field 3 from MySql database table 1 is equal to field 3 from Table 2 of the Oracle database; s4 indicates that field 3 in the service outgoing communication area is equal to field 3 of table n from the DB2 database.
According to database data operations (including query, change and deletion) defined in the data integration model, and relationships among data of the defined database are used as input, and are converted into XML model definition files through tool analysis and output. And converting the graph in the model into the definition in the XML format by using XML. Such as: the entire table definition is defined using resultMap, each data field in the table is defined using column, database queries are represented using fetch _ XXX table _ XXX field, database changes are represented using update _ XXX table _ XXX field, database deletes are represented using delete _ XXX table _ XXX field, etc.
Fig. 5a is one of flowcharts of a method for generating a test script case provided according to an embodiment of the present disclosure. Fig. 5b is a second flowchart of a method for generating a test script case according to an embodiment of the disclosure. Fig. 6a schematically shows a data operation diagram of a data preparation sub-model according to an embodiment of the present disclosure, fig. 6b schematically shows a data operation diagram of a data assertion sub-model according to an embodiment of the present disclosure, and fig. 6c schematically shows a data operation diagram of a data recovery sub-model according to an embodiment of the present disclosure.
As shown in fig. 5a, operation S230 includes operation S231.
In operation S231, a test script Excel file and a test script execution Java file are generated according to the target model definition file and the data integration model.
According to an embodiment of the present disclosure, a data integration model includes a data preparation submodel, a data assertion submodel, and a data recovery submodel.
In one example, an XML model definition file is used as an input, microsoft Excel is operated by an Apache POI to generate, and a tester can edit the Excel to control whether a case is executed, executed locally or automatically (is executed every day by a server). The test script executes the Java file: the method is characterized in that an XML model definition file is used as an input object, a JAVA script executed by a service is generated through a JAVA input and output stream technology, the JAVA script is an executable Java file taking a test script EXCEL as an input, and is also an executable Java file for executing a tool implementation case and automatically executing.
Specifically, as shown in fig. 5b, the operation S231 includes operations S2311 through 2314.
In operation S2311, a data preparation test script Excel file is generated according to the object model definition file and the data preparation sub-model.
In one example, as shown in FIG. 6a, the data preparation sub-model describes the data operation as follows: the query data operation is performed on the table 1 from the MySql database, before the service is executed, according to the condition that the field 1 is equal to A and the field 2 is equal to B, the table 1 is queried, and the query result is stored in a databean for comparison of a subsequent assertion model; table 2 from the Oracle database, the change data operation is performed, table 2 is queried and field 3 is updated to 5 before performing the service, with field 1 equal to a; from table n of the DB2 database, a delete data operation is performed, and before executing the service, the database records of table n are queried according to field 1 being equal to a, field 2 being equal to B, and field 3 being equal to C.
Before the service is executed, the tool executes a data preparation test script Excel file generated by the data preparation sub-model to complete data preparation and data cleaning. Each piece of database operation information is automatically filled into a data preparation column of the test script Excel, such as: table XXX query 1 query, indicating that table XXX is queried according to query 1, consistent with the description in the data preparation model. These database operations will be performed prior to service execution. And (3) performing data interpretation according to the type (query, update and deletion) of the data preparation request and the type (Mysql, DB2, oracle and the like) of the database, splicing SQL to perform corresponding database operation, and storing the queried database data into a databean for comparison by a later assertion model.
In operation S2312, a data assertion test script Excel file is generated according to the target model definition file and the data assertion sub-model.
As shown in fig. 6b, the data collation relationship described by the data assertion sub-model is as follows: the field 1 in the service input communication area is equal to the field 1 from the MySql database table 1; the sum of field 2 and field 3 in the service entry field is equal to field 2 from table 2 of the Oracle database; field 3 from MySql database table 1 is equal to field 3 from table 2 of the Oracle database; field 3 in the service outgoing communication area is equal to field 3 of table n from the DB2 database. After the service is executed, the tool executes a data assertion test script Excel file generated by the data assertion sub-model to finish automatic data checking, and the data updating is ensured to be correct. Automatically filling each piece of database operation information into an assertion column of the test script Excel, such as: table XXX query condition 1 expect value 1, indicating that table XXX is queried according to query condition 1, the expected result will be compared with the target value in the data assertion model description, and the comparison result is written to the result file to be output as the result. These database operations will be performed after the service is executed. And performing data interpretation according to the type (query, update, deletion and the like) of the data preparation request and the type (Mysql, DB2, oracle and the like) of the database, splicing SQL to perform corresponding database operation, and performing operation comparison with the data of the target database.
In operation S2313, a data recovery test script Excel file is generated according to the target model definition file and the data recovery sub-model.
In one example, as shown in FIG. 6c, the data recovery model describes the data operation as follows: from the MySql database table 1, performing data change operation, and before executing service, inquiring the table 1 and updating the field 3 to be 3 according to the condition that the field 1 is equal to A and the field 2 is equal to B; table 2 from the Oracle database, performing a change data operation, before performing a service, looking up table 2 according to field 1 being equal to a, and updating field 3 to 5; from table n of the DB2 database, a delete data operation is performed, and the database records of table n are deleted before performing the service according to field 1 equal to a, field 2 equal to B, and field 3 equal to C. And after the service is executed, the tool executes a data recovery script generated by the data recovery submodel to complete automatic data recovery. Automatically filling each piece of database operation information into an assertion column of a test script Excel, such as: table XXX query 1 update 1, which shows that the queried database records are updated according to the values described in update 1 by querying table XXX according to query 1. These database operations will be performed after the service is executed. And performing data interpretation according to the type (updating and deleting) of the data preparation request and the type (Mysql, DB2 and Oracle) of the database, and splicing SQL to perform corresponding database operation, thereby realizing automatic data cleaning. In operation S2314, a test script Excel file is generated according to the data preparation test script Excel file, the data assertion test script Excel file, and the data recovery test script Excel file.
In operation S2314, a test script Excel file is generated according to the data preparation test script Excel file, the data assertion test script Excel file, and the data recovery test script Excel file.
In one example, the three Excel files are combined into the same test script Excel file according to a specific test scenario. The Excel file contains a plurality of test cases, including three types of test scripts of data preparation, data assertion and data recovery.
Based on the test method based on the data integration model, the disclosure also provides a test device based on the data integration model. The apparatus will be described in detail below with reference to fig. 7.
Fig. 7 schematically shows a block diagram of a test apparatus based on a data integration model according to an embodiment of the present disclosure.
As shown in fig. 7, the testing apparatus 800 based on the data integration model of this embodiment includes a database configuration module 810, a first generation module 820, a second generation module 830, and an execution module 840.
The database configuration module 810 is configured to configure each database connection into the data integration model according to the database configuration file. In an embodiment, the database configuration module 810 may be configured to perform the operation S210 described above, which is not described herein again.
The first generation module 820 is used for inputting the business data processing flow chart into the data integration model to generate a target model definition file. In an embodiment, the first determining module 820 may be configured to perform the operation S220 described above, and is not described herein again.
The second generating module 830 is configured to generate at least one test script case according to the object model definition file and the data integration model. In an embodiment, the second generating module 830 may be configured to perform the operation S230 described above, and is not described herein again.
The execution module 840 is configured to execute the test script case to output a visual test result. In an embodiment, the execution module 840 may be configured to execute the operation S240 described above, which is not described herein again.
According to an embodiment of the present disclosure, the first generating module 820 includes: the device comprises an identification submodule, a first determination submodule and a first generation submodule.
And the identification submodule is used for identifying the operation identifier in the service data processing flow chart. In an embodiment, the identifying sub-module may be configured to perform the operation S221 described above, which is not described herein again.
In an embodiment, the first determining submodule may be configured to perform operation S222 described above, and details are not repeated here. And
and the first generation submodule is used for generating a target model definition file according to the data relation and the database data operation analysis in the data integration model, and the target model definition file is in an XML format. In an embodiment, the first generation submodule may be configured to perform operation S223 described above, and is not described herein again.
According to an embodiment of the present disclosure, the second generating module includes: and a second generation submodule.
And the second generation sub-module is used for generating a test script Excel file and a test script execution Java file according to the target model definition file and the data integration model. In an embodiment, the second generating submodule may be configured to perform the operation S241 described above, and details are not described herein again.
According to an embodiment of the present disclosure, the second generation submodule includes: a first generation unit, a second generation unit, and a third generation unit.
And the first generating unit is used for generating a data preparation test script Excel file according to the target model definition file and the data preparation sub-model. In an embodiment, the first generating unit may be configured to perform the operation S2411 described above, which is not described herein again.
And the second generation unit is used for generating a data assertion test script Excel file according to the target model definition file and the data assertion sub-model. In an embodiment, the second generating unit may be configured to perform the operation S2412 described above, which is not described herein again.
And the third generating unit is used for generating a data recovery test script Excel file according to the target model definition file and the data recovery sub model. In an embodiment, the third generating unit may be configured to perform the operation S2413 described above, which is not described herein again.
And the fourth generating unit is used for generating a test script Excel file according to the data preparation test script Excel file, the data assertion test script Excel file and the data recovery test script Excel file. In an embodiment, the fourth generating unit may be configured to perform the operation S2414 described above, which is not described herein again.
According to an embodiment of the present disclosure, a database configuration module includes: a second determination submodule, a third determination submodule and a connection configuration submodule.
And the second determining submodule is used for determining a data source connection attribute according to the database definition XML file, wherein the data source connection attribute comprises each database name, database connection information and database login verification information. In an embodiment, the second determining submodule may be configured to perform operation S211 described above, which is not described herein again.
And the third determining submodule is used for determining an attribute value corresponding to the data source connection attribute according to the attribute definition file. In an embodiment, the third determining submodule may be configured to perform the operation S212 described above, and details are not repeated herein.
And the connection configuration submodule is used for connecting and configuring each database to the data integration model according to the attribute value corresponding to the data source connection attribute. In an embodiment, the connection configuration sub-module may be configured to perform the operation S213 described above, and will not be described herein again.
According to an embodiment of the present disclosure, any plurality of the database configuration module 810, the first generation module 820, the second generation module 830, and the execution module 840 may be combined into one module to be implemented, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the database configuration module 810, the first generation module 820, the second generation module 830, and the execution module 840 may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in any suitable combination of any of them. Alternatively, at least one of the database configuration module 810, the first generation module 820, the second generation module 830 and the execution module 840 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
FIG. 8 schematically illustrates a block diagram of an electronic device suitable for implementing a data integration model-based testing method according to an embodiment of the present disclosure.
As shown in fig. 8, an electronic apparatus 900 according to an embodiment of the present disclosure includes a processor 901 which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage portion 908 into a Random Access Memory (RAM) 903. Processor 901 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 901 may also include on-board memory for caching purposes. The processor 901 may comprise a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
In the RAM 903, various programs and data necessary for the operation of the electronic apparatus 900 are stored. The processor 901, ROM 902, and RAM 903 are connected to each other by a bus 904. The processor 901 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 902 and/or the RAM 903. Note that the programs may also be stored in one or more memories other than the ROM 902 and the RAM 903. The processor 901 may also perform various operations of the method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
Electronic device 900 may also include input/output (I/O) interface 905, input/output (I/O) interface 905 also connected to bus 904, according to an embodiment of the present disclosure. The electronic device 900 may also include one or more of the following components connected to the I/O interface 905: an input portion 906 including a keyboard, a mouse, and the like; an output section 907 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 908 including a hard disk and the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. A drive 910 is also connected to the I/O interface 905 as needed. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 910 as necessary so that a computer program read out therefrom is mounted into the storage section 908 as necessary.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer readable storage medium carries one or more programs which, when executed, implement a data integration model-based testing method according to an embodiment of the present disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, a computer-readable storage medium may include the ROM 902 and/or the RAM 903 described above and/or one or more memories other than the ROM 902 and the RAM 903.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the method illustrated in the flow chart. When the computer program product runs in a computer system, the program code is used for causing the computer system to realize the test method based on the data integration model provided by the embodiment of the disclosure.
The computer program performs the above-described functions defined in the system/apparatus of the embodiments of the present disclosure when executed by the processor 901. The above described systems, devices, modules, units, etc. may be implemented by computer program modules according to embodiments of the present disclosure.
In one embodiment, the computer program may be hosted on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted, distributed in the form of a signal on a network medium, and downloaded and installed through the communication section 909 and/or installed from the removable medium 911. The computer program containing program code may be transmitted using any suitable network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 909, and/or installed from the removable medium 911. The computer program, when executed by the processor 901, performs the above-described functions defined in the system of the embodiment of the present disclosure. The above described systems, devices, apparatuses, modules, units, etc. may be implemented by computer program modules according to embodiments of the present disclosure.
In accordance with embodiments of the present disclosure, program code for executing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, these computer programs may be implemented using high level procedural and/or object oriented programming languages, and/or assembly/machine languages. The programming language includes, but is not limited to, programming languages such as Java, C + +, python, the "C" language, or the like. The program code may execute entirely on the user computing device, partly on the user device, partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external computing devices (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (10)

1. A test method based on a data integration model is characterized by comprising the following steps:
connecting and configuring each database into a data integration model according to the database configuration file;
inputting a business data processing flow chart into the data integration model to generate a target model definition file, wherein the target model definition file is used for representing the data relation between the input and the output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information;
generating at least one test script case according to the target model definition file and the data integration model; and
and executing the test script case to output a visual test result.
2. The testing method of claim 1, wherein inputting the business data processing flow graph into the data integration model to generate an object model definition file comprises:
identifying an operation identifier in the business data processing flow chart;
determining the data relation between the service input and output field to be tested and each database table field according to the operation identification; and
and generating a target model definition file according to the data relation and the database data operation analysis in the data integration model, wherein the target model definition file is in an XML format.
3. The method of claim 1, wherein generating at least one test script case from the object model definition file and the data integration model comprises:
and generating a test script Excel file and a test script execution Java file according to the object model definition file and the data integration model.
4. The testing method of claim 3, wherein the data integration model comprises a data preparation sub-model, a data assertion sub-model, and a data restoration sub-model, and wherein generating a test script Excel file from the target model definition file and the data integration model comprises:
generating a data preparation test script Excel file according to the target model definition file and the data preparation sub-model;
generating a data assertion test script Excel file according to the target model definition file and the data assertion sub-model;
generating a data recovery test script Excel file according to the target model definition file and the data recovery sub-model; and
and generating a test script Excel file according to the data preparation test script Excel file, the data assertion test script Excel file and the data recovery test script Excel file.
5. The method of claim 1, wherein the database configuration files comprise a database definition XML file, an attribute definition file, and a routing definition file, and wherein configuring the database connections into the data integration model according to the database configuration files comprises:
determining data source connection attributes according to the database definition XML file, wherein the data source connection attributes comprise database names, database connection information and database login verification information;
determining an attribute value corresponding to the data source connection attribute according to the attribute definition file; and
and connecting and configuring each database to a data integration model according to the attribute value corresponding to the data source connection attribute.
6. The test method according to any one of claims 1 to 5, wherein each database comprises Oracle, mysql and DB2.
7. A testing apparatus based on a data integration model, the testing apparatus comprising:
the database configuration module is used for connecting and configuring each database into the data integration model according to the database configuration file;
the first generation module is used for inputting a business data processing flow chart into the data integration model so as to generate a target model definition file, the target model definition file is used for representing the data relation between the input and output of the service to be tested and each database table, and the business data processing flow chart is determined according to the data integration model and the business data processing flow information;
the second generation module is used for generating at least one test script case according to the target model definition file and the data integration model; and
and the execution module is used for executing the test script case so as to output a visual test result.
8. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-6.
9. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program which, when executed by a processor, carries out the method according to any one of claims 1 to 6.
CN202210745096.9A 2022-06-27 2022-06-27 Test method and device based on data integration model Pending CN115203027A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210745096.9A CN115203027A (en) 2022-06-27 2022-06-27 Test method and device based on data integration model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210745096.9A CN115203027A (en) 2022-06-27 2022-06-27 Test method and device based on data integration model

Publications (1)

Publication Number Publication Date
CN115203027A true CN115203027A (en) 2022-10-18

Family

ID=83577390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210745096.9A Pending CN115203027A (en) 2022-06-27 2022-06-27 Test method and device based on data integration model

Country Status (1)

Country Link
CN (1) CN115203027A (en)

Similar Documents

Publication Publication Date Title
US11669503B2 (en) Building and managing data-processing attributes for modeled data sources
US8712965B2 (en) Dynamic report mapping apparatus to physical data source when creating report definitions for information technology service management reporting for peruse of report definition transparency and reuse
US8332806B2 (en) Stepwise template integration method and system
US9251222B2 (en) Abstracted dynamic report definition generation for use within information technology infrastructure
US10445675B2 (en) Confirming enforcement of business rules specified in a data access tier of a multi-tier application
US9619537B2 (en) Converting data objects from single- to multi-source database environment
CN111949543B (en) Test method and device based on distributed platform, electronic equipment and storage medium
CN111125064B (en) Method and device for generating database schema definition statement
US9330140B1 (en) Transient virtual single tenant queries in a multi-tenant shared database system
US11829814B2 (en) Resolving data location for queries in a multi-system instance landscape
US9971794B2 (en) Converting data objects from multi- to single-source database environment
CN114281803A (en) Data migration method, device, equipment, medium and program product
CN114116678A (en) Data migration method, device, equipment, medium and program product
CN116450107B (en) Method and device for secondary development of software by low-code platform and electronic equipment
US11615061B1 (en) Evaluating workload for database migration recommendations
US20220382791A1 (en) Executing services across multiple trusted domains for data analysis
CN115203027A (en) Test method and device based on data integration model
US20230195792A1 (en) Database management methods and associated apparatus
CN117009397A (en) Data query method, data query device, electronic equipment and storage medium
CN113504904A (en) User-defined function implementation method and device, computer equipment and storage medium
US11888937B2 (en) Domain specific provider contracts for core data services
CN116755684B (en) OAS Schema generation method, device, equipment and medium
US11755591B2 (en) Metadata object identifier registry
US20240143487A1 (en) Secure testing of attachment functionality of objects
US20240126759A1 (en) Converting an api into a graph api

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination