GB2404041A - System for data integration - Google Patents

System for data integration Download PDF

Info

Publication number
GB2404041A
GB2404041A GB0316922A GB0316922A GB2404041A GB 2404041 A GB2404041 A GB 2404041A GB 0316922 A GB0316922 A GB 0316922A GB 0316922 A GB0316922 A GB 0316922A GB 2404041 A GB2404041 A GB 2404041A
Authority
GB
United Kingdom
Prior art keywords
data
source
properties
input
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0316922A
Other versions
GB0316922D0 (en
Inventor
Nigel Russell Pickering
Christopher Alan Sawyer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OAKSOFT Ltd
Original Assignee
OAKSOFT Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OAKSOFT Ltd filed Critical OAKSOFT Ltd
Priority to GB0316922A priority Critical patent/GB2404041A/en
Publication of GB0316922D0 publication Critical patent/GB0316922D0/en
Publication of GB2404041A publication Critical patent/GB2404041A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems

Abstract

A system for data integration in providing a target model is described. The system described comprises an adaptable user drive interface for defining an integrated input from a respective one of a plurality of sources and directly to form the model from data input and properties defined for each input. The target model described is "soft" and formed directly as a result of the data input and the properties or mappings defined for each input.

Description

240404 1
A SYSTEM FOR DATA INTEGRATION
The invention relates to a system for data integration, particularly for providing a target model.
A prior approach to data integration requires the determination of a (common) target data model fit for a specific business use and then the exercise of mapping subsets of each of the predefined sources specifically into it. This method tends to be expensive and by definition implies a semi-rigid result. : . :.
Using the known or traditional approach illustrated in Fig. 1 as a flow chart, many large companies at some point during the business lifecycle need to create a cleaned up superset À of data currently held in disparate and non-linked systems. This typically requires the.. . following activities: Sources need to be defined and detail source analysis completed À Target requirements and hence the target data model need to be defined À Quality of source data needs to be analysed and cleansed as necessary À Transformation requirements need to be determined for each source Determination of which source prevails when data fields are common between sources (eg fields 'overlap') and their values 'conflict', which is often a qualitative process based on local expertise not an automated exercise À Processes need to be established to filter out de-selected data iJupicated entries need to be identified and resolved Building and the new database and defining the broadcast mechanisms back to the original source systems Decisions on what to do with 'filtered-out' data need to be resolved Typically, reconciliation processes are necessary before the new database can directly update any individual source.
There are associated Risks: Dealing with the number of'variables' The duration of the process, it can take many weeks/months depending on the I number of sources and data items involved.
À Avoiding the creation of a semi-rigid design requiring ongoing redesign as needs become apparent.
Where there is a requirement to be 'flexible' or there is some uncertainty in requirements, , . Nor example in the source data components shown in Fig. all questions need to be. . . I determined and resolved before a traditional solution (flow diagram of Fig. 1) can be À designed. :. :.
For these reasons, it is often the case for a project integrating 2-3 sources to require contribution from 3 skilled FTE's and to last for 4-6 elapsed months (a total of 12-24 man months (mm)). À.
According to the invention there is provided a system for data integration in providing a target model, comprising an adaptable user driver interface for defining an integrated input from a respective one of a plurality of sources and directly to form the model from data input and properties defined for each input.
Using the invention the target model is 'soft', being formed directly as the result of the data input and the properties (or mappings) defined for each input. This fundamentally different approach from the known system of 'adaptable data modelling' delivers significant savings in project effort.
Moreover, using the invention too there is provided an industrial strength and highly scalable data integration tool, which allows Business analysts to easily and efficiently manipulate any form of data into a unified and consolidated target data model or 'golden copy'. It will be understood that the term 'integration' used herein includes all activities necessary to deliver high quality data from two or more sources, eg: transformation, mapping, cleansing, de-duplication, enrichment, conflict resolution, etc. The sources each may comprise a differently named set of data originating from a source system.
Each source may comprise a relative weighting whereby to provide for conflict resolution.
Moreover each source may comprise discrete source properties, which may themselves
define a field in a source row.
There may be a plurality of additional source fields formed by concatenation and sub
division of supplied source fields. .
Each source may be mappable to provide a target cluster field. Thus if two or more fields À from different sources are mapped to the same target field, then comparison can occur. ..
Clusters of data may be characterized by source rows that have substantially the same value in a cluster key field, whereby one source row with multiple cluster key fields designated may exist in a plurality of clusters. Thus clusters may be maintained for each
source field value.
The integrated data may comprise an exported flat file, without any standardization from a source.
There may be a user interface adapted for viewing of data, manipulating properties which define each in,n,t defi,n, ins in, tearntion ntles Tuna pria..ng the required views.
The integrated data may be adapted for viewing immediate outputs.
The immediate outputs may comprise one, some, or all of a target data model, metrics on data quality, a process tangibly to clean data discrepancies, and provision of an authoritative data base.
The tangible clean data discrepancies may comprise cleaning in bunk or one-by-one, which one-by-one step may provide update information direct to source systems.
The properties may be redefinable to accommodate actual data, which redefined properties may be perfected to suit the actual data content and the data may be cleansed within a continuous feedback process. À::.
Each target field within a cluster may have mapped into it a plurality of source fields. .... . À À*e.
Each source field may be weighted, whereby to automate selection of a correct value, which value can be outputted for return to an appropriate source. À À-: Corrections may be presented on-line or extractable for automatic updating of a source.
As a result of this approach the actual data quality and integrated results are presented almost immediately allowing Operational users, rather than skilled IT resources, to efficiently conduct cleansing activities to build and use the authoritative or 'golden copy' as a reference source.
A system embodying the invention can deliver significant benefits over traditional methods of integrating data. The system is described below with reference to Fig. 3.
Thus this radical methndolny tlelivters 11 fa Quality mere efficiently Arid for fir leas cost than traditional alternatives. Experience and case studies indicate 50-80% savings in project cost.
In today's financial businesses there are four principal business objectives that can be identified which are critical to todays financial businesses shown as results output in Fig. 3.
1. Cost savings (with short term payback) 2. Enhanced exposure & risk control and delivering new Regulatory Reporting objectives (Baser ii) 3. 'Clean' and Quality data on Client and product data (KYC) 4. Operational efficiency and process quality (STP) Using the invention it is possible to solve many of the problems in achieving these:'À À À' objectives. À Thus a system embodying the invention is estimated to save up to 80% of the effort on.
any data integration projects, and as such could represent the difference between success and failure for a financial business. À Referring now to systems embodying the invention a radically new approach to the..
solution is provided and results are produced much quicker.
Thus to integrate data with a system of the invention, actions necessary are: Export flat files, without any standardization, from each source system identified as being a contributor À Use the user interface (GUI) to view the data, manipulate Properties which define each input, the integration rules and the required views - . View the immediate outputs, comprising: o A target data model comprising of the superset of data actually input and mapped MPt=C On Acts nlqlitr lo_ o Process to tangibly clean data discrepancies, in bulk or 'one-by-one', with an option to provide update information direct to source systems o A 'golden copy' database which can be queried as the 'authority' Analysts can establish the Properties in a few minutes to define each input and view the integrated results of Fig. 3. If necessary, Properties can be refined to suit the actual data and thereafter cleansing activity can commence within a continuous feedback process shown in Fig. 4.
The user is able to define Properties which govern how the integrated data is presented and therefore is able to adapt the focus on to specific areas as necessary.
Estimates show 60-80% savings in project costs and the provision of a continuously operating consistency process vs. what is usually an inconclusive 'once-off' process that 2. ' has little positive effect on the overall company efficiency. * À The following technical effects can be achieved using a system embodying the invention:- . 1. Business will achieve the targeted objective benefits from the 'clean' golden copy.
datawarehouse operating 'in-line' and being continuously reconciled with À ..
operational processing systems 2. Project costs and durations will be significantly less than with traditional methods.
The system provides the ability to process simple flat file data extracts and have analysts define Properties which govern how the data is transformed, mapped and matched. User maintained Properties control how the system integrates the raw data and displays the results.
Inputs are processed as soon as a file is received and as each row of data is loaded, this allows the system to provide up-to-date information.
Thus the system holds all input in source form and conducts its processing against the source data, maintaining interim information purely for performance purposes. Source Properties define how each input is subdivided into its constituent data fields and how the integration is to be performed. This provides for the flexibility of mapping and translation rules. All the fields from each different source can be linked ('mapped') to a target field (also defined as part of the mapping).
The system maintains logical clusters of data for each source field that is defined as a 'key field' and as each row is input it is recorded in all relevant clusters. This provides the ability for rapid system performance and up-to-date queries.
Within each cluster any target field may therefore have many source fields mapped to it.
Where the different source fields 'agree' it represents full field level reconciliation, but any target field which contains conflicting source field content it indicates that the source 2..
fields either: À À
À have different definitions/formats in their source systems, or, À - ..-are incorrectly mapped, or, are 'in conflict' (ie wrong) À À.- Conflicting formats and incorrect mappings can be immediately resolved by amending their Properties, thereby quickly isolating the 'in conflict' differences that may need correction at source.
The problem of differing field sizes, formats and coding structures in source systems is mitigated by the system's adaptive design.
Weightings can be applied to each source feed to automate selection of the correct value and corrections can be output for return to the appropriate source system(s). These corrections are presented on-line in the UI (user interface), and can also be extracted (in TV fnnat for _ somatic scarce update if a,r^p.ate.
The system provides a tool which allows Analysts to handle all aspects of the data integration within the tool itself and which rninimises the use of IT skills.
Often the hardest part of a data consolidation project is getting IT resources to extract standardised data from the source systems. The system simply requires (regular) non- standardised flat file extracts from the source systems. (This represents the minimum possible effort given the need to integrate source data.) Thereafter all of the integration functions can be carried out by Business Analysts rather than IT specialists.
The system's non-technical interface and approach allows business analysts to perform all the tasks to manipulate the data integration (eg: Transformation, integration, validation, aggregation, cleansing, enriching) in a manner that useably scales to hundreds of sources.
Regular inputs allow a continuous process and for the business to maintain ongoing high..
quality integrated results. À À À À With inputs from systems in a lifecycle model, the integration results are a consolidated À view of each transaction across its individual lifecycle, providing a front-end control layer view of the operation. À :- Using the system comprises of the following steps and actions: 1. Determine at a high level the specific business objective and the initially identified source systems holding the relevant data.
2. Arrange to provide 'flat' files or message 'streams' extracted from the source systems without any standardization or format change. Using existing exports or file backups can be used to further reduce IT impact. One-off 'full file loads' are necessary initially and thereafter 'update only' streams can maintain the currency of the integrated data.
3. If the system is not already installed: a. Install the system onto the client's preferred SQL database using the SQL script provided b. Load the the system User interface (GUI) executable onto the necessary PC's.
4. Import the source data into the system by one of the methods: a. Select 'Import' on the GUI, or, b. Drag and drop, or FTP the file into nominated the system input directory, or, c. Establish SQL which writes rows of data into the system input table (two fields: filename, data row in single character string), or, d. Establish rules for any message routing software to deliver data into directory or direct into the system.
5. Input formats can be any one of: .. . . a. Fixed width, or variable, delineated by any character (comma, pipe, tab, À - ... etc) À-.
b. Native SWIFT format, any message type (MTnnn) c. Any other formats require a few days to produce the appropriate DLL eg.., À - for XML...
6. For each new input, complete the Properties and trigger the system to process each input. Thereafter inputs are fully processed when they 'arrive' or are input.
7. Review the results and refine Properties as necessary. Properties cater for differing field sizes, mapping issues, different coding structures, and different values but which mean the same 8. Present via the UI the true data conflicts to operational users to apply corrections to true value conflicts as necessary 9. Use the 'golden copy' results, either: a. Export the 'golden copy' data if its required for another database b. As a reference source using the UI or via SQL queries c. Create a user lookup to the Golden Copy for reference purposes e.g. when creating new client records 10. Optionally, add further inputs to further enrich the breadth and depth of data. Note: new inputs do not affect operational users until registered to do so.
Using a system embodying the invention is hereinafter illustrated on the following
further Examples.
EXAMPLE 1
A Bank needs to validate and clean the data in its large 50O,000 client masterfile.
The Traditional solution is illustrated in Fig. 5.
Given the size of this masterfile, and the likely absence of any cleansing resources, the.; . traditional solution suggests an 'automatic' data scrubbing solution is necessary. I-:.
A sample ofthe typical actions necessary for this project are: Select and/or work with IT. ..
department to set up analysis tools to determine the 'quality' of the data; take an extract of....
the masterfile and run into the analysis tool; discuss and agree with the 'business' which. . fields are 'key'; addresses: set up and run against address improvement software; look for duplicates - and decide what to do when duplicates are found; seek other sources to improve against, build methods to merge; 'pack' missing fields with calculated values; apply/update fixed file into 'live' database management (IT, business, operations etc.).
Chart 1 (Fig. 5) shows a sample ofthe typical actions necessary for a project which conducts a 'once-off' data field cleansing process which will deliver valid address formats where records have valid postcodes, otherwise no guarantees of'success'.
Therm_+ A_ CA1+ A- T11%AI _ -lo_ TT IT'' _ <\ lrlLlVII =IULIWlI 10 1lUOLl"" 111 W11t t1-18. UJ This 'project' installs a continuous data quality monitor and guarantees to cleanse and keep clean at least 50,000 'headline' client records in the initial period. The actions typical for the project are: Install 1 view and load initial inputs; apply and refine validation properties; review details, set up appropriate folders; set up regular inputs and update file outputs; clean 5O,OOO individual accounts (clerical); merge further inputs; apply best source rules and apply automatic updates; management.
EXAMPLE 2
Company needs to integrate two client master files and 10 assorted Compliance/Risk/MIS/CRM client files into a new company-wide data warehouse and establish this as the company-wide 'Authority' The Traditional solution is illustrated in Chart m (Fig. 7): Design and build the target data-model and system, then populate it with 'quality data then satisfy the need to reconcile it with each source before establishing 'broadcast' 2 mechanisms to each. Examples of typical actions required are: design, build and test new datawarehouse; define interface standards; build extracts A. .. A.
complying with standards; load data; discover results and adapt database; select 'best..
source' and methods to filter out data conflicts; de-duplicate; build UI access to new warehouse; reconcile and clean source vs warehouse; establish technology and 'broadcast' mechanisms (to major apps); management. The Risk of not being truly successful for the business is 'High'.
The System solution is illustrated by chart IV (Fig. 8) Take a simple export copy of each source, manipulate the data elements (using the system) into a 'net superset' (eg derive the target data model from source data), establish continuous dispute resolution process and use the results. Examples of the steps required are: Set ur regular export filets firm sn,rGeS; refine properties for each source; resolve data conflicts (de-skilled); de-duplicate using common 'keys'; management; establish new procedures to use warehouse as the 'authority'. The Risk of not being truly successful for the business is 'low'.
As previously referred to alternative methods of bespoke design and build using traditional technology components is skilled and labour intensive, and prone to failure for a number of practical reasons highlighted in Fig. 9.
The questions that need to be answered for data integration relate to: relative priority, technology, volumes, time-to-market, fact finding, what data, where, resources, what scope, what products, detail requirements, design, cleansing, connectivity, cost.
Answering these and other questions represents significant investment and typically limits project success. #age Given the core objectives of integrating data/systems the fundamental differences between I traditional approaches and a system embodying the invention are: .À 1. With the invention it is not necessary to spend time:- ....
a. to pre-design the target data-model or functionality of the integration À b. to analyse the source data or to determine a common standard À.
c. to design an architecture for message processing/transmission d. to build interface standards 2. With the invention data discrepancies are highlighted almost immediately, with a tangible method to resolve the problems; and 3. The invention is not rigid in its design and positively encourages greater breadth of integration ie more consistency Case studies demonstrate a saving of 48% to 76% in project costs when compared to traditionally designed solutions. ! The system is usually combined with an adaptable Data Integration System (ADI).
ADI is a method for capturing, storing and manipulating different forms of data and presenting it all as if it were fully integrated. Transformation and integration mapping rules can be interactively adapted by non-technicians until a suitable result is achieved. Validation rules can highlight content issues with any field and where any field is supplied in more than 1 source a multi-way comparison determines consistency.
The system can combine ADI with an adaptable user driven interface for initially setting up rules on how each input is to be integrated and which thereafter highlights and manages the resolution of all data content issues, value conflicts and missing transactional events.
With regular updates from the contributing systems, the system provides a. ., À.
continuous and up-to-date process to monitor and correct data issues in the contributing systems or databases, enables non-progranmers to manipulate the " '. ' design and modelling of data interactively until a suitable result is achieved with. *'.
the data, and provides the ability to view any number of different input streams, ,, (data) by any clustering value (key) and adapt mappings until the data. ..
representation accurately reflects the desired superset target. By the simple inclusion of new inputs, and with regular update inputs, a complete and continuous process of integration is achieved across the organization In short, prior to the invention of the system computer systems required databases to be physically designed, instantiated and maintained into a physical database schema specifically established the purpose. Where a field was present in several inputs decisions had to be made to filter out all but 1 value. These overheads typically translate into months of effort in analysis, development and maintenance.
The effects of a system embodying the invention are: 1. To remove the need to pre-design and code database models 2. To significantly reduce the time to establish and maintain consolidation databases and data warehouses 3. To provide a method of practically integrating and reconciling literally 1 OO's of sources of data 4. To deliver highly adaptable results in rapid timeframes 5. To provide a method of safely transitioning from any number of disparate systems into an integrated process . eac e.
When tools require skilled IT resources and require multiple inputs it becomes,' .', increasingly more difficult to manage and remain flexible in support of the business. Any À approach that requires pre-determining a target data model must have limitations in:. ..
adaptability. : In short, a system embodying the invention receives information from different sources. . and "overlays" that information so that a user can see the data from the different sources in a single view (as a kind of 3-D effect).

Claims (22)

1. A system for data integration in providing a target model, comprising an adaptable user driver interface for defining an integrated input from a respective one of a plurality of sources and directly to form the model from data input and properties defined for each input.
2. A system according to Claim 1, the sources each comprising a differently named set of data originating from a source system.
3. A system according to Claim 2, there being a source row comprising one.
. :...CLME: record of data received from a source. . À . .
4. A system according to Claim 2 or Claim 3, each source comprising a relative weighting whereby to provide for conflict resolution.
5. A system according to Claim 4, each source comprising discrete source.
properties.
6. A system according to Claim 5, the source properties defining a field in a source row.
7. A system according to Claim 6, there being a plurality of additional source fields formed by concatenation and sub division of supplied course fields.
8. A system according to any one of Claims 2 to 7, each source being mappable
to provide a target cluster field.
9. A system according to Claim 8, comprising clusters of data characterized by source rows that have substantially the same value in a cluster key field, whereby one source row with multiple cluster key fields designated may exist in a plurality of clusters.
10. A system according to any preceding claims, the integrated data comprising art exported flat file, without any standardization from a source.
11. A system according to Claim 10, comprising a user interface adapted for viewing of data, manipulating properties which define each input, defining integration rules and providing the required views.
12. A system according to Claim 11, the integrated data being adapted for viewing immediate outputs. e
13. A system according to Claim 12, the immediate outputs comprising one, some As.
Or all of a target data model, metrics on data quality, a process tangibly to clean data discrepancies and provision of an authoritative data base. À ;
14. A system according to Claim 13, the tangible clean data discrepancies. .
comprising cleaning in bulk or one-by-one. À . .
15. A system according to Claim 14, the one-by-one step providing update information direct to source systems.
16. A system according to any of Claims 11 to 15, the properties being redefinable to accommodate actual data.
17. A system according to Claim 16, redefined properties being cleansed within a continuous feedback process.
18. A system according to any of Claims 8 to 17, each target field within a cluster having mapped into it a plurality of source fields.
19. A system according to Claim 18, each source field being weighted whereby to automate selection of a correct value, which value can be outputted for return to an appropriate source.
20. A system according to Claim 19, correction being presented on-line or extractable for automatic updating of a source.
21. A system according to any preceding claim, in combination with adaptable data integration (ADI).
22. A system for providing data integration for providing a target model, . :.
substantially as hereinbefore described with reference to the Examples. . À . . À
A À :- À.
GB0316922A 2003-07-18 2003-07-18 System for data integration Withdrawn GB2404041A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0316922A GB2404041A (en) 2003-07-18 2003-07-18 System for data integration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0316922A GB2404041A (en) 2003-07-18 2003-07-18 System for data integration

Publications (2)

Publication Number Publication Date
GB0316922D0 GB0316922D0 (en) 2003-08-27
GB2404041A true GB2404041A (en) 2005-01-19

Family

ID=27772301

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0316922A Withdrawn GB2404041A (en) 2003-07-18 2003-07-18 System for data integration

Country Status (1)

Country Link
GB (1) GB2404041A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5721912A (en) * 1994-08-05 1998-02-24 Data Integration Solutions Corp. Graphical user interface for creating database integration specifications
US5819264A (en) * 1995-04-03 1998-10-06 Dtl Data Technologies Ltd. Associative search method with navigation for heterogeneous databases including an integration mechanism configured to combine schema-free data models such as a hyperbase
US5909570A (en) * 1993-12-28 1999-06-01 Webber; David R. R. Template mapping system for data translation
WO2000002144A1 (en) * 1998-07-06 2000-01-13 Beller Stephen E Flexible, modular electronic element patterning method and apparatus for compiling, processing, transmitting, and reporting data and information
US6199068B1 (en) * 1997-09-11 2001-03-06 Abb Power T&D Company Inc. Mapping interface for a distributed server to translate between dissimilar file formats
US6389429B1 (en) * 1999-07-30 2002-05-14 Aprimo, Inc. System and method for generating a target database from one or more source databases

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5909570A (en) * 1993-12-28 1999-06-01 Webber; David R. R. Template mapping system for data translation
US5721912A (en) * 1994-08-05 1998-02-24 Data Integration Solutions Corp. Graphical user interface for creating database integration specifications
US5819264A (en) * 1995-04-03 1998-10-06 Dtl Data Technologies Ltd. Associative search method with navigation for heterogeneous databases including an integration mechanism configured to combine schema-free data models such as a hyperbase
US6199068B1 (en) * 1997-09-11 2001-03-06 Abb Power T&D Company Inc. Mapping interface for a distributed server to translate between dissimilar file formats
WO2000002144A1 (en) * 1998-07-06 2000-01-13 Beller Stephen E Flexible, modular electronic element patterning method and apparatus for compiling, processing, transmitting, and reporting data and information
US6389429B1 (en) * 1999-07-30 2002-05-14 Aprimo, Inc. System and method for generating a target database from one or more source databases

Also Published As

Publication number Publication date
GB0316922D0 (en) 2003-08-27

Similar Documents

Publication Publication Date Title
US11334594B2 (en) Data model transformation
US10042904B2 (en) System of centrally managing core reference data associated with an enterprise
US5987472A (en) System and method for handling database cross references
US8239426B2 (en) Data management system providing a data thesaurus for mapping between multiple data schemas or between multiple domains within a data schema
AU694583B2 (en) Database system and its data exploitation support apparatus
JP5538581B2 (en) Method and system for combining business entities
US8630969B2 (en) Systems and methods for implementing business rules designed with cloud computing
CN110210775B (en) Information management platform, application and method thereof
US20080154925A1 (en) Select/refresh method and apparatus
US20150026115A1 (en) Creation of change-based data integration jobs
JP2002108681A (en) Replication system
US20220405297A1 (en) Apparatus and method for filtering data from or across different analytics platforms
JP2007249422A (en) Organization constitution management system, and program therefor
GB2404041A (en) System for data integration
CN112632173A (en) ETL-based due diligence data analysis system and method under mass data
EP4231169A1 (en) One-click paradigm for data processing in data migration
CN116361298B (en) Method and device for automatically acquiring SAP data to generate report form for EAS
US20240086809A1 (en) Process Variation Management
CN114416043A (en) Form system capable of linking with database
JP2004303117A (en) Name sorting database design support method and system
CN116628060A (en) Data exchange method for realizing web based on keyle
Druzhinina et al. A subsystem for the management of working hours during the operation of automated technology for the processing of information about scientific activities
Jarrett A Methodology for the Implementation and Maintenance Of a Data Warehouse

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)