CN106991100B - Data import method and device - Google Patents

Data import method and device Download PDF

Info

Publication number
CN106991100B
CN106991100B CN201610041954.6A CN201610041954A CN106991100B CN 106991100 B CN106991100 B CN 106991100B CN 201610041954 A CN201610041954 A CN 201610041954A CN 106991100 B CN106991100 B CN 106991100B
Authority
CN
China
Prior art keywords
data
file
kettle
data source
import
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610041954.6A
Other languages
Chinese (zh)
Other versions
CN106991100A (en
Inventor
李飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201610041954.6A priority Critical patent/CN106991100B/en
Publication of CN106991100A publication Critical patent/CN106991100A/en
Application granted granted Critical
Publication of CN106991100B publication Critical patent/CN106991100B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/258Data format conversion from or to a database

Abstract

The disclosure relates to a data import method and a device, and the method comprises the following steps: designating a KETTLE file which is created in advance and used as a data import template; setting configuration information according to preset information needing to be reconfigured in the KETTLE file; and dynamically analyzing the KETTLE file according to the configuration information so as to automatically import data. The technical scheme disclosed by the invention can put the change on the use of an ETL tool, has low coupling degree with the service, is easy to expand, does not need to update codes frequently, can realize personalized requirements and supports hot deployment.

Description

Data import method and device
Technical Field
The present disclosure relates to the field of computer application technologies, and in particular, to a data importing method and apparatus.
Background
The data import function is one of the most common functions in WEB applications, and basically, such a requirement exists in all WEB applications.
Related art the related functions are generally implemented by program codes. Taking the example of importing the EXCEL file through the WEB server, the related art realizes the importing of the EXCEL file into the WEB server by adopting the API for realizing the EXCEL file importing function provided by the POI tool package, through the open-source POI tool package provided by JAVA (Apache POI, Apache POI is an open source code function library of Apache software foundation, and the POI provides the API for the JAVA program to read and write the Microsoft Office format archive).
However, in actual development, when the requirement changes continuously, the above method needs to modify the code frequently to meet the requirement, and for a more personalized requirement, multiple function buttons need to be implemented on one export function, and even more interactions between systems exist, and the verification logic is very complex, the implementation is also more complex, and a bottleneck may exist in performance. Therefore, the method for importing data by directly utilizing the API with the data importing function needs to frequently update codes when the demand is changed, the realization difficulty of personalized demand is high, and the degree of coupling with the service is high.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a data importing method and apparatus.
According to a first aspect of the embodiments of the present disclosure, there is provided a data import method, including:
designating a KETTLE file which is created in advance and used as a data import template;
setting configuration information according to preset information needing to be reconfigured in the KETTLE file;
and dynamically analyzing the KETTLE file according to the configuration information so as to automatically import data.
Further, the key file is a key file generated by importing a data source of a first preset type into a database of a second preset type according to a set logic by using a tool for generating the key file in advance.
Further, the data source of the first preset type is an EXCEL file.
Further:
the information needing to be reconfigured in the KETTLE file comprises a data source;
the preset information needing to be reconfigured comprises the data source involved in the KETTLE file.
Further, the key file includes: a data source input part, a JavaScript code part, a table input part, a stream query part, a constant adding part, a sorting and de-duplication part, a field selection part, a table output part and/or a no-operation part;
the data source input part comprises an uploaded demand file;
the JavaScript code part comprises a JavaScript script, and the JavaScript script is used for carrying out data verification and data format conversion on the cells in the uploaded file;
the table input part is used for acquiring data of a data source, including query statements;
the stream query part is used for comparing the data acquired by the table input with column fields in the data source;
the constant adding part is used for assigning values to default fields of the database;
the sorting and duplicate removal part is used for carrying out repeated record verification on the specified row record in the data source;
the field selection part is used for selecting a column to be inserted into the database;
the table output part is used for selecting a service table and appointing a column to be inserted for data insertion;
and the null operation part is used for inserting exception handling.
Further, the operation of automatically performing data import includes: and automatically importing data through a WEB server.
According to a second aspect of the embodiments of the present disclosure, there is provided a data importing apparatus, including:
a KETTLE file specifying unit, which is used for specifying a KETTLE file which is created in advance and is used as a data import template;
the configuration information setting unit is used for setting configuration information according to preset information needing to be reconfigured in the KETTLE file;
and the service unit is used for dynamically analyzing the KETTLE file according to the configuration information so as to automatically import data.
Further, the key file is a key file generated by importing a data source of a first preset type into a database of a second preset type according to a set logic by using a tool for generating the key file in advance.
Further, the data source of the first preset type is an EXCEL file.
Further:
the information needing to be reconfigured in the KETTLE file comprises a data source;
the preset information needing to be reconfigured comprises the data source involved in the KETTLE file.
Further, the key file includes: a data source input part, a JavaScript code part, a table input part, a stream query part, a constant adding part, a sorting and de-duplication part, a field selection part, a table output part and/or a no-operation part;
the data source input part comprises an uploaded demand file;
the JavaScript code part comprises a JavaScript script, and the JavaScript script is used for carrying out data verification and data format conversion on the cells in the uploaded file;
the table input part is used for acquiring data of a data source, including query statements;
the stream query part is used for comparing the data acquired by the table input with column fields in the data source;
the constant adding part is used for assigning values to default fields of the database;
the sorting and duplicate removal part is used for carrying out repeated record verification on the specified row record in the data source;
the field selection part is used for selecting a column to be inserted into the database;
the table output part is used for selecting a service table and appointing a column to be inserted for data insertion;
and the null operation part is used for inserting exception handling.
Further, the service unit is used for automatically importing data through a WEB server.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the method has the advantages that the changed requirements are reflected through the KTL file, the changes are placed on the use of an ETL tool, the coupling degree with the service is low, the expansion is easy, the code does not need to be updated frequently, the personalized requirements can be realized, and the hot deployment is supported.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow diagram illustrating a method of data import in accordance with an illustrative embodiment;
FIG. 2 is a diagram illustrating a data import system architecture in accordance with an illustrative embodiment;
FIG. 3 is a flowchart illustrating a logic implementation for generating a KETTLE file from an EXCEL file according to import logic, in accordance with an illustrative embodiment;
FIG. 4 is a flowchart illustrating a method of parsing the KETTLE file, according to an example embodiment;
fig. 5 is a block diagram illustrating a data import apparatus according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Fig. 1 illustrates an architecture diagram of a data import system, and as shown in fig. 1, the implementation method described in this embodiment may involve a business system 110, a key service 120, and an imported target database 130. The service system 110 is responsible for writing a key file, generating an executable key file 111, and for configuring a data source 112, and the key service 120 is responsible for configuring a dynamic database and parsing the key file to import data into the database 130.
Fig. 2 is a flowchart illustrating a data import method according to an exemplary embodiment, which is applicable to the case of importing data. As shown in fig. 2, the data importing method according to this embodiment includes:
in step S210, a key file created in advance for use as a data import template is specified.
The KETTLE is an abbreviation of a key E.T.T.L Environment, is an ETL tool with open sources abroad, is written by pure java, and is efficient and stable in data extraction. It allows managing data from different data sources by providing a graphical interface to implement the functions you want to accomplish. The ktr file generated by the KETTLE tool is referred to as a KETTLE file in this embodiment.
The key file may be a key file generated by importing a data source of a first preset type into a database of a second preset type according to a set logic by using a tool for generating the key file in advance.
For example, importing data in the EXCEL file into a target database through a WEB server, and nodes required for implementing the import function through a key tool may include: EXCEL input, streaming input, JavaScript code, value mapping, sort deduplication, table output, no operation, etc.
FIG. 3 is a flowchart illustrating a logic implementation of generating a KETTLE file from an EXCEL file according to an import logic according to an exemplary embodiment, where the KETTLE file created by implementing the function includes:
A. EXCEL input: mainly uploaded demand files, and for the source data being EXCEL files, may be.
It should be noted that this step is a necessary step, and since the example of importing the EXCEL file is taken here, the EXCEL file is input, but the source data that can be imported in this embodiment includes, but is not limited to, the EXCEL file, and may also be, for example, a TXT text file, a SQLServer database, an ORACLE database, a MySQL database, and the like.
B. JavaScript code: this step is an optional step. JS scripts can be written in the JavaScript codes, and data verification and data format conversion can be carried out on each cell in the uploaded file in the actual application process. Such as: numerical data verification, date format verification, and string splitting, conversion between strings and other data types. This can greatly reduce the amount of code passing verification in the program.
C. Table entry: the method is mainly used for stream query, when stream query is needed, data of a data source is obtained, a SELECT statement is arranged in a node, and the SELECT statement is mainly used for data checksum and leading-in data integrity matching. In actual service requirements, a service field needs to check whether data exists or not to check whether the data can be imported or not. And moreover, some service fields need to be inserted by the identification of the associated service field, so that the node can well acquire service field data and can also be an external system data source, and the method is simpler and clearer than other schemes.
C. And (3) stream query: the data acquired by table input is compared with the column fields in the EXCEL, and the completeness and the accuracy of the data are mainly verified. The node is based on the comparison of table entries and EXCEL data, which is based on memory comparisons.
D. Increasing a constant: this step is an optional step, and is assigned to default fields of the database. Constants are default values for some fields when data is imported. For example: status of records, time, etc.
E. Sorting records and removing duplicate records: this step is an optional step, but is often necessary. And checking the repeated record of the specified row in the EXCEL. In the sequencing and deduplication, deduplication or deduplication verification is carried out on all imported records according to service requirements, codes need to be written in a traditional code for processing, and in the scheme, only a service deduplication field needs to be indicated.
F. And (3) field selection: the method is used for selecting columns to be inserted into the database, and mainly carries out field mapping when the columns are imported.
G. And (3) outputting a table: this step is a necessary step for selecting a service table, specifying a column to be inserted, and inserting data. The node may specify whether to insert in bulk or in single, or whether to process in the same transaction.
H. And (3) idle operation: this step is a necessary step for exception handling of the insertion. The node is used for processing the insertion exception, and is mainly used for judging whether the exception is ignored or not, if the exception is ignored, skipping is carried out after the insertion exception is encountered, and otherwise, rolling back is directly carried out. In the KETTLE tool, two rollback types are included for selection, and when the table record has an exception, the rollback can be selected to be performed before the whole data is imported or before the exception record and the skipping of the exception record is continuously performed through the selection of the rollback types.
The step is used for temporarily configuring one or more than one data source in a KETTLE file so as to dynamically assign values to the temporary data sources by using the data sources specified in the configuration information in the analysis KETTLE file, and the purpose of importing data into different databases by only giving different configuration information every time is achieved.
For example, in the flowchart illustrated in fig. 3, in the process of creating the key file, the table inputs the specified temporary database as the database a, the table outputs the specified temporary database as the database B, and in the configuration information, the temporary database a is configured as the actual corresponding database C, and the temporary database B is configured as the actual corresponding database D, so that when the subsequent dynamic parsing of the key file is performed, the database a is dynamically replaced by the database C according to the configuration information, and the database B is replaced by the database D.
It should be noted that the data source in the configuration information may be a data source that can be supported by any type of key project, such as an SQLServer database, an ORACLE database, a MySQL database, and the like.
In step S220, configuration information is set according to the information that needs to be reconfigured and is preset in the key file.
The tool for generating the key file in this embodiment is a key tool. In this step, the changed requirements are placed in a visual graphic file (a key file), and the key tool generates the key file from the data source configured in step S110 according to the logic of importing the data to be imported. Each node in the visual graphic file is a step of processing the imported service, the logic is clearer, the file is stored after the editing is finished, the file is uploaded to a corresponding directory of a WEB server, and the import function is triggered. It should be noted that, the analysis engine may be encapsulated according to the API according to the key graph node used for generating the key file, and the analysis engine is used to analyze the key file to import the data to be imported to the WEB server.
In the related art, the key file is mainly designed to help implement ETTL requirements, and is usually used for operations such as data extraction, conversion, and cleaning, and after the key file is generated according to the above functions, the operations such as data extraction, conversion, and cleaning can be completed by directly executing the key file in the key tool, and each configuration information, in particular, a data source (including a source database, a target database, and/or a database related to table input for stream query) related to the key file is directly executed according to information configured in the key file. In the embodiment, the key file is innovatively used as a data import template, the operation logic during data import is mainly realized through the key file, the related configuration information, in particular, the data source (including a source database, a target database, and/or a database related to table input during stream query) in the key file is used as temporary information for creating the key file, and in the actual data import process, the configuration information is set according to the information which is preset in the key file and needs to be reconfigured, so that when the key file is executed, the temporary information in the key file is dynamically replaced by the configuration information.
In step S230, the key file is dynamically analyzed according to the configuration information, so as to automatically import data.
The step is used for dynamically analyzing the KETTLE file to finish importing data to the WEB server by dynamically reading the data source configured by the configuration database.
In this step, the API provided by the KETTLE can be used for analyzing the KETTLE file and importing the data to be imported into the WEB server. Specifically, an analysis engine may be encapsulated according to the API according to the key graph node used for generating the key file, and the analysis engine may be used to analyze the key file and import the data to be imported to a WEB server.
Fig. 4 is a flowchart illustrating a method for importing the data to be imported to a WEB server by parsing the key file by using a parsing class engine according to a key graph node used for generating the key file and encapsulating the parsing class engine according to the API, according to an exemplary embodiment, as shown in fig. 4, the method for parsing the key file may include:
in step S410, the execution engine is called.
The example is based on an execution engine which encapsulates the APIs to form a custom parsing class, and dynamically parses the key file according to the configuration information. In this embodiment, when parsing the key file, the execution engine is first called.
In step S420, the keyhander is called.
This step includes using the API provided by the KETTL tool: button environment init (), envutil environment init (), and the like.
In step S430, the resource information is initialized, and the conversion object is created.
Namely, the configured data source information is initialized, and the Transformation object Transformation is created.
In step S440, the data source configured in the key file is parsed.
I.e. the actual data source specified for the conversion object within the configuration information.
In step S450, the parameters in the configuration information are dynamically assigned to the transformation object.
The conversion object described in this embodiment refers to preset information that needs to be reconfigured in the key file, for example, a data source related to the key file.
In step S460, execute () is executed.
The step is actually executed after the preset information needing to be reconfigured in the KETTLE file is replaced by the corresponding information specified in the configuration information.
In step S470, import.
Therefore, the process of dynamically analyzing the KETTLE file according to the configuration file and importing data is realized, and the requirement of changing with unchangeable codes is met.
According to the data importing method and the data importing system, the created KETTLE file is dynamically analyzed according to the configuration information so as to automatically import the data, the changed requirements are reflected through the KETTLE file, and the configuration information is set according to the actual import requirements.
Fig. 5 is a block diagram illustrating a data import apparatus according to an exemplary embodiment, and as shown in fig. 5, the data import apparatus according to this embodiment includes a key file specifying unit 510, a configuration information setting unit 520, and a service unit 530.
The key file specifying unit 510 is configured to specify a key file created in advance for importing a template as data;
the configuration information setting unit 520 is configured to set configuration information according to information that needs to be reconfigured and is preset in the key file;
the service unit 530 is configured to perform dynamic parsing on the key file according to the configuration information, so as to automatically perform data import.
Further, the key file is a key file generated by importing a data source of a first preset type into a database of a second preset type according to a set logic by using a tool for generating the key file in advance.
Further, the data source of the first preset type is an EXCEL file.
Further:
the information needing to be reconfigured in the KETTLE file comprises a data source;
the preset information needing to be reconfigured comprises the data source involved in the KETTLE file.
Further, the key file includes: a data source input part, a JavaScript code part, a table input part, a stream query part, a constant adding part, a sorting and de-duplication part, a field selection part, a table output part and/or a no-operation part;
the data source input part comprises an uploaded demand file;
the JavaScript code part comprises a JavaScript script, and the JavaScript script is used for carrying out data verification and data format conversion on the cells in the uploaded file;
the table input part is used for acquiring data of a data source, including query statements;
the stream query part is used for comparing the data acquired by the table input with column fields in the data source;
the constant adding part is used for assigning values to default fields of the database;
the sorting and duplicate removal part is used for carrying out repeated record verification on the specified row record in the data source;
the field selection part is used for selecting a column to be inserted into the database;
the table output part is used for selecting a service table and appointing a column to be inserted for data insertion;
and the null operation part is used for inserting exception handling.
Further, the service unit 530 is configured to automatically perform data import through a WEB server.
With regard to the apparatus in the above-described embodiment, the specific manner in which each unit performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
The data importing device provided by the embodiment can execute the data importing method provided by the embodiment of the data importing method provided by the invention, and has corresponding functional modules and beneficial effects of the execution method.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (8)

1. A data import method, comprising:
designating a KETTLE file which is created in advance and used as a data import template;
setting configuration information according to preset information needing to be reconfigured in the KETTLE file;
dynamically analyzing the KETTLE file according to the configuration information so as to automatically import data;
the KETTLE file includes: a data source input part, a JavaScript code part, a table input part, a stream query part, a constant adding part, a sorting and de-duplication part, a field selection part, a table output part and/or a no-operation part;
the data source input part comprises an uploaded demand file;
the JavaScript code part comprises a JavaScript script, and the JavaScript script is used for carrying out data verification and data format conversion on the cells in the uploaded file;
the table input part is used for acquiring data of a data source, including query statements; the query statement is used for checking data and matching the integrity of imported data;
the stream query part is used for comparing the data acquired by the table input with column fields in the data source;
the constant adding part is used for assigning values to default fields of the database;
the sorting and duplicate removal part is used for carrying out repeated record verification on the specified row record in the data source;
the field selection part is used for selecting a column to be inserted into the database;
the table output part is used for selecting a service table and appointing a column to be inserted for data insertion;
the null operation part is used for inserting exception handling; the method for inserting exception handling comprises the following steps: skipping or rolling back after encountering the exception, wherein the type of rolling back comprises rolling back to a position before the date of the data and rolling back to a position before the exception record and continuously executing to skip the exception record;
the operation of automatically importing the data comprises the following steps: automatically importing data through a WEB server; wherein, carry out data import automatically through the WEB server, include: and encapsulating an analysis engine according to API (application programming interface) according to the KETTLE graph node used for generating the KETTLE file, and analyzing the KETTLE file by using the analysis engine to import the data to be imported into the WEB server.
2. The method according to claim 1, wherein the key file is a key file generated by importing a data source of a first preset type into a database of a second preset type according to a set logic by using a tool for generating the key file in advance.
3. The method according to any one of claims 1 to 2, characterized in that:
the information needing to be reconfigured in the KETTLE file comprises a data source;
the preset information needing to be reconfigured comprises the data source involved in the KETTLE file.
4. A data importing apparatus, comprising:
a KETTLE file specifying unit, which is used for specifying a KETTLE file which is created in advance and is used as a data import template;
the configuration information setting unit is used for setting configuration information according to preset information needing to be reconfigured in the KETTLE file;
the service unit is used for dynamically analyzing the KETTLE file according to the configuration information so as to automatically import data;
the KETTLE file includes: a data source input part, a JavaScript code part, a table input part, a stream query part, a constant adding part, a sorting and de-duplication part, a field selection part, a table output part and/or a no-operation part;
the data source input part comprises an uploaded demand file;
the JavaScript code part comprises a JavaScript script, and the JavaScript script is used for carrying out data verification and data format conversion on the cells in the uploaded file;
the table input part is used for acquiring data of a data source, including query statements;
the stream query part is used for comparing the data acquired by the table input with column fields in the data source;
the constant adding part is used for assigning values to default fields of the database;
the sorting and duplicate removal part is used for carrying out repeated record verification on the specified row record in the data source;
the field selection part is used for selecting a column to be inserted into the database;
the table output part is used for selecting a service table and appointing a column to be inserted for data insertion;
the null operation part is used for inserting exception handling;
the service unit is used for automatically importing data through a WEB server; wherein, the service unit automatically imports data through a WEB server, and the method comprises the following steps: and encapsulating an analysis engine according to API (application programming interface) according to the KETTLE graph node used for generating the KETTLE file, and analyzing the KETTLE file by using the analysis engine to import the data to be imported into the WEB server.
5. The apparatus of claim 4, wherein the key file is a key file generated by importing a data source of a first preset type into a database of a second preset type according to a set logic by using a tool for generating the key file in advance.
6. The apparatus according to any one of claims 4 to 5, wherein:
the information needing to be reconfigured in the KETTLE file comprises a data source;
the preset information needing to be reconfigured comprises the data source involved in the KETTLE file.
7. An electronic device, comprising;
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-3.
8. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-3.
CN201610041954.6A 2016-01-21 2016-01-21 Data import method and device Active CN106991100B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610041954.6A CN106991100B (en) 2016-01-21 2016-01-21 Data import method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610041954.6A CN106991100B (en) 2016-01-21 2016-01-21 Data import method and device

Publications (2)

Publication Number Publication Date
CN106991100A CN106991100A (en) 2017-07-28
CN106991100B true CN106991100B (en) 2021-10-01

Family

ID=59413621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610041954.6A Active CN106991100B (en) 2016-01-21 2016-01-21 Data import method and device

Country Status (1)

Country Link
CN (1) CN106991100B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107679251A (en) * 2017-11-02 2018-02-09 成都信息工程大学 Universal Database abstracting method based on Spoon under big data environment
CN108182963A (en) * 2017-12-14 2018-06-19 山东浪潮云服务信息科技有限公司 A kind of medical data processing method and processing device
CN108256087B (en) * 2018-01-22 2020-12-04 北京腾云天下科技有限公司 Data importing, inquiring and processing method based on bitmap structure
CN108874866A (en) * 2018-04-22 2018-11-23 平安科技(深圳)有限公司 Data import management method, apparatus, mobile terminal and storage medium
CN109875521A (en) * 2019-04-18 2019-06-14 厦门纳龙科技有限公司 A kind of analysis of ECG data and system
CN110958292A (en) * 2019-09-17 2020-04-03 平安银行股份有限公司 File uploading method, electronic device, computer equipment and storage medium
CN111625581A (en) * 2020-04-28 2020-09-04 四川省金科成地理信息技术有限公司 System data processing method adopting button to start service

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101655873A (en) * 2009-08-28 2010-02-24 金蝶软件(中国)有限公司 Single sign-on system as well as method and device for inputting and outputting data thereof
CN103309945A (en) * 2013-05-15 2013-09-18 上海证券交易所 Device for importing data to database

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9779069B2 (en) * 2014-01-31 2017-10-03 Yahoo Holdings, Inc. Model traversing based compressed serialization of user interaction data and communication from a client-side application
CN104361139B (en) * 2014-12-10 2019-04-16 用友网络科技股份有限公司 Data importing device and method
CN105068770A (en) * 2015-08-28 2015-11-18 国家电网公司 Data integration method and apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101655873A (en) * 2009-08-28 2010-02-24 金蝶软件(中国)有限公司 Single sign-on system as well as method and device for inputting and outputting data thereof
CN103309945A (en) * 2013-05-15 2013-09-18 上海证券交易所 Device for importing data to database

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"干货放送:基于Kettle的数据处理实践";王伟;《http ://tech.itl68.com/a2015/1201/1783/000001783611.shtml》;20151201;1-12 *

Also Published As

Publication number Publication date
CN106991100A (en) 2017-07-28

Similar Documents

Publication Publication Date Title
CN106991100B (en) Data import method and device
CN107918666B (en) Data synchronization method and system on block chain
JP5598017B2 (en) Judgment program, method and apparatus
JP5298117B2 (en) Data merging in distributed computing
WO2017084410A1 (en) Network management data synchronization method and apparatus
CN104133772A (en) Automatic test data generation method
CN106557307B (en) Service data processing method and system
CN106126205A (en) The rapid batch of a kind of Android program installation kit generates method and system
CN106970929B (en) Data import method and device
CN111061475B (en) Software code generating method, device, computer equipment and storage medium
CN109032631B (en) Application program patch package obtaining method and device, computer equipment and storage medium
US20120266131A1 (en) Automatic program generation device, method, and computer program
CN108536745B (en) Shell-based data table extraction method, terminal, equipment and storage medium
CN108241720B (en) Data processing method, device and computer readable storage medium
CN106599167B (en) System and method for supporting increment updating of database
US8965797B2 (en) Explosions of bill-of-materials lists
CN114780641A (en) Multi-library multi-table synchronization method and device, computer equipment and storage medium
CN113495728A (en) Dependency relationship determination method, dependency relationship determination device, electronic equipment and medium
CN112948473A (en) Data processing method, device and system of data warehouse and storage medium
CN107025233B (en) Data feature processing method and device
US11354165B1 (en) Automated cluster execution support for diverse code sources
WO2022262240A1 (en) Data processing method, electronic device, and storage medium
CN115757172A (en) Test execution method and device, storage medium and computer equipment
US20150193854A1 (en) Automated compilation of graph input for the hipergraph solver
CN112114794B (en) Automatic generation method and device of website application program and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant