CN115562859A - Data processing method and device, electronic equipment and computer storage medium - Google Patents

Data processing method and device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN115562859A
CN115562859A CN202211186747.1A CN202211186747A CN115562859A CN 115562859 A CN115562859 A CN 115562859A CN 202211186747 A CN202211186747 A CN 202211186747A CN 115562859 A CN115562859 A CN 115562859A
Authority
CN
China
Prior art keywords
data
processed
processing
configuration
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211186747.1A
Other languages
Chinese (zh)
Inventor
潘海华
罗亚
杜秀清
徐梓舰
马奕彬
王贤兵
陈晓矫
徐三江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shuhui System Technology Co ltd
Original Assignee
Shanghai Shuhui System Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shuhui System Technology Co ltd filed Critical Shanghai Shuhui System Technology Co ltd
Publication of CN115562859A publication Critical patent/CN115562859A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5044Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering hardware capabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/50Indexing scheme relating to G06F9/50
    • G06F2209/5011Pool
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application provides a data processing method, a data processing device, an electronic device and a computer storage medium, wherein the data processing method comprises the following steps: firstly, acquiring an information set of data to be processed; wherein the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume; then, aiming at each data to be processed, creating a target processing queue according to the data size of the data to be processed and the operational capacity of the current server; the target processing queue is used for processing the data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory; and finally, processing the data to be processed by using the target processing queue. The computing resources of the hardware can be fully utilized, the data processing capacity is improved, the processing time is reduced, and the benefit is improved.

Description

Data processing method and device, electronic equipment and computer storage medium
The present application claims priority of domestic application entitled "data processing method, apparatus, electronic device, and computer storage medium" filed by the national intellectual property office of china on 17.5.2022 under the application number 202210536360.8, which is incorporated herein by reference in its entirety.
Technical Field
The present application relates to the field of computer technologies, and in particular, to a data processing method and apparatus, an electronic device, and a computer storage medium.
Background
With the continuous and rapid increase of data, when the data is processed, the overall processing time is long due to the problems of large data volume and slow single-task data processing, and the benefit is greatly influenced.
Disclosure of Invention
In view of this, the present application provides a data processing method, an apparatus, an electronic device, and a computer storage medium, which fully utilize computing resources of hardware, improve data processing capability, reduce processing time, and improve benefits.
A first aspect of the present application provides a data processing method, including:
acquiring an information set of data to be processed; the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume;
aiming at each data to be processed, creating a target processing queue according to the data size of the data to be processed and the operational capability of the current server; the target processing queue is used for processing the data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory;
and processing the data to be processed by utilizing the target processing queue.
Optionally, the acquiring an information set of the data to be processed includes:
reading the configuration of a data pool; wherein the data pool configuration comprises: a data source and a related table, wherein the data source uses identification; the data source comprises a data connection and a drive configuration; the related table is a configuration table name selected by a user;
loading a driver for accessing a database according to the driver configuration;
and loading the data to be processed in batches in the database to obtain an information set of the data to be processed.
Optionally, the processing the data to be processed by using the target processing queue includes:
acquiring execution configuration of data to be processed; wherein the executing the configuration comprises: processing the relevant configuration information in the process of the data to be processed;
if the execution configuration indicates that the interaction with a third-party system is involved in the process of processing the data to be processed, scheduling a processing program related to the third-party system to cooperatively process the data to be processed;
and if the execution configuration indicates that the interaction with a third-party system is not involved in the process of processing the data to be processed, directly utilizing the target processing queue to process the data to be processed.
Optionally, after the processing the data to be processed by using the target processing queue, the method further includes:
monitoring the utilization rate of a central processing unit and the utilization rate of a memory in real time;
and if the monitoring central processing unit utilization rate does not exceed the first threshold value and the memory utilization rate does not exceed the second threshold value, increasing a processing queue for processing the data to be processed.
Optionally, after the processing the data to be processed by using the target processing queue, the method further includes:
when the processing is finished, one piece of data to be processed exits from the target processing queue corresponding to the data to be processed;
and according to a persistence mode customized by a user, performing persistence processing on the to-be-processed data after the processing is finished to obtain a persistence processing result.
A second aspect of the present application provides an apparatus for processing data, including:
the device comprises a first acquisition unit, a second acquisition unit and a processing unit, wherein the first acquisition unit is used for acquiring an information set of data to be processed; wherein the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume;
the creating unit is used for creating a target processing queue according to the data size of the data to be processed and the operational capability of the current server aiming at each data to be processed; the target processing queue is used for processing the data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory;
and the processing unit is used for processing the data to be processed by utilizing the target processing queue.
Optionally, the first obtaining unit includes:
a reading unit for reading the data pool configuration; wherein the data pool configuration comprises: a data source and a related table, wherein the data source uses identification; the data source comprises a data connection and a drive configuration; the related table is a configuration table name selected by a user;
the first loading unit is used for loading a driver for accessing the database according to the driver configuration;
and the second loading unit is used for loading the data to be processed in batches in the database to obtain an information set of the data to be processed.
Optionally, the processing unit includes:
the second acquisition unit is used for acquiring the execution configuration of the data to be processed; wherein the executing the configuration comprises: processing the relevant configuration information in the process of the data to be processed;
the scheduling unit is used for scheduling a processing program related to a third-party system to cooperatively process the data to be processed if the execution configuration indicates that the interaction with the third-party system is involved in the process of processing the data to be processed;
and the processing subunit is configured to, if the execution configuration indicates that no interaction with a third-party system is involved in the process of processing the to-be-processed data, directly utilize the target processing queue to process the to-be-processed data.
Optionally, the data processing apparatus further includes:
the monitoring unit is used for monitoring the utilization rate of the central processing unit and the utilization rate of the memory in real time;
and the adding unit is used for adding a processing queue for processing the data to be processed if the utilization rate of the monitoring central processing unit does not exceed a first threshold value and the utilization rate of the memory does not exceed a second threshold value.
Optionally, the data processing apparatus further includes:
the quitting unit is used for quitting the target processing queue corresponding to the data to be processed when the data to be processed is processed;
and the persistence processing unit is used for performing persistence processing on the data to be processed after the processing according to a persistence mode customized by a user to obtain a persistence processing result.
A third aspect of the present application provides an electronic device comprising:
one or more processors;
a storage device having one or more programs stored thereon;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a method of processing data as described in any of the first aspects.
A fourth aspect of the present application provides a computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements a method of processing data as set forth in any one of the first aspects.
As can be seen from the above aspects, the present application provides a data processing method, an apparatus, an electronic device, and a computer storage medium, where the data processing method includes: firstly, acquiring an information set of data to be processed; the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume; then, aiming at each data to be processed, creating a target processing queue according to the data size of the data to be processed and the operational capacity of the current server; the target processing queue is used for processing the data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory; and finally, processing the data to be processed by using the target processing queue. The computing resources of hardware can be fully utilized, the data processing capacity is improved, the processing time is shortened, and the benefit is improved.
Drawings
In order to more clearly illustrate the technical solutions in the present embodiment or the prior art, the drawings needed to be used in the description of the embodiment or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only the embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a detailed flowchart of a data processing method according to an embodiment of the present application;
fig. 2 is a flowchart of a data processing method according to another embodiment of the present application;
fig. 3 is a flowchart of a data processing method according to another embodiment of the present application;
fig. 4 is a flowchart of a data processing method according to another embodiment of the present application;
fig. 5 is a flowchart of a data processing method according to another embodiment of the present application;
fig. 6 is a schematic diagram of a data processing apparatus according to another embodiment of the present application;
fig. 7 is a schematic view of an electronic device implementing a data processing method according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present application are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this application are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
An embodiment of the present application provides a data processing method, as shown in fig. 1, which specifically includes the following steps:
s101, acquiring an information set of data to be processed.
The information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises the unique identification and the size of the data volume.
Optionally, in another embodiment of the present application, an implementation manner of step S101, as shown in fig. 2, includes:
s201, reading data pool configuration.
Wherein, the data pool configuration comprises: a data source and a related table, wherein the data source uses identification; the data source comprises a data connection and a driving configuration; the list concerned is the name of the configuration list chosen by the user.
S202, loading a driver for accessing the database according to the driver configuration.
It should be noted that the database may be, but is not limited to, oracle, mysql, etc., and is not limited herein.
S203, loading the data to be processed in batches in the database to obtain an information set of the data to be processed.
S102, aiming at each piece of data to be processed, creating a target processing queue according to the data size of the data to be processed and the operational capacity of the current server.
The target processing queue is used for processing data to be processed; the computing capacity of the server is determined according to the utilization rate of the central processing unit and the utilization rate of the memory.
It should be noted that, in a specific implementation process of the present application, one target processing queue corresponds to one execution thread for processing data, and the to-be-processed data in the target processing queue is processed through the execution thread.
Specifically, the size of the target processing queue to be created is determined according to the data size of the data to be processed, the current utilization rate of the central processing unit and the memory utilization rate, and then the target processing queue is created. Through elastic computing power calculation, dynamic multitask concurrent processing expansion and automatic computing of the processed concurrent amount, comprehensive processing capacity is improved, overall processing time is shortened, and performance of a data processing system is improved.
And S103, processing the data to be processed by using the target processing queue.
Optionally, in another embodiment of the present application, an implementation manner of step S103, as shown in fig. 3, includes:
s301, acquiring execution configuration of data to be processed.
Wherein executing the configuration comprises: and processing the relevant configuration information in the process of the data to be processed.
S302, if the execution configuration shows that the interaction with a third-party system is involved in the process of processing the data to be processed, scheduling a processing program related to the third-party system to cooperatively process the data to be processed.
And S303, if the execution configuration shows that the interaction with a third-party system is not involved in the process of processing the data to be processed, directly processing the data to be processed by using the target processing queue.
Optionally, in another embodiment of the present application, after processing the data to be processed by using the target processing queue, an implementation manner of the data processing method, as shown in fig. 4, further includes:
s401, monitoring the utilization rate of a central processing unit and the utilization rate of a memory in real time.
S402, if the utilization rate of the central processing unit does not exceed a first threshold value and the utilization rate of the memory does not exceed a second threshold value, adding a processing queue for processing to-be-processed data.
The first threshold and the second threshold may be preset and modified by a technician or an authorized person, and are not limited herein.
Optionally, in another embodiment of the present application, after processing the data to be processed by using the target processing queue, as shown in fig. 5, an implementation manner of the data processing method further includes:
and S501, when each piece of data to be processed is finished, quitting the target processing queue corresponding to the data to be processed.
And S502, according to a persistence mode customized by a user, performing persistence processing on the to-be-processed data after the processing is finished to obtain a persistence processing result.
Wherein, the persistence mode customized by the user may be, but is not limited to, that the pushed result includes a data unique identifier + state, and is stored in an independent table, or may be a state column corresponding to the source data pool table, and the like, which is not limited herein,
as can be seen from the above solutions, the present application provides a data processing method: firstly, acquiring an information set of data to be processed; the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume; then, aiming at each data to be processed, creating a target processing queue according to the data size of the data to be processed and the operational capacity of the current server; the target processing queue is used for processing data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory; and finally, processing the data to be processed by using the target processing queue. The computing resources of hardware can be fully utilized, the data processing capacity is improved, the processing time is shortened, and the benefit is improved.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The names of messages or information exchanged between a plurality of devices in the embodiments of the present application are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Python, java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Another embodiment of the present application provides a data processing apparatus, as shown in fig. 6, specifically including:
a first obtaining unit 601, configured to obtain an information set of data to be processed.
The information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises the unique identification and the size of the data volume.
The creating unit 602 is configured to create, for each piece of to-be-processed data, a target processing queue according to the data size of the to-be-processed data and the computing capability of the current server.
The target processing queue is used for processing data to be processed; the computing capacity of the server is determined according to the utilization rate of the central processing unit and the utilization rate of the memory.
A processing unit 603, configured to process the data to be processed by using the target processing queue.
For a specific working process of the unit disclosed in the above embodiment of the present application, reference may be made to the content of the corresponding method embodiment, as shown in fig. 1, which is not described herein again.
Optionally, in another embodiment of the present application, an implementation manner of the first obtaining unit 601 includes:
and the reading unit is used for reading the configuration of the data pool.
Wherein, the data pool configuration comprises: a data source and a related table, wherein the data source uses identification; the data source comprises a data connection and a driving configuration; the list concerned is the name of the configuration list selected by the user.
And the first loading unit is used for loading the driver for accessing the database according to the driver configuration.
And the second loading unit is used for loading the data to be processed in batches in the database to obtain an information set of the data to be processed.
For a specific working process of the unit disclosed in the above embodiment of the present application, reference may be made to the content of the corresponding method embodiment, as shown in fig. 2, which is not described herein again.
Optionally, in another embodiment of the present application, an implementation manner of the processing unit 603 includes:
and the second acquisition unit is used for acquiring the execution configuration of the data to be processed.
Wherein executing the configuration comprises: and processing the relevant configuration information in the process of the data to be processed.
And the scheduling unit is used for scheduling the processing program related to the third-party system to cooperatively process the data to be processed if the execution configuration indicates that the interaction with the third-party system is involved in the process of processing the data to be processed.
And the processing subunit is used for directly processing the data to be processed by using the target processing queue if the execution configuration indicates that the interaction with a third-party system is not involved in the process of processing the data to be processed.
For a specific working process of the unit disclosed in the above embodiment of the present application, reference may be made to the content of the corresponding method embodiment, as shown in fig. 3, which is not described herein again.
Optionally, in another embodiment of the present application, an implementation manner of the data processing apparatus further includes:
and the monitoring unit is used for monitoring the utilization rate of the central processing unit and the utilization rate of the memory in real time.
And the adding unit is used for adding a processing queue for processing the data to be processed if the utilization rate of the central processing unit does not exceed the first threshold value and the utilization rate of the memory does not exceed the second threshold value.
For a specific working process of the unit disclosed in the above embodiment of the present application, reference may be made to the content of the corresponding method embodiment, as shown in fig. 4, which is not described herein again.
Optionally, in another embodiment of the present application, an implementation manner of the data processing apparatus further includes:
and the quitting unit is used for quitting the target processing queue corresponding to the data to be processed when the data to be processed is processed.
And the persistence processing unit is used for performing persistence processing on the data to be processed after the processing according to a persistence mode customized by a user to obtain a persistence processing result.
For the specific working process of the units disclosed in the above embodiments of the present application, reference may be made to the content of the corresponding method embodiment, as shown in fig. 5, which is not described herein again.
As can be seen from the above, the present application provides a data processing apparatus: first, a first acquisition unit 601 acquires an information set of data to be processed; the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume; then, the creating unit 602 creates a target processing queue for each piece of to-be-processed data according to the data size of the to-be-processed data and the operational capability of the current server; the target processing queue is used for processing data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory; finally, the processing unit 603 processes the data to be processed using the target processing queue. The computing resources of the hardware can be fully utilized, the data processing capacity is improved, the processing time is reduced, and the benefit is improved.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
Another embodiment of the present application provides an electronic device, as shown in fig. 7, including:
one or more processors 701.
A storage 702 having one or more programs stored thereon.
The one or more programs, when executed by the one or more processors 701, cause the one or more processors 701 to implement a method of processing data as in any one of the above embodiments.
Another embodiment of the present application provides a computer storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements a method for processing data as in any one of the above embodiments.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
Another embodiment of the present application provides a computer program product for performing the method of processing data of any one of the above when the computer program product is executed.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means, or installed from a storage means, or installed from a ROM. The computer program, when executed by a processing device, performs the above-described functions defined in the method of the embodiments of the present application.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
While several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the application. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the application referred to in the present application is not limited to the embodiments with a particular combination of the above-mentioned features, but also encompasses other embodiments with any combination of the above-mentioned features or their equivalents without departing from the scope of the application. For example, the above features may be replaced with (but not limited to) features having similar functions as those described in this application.

Claims (10)

1. A method of processing data, comprising:
acquiring an information set of data to be processed; wherein the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume;
aiming at each piece of the data to be processed, creating a target processing queue according to the data size of the data to be processed and the operational capacity of the current server; the target processing queue is used for processing the data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory;
and processing the data to be processed by utilizing the target processing queue.
2. The processing method according to claim 1, wherein the obtaining of the information set of the data to be processed comprises:
reading the configuration of a data pool; wherein the data pool configuration comprises: a data source and a related table, wherein the data source uses identification; the data source comprises a data connection and a drive configuration; the related table is a configuration table name selected by a user;
loading a driver for accessing a database according to the driver configuration;
and loading the data to be processed in batches in the database to obtain an information set of the data to be processed.
3. The processing method according to claim 1, wherein said processing the data to be processed by using the target processing queue comprises:
acquiring execution configuration of data to be processed; wherein the executing the configuration comprises: processing the relevant configuration information in the process of the data to be processed;
if the execution configuration indicates that the interaction with a third-party system is involved in the process of processing the data to be processed, scheduling a processing program related to the third-party system to cooperatively process the data to be processed;
and if the execution configuration indicates that the interaction with a third-party system is not involved in the process of processing the data to be processed, directly utilizing the target processing queue to process the data to be processed.
4. The processing method according to claim 1, wherein after processing the data to be processed by using the target processing queue, further comprising:
monitoring the utilization rate of a central processing unit and the utilization rate of a memory in real time;
and if the monitoring central processing unit utilization rate does not exceed the first threshold value and the memory utilization rate does not exceed the second threshold value, increasing a processing queue for processing the data to be processed.
5. The processing method according to claim 1, wherein after processing the data to be processed by using the target processing queue, the processing method further comprises:
when the processing is finished, one piece of data to be processed exits from the target processing queue corresponding to the data to be processed;
and according to a persistence mode customized by a user, performing persistence processing on the data to be processed after the processing is finished to obtain a persistence processing result.
6. An apparatus for processing data, comprising:
the device comprises a first acquisition unit, a second acquisition unit and a processing unit, wherein the first acquisition unit is used for acquiring an information set of data to be processed; wherein the information set of the data to be processed comprises information of at least one data to be processed; the information of the data to be processed comprises a unique identifier and the size of data volume;
the creating unit is used for creating a target processing queue according to the data size of the data to be processed and the operational capability of the current server aiming at each data to be processed; the target processing queue is used for processing the data to be processed; the computing capacity of the server is determined according to the utilization rate of a central processing unit and the utilization rate of a memory;
and the processing unit is used for processing the data to be processed by utilizing the target processing queue.
7. The processing apparatus according to claim 6, wherein the first obtaining unit includes:
a reading unit for reading the data pool configuration; wherein the data pool configuration comprises: a data source and a related table, wherein the data source uses identification; the data source comprises a data connection and a drive configuration; the related table is a configuration table name selected by a user;
the first loading unit is used for loading a driver for accessing the database according to the driver configuration;
and the second loading unit is used for loading the data to be processed in batches in the database to obtain an information set of the data to be processed.
8. The processing apparatus according to claim 6, wherein the processing unit comprises:
the second acquisition unit is used for acquiring the execution configuration of the data to be processed; wherein the executing the configuration comprises: processing the relevant configuration information in the process of the data to be processed;
the scheduling unit is used for scheduling a processing program related to a third-party system to cooperatively process the data to be processed if the execution configuration indicates that the interaction with the third-party system is involved in the process of processing the data to be processed;
and the processing subunit is configured to, if the execution configuration indicates that interaction with a third-party system is not involved in the process of processing the to-be-processed data, directly utilize the target processing queue to process the to-be-processed data.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a method of processing data as recited in any of claims 1 to 5.
10. A computer storage medium, characterized in that a computer program is stored thereon, wherein the computer program, when executed by a processor, implements a method of processing data according to any one of claims 1 to 5.
CN202211186747.1A 2022-05-17 2022-09-27 Data processing method and device, electronic equipment and computer storage medium Pending CN115562859A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210536360 2022-05-17
CN2022105363608 2022-05-17

Publications (1)

Publication Number Publication Date
CN115562859A true CN115562859A (en) 2023-01-03

Family

ID=84742082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211186747.1A Pending CN115562859A (en) 2022-05-17 2022-09-27 Data processing method and device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN115562859A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115964181A (en) * 2023-03-10 2023-04-14 之江实验室 Data processing method and device, storage medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115964181A (en) * 2023-03-10 2023-04-14 之江实验室 Data processing method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US11762697B2 (en) Method and apparatus for scheduling resource for deep learning framework
EP4113299A2 (en) Task processing method and device, and electronic device
CN110825436B (en) Calculation method applied to artificial intelligence chip and artificial intelligence chip
CN115562859A (en) Data processing method and device, electronic equipment and computer storage medium
CN115357350A (en) Task configuration method and device, electronic equipment and computer readable medium
CN116627333A (en) Log caching method and device, electronic equipment and computer readable storage medium
CN110659340A (en) Electronic fence generation method, device, medium and electronic equipment
CN112182111A (en) Block chain based distributed system layered processing method and electronic equipment
US9671779B2 (en) Method and system for filtering lot schedules using a previous schedule
CN113760494B (en) Task scheduling method and device
CN109697592B (en) Goods source off-shelf method, system, equipment and storage medium based on annular array
CN109255641B (en) Business object processing method and device
CN111784295A (en) Flight validation method and device
CN112965827B (en) Information scheduling method and device, electronic equipment and computer medium
CN110908886A (en) Data sending method and device, electronic equipment and storage medium
CN113778711B (en) Event processing method and device, electronic equipment and storage medium
CN110262756B (en) Method and device for caching data
CN114996169B (en) Device diagnosis method, device, electronic device, and storage medium
CN114257598B (en) Resource downloading method and device, storage medium and electronic equipment
CN115374320B (en) Text matching method and device, electronic equipment and computer medium
CN113204497A (en) Test case generation method and device, electronic equipment and computer storage medium
CN117667456A (en) Message sending management method, device, equipment and storage medium
CN112953810A (en) Network request processing method and device
CN115952088A (en) Memory leak detection method, device, equipment and storage medium
CN114253826A (en) Memory leak determination method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination