CN111178540A - Training data transmission method, device, equipment and medium - Google Patents

Training data transmission method, device, equipment and medium Download PDF

Info

Publication number
CN111178540A
CN111178540A CN201911386514.4A CN201911386514A CN111178540A CN 111178540 A CN111178540 A CN 111178540A CN 201911386514 A CN201911386514 A CN 201911386514A CN 111178540 A CN111178540 A CN 111178540A
Authority
CN
China
Prior art keywords
training data
data
current
compressed data
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911386514.4A
Other languages
Chinese (zh)
Inventor
赵旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Beijing Electronic Information Industry Co Ltd
Original Assignee
Inspur Beijing Electronic Information Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Beijing Electronic Information Industry Co Ltd filed Critical Inspur Beijing Electronic Information Industry Co Ltd
Priority to CN201911386514.4A priority Critical patent/CN111178540A/en
Publication of CN111178540A publication Critical patent/CN111178540A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5061Partitioning or combining of resources
    • G06F9/5072Grid computing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)

Abstract

The invention discloses a transmission method of training data, which comprises the following steps: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, compressing the obtained current training data into preset type compressed data; and establishing communication with the destination node, and sending the compressed data to the destination node. And after the calculation is finished, carrying out decompression operation and continuing the model training process. Therefore, the current training data are compressed into the preset type of compressed data, so that the model training precision is ensured, and the data volume transmitted in the communication process is reduced; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured. In addition, the invention provides a transmission device, equipment and a storage medium of training data, which correspond to the method.

Description

Training data transmission method, device, equipment and medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method, an apparatus, a device, and a medium for transmitting training data.
Background
At present, deep learning models are widely applied in various fields, such as computer vision, recommendation systems, natural language processing and the like. In order to achieve better training results for deep learning models, model parameters have reached the order of 10 hundred million. Aiming at the situation, scientific research personnel distribute the training process of the deep learning model to a plurality of computing nodes by using a distributed computing framework, so that the computing nodes perform parallel computation, and then establish communication in the training process to realize the consistency of gradient data in the back propagation process. Current distributed computing frameworks support computing for data types such as int32, float32, and float 16.
In a cloud computing environment, a 25Gb network is typically employed. In the prior art, when training data is transmitted, in order to relieve network resource pressure and ensure normal transmission of the training data, float16 type data with small number of bits occupied by floating point numbers is generally selected for deep learning model training, and the obtained float16 type training data is transmitted, so that the communication data amount in the transmission process is reduced, and the purpose of relieving network resource pressure is achieved.
However, as the number of layers of the deep learning model increases, the scale of the parameter quantity becomes larger and larger, and a large amount of training data is generated in the large-scale training process of the float16 type data, so that the situation that the communication bandwidth is not available is called as a model training bottleneck, and the problem is not solved fundamentally.
Disclosure of Invention
The invention aims to provide a method, a device, equipment and a medium for transmitting training data, which can reduce the data volume transmitted in the communication process while ensuring the model training precision by compressing the current training data into the preset type of compressed data; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
In order to solve the above technical problem, the present invention provides a method for transmitting training data, including:
the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not;
if yes, compressing the obtained current training data into preset type compressed data;
and establishing communication with a destination node, and sending the compressed data to the destination node.
Preferably, the current monitoring value is a current time interval or a current data amount put into the buffer.
Preferably, the compressing the obtained current training data into the preset type of compressed data specifically includes:
and compressing the obtained current training data into preset type compressed data according to a compression algorithm.
Preferably, the preset type is specifically an int8 data type.
Preferably, the method further comprises the following steps:
judging whether compressed data sent by other nodes is received or not;
and if so, decompressing the compressed data according to a decompression algorithm corresponding to the compression algorithm.
Preferably, the method further comprises the following steps:
detecting whether the communication with a destination node is normal;
if not, abnormal information for indicating communication failure is fed back.
Preferably, the method further comprises the following steps:
and informing the operation and maintenance personnel to process the abnormal information according to a pre-stored contact way of the operation and maintenance personnel.
In order to solve the above technical problem, the present invention further provides a transmission device for training data, including:
the monitoring module is used for monitoring a current monitoring value used for reflecting the current training condition in the node and judging whether the current monitoring value is larger than a preset value or not; if yes, entering a compression module;
the compression module is used for compressing the obtained current training data into preset type of compressed data;
and the sending module is used for establishing communication with a destination node and sending the compressed data to the destination node.
Preferably, the method further comprises the following steps:
the judging module is used for judging whether compressed data sent by other nodes is received or not; and if so, decompressing the compressed data according to a decompression algorithm corresponding to the compression algorithm.
Preferably, the method further comprises the following steps:
the detection module is used for detecting whether the communication with the destination node is normal or not; if not, abnormal information for indicating communication failure is fed back.
Preferably, the method further comprises the following steps:
and the notification module is used for notifying the operation and maintenance personnel to process the abnormal information according to the pre-stored contact information of the operation and maintenance personnel.
In order to solve the above technical problem, the present invention further provides a transmission device for training data, including a memory for storing a computer program;
a processor for implementing the steps of the method of transmission of training data according to any one of the preceding claims when executing said computer program.
In order to solve the above technical problem, the present invention further provides a computer-readable storage medium, having a computer program stored thereon, where the computer program is executed by a processor to implement the steps of the transmission method of training data according to any one of the above embodiments.
The invention provides a transmission method of training data, which comprises the following steps: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, compressing the obtained current training data into preset type compressed data; and establishing communication with the destination node, and sending the compressed data to the destination node. Therefore, the current training data are compressed into the preset type of compressed data, so that the model training precision is ensured, and the data volume transmitted in the communication process is reduced; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
In addition, the transmission device, the equipment and the storage medium of the training data provided by the invention correspond to the method, and have the same beneficial effects.
Drawings
In order to illustrate the embodiments of the present invention more clearly, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings can be obtained by those skilled in the art without inventive effort.
Fig. 1 is a flowchart of a training data transmission method according to an embodiment of the present invention;
fig. 2 is a flowchart of another training data transmission method according to an embodiment of the present invention;
fig. 3 is a structural diagram of a transmission device for training data according to an embodiment of the present invention;
fig. 4 is a structural diagram of a transmission device for training data according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without any creative work belong to the protection scope of the present invention.
The core of the invention is to provide a transmission method, a device, equipment and a medium of training data, which can reduce the data amount transmitted in the communication process while ensuring the model training precision by compressing the current training data into the preset type of compressed data; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
In order that those skilled in the art will better understand the disclosure, the invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a flowchart of a training data transmission method according to an embodiment of the present invention; as shown in fig. 1, a method for transmitting training data according to an embodiment of the present invention includes steps S101 to S103:
step S101: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, go to step S102;
in an embodiment, the current monitoring value used for reflecting the current training condition in the monitoring node may be detected in a real-time manner, or may be monitored in a timing manner. Those skilled in the art can determine a specific monitoring mode according to the actual application situation, and the embodiment of the present invention is not limited.
Further, the current monitoring value is specifically a current time interval or a current data amount put into the buffer. It should be noted that, if the current monitoring value is the current time interval of the two data communications, the preset value is a preset time value, and whether the current training data needs to be compressed and transmitted is determined by judging whether the current time interval of the last data communications is greater than the preset time value; and if the current monitoring value is the current data volume currently put into the buffer area, the preset value is a threshold value of the size of the preset buffer area, and whether the current training data needs to be compressed and transmitted is determined by judging whether the current training data volume is larger than the threshold value. It should be noted that, a person skilled in the art can reasonably determine the preset value according to the actual application situation, and the embodiment of the present invention is not limited.
Step S102: compressing the obtained current training data into preset type compressed data;
step S103: and establishing communication with the destination node, and sending the compressed data to the destination node.
In specific implementation, if the current monitoring value is greater than the preset value, the obtained current training data is compressed, so that the preset type of compressed data is obtained. The preset types of compressed data may include int32, float32, float16, or int8, etc. It should be noted that, a person skilled in the art may preset the type of compressed data according to actual needs, and on the premise of ensuring the training precision of the deep learning model, the amount of transmitted data is reduced as much as possible.
In one embodiment, the preset type is specifically int8 data type, so as to reduce the amount of data to be transmitted and ensure the simplicity of the specification process. Because in the Horovod distributed framework applied in the prior art, gradient specification calculation of int8 type is not supported. In one embodiment, the gradient reduction calculation of the int8 type and the uint8 type data is realized by adding the parameter types of the int8 type and the uint8 type in the Allreduce function under the Horovod distributed framework and calling the MPI function library and the NCCL function library.
In a specific implementation, the obtained current training data may be compressed into a preset type of compressed data according to a compression algorithm. Firstly, a scaling coefficient scale in the compression process is calculated, and the formula is as follows:
Figure BDA0002343765270000051
where t arget is an upper bound value of an integer obtained after scaling, for example, when the adopted preset type is 8 bits, the target is 27-1. Max and Min are the maximum and minimum values, respectively, in the training data that needs to be communicated. And N, the number of computing nodes used for realizing data parallel in the model training process.
And calculating compressed data corresponding to the current training data through the scaling coefficient. Specifically, the following formula:
Aquantized=round((Afloat-Mid)·scale);
Figure BDA0002343765270000052
wherein A isfloatTraining data needing to be communicated; a. thequantizedCompressed data after being compressed; round is the rounding operation.
In one embodiment, training data to be transmitted is compressed into compressed data, communication is established with a destination node, and a ring-reduce calculation operation is performed. It should be noted that there may be a plurality of destination nodes, and those skilled in the art may determine the destination node specifically needed to establish communication according to the actual application. The compressed data is sent to the target node to realize ring-reduce calculation among different nodes, so that the data volume needing to be transmitted is reduced, and the pressure on network bandwidth in the deep learning model multi-node training process is relieved.
The invention provides a transmission method of training data, which comprises the following steps: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, compressing the obtained current training data into preset type compressed data; and establishing communication with the destination node, and sending the compressed data to the destination node. Therefore, the current training data are compressed into the preset type of compressed data, so that the model training precision is ensured, and the data volume transmitted in the communication process is reduced; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
Fig. 2 is a flowchart of another training data transmission method according to an embodiment of the present invention; as shown in fig. 2, the transmission method of training data according to the embodiment of the present invention further includes steps S104 to S105:
step S104: judging whether compressed data sent by other nodes is received or not; if yes, go to step S105;
step S105: and decompressing the compressed data according to a decompression algorithm corresponding to the compression algorithm.
In one embodiment, each node used in deep learning model multi-node training needs to send compressed data, and meanwhile, the node also serves as a destination node to receive compressed data sent by other nodes, so that the summary of the compressed data is completed. When compressed data sent by other nodes is received, decompression operation needs to be carried out on the compressed data. Specifically, the received compressed data is decompressed according to a decompression algorithm corresponding to the compression algorithm, and the specific formula is as follows:
Figure BDA0002343765270000061
wherein A isdequantizedIs data obtained by decompressing received compressed data. For example, if the preset type is int8 data type, the type of the received compressed data is int 8; the compressed data can be decompressed into the data type of the training data before compression through the decompression algorithm, so that the training precision of the deep learning model is ensured.
As shown in fig. 2, the transmission method of training data according to the embodiment of the present invention further includes steps S106 to S107:
step S106: detecting whether the communication with a destination node is normal; if not, go to step S107;
step S107: the feedback is used for indicating abnormal information of communication failure.
In one embodiment, it is detected whether the communication with the destination node is normal. Specifically, communication between the own node and the destination node can be detected by a ping operation. When the communication is normal, the compressed data after the current training data compression can be sent to the destination node by utilizing the communication connection between the node and the destination node; when the communication is abnormal, the compressed data cannot be transmitted. The abnormal phenomenon can be warned and prompted by feeding back abnormal information indicating a communication failure. Therefore, operation and maintenance personnel can know the current communication situation according to the abnormal information and process the abnormal situation.
As shown in fig. 2, the method for transmitting training data according to the embodiment of the present invention further includes step S108:
step S108: and informing the operation and maintenance personnel to process the abnormal information according to the pre-stored contact information of the operation and maintenance personnel.
Specifically, when communication with the destination node is found to be abnormal, the operation and maintenance personnel can be notified to process abnormal information according to the pre-stored contact information of the operation and maintenance personnel, such as a mailbox or a telephone number. Therefore, operation and maintenance personnel can timely perform treatment, and the training efficiency of the deep learning model is prevented from being influenced due to untimely treatment.
The invention also provides a transmission device of the training data and a corresponding embodiment of the transmission equipment of the training data. It should be noted that the present invention describes the embodiments from two perspectives, one is based on the functional module, and the other is based on the hardware.
Fig. 3 is a structural diagram of a transmission device for training data according to an embodiment of the present invention; as shown in fig. 3, an apparatus for transmitting training data according to an embodiment of the present invention includes:
the monitoring module 10 is used for monitoring a current monitoring value used for reflecting the current training condition in the node and judging whether the current monitoring value is larger than a preset value or not; if yes, entering a compression module;
the compression module 11 is configured to compress the obtained current training data into preset type compressed data;
and a sending module 12, configured to establish communication with the destination node and send the compressed data to the destination node.
The transmission device for training data provided by the embodiment of the invention further comprises:
the judging module is used for judging whether compressed data sent by other nodes is received or not; if so, decompressing the compressed data according to a decompression algorithm corresponding to the compression algorithm.
The transmission device for training data provided by the embodiment of the invention further comprises:
the detection module is used for detecting whether the communication with the destination node is normal or not; if not, abnormal information for indicating communication failure is fed back.
The transmission device for training data provided by the embodiment of the invention further comprises:
and the notification module is used for notifying the operation and maintenance personnel to process the abnormal information according to the pre-stored contact information of the operation and maintenance personnel.
Since the embodiments of this section correspond to the embodiments of the method section, reference is made to the description of the embodiments of the method section for the embodiments of this section, and details are not repeated here.
The invention provides a transmission device of training data, comprising: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, compressing the obtained current training data into preset type compressed data; and establishing communication with the destination node, and sending the compressed data to the destination node. Therefore, the current training data are compressed into the preset type of compressed data, so that the model training precision is ensured, and the data volume transmitted in the communication process is reduced; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
Fig. 4 is a structural diagram of a transmission device for training data according to an embodiment of the present invention. As shown in fig. 4, the transmission apparatus of training data according to the embodiment of the present invention includes a memory 20 for storing a computer program;
a processor 21 for implementing the steps of the method for transmitting training data according to any one of the above when executing a computer program.
The processor 21 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 21 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 21 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 21 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 21 may further include an AI (Artificial Intelligence) processor for processing a calculation operation related to machine learning.
The memory 20 may include one or more computer-readable storage media, which may be non-transitory. Memory 20 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In this embodiment, the memory 20 is at least used for storing the following computer program 201, wherein after being loaded and executed by the processor 21, the computer program can implement relevant steps in the transmission method of training data disclosed in any of the foregoing embodiments. In addition, the resources stored in the memory 20 may also include an operating system 202, data 203, and the like, and the storage manner may be a transient storage manner or a permanent storage manner. Operating system 202 may include, among others, Windows, Unix, Linux, and the like.
In some embodiments, the apparatus for transmitting training data may further comprise an input/output interface 22, a communication interface 23, a power supply 24, and a communication bus 25.
Those skilled in the art will appreciate that the configuration shown in fig. 4 does not constitute a limitation of the transmission device of the training data and may include more or fewer components than those shown.
Since the embodiments of this section correspond to the embodiments of the method section, reference is made to the description of the embodiments of the method section for the embodiments of this section, and details are not repeated here. In some embodiments of the invention, the processor and memory may be connected by a bus or other means.
The transmission equipment of the training data provided by the invention can realize the following method: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, compressing the obtained current training data into preset type compressed data; and establishing communication with the destination node, and sending the compressed data to the destination node. Therefore, the current training data are compressed into the preset type of compressed data, so that the model training precision is ensured, and the data volume transmitted in the communication process is reduced; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
Finally, the invention also provides a computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the method for transmitting training data according to any one of the preceding claims.
It is to be understood that if the method in the above embodiments is implemented in the form of software functional units and sold or used as a stand-alone product, it can be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and performs all or part of the steps of the methods according to the embodiments of the present invention, or all or part of the technical solution. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The invention provides a computer readable storage medium, on which a computer program is stored, the computer program, when executed by a processor, implements the following method: the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not; if yes, compressing the obtained current training data into preset type compressed data; and establishing communication with the destination node, and sending the compressed data to the destination node. Therefore, the current training data are compressed into the preset type of compressed data, so that the model training precision is ensured, and the data volume transmitted in the communication process is reduced; the pressure of network resources is relieved, and the accuracy and the high efficiency in the data transmission process are ensured.
The method, device, equipment and medium for transmitting training data provided by the invention are described in detail above. The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A method for transmitting training data, comprising:
the monitoring node is used for reflecting a current monitoring value of a current training condition and judging whether the current monitoring value is larger than a preset value or not;
if yes, compressing the obtained current training data into preset type compressed data;
and establishing communication with a destination node, and sending the compressed data to the destination node.
2. The method for transmitting training data according to claim 1, wherein the current monitoring value is a current time interval or a current data amount put into a buffer.
3. The method for transmitting training data according to claim 1, wherein the compressing the obtained current training data into a preset type of compressed data specifically comprises:
and compressing the obtained current training data into preset type compressed data according to a compression algorithm.
4. The method according to claim 3, wherein the preset type is a data type int 8.
5. The method for transmitting training data according to claim 3, further comprising:
judging whether compressed data sent by other nodes is received or not;
and if so, decompressing the compressed data according to a decompression algorithm corresponding to the compression algorithm.
6. The method for transmitting training data according to claim 1, further comprising:
detecting whether the communication with a destination node is normal;
if not, abnormal information for indicating communication failure is fed back.
7. The method for transmitting training data according to claim 6, further comprising:
and informing the operation and maintenance personnel to process the abnormal information according to a pre-stored contact way of the operation and maintenance personnel.
8. An apparatus for transmitting training data, comprising:
the monitoring module is used for monitoring a current monitoring value used for reflecting the current training condition in the node and judging whether the current monitoring value is larger than a preset value or not; if yes, entering a compression module;
the compression module is used for compressing the obtained current training data into preset type of compressed data;
and the sending module is used for establishing communication with a destination node and sending the compressed data to the destination node.
9. A transmission device for training data, comprising a memory for storing a computer program;
a processor for implementing the steps of the method of transmission of training data according to any one of claims 1 to 7 when executing said computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the transmission method of training data according to any one of claims 1 to 7.
CN201911386514.4A 2019-12-29 2019-12-29 Training data transmission method, device, equipment and medium Pending CN111178540A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911386514.4A CN111178540A (en) 2019-12-29 2019-12-29 Training data transmission method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911386514.4A CN111178540A (en) 2019-12-29 2019-12-29 Training data transmission method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN111178540A true CN111178540A (en) 2020-05-19

Family

ID=70654177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911386514.4A Pending CN111178540A (en) 2019-12-29 2019-12-29 Training data transmission method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN111178540A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111651490A (en) * 2020-06-04 2020-09-11 深圳前海微众银行股份有限公司 Data screening method, device, equipment and computer storage medium
CN111865326A (en) * 2020-07-14 2020-10-30 北京灵汐科技有限公司 Data compression method, device, equipment and storage medium
CN113872947A (en) * 2021-09-15 2021-12-31 珠海格力电器股份有限公司 Data reporting method and device, electronic equipment and computer readable storage medium
WO2023242927A1 (en) * 2022-06-13 2023-12-21 日本電信電話株式会社 Data management device, data management method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101394402A (en) * 2008-10-13 2009-03-25 邓学锋 Method for fast code changing in large range to audio information to break virus
CN108510975A (en) * 2017-02-24 2018-09-07 百度(美国)有限责任公司 System and method for real-time neural text-to-speech
CN109343978A (en) * 2018-09-27 2019-02-15 郑州云海信息技术有限公司 A kind of method for interchanging data and device of deep learning Distributed Architecture
CN110147710A (en) * 2018-12-10 2019-08-20 腾讯科技(深圳)有限公司 Processing method, device and the storage medium of face characteristic
US20190324856A1 (en) * 2018-04-18 2019-10-24 EMC IP Holding Company LLC Optimization of checkpoint operations for deep learning computing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101394402A (en) * 2008-10-13 2009-03-25 邓学锋 Method for fast code changing in large range to audio information to break virus
CN108510975A (en) * 2017-02-24 2018-09-07 百度(美国)有限责任公司 System and method for real-time neural text-to-speech
US20190324856A1 (en) * 2018-04-18 2019-10-24 EMC IP Holding Company LLC Optimization of checkpoint operations for deep learning computing
CN109343978A (en) * 2018-09-27 2019-02-15 郑州云海信息技术有限公司 A kind of method for interchanging data and device of deep learning Distributed Architecture
CN110147710A (en) * 2018-12-10 2019-08-20 腾讯科技(深圳)有限公司 Processing method, device and the storage medium of face characteristic

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111651490A (en) * 2020-06-04 2020-09-11 深圳前海微众银行股份有限公司 Data screening method, device, equipment and computer storage medium
CN111865326A (en) * 2020-07-14 2020-10-30 北京灵汐科技有限公司 Data compression method, device, equipment and storage medium
CN113872947A (en) * 2021-09-15 2021-12-31 珠海格力电器股份有限公司 Data reporting method and device, electronic equipment and computer readable storage medium
WO2023242927A1 (en) * 2022-06-13 2023-12-21 日本電信電話株式会社 Data management device, data management method, and program

Similar Documents

Publication Publication Date Title
CN111178540A (en) Training data transmission method, device, equipment and medium
CN111008075B (en) Load balancing system, method, device, equipment and medium
CN112769897A (en) Synchronization method and device for edge calculation message, electronic equipment and storage medium
CN114138700B (en) Flow control method, device, equipment and storage medium for serial port data transmission
CN114448989B (en) Method, device, electronic equipment, storage medium and product for adjusting message distribution
CN105262680A (en) Multi-threaded NAS Gateway applied to cloud storage system
CN111159195A (en) Data storage control method and equipment in block chain system
CN109788251B (en) Video processing method, device and storage medium
CN113468021B (en) Method, device, equipment and storage medium for monitoring performance data
CN113037489B (en) Data processing method, device, equipment and storage medium
CN113961289A (en) Data processing method, device, equipment and storage medium
CN112948081A (en) Method, device and equipment for processing task in delayed mode and storage medium
CN110752972A (en) Network card state monitoring method, device, equipment and medium
CN112905119B (en) Data write-in control method, device and equipment of distributed storage system
CN111953569B (en) State information reporting method, device, equipment and medium
CN112084099B (en) Method, device, equipment and storage medium for acquiring alarm state value based on host
CN105264499B (en) Message treatment method, device and reception core in a kind of shared queue
CN113064620A (en) Method and device for processing system data
CN109547439B (en) Processing method and device for service node access network
CN111858129A (en) Erasure code reading request processing method, system, equipment and computer medium
CN115600671B (en) Data processing method, device, equipment and storage medium of deep learning framework
CN116781667A (en) Energy storage battery pack address coding method and device, electronic equipment and medium
CN114115718A (en) Distributed block storage system service quality control method, device, equipment and medium
CN113052592A (en) Mobile banking APP performance monitoring device and method
CN116527498A (en) Model transmission method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200519

RJ01 Rejection of invention patent application after publication