CN111612158B - Model deployment method, device, equipment and storage medium - Google Patents
Model deployment method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN111612158B CN111612158B CN202010443415.1A CN202010443415A CN111612158B CN 111612158 B CN111612158 B CN 111612158B CN 202010443415 A CN202010443415 A CN 202010443415A CN 111612158 B CN111612158 B CN 111612158B
- Authority
- CN
- China
- Prior art keywords
- training
- model
- identification
- sample data
- directory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000012549 training Methods 0.000 claims abstract description 214
- 238000012360 testing method Methods 0.000 claims description 43
- 238000004590 computer program Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 5
- 238000004891 communication Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Stored Programmes (AREA)
Abstract
The invention provides a model deployment method, a device, equipment and a storage medium, wherein the method comprises the following steps: if a training request is received, training is carried out based on sample data carried by the training request, and a training model is obtained; deploying the training model to a specified directory; the identification of the training model and the appointed directory are pushed to a model server, so that the model server loads the training model according to the identification of the training model and the appointed directory, the training model is automatically deployed under the condition that the model server is not restarted, and the use of the model server is not affected in the process of deploying the training model. By adopting the technical scheme of the invention, the model deployment efficiency can be improved.
Description
Technical Field
The present invention relates to the field of artificial intelligence technologies, and in particular, to a method, an apparatus, a device, and a storage medium for model deployment.
Background
At present, machine learning models such as a Bert model are mutually separated in a training stage and an application stage, namely, a required model is trained by a large amount of data operation in the training stage, an executable program is manually built on the trained model in the application stage, and the trained model can be loaded by restarting a model server. For example, the Bert training code is adopted to load the collected training set to run the model for training once, then the collated test set is used for model verification, after the model verification is completed, an executable program is manually established, and the model server can be restarted to load the trained model, so that the model can be newly added, updated and the like.
However, when the model is newly added and updated, an executable program needs to be manually established, and the model server is restarted, so that the model server cannot be used, and the model deployment efficiency is reduced.
Disclosure of Invention
In view of the above, the present invention aims to provide a model deployment method, device, apparatus and storage medium, so as to solve the problem of low model deployment efficiency in the prior art.
Based on the above object, the present invention provides a model deployment method, comprising:
if a training request is received, training is carried out based on sample data carried by the training request, and a training model is obtained;
deploying the training model to a specified directory;
pushing the identification of the training model and the appointed directory to a model server so that the model server loads the training model according to the identification of the training model and the appointed directory.
Further, in the model deployment method, training is performed based on the sample data carried by the training request to obtain a training model, including:
loading a training set based on an identification of the training set in the sample data and a first storage directory;
training the training set to obtain a pre-training model;
loading a test set in the sample data based on the identification of the test set and a second storage directory;
inputting the test set into the pre-training model to obtain a test result;
and if the test result meets a preset model on-line condition, taking the pre-training model as the training model.
Further, in the model deployment method described above, the specified directory is provided in the model server.
Further, the model deployment method further includes:
determining intention information corresponding to the sample data;
and determining a training algorithm corresponding to the intention information based on the association relation between the preset intention and the training algorithm so as to train by using the training algorithm.
The invention also provides a model deployment device, which comprises:
the training module is used for training based on sample data carried by the training request if the training request is received, so as to obtain a training model;
the deployment module is used for deploying the training model to a designated directory;
and the pushing module is used for pushing the identification of the training model and the appointed directory to a model server so that the model server loads the training model according to the identification of the training model and the appointed directory.
Further, in the model deployment device described above, the training module is specifically configured to:
loading a training set based on an identification of the training set in the sample data and a first storage directory;
training the training set to obtain a pre-training model;
loading a test set in the sample data based on the identification of the test set and a second storage directory;
inputting the test set into the pre-training model to obtain a test result;
and if the test result meets a preset model on-line condition, taking the pre-training model as the training model.
Further, in the model deployment apparatus described above, the specified directory is provided in the model server.
Further, in the model deployment device described above, the training module is further configured to:
determining intention information corresponding to the sample data;
and determining a training algorithm corresponding to the intention information based on the association relation between the preset intention and the training algorithm so as to train by using the training algorithm.
The invention also provides a model deployment device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to any one of the preceding claims when executing the program.
The present invention also provides a storage medium storing computer instructions for causing the computer to perform the method of any one of the above.
From the above, it can be seen that, if a training request is received, training is performed based on sample data carried by the training request, after a training model is obtained, the training model is deployed to a designated directory, and the identifier and the designated directory of the training model are pushed to a model server, so that the model server loads the training model according to the identifier and the designated directory of the training model, and the model server is automatically deployed without restarting the model server, and the use of the model server is not affected in the process of deploying the training model. By adopting the technical scheme of the invention, the model deployment efficiency can be improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an embodiment of a model deployment method of the present invention;
FIG. 2 is a schematic diagram of an embodiment of a model deployment apparatus of the present invention;
fig. 3 is a schematic structural view of an embodiment of the model deployment apparatus of the present invention.
Detailed Description
The present invention will be further described in detail below with reference to specific embodiments and with reference to the accompanying drawings, in order to make the objects, technical solutions and advantages of the present invention more apparent.
It should be noted that unless otherwise defined, technical or scientific terms used in the embodiments of the present invention should be given the ordinary meaning as understood by one of ordinary skill in the art to which the present disclosure pertains. The terms "first," "second," and the like, as used in this disclosure, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
Fig. 1 is a flowchart of an embodiment of a model deployment method of the present invention, and as shown in fig. 1, the model deployment method of the present embodiment may specifically include the following steps:
100. if a training request is received, training is carried out based on sample data carried by the training request, and a training model is obtained;
in a specific implementation process, a user can import the tidied corpus, the corresponding intention labels and the like into the online training platform as sample data and input a training request, and at the moment, if the training request is received, training can be performed based on the sample data carried by the training request to obtain a training model.
Specifically, the sample data may be divided to obtain a training set and a test set, where the training set is preferably 80% of the sample data, and the test set is preferably 20% of the sample data. When the training set and the test set are divided, a random division method can be adopted for division. In this embodiment, when the data division is completed, the identification of the training set and the first storage directory may be recorded, and the identification of the test set and the second storage directory may be recorded. The identification of the training set, the first storage directory, the identification of the test set and the second storage directory are respectively sent to the online training platform, so that the training set can be loaded based on the identification of the training set and the first storage directory in the sample data, and the training set is trained to obtain a pre-training model. After the pre-training model is obtained, a new model can be detected, at this time, the test set can be loaded based on the identification of the test set in the sample data and the second storage catalog, and the test set is input into the pre-training model to obtain a test result; judging whether the test result meets the model on-line condition, and if the test result meets the preset model on-line condition, taking the pre-training model as a training model. If the test result does not meet the condition of the line on the model, training is continued.
For example, the test results may include an accuracy and/or recall of the pre-trained model, and correspondingly, an on-model condition accuracy greater than a first preset threshold, and/or a recall greater than a second preset threshold.
In this embodiment, regarding the test result, there are two possibilities for predicting how many samples are true positive samples among samples whose predicted samples are positive, one is to predict positive class (TP) and the other negative class is to predict positive class (FP), so that the accuracy p=tp/(tp+fp) can be obtained. For sample data, there are two possibilities to indicate how many positive examples in the sample are predicted to be correct, one is to predict the original positive class as a positive class (TP), and the other is to predict the original positive class as a negative class (FN), so that the recall ratio r=tp/(tp+fn) can be obtained.
In the embodiment, only sample data such as intent classification corpus, intent labels and the like which are collected manually are needed, a training set and a testing set can be automatically loaded through an online training platform, model training and evaluation are completed, and the efficiency of training a model is improved.
101. Deploying the training model to a specified directory;
after the training model is obtained, the training model may be deployed into a specified directory, which is preferably located in the model server.
102. Pushing the identification and the appointed catalogue of the training model to a model server so that the model server loads the training model according to the identification and the appointed catalogue of the training model.
In one implementation, a unique identifier is generated as each training model is obtained. In this embodiment, after the deployment of the training model is completed, the identifier and the designated directory of the training model may be automatically pushed to the model server through an HTTP interface or the like, so that the model server may be dynamically loaded to the corresponding training model without restarting, and thus the obtained training model may also be validated.
According to the model deployment method, if the training request is received, training is performed based on sample data carried by the training request, after a training model is obtained, the training model is deployed to a designated directory, and the identification and the designated directory of the training model are pushed to a model server, so that the model server loads the training model according to the identification and the designated directory of the training model, the training model is automatically deployed under the condition that the model server is not restarted, and the use of the model server is not affected in the process of deploying the training model. By adopting the technical scheme of the invention, the model deployment efficiency can be improved.
In practical application, in order to adapt to various training methods, a calling interface of various training methods can be set, and a user can select a required training method when inputting sample data. However, some users do not know the training method well, and cannot accurately select the required training method, so in order to solve the above problem, the present invention further provides the following technical solution.
The data features can be extracted from part of the sample data so as to determine intention information corresponding to the sample data according to the extracted data features, and a training algorithm corresponding to the intention information is determined based on the association relation between the preset intention and the training algorithm so as to train by using the determined training algorithm. Therefore, the user can accurately select the required training method to train under the condition that the user does not know the training method, and meanwhile, the phenomenon that the user cannot train when forgetting to select the training algorithm can be avoided.
It should be noted that, the method of the embodiment of the present invention may be performed by a single device, for example, a computer or a server. The method of the embodiment can also be applied to a distributed scene, and is completed by mutually matching a plurality of devices. In the case of such a distributed scenario, one of the devices may perform only one or more steps of the method of an embodiment of the present invention, and the devices interact with each other to complete the method.
Fig. 2 is a schematic structural diagram of an embodiment of the model deployment device of the present invention, and as shown in fig. 2, the model deployment device of the present embodiment includes a training module 20, a deployment module 21, and a pushing module 22.
The training module 20 is configured to, if a training request is received, perform training based on sample data carried by the training request, and obtain a training model;
specifically, loading a training set based on an identification of the training set in the sample data and a first storage directory; training the training set to obtain a pre-training model; loading the test set based on the identification of the test set in the sample data and the second storage directory; inputting the test set into a pre-training model to obtain a test result; and if the test result meets the preset model on-line condition, taking the pre-training model as a training model.
A deployment module 21 for deploying the training model to a specified directory;
wherein the specified directory is preferably provided in the model server.
And the pushing module 22 is configured to push the identifier of the training model and the specified directory to a model server, so that the model server loads the training model according to the identifier of the training model and the specified directory.
According to the model deployment device, if a training request is received, training is performed based on sample data carried by the training request, after a training model is obtained, the training model is deployed to a designated directory, and the identification and the designated directory of the training model are pushed to a model server, so that the model server loads the training model according to the identification and the designated directory of the training model, the training model is automatically deployed under the condition that the model server is not restarted, and the use of the model server is not affected in the process of deploying the training model. By adopting the technical scheme of the invention, the model deployment efficiency can be improved.
Further, in the above embodiment, the training module 20 is further configured to:
determining intention information corresponding to the sample data;
based on the association relation between the preset intention and the training algorithm, determining the training algorithm corresponding to the intention information so as to train by using the training algorithm.
The device of the foregoing embodiment is configured to implement the corresponding method in the foregoing embodiment, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Fig. 3 is a schematic structural diagram of an embodiment of a model deployment device of the present invention, as shown in fig. 3, a passing device of the present embodiment may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 implement communication connections therebetween within the device via a bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit ), microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, etc. for executing relevant programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of ROM (Read Only Memory), RAM (Random Access Memory ), static storage device, dynamic storage device, or the like. Memory 1020 may store an operating system and other application programs, and when the embodiments of the present specification are implemented in software or firmware, the associated program code is stored in memory 1020 and executed by processor 1010.
The input/output interface 1030 is used to connect with an input/output module for inputting and outputting information. The input/output module may be configured as a component in a device (not shown) or may be external to the device to provide corresponding functionality. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various types of sensors, etc., and the output devices may include a display, speaker, vibrator, indicator lights, etc.
Communication interface 1040 is used to connect communication modules (not shown) to enable communication interactions of the present device with other devices. The communication module may implement communication through a wired manner (such as USB, network cable, etc.), or may implement communication through a wireless manner (such as mobile network, WIFI, bluetooth, etc.).
Bus 1050 includes a path for transferring information between components of the device (e.g., processor 1010, memory 1020, input/output interface 1030, and communication interface 1040).
It should be noted that although the above-described device only shows processor 1010, memory 1020, input/output interface 1030, communication interface 1040, and bus 1050, in an implementation, the device may include other components necessary to achieve proper operation. Furthermore, it will be understood by those skilled in the art that the above-described apparatus may include only the components necessary to implement the embodiments of the present description, and not all the components shown in the drawings.
The invention also provides a training system which comprises the model server and the model deployment equipment of the embodiment.
The present invention also provides a storage medium storing computer instructions for causing the computer to execute the control method of the distributed terminal of the above embodiment.
The computer readable media of the present embodiments, including both permanent and non-permanent, removable and non-removable media, may be used to implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
Those of ordinary skill in the art will appreciate that: the discussion of any of the embodiments above is merely exemplary and is not intended to suggest that the scope of the disclosure, including the claims, is limited to these examples; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the invention, the steps may be implemented in any order and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity.
Additionally, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown within the provided figures, in order to simplify the illustration and discussion, and so as not to obscure the invention. Furthermore, the devices may be shown in block diagram form in order to avoid obscuring the invention, and also in view of the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the present invention is to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the invention, it should be apparent to one skilled in the art that the invention can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative in nature and not as restrictive.
While the invention has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of those embodiments will be apparent to those skilled in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic RAM (DRAM)) may use the embodiments discussed.
The embodiments of the invention are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omission, modification, equivalent replacement, improvement, etc. of the present invention should be included in the scope of the present invention.
Claims (6)
1. A method of model deployment, comprising:
importing the sorted corpus and the corresponding intention labels into an online training platform as sample data, inputting a training request, and training based on the sample data carried by the training request if the training request is received, so as to obtain a training model;
deploying the training model to a specified directory;
pushing the identification of the training model and the appointed directory to a model server so that the model server loads the training model according to the identification of the training model and the appointed directory;
the training is performed based on the sample data carried by the training request to obtain a training model, which comprises the following steps:
respectively transmitting the identification of the training set, the first storage catalogue, the identification of the test set and the second storage catalogue to the online training platform; loading the training set based on the identification of the training set in the sample data and the first storage directory;
training the training set to obtain a pre-training model;
loading the test set based on the identification of the test set in the sample data and the second storage directory;
inputting the test set into the pre-training model to obtain a test result;
if the test result meets a preset model online condition, taking the pre-training model as the training model;
the method further comprises the steps of: determining intention information corresponding to the sample data; and determining a training algorithm corresponding to the intention information based on the association relation between the preset intention and the training algorithm so as to train by using the training algorithm.
2. The model deployment method of claim 1, wherein the specified directory is provided in the model server.
3. A model deployment apparatus, comprising:
the training module is used for importing the tidied corpus and the corresponding intention labels into the online training platform as sample data, inputting a training request, and training based on the sample data carried by the training request if the training request is received, so as to obtain a training model;
the deployment module is used for deploying the training model to a designated directory;
the pushing module is used for pushing the identification of the training model and the appointed directory to a model server so that the model server loads the training model according to the identification of the training model and the appointed directory;
the training module is specifically configured to:
respectively transmitting the identification of the training set, the first storage catalogue, the identification of the test set and the second storage catalogue to the online training platform; loading the training set based on the identification of the training set in the sample data and the first storage directory;
training the training set to obtain a pre-training model;
loading the test set based on the identification of the test set in the sample data and the second storage directory;
inputting the test set into the pre-training model to obtain a test result;
if the test result meets a preset model online condition, taking the pre-training model as the training model;
the training module is also configured to: determining intention information corresponding to the sample data; and determining a training algorithm corresponding to the intention information based on the association relation between the preset intention and the training algorithm so as to train by using the training algorithm.
4. A model deployment device as claimed in claim 3, wherein the specified directory is provided in the model server.
5. A model deployment device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of claim 1 or 2 when executing the program.
6. A storage medium storing computer instructions for causing the computer to perform the method of claim 1 or 2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010443415.1A CN111612158B (en) | 2020-05-22 | 2020-05-22 | Model deployment method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010443415.1A CN111612158B (en) | 2020-05-22 | 2020-05-22 | Model deployment method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111612158A CN111612158A (en) | 2020-09-01 |
CN111612158B true CN111612158B (en) | 2024-03-01 |
Family
ID=72203817
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010443415.1A Active CN111612158B (en) | 2020-05-22 | 2020-05-22 | Model deployment method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111612158B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112230898A (en) * | 2020-10-23 | 2021-01-15 | 贝壳技术有限公司 | Model application interaction system, method, readable storage medium and electronic device |
CN114153540A (en) * | 2021-11-30 | 2022-03-08 | 上海商汤科技开发有限公司 | Pre-training model issuing method and device, electronic equipment and storage medium |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107180035A (en) * | 2016-03-09 | 2017-09-19 | 阿里巴巴集团控股有限公司 | A kind of training pattern information output method and device |
CN107808004A (en) * | 2017-11-15 | 2018-03-16 | 北京百度网讯科技有限公司 | Model training method and system, server, storage medium |
CN108665072A (en) * | 2018-05-23 | 2018-10-16 | 中国电力科学研究院有限公司 | A kind of machine learning algorithm overall process training method and system based on cloud framework |
CN108764808A (en) * | 2018-03-29 | 2018-11-06 | 北京九章云极科技有限公司 | Data Analysis Services system and its on-time model dispositions method |
CN109685160A (en) * | 2019-01-18 | 2019-04-26 | 创新奇智(合肥)科技有限公司 | A kind of on-time model trained and dispositions method and system automatically |
CN110175677A (en) * | 2019-04-16 | 2019-08-27 | 平安普惠企业管理有限公司 | Automatic update method, device, computer equipment and storage medium |
CN110308910A (en) * | 2019-05-30 | 2019-10-08 | 苏宁金融服务(上海)有限公司 | The method, apparatus and computer equipment of algorithm model deployment and risk monitoring and control |
CN110727468A (en) * | 2018-06-28 | 2020-01-24 | 北京京东尚科信息技术有限公司 | Method and apparatus for managing algorithm models |
CN110737538A (en) * | 2019-10-29 | 2020-01-31 | 曹严清 | algorithm model calling system based on thrift |
CN110738323A (en) * | 2018-07-03 | 2020-01-31 | 百度在线网络技术(北京)有限公司 | Method and device for establishing machine learning model based on data sharing |
CN110808881A (en) * | 2019-11-05 | 2020-02-18 | 广州虎牙科技有限公司 | Model deployment method and device, target monitoring method and device, equipment and system |
CN110928528A (en) * | 2019-10-23 | 2020-03-27 | 深圳市华讯方舟太赫兹科技有限公司 | Development method of algorithm model, terminal device and computer storage medium |
CN111062520A (en) * | 2019-11-29 | 2020-04-24 | 苏州迈科网络安全技术股份有限公司 | Hostname feature prediction method based on random forest algorithm |
CN111095308A (en) * | 2017-05-14 | 2020-05-01 | 数字推理系统有限公司 | System and method for quickly building, managing and sharing machine learning models |
CN111104495A (en) * | 2019-11-19 | 2020-05-05 | 深圳追一科技有限公司 | Information interaction method, device, equipment and storage medium based on intention recognition |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11222281B2 (en) * | 2018-06-26 | 2022-01-11 | International Business Machines Corporation | Cloud sharing and selection of machine learning models for service use |
-
2020
- 2020-05-22 CN CN202010443415.1A patent/CN111612158B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107180035A (en) * | 2016-03-09 | 2017-09-19 | 阿里巴巴集团控股有限公司 | A kind of training pattern information output method and device |
CN111095308A (en) * | 2017-05-14 | 2020-05-01 | 数字推理系统有限公司 | System and method for quickly building, managing and sharing machine learning models |
CN107808004A (en) * | 2017-11-15 | 2018-03-16 | 北京百度网讯科技有限公司 | Model training method and system, server, storage medium |
CN108764808A (en) * | 2018-03-29 | 2018-11-06 | 北京九章云极科技有限公司 | Data Analysis Services system and its on-time model dispositions method |
CN108665072A (en) * | 2018-05-23 | 2018-10-16 | 中国电力科学研究院有限公司 | A kind of machine learning algorithm overall process training method and system based on cloud framework |
CN110727468A (en) * | 2018-06-28 | 2020-01-24 | 北京京东尚科信息技术有限公司 | Method and apparatus for managing algorithm models |
CN110738323A (en) * | 2018-07-03 | 2020-01-31 | 百度在线网络技术(北京)有限公司 | Method and device for establishing machine learning model based on data sharing |
CN109685160A (en) * | 2019-01-18 | 2019-04-26 | 创新奇智(合肥)科技有限公司 | A kind of on-time model trained and dispositions method and system automatically |
CN110175677A (en) * | 2019-04-16 | 2019-08-27 | 平安普惠企业管理有限公司 | Automatic update method, device, computer equipment and storage medium |
CN110308910A (en) * | 2019-05-30 | 2019-10-08 | 苏宁金融服务(上海)有限公司 | The method, apparatus and computer equipment of algorithm model deployment and risk monitoring and control |
CN110928528A (en) * | 2019-10-23 | 2020-03-27 | 深圳市华讯方舟太赫兹科技有限公司 | Development method of algorithm model, terminal device and computer storage medium |
CN110737538A (en) * | 2019-10-29 | 2020-01-31 | 曹严清 | algorithm model calling system based on thrift |
CN110808881A (en) * | 2019-11-05 | 2020-02-18 | 广州虎牙科技有限公司 | Model deployment method and device, target monitoring method and device, equipment and system |
CN111104495A (en) * | 2019-11-19 | 2020-05-05 | 深圳追一科技有限公司 | Information interaction method, device, equipment and storage medium based on intention recognition |
CN111062520A (en) * | 2019-11-29 | 2020-04-24 | 苏州迈科网络安全技术股份有限公司 | Hostname feature prediction method based on random forest algorithm |
Non-Patent Citations (2)
Title |
---|
Lu Hou et al..DynaBERT: Dynamic BERT with Adaptive Width and Depth.《arXiv》.2020,1-16. * |
骆仕杰.基于信息提取技术对文本命名实体识别和主题提取的工程构建.《中国优秀硕士学位论文全文数据库》.2020,1-87. * |
Also Published As
Publication number | Publication date |
---|---|
CN111612158A (en) | 2020-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI719557B (en) | Model training method and device based on gradient boosting decision tree | |
CN111611797B (en) | Method, device and equipment for marking prediction data based on Albert model | |
CN109976998B (en) | Software defect prediction method and device and electronic equipment | |
CN111612158B (en) | Model deployment method, device, equipment and storage medium | |
RU2606880C2 (en) | Method, device and software for activity sensor data processing | |
CN107145446B (en) | Application program APP test method, device and medium | |
CN113836885A (en) | Text matching model training method, text matching device and electronic equipment | |
CN109190879B (en) | Method and device for training adaptation level evaluation model and evaluating adaptation level | |
CN108509348A (en) | A kind of test method and mobile terminal of system aging | |
CN113132181B (en) | Method and device for detecting network protocol support degree of IPv6 mobile application program | |
CN111325291B (en) | Entity object classification method for selectively integrating heterogeneous models and related equipment | |
CN111798263A (en) | Transaction trend prediction method and device | |
CN115796210B (en) | Sample detection method and related equipment | |
CN116560968A (en) | Simulation calculation time prediction method, system and equipment based on machine learning | |
CN116361552A (en) | Campus book retrieval method, device, equipment and readable storage medium | |
CN115168575A (en) | Subject supplement method applied to audit field and related equipment | |
CN110888036A (en) | Test item determination method and device, storage medium and electronic equipment | |
CN114741047A (en) | Volume adjusting method and volume adjusting system | |
CN111461328A (en) | Neural network training method and electronic equipment | |
CN116091263B (en) | Theoretical power calculation method of photovoltaic power station and related equipment | |
CN111967273B (en) | Dialog management system, method and rule engine device | |
CN114237754B (en) | Data loading method and device, electronic equipment and storage medium | |
CN114301638B (en) | Firewall rule reproduction method and device, storage medium and processor | |
WO2019051704A1 (en) | Method and device for identifying junk file | |
CN115314413B (en) | CAN signal testing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |