CN110288089B - Method and apparatus for transmitting information - Google Patents

Method and apparatus for transmitting information Download PDF

Info

Publication number
CN110288089B
CN110288089B CN201910575820.6A CN201910575820A CN110288089B CN 110288089 B CN110288089 B CN 110288089B CN 201910575820 A CN201910575820 A CN 201910575820A CN 110288089 B CN110288089 B CN 110288089B
Authority
CN
China
Prior art keywords
category information
target
external
internal
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910575820.6A
Other languages
Chinese (zh)
Other versions
CN110288089A (en
Inventor
李旭
黄靖博
王文博
陈川石
叶芷
马彩虹
王冠皓
舒俊华
陈波
孙雯
丁扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910575820.6A priority Critical patent/CN110288089B/en
Publication of CN110288089A publication Critical patent/CN110288089A/en
Application granted granted Critical
Publication of CN110288089B publication Critical patent/CN110288089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for transmitting information. The method relates to the field of cloud computing, and a specific implementation mode of the method comprises the following steps: acquiring a category information set sent by a target terminal as an external category information set; in response to the fact that target external category information does not exist in the external category information set, determining a pre-trained model as a trained classification model, wherein the target external category information is external category information which is not matched with internal category information in the pre-determined internal category information set, the pre-trained model is obtained by training on the basis of a pre-determined training sample set, and training samples in the training sample set comprise internal data and internal category information corresponding to the internal data in the internal category information set; and sending the calling interface of the classification model obtained by training to the target terminal. The implementation mode improves the model training speed and is beneficial to improving the accuracy and the recall rate of the model obtained by training.

Description

Method and apparatus for transmitting information
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a method and an apparatus for transmitting information.
Background
The Artificial Intelligence (AI) technology has the characteristics of wide coverage area, high technical threshold, complex processing flow and the like. In practice, although the techniques of machine learning, deep learning, etc. have been developed, a lot of preparation work is still needed before the techniques are applied to solve the practical problems. For example, a longer period is required to select which technique to analyze and process the data.
In general, the technician needs to perform the following steps to use the model: model selection, data preparation, model training, model testing, model deployment, and the like.
Disclosure of Invention
The present disclosure presents methods and apparatus for transmitting information.
In a first aspect, an embodiment of the present disclosure provides a method for transmitting information, where the method includes: acquiring a category information set sent by a target terminal as an external category information set; in response to the fact that target external category information does not exist in the external category information set, determining a pre-trained model as a trained classification model, wherein the target external category information is external category information which is not matched with internal category information in the pre-determined internal category information set, the pre-trained model is obtained by training on the basis of a pre-determined training sample set, and training samples in the training sample set comprise internal data and internal category information corresponding to the internal data in the internal category information set; and sending the calling interface of the classification model obtained by training to the target terminal.
In some embodiments, the method further comprises: acquiring a test sample set sent by a target terminal through a calling interface, wherein test samples in the test sample set comprise data and data category information; inputting data in the test sample set to a classification model corresponding to the calling interface to obtain class information output by the classification model; based on the class information output by the classification model and the class information in the test sample set, generating at least one item of evaluation information of the classification model, which is as follows: accuracy, recall, F1 score (F1-score); and sending the generated evaluation information to the target terminal.
In some embodiments, before sending the trained calling interface of the classification model to the target terminal, the method further includes: responding to the target external category information in the external category information set, and acquiring a target external data set which is sent by a target terminal and corresponds to the target external category information; training an initial model by adopting a machine learning algorithm based on an external category information set, a target external data set and a target internal data set, and determining the initial model meeting a predetermined training end condition as a classification model obtained by training; the target internal data set is an internal data set corresponding to internal category information matched with the external category information in the external category information set.
In some embodiments, before sending the trained calling interface of the classification model to the target terminal, the method further includes: calculating the remaining time based on the number of external data in the target external data set and the number of internal data in the target internal data set, wherein the remaining time is used for indicating the time difference between the time for training the obtained classification model and the current time; and transmitting the remaining time to the target terminal.
In some embodiments, the method further comprises: acquiring target data to be classified sent by a target terminal through a calling interface; inputting the target data to be classified into a classification model corresponding to the calling interface to generate class information of the target data to be classified; and transmitting the generated category information to the target terminal.
In some embodiments, the internal category information set and the internal data set corresponding to the internal category information in the internal category information set are obtained after feature engineering.
In a second aspect, an embodiment of the present disclosure provides an apparatus for transmitting information, the apparatus including: a first acquisition unit configured to acquire a category information set transmitted by a target terminal as an external category information set; a determining unit configured to determine a pre-trained model as a trained classification model in response to the target external category information not existing in the external category information set, wherein the target external category information is external category information not matching internal category information in the pre-determined internal category information set, the pre-trained model is trained based on a pre-determined training sample set, and training samples in the training sample set comprise internal data and internal category information corresponding to the internal data in the internal category information set; and the first sending unit is configured to send the trained calling interface of the classification model to the target terminal.
In some embodiments, the apparatus further comprises: the second acquisition unit is configured to acquire a test sample set sent by the target terminal through the calling interface, wherein the test samples in the test sample set comprise data and class information of the data; the input unit is configured to input the data in the test sample set to the classification model corresponding to the calling interface to obtain the class information output by the classification model; a generating unit configured to generate at least one of the following evaluation information of the classification model based on the classification information output by the classification model and the classification information in the test sample set: accuracy, recall, F1 score; a second transmitting unit configured to transmit the generated evaluation information to the target terminal.
In some embodiments, the apparatus further comprises: a third obtaining unit configured to obtain a target external data set corresponding to the target external category information, which is sent by the target terminal, in response to the target external category information existing in the external category information set; training an initial model by adopting a machine learning algorithm based on an external category information set, a target external data set and a target internal data set, and determining the initial model meeting a predetermined training end condition as a classification model obtained by training; the target internal data set is an internal data set corresponding to internal category information matched with the external category information in the external category information set.
In some embodiments, the apparatus further comprises: a calculating unit configured to calculate a remaining time based on the number of external data in the target external data set and the number of internal data in the target internal data set, wherein the remaining time is used for indicating a time difference between a time of training the classification model and a current time; a third transmitting unit configured to transmit the remaining time to the target terminal.
In some embodiments, the apparatus further comprises: the fourth acquisition unit is configured to acquire target data to be classified sent by the target terminal through the calling interface; the input unit is configured to input the target data to be classified to the classification model corresponding to the calling interface and generate the class information of the target data to be classified; a fourth transmitting unit configured to transmit the generated category information to the target terminal.
In some embodiments, the internal category information set and the internal data set corresponding to the internal category information in the internal category information set are obtained after feature engineering.
In a third aspect, an embodiment of the present disclosure provides a server for sending information, including: one or more processors; a storage device having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to implement the method of any of the embodiments of the method for transmitting information as described above.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable medium for transmitting information, on which a computer program is stored, which when executed by a processor, implements the method of any of the embodiments of the method for transmitting information as described above.
The method and the device for sending information provided by the embodiment of the disclosure can be used for sending the information by acquiring the category information set sent by the target terminal as an external category information set, then, in response to the target external category information not existing in the external category information set, determining a pre-trained model as a trained classification model, wherein the target external class information is external class information which is not matched with the internal class information in the predetermined internal class information set, the pre-trained model is obtained by training based on the predetermined training sample set, the training samples in the training sample set comprise internal data and the internal class information corresponding to the internal data in the internal class information set, and finally, a calling interface of the classification model obtained by training is sent to the target terminal, therefore, the model training speed is improved, and the accuracy and the recall rate of the trained model are improved.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram for one embodiment of a method for transmitting information, according to the present disclosure;
FIG. 3 is a schematic diagram of one application scenario of a method for transmitting information according to the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of a method for transmitting information according to the present disclosure;
FIGS. 5A-5F are schematic diagrams of interaction processes of a target terminal for a method for transmitting information according to the present disclosure;
FIG. 6 is a schematic block diagram illustrating one embodiment of an apparatus for transmitting information according to the present disclosure;
FIG. 7 is a schematic block diagram of a computer system suitable for use as a server for implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 of an embodiment of a method for transmitting information or an apparatus for transmitting information to which embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or transmit data (e.g. external sets of category information) or the like. The terminal devices 101, 102, 103 may have various client applications installed thereon, such as a model training application, video playing software, news information application, image processing application, web browser application, shopping application, search application, instant messaging tool, mailbox client, social platform software, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices with display screens, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server that provides various services, such as a backend server that processes data such as external category information sets transmitted by the terminal apparatuses 101, 102, 103. The background server may determine whether the target external category information exists in the external category information set, and in a case that the target external category information does not exist in the external category information set, determine a pre-trained model as the trained classification model, and send a call interface of the trained classification model to the terminal device 101, 102, 103. As an example, the server 105 may be a cloud server or a physical server
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should also be noted that the method for sending information provided by the embodiments of the present disclosure may be executed by a server. Accordingly, various parts (e.g., various units, sub-units, modules, sub-modules) included in the apparatus for transmitting information may be provided in the server.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. The system architecture may include only the electronic device (e.g., server) on which the method for sending information operates, when the electronic device on which the method for sending information operates does not require data transfer with other electronic devices.
With continued reference to fig. 2, a flow 200 of one embodiment of a method for transmitting information in accordance with the present disclosure is shown. The method for transmitting information comprises the following steps:
step 201, acquiring a category information set sent by a target terminal as an external category information set.
In the present embodiment, an execution subject (e.g., a server shown in fig. 1) of the method for transmitting information may acquire a set of category information transmitted by a target terminal as an external set of category information by a wired connection manner or a wireless connection manner.
The target terminal may be a terminal communicatively connected to the execution main body. The category information in the category information set sent by the target terminal may be used to indicate the category. As an example, the category information may be used to indicate any of: human, animal, plant, containing human face, not containing human face, etc.
The category information in the category information set transmitted by the target terminal may be used to indicate a category of video, a category of text, a category of image, or a category of other data.
Step 202, in response to that no target external category information exists in the external category information set, determining a pre-trained model as a trained classification model.
In this embodiment, in the case that the target external category information does not exist in the external category information set, the executing entity may determine a model trained in advance as the trained classification model.
The target external category information is external category information which is not matched with internal category information in a predetermined internal category information set. The pre-trained model is trained based on a predetermined set of training samples. The training samples in the training sample set comprise internal data and internal class information corresponding to the internal data in the internal class information set. The classification model is used for determining external category information corresponding to the input data from the external category information set.
The internal category information in the internal category information set is used to indicate a predetermined category. As an example, the internal category information in the internal category information set is used to indicate any of the following categories: vehicles, people, plants, etc. The external category information may be category information transmitted by the target terminal. As an example, the external category information may be used to indicate any of the following categories: automobiles, landscapes, plants, etc. Here, the technician may determine in advance a matching rule to determine whether the internal category information matches the external category information. For example, the matching rule may be "if the internal category information is the same as the external category information, the internal category information matches the external category information". Thus, the execution body can determine that the internal category information indicating "plant class" matches the external category information indicating "plant class".
The target internal data set is: and the internal data set corresponds to the internal category information matched with the external category information in the external category information set. The categories of the target external data in the target external data set are: a category indicated by the external category information corresponding to the target external data set.
The pre-trained model can be obtained by training by adopting the following steps:
first, a set of training samples is obtained. The training samples in the training sample set comprise internal category information in an internal category information set and internal data corresponding to the internal category information.
Then, the internal data included in the training samples in the training sample set is used as input data of the initial model by using a machine learning algorithm, and the internal category information corresponding to the input internal data is used as expected output data of the initial model to train the initial model, so that the initial model satisfying a predetermined training end condition is determined as a trained model.
The initial model may include various model structures, such as AlexNet, ZFNet, OverFeat, VGG (Visual Geometry Group) Network, and so on. As an example, the initial model may be a convolutional neural network. The end-of-training conditions may include, but are not limited to, at least one of the following: the training time length exceeds the preset time length, the training times exceed the preset times, and the function value calculated based on the predetermined loss function is smaller than the preset threshold value.
It is understood that, when the initial model does not satisfy the training end condition, the model may be trained by adjusting the model parameters of the initial model so that the initial model satisfies the training end condition.
In practice, the pre-trained model may be trained by using algorithms such as Batch Gradient Descent (BGD), Stochastic Gradient Descent (SGD), and minimum batch Gradient Descent (Mini-batch Gradient Descent).
In some optional implementation manners of this embodiment, the executing body may further determine whether the target external category information exists in the external category information set by executing any one of the following steps (i.e., step one or step two):
step one, for external category information in an external category information set, in response to the fact that internal category information which is in a pre-established association relation with the external category information does not exist in an internal category information set, determining that target external category information exists in the external category information set.
Here, the execution agent may store an internal category information set locally or in an electronic device communicatively connected to the execution agent, and store the internal category information and external category information in association with each other for each internal category information in the internal category information set. Thus, when it is determined that there is no internal category information in the internal category information set that is associated with the external category information in advance, the execution subject may determine that there is target external category information in the external category information set; in the case where each external category information has internal category information with which an association relationship is established in advance, the execution subject may determine that the target external category information does not exist in the external category information set.
And step two, for the external category information in the external category information set, in response to determining that there is no internal category information with the similarity to the external category information being greater than or equal to a preset threshold in the internal category information set, determining that there is target external category information in the external category information set.
Here, the execution subject described above may determine the similarity between the external category information and the internal category information in various ways, for example, a Deep Structure Semantic Model (DSSM), cosine similarity, and the like.
It can be understood that, in the case that it is determined that there is no internal category information in the internal category information set whose similarity with the external category information is greater than or equal to the preset threshold, the execution subject may determine that there is target external category information in the external category information set; when the similarity between each external category information and the internal category information is smaller than the preset threshold, the execution subject may determine that the target external category information does not exist in the external category information set.
And step 203, sending the calling interface of the trained classification model to the target terminal.
In this embodiment, the execution subject may deploy the classification model after obtaining the classification model, and send a call Interface (API) of the trained classification model to the target terminal. Wherein the calling interface may include an address and a port number for deploying the classification model.
Optionally, the execution subject may deploy the classification model to a public cloud, a private cloud, or an embedded device.
It is understood that after the target terminal receives the call interface, the user can use the trained classification model through the call interface.
In practice, deploying the classification model may comprise the steps of: and packaging the classification model, uploading the packaged classification model to a storage (such as a BOS), and deploying a cluster operating system (such as a matrix).
In some optional implementation manners of this embodiment, the executing main body may further perform the following steps:
step one, a test sample set sent by a target terminal through a calling interface is obtained. The test samples in the test sample set comprise data and class information of the data. As an example, the data included in the test sample may be a video, and the category information included in the test sample may be "person class", and it is understood that the category information "person class" included in the test sample may be used to indicate that the category of the video included in the test sample is "person class".
And step two, inputting the data in the test sample set to a classification model corresponding to the calling interface to obtain the class information output by the classification model.
It can be understood that, here, the class information obtained in the second step is output data obtained by calculating the data in the test sample set by the classification model.
Step three, based on the class information output by the classification model and the class information in the test sample set, generating at least one item of evaluation information of the classification model, which is as follows: accuracy, recall, F1 score.
And step four, sending the generated evaluation information to the target terminal.
It can be understood that, in this alternative implementation manner, the generated evaluation information may be sent to the target terminal, so that a user using the target terminal evaluates the obtained classification model through the evaluation information, thereby determining whether the classification model meets an actual requirement, and further determining to continue training the classification model, retrain the classification model, or start using the classification model. Therefore, the method is beneficial to training aiming at different requirements of the user to obtain the corresponding classification model meeting the requirements of the user, enriches the training modes of the model, and is beneficial to improving at least one of the accuracy, the recall rate and the F1 score of the model on the premise of reducing the training time of the model.
In some optional implementations of this embodiment, before performing step 203, the performing main body may further perform the following steps:
and under the condition that the target external category information exists in the external category information set, acquiring a target external data set which is sent by the target terminal and corresponds to the target external category information. Then, training the initial model by adopting a machine learning algorithm based on the external category information set, the target external data set and the target internal data set, and determining the initial model meeting the predetermined training end condition as a classification model obtained by training. Wherein the target internal data set is: and the set is formed by internal data corresponding to the internal category information matched with the external category information in the external category information set.
It is to be understood that, here, each target external data set may correspond to one external category information, and different target external data sets may correspond to different external category information. In practice, a plurality of internal data sets and internal category information sets may be stored in advance in the electronic device local to the execution main body or communicatively connected to the execution main body. Each internal data set may correspond to one internal category information in the internal category information set, which is used to indicate the category of each internal data in the internal data set, in other words, the category of the internal data in the internal data set is the category indicated by the internal category information corresponding to the internal data set.
The category of the target external data in the target external data set corresponding to the target external category information, which is transmitted by the terminal, may be a category indicated by the target external category information.
In some optional implementations of this embodiment, the executing entity may train to obtain the classification model by using the following steps:
step one, a training sample set is obtained. The training samples in the training sample set include target data and class information of the target data. The target data are: target external data in the target external data set or target internal data in the target internal data set. The category information of the target data is external category information in the external category information set.
And step two, training the initial model by using a machine learning algorithm and taking target data included in training samples in the training sample set as input data and category information corresponding to the input data as expected output data, so that the initial model meeting a predetermined training end condition is determined as a trained classification model.
The initial model may include various model structures, such as AlexNet, ZFNet, OverFeat, VGG (Visual Geometry Group) Network, and so on. As an example, the initial model may be a convolutional neural network. The end-of-training conditions may include, but are not limited to, at least one of the following: the training time length exceeds the preset time length, the training times exceed the preset times, and the function value calculated based on the predetermined loss function is smaller than the preset threshold value.
It is understood that when the initial model does not satisfy the training end condition, algorithms such as a gradient descent method, a back propagation method, and the like may be used to adjust the model parameters of the initial model.
In some optional implementation manners of this embodiment, the executing entity may further train to obtain a classification model by using the following steps:
first, a pre-trained model is obtained. The hyper-parameters of the pre-trained model are the optimal hyper-parameters of the target model in the target model set, for example, the hyper-parameters of the target model with the highest learning speed. And the target model in the target model set is obtained by adopting a super-parameter adjusting mode in a plurality of pre-determined super-parameter adjusting modes to carry out super-parameter adjustment on the initial model. Wherein the hyper-parameters may include, but are not limited to, at least one of: the method comprises the following steps of learning rate, regularization parameters, the number of layers of a neural network, the number of neurons in each hidden layer, the number of learning rounds, the size of small-batch data, the encoding mode of output neurons, the selection of a cost function, a weight initialization method, the type of neuron activation functions, the scale of training model data, and the like. Here, for different initial models, different hyper-parameters may be set accordingly. The above-mentioned super-parameter adjustment mode may include at least one of the following: grid search, bayesian optimization, random search, gradient-based optimization, and the like.
And secondly, training the pre-trained model by adopting a machine learning algorithm based on the external class information set, the target external data set and the target internal data set to obtain a classification model.
Specifically, the executing agent may train the pre-trained model by using the target data as input data of the pre-trained model, and using the external category information in the target internal data set corresponding to the input data as expected output data of the pre-trained model, so as to determine the pre-trained model satisfying a predetermined training end condition as the trained classification model.
The initial model may include various model structures, such as AlexNet, ZFNet, OverFeat, VGG (Visual Geometry Group) Network, and so on. As an example, the initial model may be a convolutional neural network. The end-of-training conditions may include, but are not limited to, at least one of the following: the training time length exceeds the preset time length, the training times exceed the preset times, and the function value calculated based on the predetermined loss function is smaller than the preset threshold value.
It is understood that when the pre-trained model does not satisfy the training end condition, algorithms such as a gradient descent method, a back propagation method, and the like may be used to adjust the model parameters of the pre-trained model.
It should be understood that in this alternative implementation manner, a plurality of hyper-parameter adjustment manners may be adopted to train the initial model, so as to obtain a plurality of target models, and then an optimal target model (for example, the highest one of accuracy and recall ratio, or the fastest learning speed) is selected from the obtained plurality of target models as a pre-trained model, so as to increase the speed of obtaining the classification model by subsequent training, or improve at least one of accuracy, recall ratio, and F1 score of obtaining the classification model by subsequent training.
In some optional implementations of this embodiment, before performing step 203, the performing main body may further perform the following steps:
step one, calculating the remaining time for generating the classification model based on the number of external data in the target external data set and the number of internal data in the target internal data set. Wherein the remaining time is used for indicating the time difference between the time of training the obtained classification model and the current time.
As an example, the remaining time may be calculated as follows:
firstly, a technician can determine the corresponding relationship between the number of the representative training samples and the time length of the classification model obtained by training through a large amount of statistics. Here, the correspondence relationship may be represented in the form of a two-dimensional table, a graph, a line graph, or the like. And the number of the training samples is the sum of the number of the external data in the target external data set and the number of the internal data in the target internal data set. Wherein the current time may be used to indicate the time to perform the step one.
Then, the execution subject may determine a duration corresponding to the number of the current training samples according to the correspondence. And the number of the current training samples is the sum of the number of the external data in the target external data set and the number of the internal data in the target internal data set in the step one.
Finally, the execution subject may determine a difference between the determined time duration and the elapsed time duration as the remaining time. Wherein the elapsed time characterizes an elapsed time from when the training sample set was acquired to the current time. Wherein, the current time may be a time for executing the step one.
Optionally, the execution main body may further calculate the remaining time in the following manner:
first, the sum of the number of external data in the target external data set and the number of internal data in the target internal data set is calculated as the training sample number.
Then, the dimensions of the external data in the target external data set are calculated, and the dimensions of the internal data in the target internal data set are calculated.
And then, determining the product of the sum of the two dimensions, the number of training samples and the preset time length as the total training time length. The preset duration is preset duration used for representing the duration used by training the classification model by adopting a single-dimensional training sample.
Finally, the execution subject may determine a difference between the determined total training duration and the elapsed duration as the remaining time. Wherein the elapsed time characterizes an elapsed time from when the training sample set was acquired to the current time. Wherein, the current time may be a time for executing the step one.
And step two, sending the residual time calculated in the step one to a target terminal.
It can be understood that the alternative implementation manner may send the remaining duration to the user terminal, so that the user may know the specific time for obtaining the classification model.
In some optional implementation manners of this embodiment, the executing main body may further perform the following steps:
step one, acquiring target data to be classified sent by a target terminal through a calling interface. The target data to be classified may be data to be classified. In practice, the target data to be classified may be the same type of data as the internal data and the external data. For example, if the internal data and the external data are words, the target data to be classified may also be words; if the internal data and the external data are images, the target data to be classified may also be images.
And step two, inputting the target data to be classified into the classification model corresponding to the calling interface to generate the class information of the target data to be classified.
And step three, sending the generated category information to the target terminal.
It can be understood that, in the optional implementation manner, the classification model obtained through training is used for classifying the target data to be classified sent by the target terminal through the calling interface, so as to generate the class information of the target data to be classified, and send the generated class information to the target terminal. Therefore, the classification model can be called by the user without being stored in the target terminal used by the user, so that the resource occupation of the target terminal is reduced, the computing resource of the target terminal is saved, and the hardware loss of the target terminal is reduced.
Optionally, before generating the category information of the target data to be classified, the executing body may further perform verification to improve accuracy of the finally output category information.
In some optional implementation manners of this embodiment, the internal category information set and the internal data set corresponding to the internal category information in the internal category information set are obtained after feature engineering processing.
It is to be appreciated that the feature engineering process can include, but is not limited to, at least one of: feature construction, feature extraction, feature selection, and the like. In general, before training a model, a model trainer needs to spend a great deal of time and energy on feature engineering processing of a training sample, so that the trained model has relatively excellent performances in terms of accuracy, recall rate and the like. The optional implementation manner may be that, in the electronic device locally or in communication connection with the execution main body, the internal category information set and the internal data set after feature engineering processing are stored in advance, so that a user of the target terminal may obtain a classification model trained by using the internal category information set and the internal data set after feature engineering processing without feature engineering processing. Therefore, the obstacle in the aspect of data acquisition when a user trains the model is avoided, the steps of training and calling the model are simplified, and the accuracy of the model is guaranteed on the premise of improving the generation efficiency of the model.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for transmitting information according to the present embodiment. In the application scenario of fig. 3, the server 301 first acquires the category information set 303 transmitted by the target terminal 302 as the external category information set 303. Then, when the target external category information does not exist in the external category information set 303, the server 301 determines the model 305 trained in advance as the classification model 305 obtained by training. The target external category information is external category information that does not match the internal category information in the predetermined internal category information set 304. The pre-trained model 305 is trained based on a predetermined set of training samples. The training samples in the training sample set include the internal category information in the internal category information set 304 and internal data corresponding to the internal category information. Finally, the server 301 sends the trained class model calling interface 306 to the target terminal 302.
In the prior art, before using a model, a user is generally required to first perform the following steps to implement the use of the model: model selection, data preparation, model training, model testing, model deployment, and the like.
The method provided by the above embodiment of the present disclosure obtains a category information set sent by a target terminal as an external category information set, then determines a pre-trained model as a trained classification model in the case that no target external category information exists in the external category information set, where the target external category information is external category information that does not match internal category information in a pre-determined internal category information set, the pre-trained model is obtained by training based on a pre-determined training sample set, a training sample in the training sample set includes internal category information in the internal category information set and internal data corresponding to the internal category information, and finally sends a call interface of the trained classification model to the target terminal, so that a user only needs to upload data (e.g., the category information set) for training the model, the classification model for classifying the data according to the class indicated by the class information uploaded by the user can be generated, the model training speed is improved, the accuracy and the recall rate of the trained model are improved, the classification model can be called by the user without being stored in the target terminal used by the user, the resource occupation of the target terminal is reduced, the computing resource of the target terminal is saved, and the hardware loss of the target terminal is reduced.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method for transmitting information is shown. The process 400 of the method for transmitting information includes the steps of:
step 401, acquiring a category information set sent by a target terminal as an external category information set. Thereafter, step 402 is performed.
In this embodiment, step 401 is substantially the same as step 201 in the corresponding embodiment of fig. 2, and is not described here again.
Step 402, determining whether target external category information exists in the external category information set. Then, if yes, go to step 404; if not, go to step 403.
In this embodiment, the execution main body may determine whether the target external category information exists in the external category information set by using a method described in the optional implementation manner shown in fig. 2, which is not described herein again.
In step 403, the pre-trained model is determined as the trained classification model. Thereafter, step 406 is performed.
In this embodiment, the executing entity may determine a pre-trained model as the trained classification model. The execution subject may use the method described in fig. 2 to train a pre-trained model, which is not described herein again.
It is to be understood that, in the case that the target external category information does not exist in the external category information set, the executing entity may directly determine the pre-trained model as the trained classification model. In addition, the pre-trained model can be obtained by training based on the internal category information set and the internal data set in advance, so that the execution subject can obtain the classification model meeting the requirements of the user without performing model training in the application scene, and the generation efficiency of the model is improved.
And step 404, acquiring a target external data set which is sent by the target terminal and corresponds to the target external category information. Thereafter, step 405 is performed.
In this embodiment, the execution subject may obtain a target external data set corresponding to the target external category information, which is sent by the target terminal. It is to be understood that, here, each target external data set may correspond to one external category information, and different target external data sets may correspond to different external category information. In practice, a plurality of internal data sets and internal category information sets may be stored in advance in the electronic device local to the execution main body or communicatively connected to the execution main body. Each internal data set may correspond to one internal category information in the internal category information set, which is used to indicate the category of each internal data in the internal data set, in other words, the category of the internal data in the internal data set is the category indicated by the internal category information corresponding to the internal data set.
It is understood that the internal category information set and the internal data set corresponding to the internal category information in the internal category information set may be obtained after feature engineering processing. For example, the internal set of category information and the internal set of data can be structured data (e.g., data that expresses an implementation in a two-dimensional table structure). Therefore, data acquisition obstacles faced by a user during model training can be avoided, the work of the user with higher technical thresholds such as algorithm investigation, scheme selection, parameter optimization and the like from zero can be avoided, and the generation efficiency of the model can be improved.
And 405, training an initial model by adopting a machine learning algorithm based on the external category information set, the target external data set and the target internal data set, and determining the initial model meeting the predetermined training end condition as a classification model obtained by training. Thereafter, step 406 is performed.
In this embodiment, the executing entity may adopt a machine learning algorithm to train the initial model based on the external category information set, the target external data set, and the target internal data set, and determine the initial model satisfying a predetermined training end condition as the trained classification model. The category of the target external data in the target external data set corresponding to the target external category information, which is sent by the terminal, may be a category indicated by the target external category information.
It can be understood that, under the condition that each external category information in the external category information set is target external category information, the target internal data set is an empty set, and thus, the execution subject can directly train on the basis of the external category information set and the external data set to obtain a classification model; in the case that part (but not all) of the external category information in the external category information set is the target external category information, the execution subject may train to obtain the classification model based on the external category information set and the external data set. In this way, for the external category information in the external category information set that matches the internal category information in the predetermined internal category information set, the execution subject may adopt an internal data set corresponding to the internal category information that matches the external category information, instead of the external data set corresponding to the external category information, to train the classification model, whereby, when the target internal data set is a set of data obtained after feature engineering processing, the external data set and the external category information are acquired from the execution subject, and it usually takes only 10 hours to generate the classification model, thereby increasing the speed of model generation while ensuring the accuracy of the trained classification model.
In some optional implementation manners of this embodiment, the executing main body may further perform the following steps:
and under the condition of receiving target modification information sent by a target terminal, retraining to obtain a classification model by adopting a target external data set, a modified external category information set and an internal data set corresponding to internal category information matched with the modified external category information. The target modification information is used for modifying external data or external category information corresponding to the internal data.
Here, the execution agent may retrain the target data as input data of the initial model, and the modified external category information corresponding to the target data as expected output data of the initial model to obtain the classification model. The target data is target external data in a target external data set, or internal data in the internal data set.
It can be understood that, in this alternative implementation manner, the manner of obtaining the classification model by retraining may be substantially the same as the manner of obtaining the classification model by training described above, and details are not described here.
It should be understood that, in this alternative implementation manner, after the user modifies the external data or the external category information corresponding to the internal data through the target terminal, the classification model meeting the new requirements of the user may be obtained through retraining. Therefore, the model training mode is enriched, and the classification models meeting the requirements of different users can be obtained through training. And then, the executing body can also send a calling interface of the classification model obtained by retraining to the target user, so that the user can use the classification model obtained by retraining.
And step 406, sending the calling interface of the trained classification model to the target terminal.
In this embodiment, the executing entity may deploy the classification model after obtaining the classification model, and send the trained call interface of the classification model to the target terminal. Wherein the calling interface may include an address and a port number for deploying the classification model.
It is understood that after the target terminal receives the call interface, the user can use the trained classification model through the call interface. In the case that the target external category information does not exist in the external category information set, the calling interface sent in step 406 is the calling interface of the classification model in step 403, that is, the calling interface of the model trained in advance; if the target external category information exists in the external category information set, the calling interface sent in step 406 is the calling interface of the classification model trained in step 405.
In practice, deploying the classification model may comprise the steps of: and packaging the classification model, uploading the packaged classification model to a storage (such as a BOS), and deploying a cluster operating system (such as a matrix).
It can be understood that the classification model trained by the alternative implementation manner may be used for classification, detection, recognition and other scenarios.
In some optional implementation manners of this embodiment, the executing main body may further perform the following steps:
step one, generating identification information of the classification model obtained through training. The identification information may be used to identify the classification model, and for example, the identification information may be a timestamp for generating the classification model, or may be a version number of the classification model.
Step two, storing each piece of storage data in an associated manner, wherein each piece of storage data comprises at least two items as follows: identification information of the classification model, the classification model obtained by training, a data set of the classification model obtained by training and an external class information set, wherein data in the data set of the classification model obtained by training is one of the following data: internal data or external data.
And step three, based on the data for storage received from the target terminal, sending the searched data for storage stored in association with the received data for storage to the target terminal.
It can be understood that, since at least two items of data (i.e., storage data) in the identification information of the classification model, the classification model obtained by training, the data set of the classification model obtained by training, and the external category information set are stored in association with each other, the optional implementation manner may search for other storage data stored in association with the storage data according to the storage data received from the target terminal. For example, when the execution subject stores the identification information of the classification model, the classification model obtained by training, the data set of the classification model obtained by training, and the external category information set in association with each other, the execution subject may find the classification model obtained by training, the data set of the classification model obtained by training, and the external category information set stored in association with the identification information of the classification model according to the identification information of the classification model received from the target terminal, and thus, a user may know the iteration condition of the classification model or the training sample used in the training process.
It should be noted that, besides the above-mentioned contents, the embodiment of the present application may further include the same or similar features and effects as the embodiment corresponding to fig. 2, and details are not repeated herein.
Referring now to fig. 5A-5F, fig. 5A-5F illustrate an interaction process diagram of a target terminal for a method for transmitting information according to the present disclosure.
As shown in fig. 5A, in the process of creating a model, first, a user inputs a model name "short video classification" and a model classification "scene classification" to a target terminal. It is understood that in this application scenario, the user's requirement is to perform scene classification for short video.
Thereafter, referring to fig. 5B, the user uploads a category information set ("life, science and technology, entertainment" in fig. 5B) as an external category information set to the target terminal. Then, if the target external class information does not exist in the external class information set, the execution subject determines a model trained in advance as a classification model obtained by training. The target external category information is external category information which is not matched with internal category information in a predetermined internal category information set. The pre-trained model is trained based on a predetermined set of training samples. The training samples in the training sample set comprise internal category information in the internal category information set and internal data corresponding to the internal category information. And if the external category information set contains the target external category information, the execution main body acquires a target external data set which is sent by the target terminal and corresponds to the target external category information. Then, training the initial model by adopting a machine learning algorithm based on the external category information set, the target external data set and the target internal data set, and determining the initial model meeting the predetermined training end condition as a classification model obtained by training. As shown in fig. 5B, in the case where the target external category information exists in the external category information set, the target terminal transmits the external data set "video 1, video 2, video 3, video 4, video 5, video 6, video 7 … …" to the execution main body described above.
As shown in fig. 5C, for each external data in the external data set, the user selects external category information of the external data from the external category information set. Thus, the target terminal obtains a target external data set (for example, video 1, video 3 … …) corresponding to the target external category information (the target external category information is "life" in fig. 5B).
Thereafter, the executing entity starts to train the model, and as shown in fig. 5D, the current remaining time is "10 minutes", which means that the executing entity finishes training the classification model after 10 minutes.
As shown in fig. 5E, after the classification model is trained, the target terminal presents basic information, a test report, accuracy analysis of each label (i.e. class), and other information.
Finally, as shown in fig. 5F, the target terminal receives the call interface "http:// xxx.xx.com/xx: 8088" of the classification model obtained by training and sent by the execution subject.
Returning to fig. 4, it can be seen from fig. 4 that, compared with the embodiment corresponding to fig. 2, the flow 400 of the method for sending information in the present embodiment highlights the step of training to obtain the classification model when the target external category information exists in the external category information set. Therefore, in the scheme described in this embodiment, under the condition that the target external category information exists in the category information set sent by the target terminal, the classification model can be obtained through training based on the external category information set, the target external data set and the target internal data set, so that the training mode of the model is enriched, and under the condition that the target internal data set is the set of data obtained after feature engineering processing, the optional implementation mode can improve the accuracy and speed of the classification model obtained through training.
With further reference to fig. 6, as an implementation of the method shown in the above figures, the present disclosure provides an embodiment of an apparatus for transmitting information, the apparatus embodiment corresponds to the method embodiment shown in fig. 2, and the apparatus embodiment may include the same or corresponding features as the method embodiment shown in fig. 2, in addition to the features described below, and produce the same or corresponding effects as the method embodiment shown in fig. 2. The device can be applied to various electronic equipment.
As shown in fig. 6, the apparatus 600 for transmitting information of the present embodiment includes: a first acquisition unit 601, a determination unit 602, and a first transmission unit 603. The first acquiring unit 601 is configured to acquire a category information set sent by a target terminal as an external category information set; a determining unit 602, configured to determine, in response to that there is no target external category information in the external category information set, a pre-trained model as a trained classification model, where the target external category information is external category information that does not match internal category information in a pre-determined internal category information set, the pre-trained model is trained based on a pre-determined training sample set, and training samples in the training sample set include internal category information in the internal category information set and internal data corresponding to the internal category information; a first sending unit 603 configured to send the trained calling interface of the classification model to the target terminal.
In this embodiment, the first acquiring unit 601 of the apparatus 600 for transmitting information may acquire the category information set transmitted by the target terminal as the external category information set by a wired connection manner or a wireless connection manner.
The target terminal may be a terminal communicatively connected to the apparatus 600. The category information in the category information set sent by the target terminal may be used to indicate the category. As an example, the category information may be used to indicate any of: humans, animals, plants, etc. The category information in the category information set transmitted by the target terminal may be used to indicate a category of video, a category of text, a category of image, or a category of other data.
In this embodiment, in a case where the target external category information does not exist in the external category information set, the determining unit 602 may determine a model trained in advance as the trained classification model. The target external category information is external category information which is not matched with internal category information in a predetermined internal category information set, a pre-trained model is obtained through training based on a predetermined training sample set, and training samples in the training sample set comprise internal category information in the internal category information set and internal data corresponding to the internal category information.
The target external category information is external category information which is not matched with internal category information in a predetermined internal category information set. The pre-trained model is trained based on a predetermined set of training samples. The training samples in the training sample set comprise internal category information in the internal category information set and internal data corresponding to the internal category information. The classification model is used for determining external category information corresponding to the input data from the external category information set.
The internal category information in the internal category information set is used to indicate a predetermined category. As an example, the internal category information in the internal category information set is used to indicate any of the following categories: vehicles, people, plants, etc. The external category information may be category information transmitted by the target terminal. As an example, the external category information may be used to indicate any of the following categories: automobiles, landscapes, plants, etc.
In this embodiment, the first sending unit 603 may send, to the target terminal, a call interface of the classification model trained by the determining unit 602. Wherein the calling interface may include an address and a port number for deploying the classification model.
In some optional implementations of this embodiment, the apparatus 600 further includes: the second obtaining unit (not shown in the figure) is configured to obtain a test sample set sent by the target terminal via the calling interface, wherein the test samples in the test sample set include data and category information of the data. The input unit (not shown in the figure) is configured to input the data in the test sample set to the classification model corresponding to the calling interface, and obtain the class information output by the classification model. The generating unit (not shown in the figure) is configured to generate at least one of the following evaluation information of the classification model based on the classification information output by the classification model and the classification information in the test sample set: accuracy, recall, F1 score. The second transmitting unit (not shown in the figure) is configured to transmit the generated evaluation information to the target terminal.
In some optional implementations of this embodiment, the apparatus 600 further includes: a third acquiring unit (not shown in the figure) is configured to acquire a target external data set corresponding to the target external category information, which is sent by the target terminal, in response to the target external category information existing in the external category information set; training an initial model by adopting a machine learning algorithm based on an external category information set, a target external data set and a target internal data set, and determining the initial model meeting a predetermined training end condition as a classification model obtained by training; the target internal data set is an internal data set corresponding to internal category information matched with the external category information in the external category information set.
In some optional implementations of this embodiment, the apparatus 600 further includes: the calculation unit (not shown in the figures) is configured to calculate a remaining time based on the amount of external data in the target external data set and the amount of internal data in the target internal data set, wherein the remaining time is used for indicating a time difference between a time at which the classification model is trained and a current time. The third transmitting unit (not shown in the figure) is configured to transmit the remaining time to the target terminal.
In some optional implementations of this embodiment, the apparatus 600 further includes: the fourth acquiring unit (not shown in the figure) is configured to acquire the target data to be classified sent by the target terminal via the calling interface. The input unit (not shown in the figure) is configured to input the target data to be classified to the classification model corresponding to the calling interface, and generate the class information of the target data to be classified. The fourth transmitting unit (not shown in the figure) is configured to transmit the generated category information to the target terminal.
In some optional implementation manners of this embodiment, the internal category information set and the internal data set corresponding to the internal category information in the internal category information set are obtained after feature engineering processing.
The apparatus provided in the above embodiment of the present disclosure acquires, by the first acquiring unit 601, a class information set sent by a target terminal as an external class information set, then, in response to that there is no target external class information in the external class information set, the determining unit 602 determines a pre-trained model as a trained classification model, where the target external class information is external class information that does not match internal class information in a pre-determined internal class information set, the pre-trained model is trained based on a pre-determined training sample set, training samples in the training sample set include internal class information in the internal class information set and internal data corresponding to the internal class information, and finally, the first sending unit 603 sends a call interface of the trained classification model to the target terminal, so that a user only needs to upload data (e.g., the class information set) for the training model, the classification model for classifying the data according to the class indicated by the class information uploaded by the user can be generated, the model training speed is improved, the accuracy and the recall rate of the trained model are improved, the classification model can be called by the user without being stored in the target terminal used by the user, the resource occupation of the target terminal is reduced, the computing resource of the target terminal is saved, and the hardware loss of the target terminal is reduced.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use as a server in implementing embodiments of the present disclosure. The server shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU)701, performs the above-described functions defined in the method of the present disclosure.
It should be noted that the computer readable medium in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Python, Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a first acquisition unit, a determination unit, and a first transmission unit. The names of these units do not in some cases form a limitation on the unit itself, and for example, the first acquisition unit may also be described as a "unit that acquires a set of category information transmitted by a target terminal".
As another aspect, the present disclosure also provides a computer-readable medium, which may be contained in the server described in the above embodiments; or may exist separately and not be assembled into the server. The computer readable medium carries one or more programs which, when executed by the server, cause the server to: acquiring a category information set sent by a target terminal as an external category information set; in response to the fact that target external category information does not exist in the external category information set, determining a pre-trained model as a trained classification model, wherein the target external category information is external category information which is not matched with internal category information in the pre-determined internal category information set, the pre-trained model is obtained by training on the basis of a pre-determined training sample set, and training samples in the training sample set comprise internal category information in the internal category information set and internal data corresponding to the internal category information; and sending the calling interface of the classification model obtained by training to the target terminal.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept as defined above. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (14)

1. A method for transmitting information, comprising:
acquiring a category information set sent by a target terminal as an external category information set;
in response to that no target external category information exists in the external category information set, determining a pre-trained model as a trained classification model, wherein the target external category information is external category information which is not matched with internal category information in a pre-determined internal category information set, the pre-trained model is trained on a pre-determined training sample set, and training samples in the training sample set comprise internal data and internal category information corresponding to the internal data in the internal category information set;
and sending a calling interface of the classification model obtained by training to the target terminal.
2. The method of claim 1, wherein the method further comprises:
acquiring a test sample set sent by the target terminal through the calling interface, wherein the test sample in the test sample set comprises data and data category information;
inputting the data in the test sample set to a classification model corresponding to the calling interface to obtain class information output by the classification model;
based on the class information output by the classification model and the class information in the test sample set, generating at least one item of evaluation information of the classification model, which is as follows: accuracy, recall, F1 score;
and sending the generated evaluation information to the target terminal.
3. The method of claim 1, wherein prior to the sending of the trained classification model's calling interface to the target terminal, the method further comprises:
responding to the target external category information in the external category information set, and acquiring a target external data set which is sent by the target terminal and corresponds to the target external category information; training an initial model by adopting a machine learning algorithm based on the external category information set, the target external data set and the target internal data set, and determining the initial model meeting a predetermined training end condition as a classification model obtained by training; the target internal data set is an internal data set corresponding to internal category information matched with the external category information in the external category information set.
4. The method of claim 3, wherein prior to the sending of the trained classification model's calling interface to the target terminal, the method further comprises:
calculating a remaining time based on the number of external data in the target external data set and the number of internal data in the target internal data set, wherein the remaining time is used for indicating a time difference between the time of training the obtained classification model and the current time;
and sending the remaining time to the target terminal.
5. The method according to one of claims 1-4, wherein the method further comprises:
acquiring target data to be classified sent by the target terminal through the calling interface;
inputting the target data to be classified into a classification model corresponding to the calling interface, and generating class information of the target data to be classified;
and sending the generated category information to the target terminal.
6. The method according to one of claims 1 to 4, wherein the internal category information set and the internal data set corresponding to the internal category information in the internal category information set are obtained after feature engineering.
7. An apparatus for transmitting information, comprising:
a first acquisition unit configured to acquire a category information set transmitted by a target terminal as an external category information set;
a determining unit configured to determine a pre-trained model as a trained classification model in response to no target external class information existing in the external class information set, wherein the target external class information is external class information not matching internal class information in a pre-determined internal class information set, the pre-trained model is trained on a pre-determined training sample set, and training samples in the training sample set comprise internal data and internal class information corresponding to the internal data in the internal class information set;
and the first sending unit is configured to send the trained calling interface of the classification model to the target terminal.
8. The apparatus of claim 7, wherein the apparatus further comprises:
the second acquisition unit is configured to acquire a test sample set sent by the target terminal, wherein the test samples in the test sample set comprise data and class information of the data;
the input unit is configured to input the data in the test sample set into a trained classification model to obtain class information output by the classification model;
a generating unit configured to generate at least one of the following evaluation information of the classification model based on the classification information output by the classification model and the classification information in the test sample set: accuracy, recall, F1 score;
a second transmitting unit configured to transmit the generated evaluation information to the target terminal.
9. The apparatus of claim 7, wherein the apparatus further comprises:
a third obtaining unit, configured to obtain a target external data set corresponding to target external category information sent by the target terminal in response to the target external category information existing in the external category information set; training an initial model by adopting a machine learning algorithm based on the external category information set, the target external data set and the target internal data set, and determining the initial model meeting a predetermined training end condition as a classification model obtained by training; the target internal data set is an internal data set corresponding to internal category information matched with the external category information in the external category information set.
10. The apparatus of claim 9, wherein the apparatus further comprises:
a calculating unit configured to calculate a remaining time based on the number of external data in the target external data set and the number of internal data in the target internal data set, wherein the remaining time is used for indicating a time difference between a time of training a classification model and a current time;
a third transmitting unit configured to transmit the remaining time to the target terminal.
11. The apparatus according to one of claims 7-10, wherein the apparatus further comprises:
a fourth obtaining unit, configured to obtain target data to be classified sent by the target terminal via the calling interface;
the input unit is configured to input the target data to be classified to the classification model corresponding to the calling interface, and generate class information of the target data to be classified;
a fourth transmitting unit configured to transmit the generated category information to the target terminal.
12. The apparatus according to one of claims 7 to 10, wherein the internal category information set and the internal data set corresponding to the internal category information in the internal category information set are obtained by feature engineering.
13. A server, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
14. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-6.
CN201910575820.6A 2019-06-28 2019-06-28 Method and apparatus for transmitting information Active CN110288089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910575820.6A CN110288089B (en) 2019-06-28 2019-06-28 Method and apparatus for transmitting information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910575820.6A CN110288089B (en) 2019-06-28 2019-06-28 Method and apparatus for transmitting information

Publications (2)

Publication Number Publication Date
CN110288089A CN110288089A (en) 2019-09-27
CN110288089B true CN110288089B (en) 2021-07-09

Family

ID=68019591

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910575820.6A Active CN110288089B (en) 2019-06-28 2019-06-28 Method and apparatus for transmitting information

Country Status (1)

Country Link
CN (1) CN110288089B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106446754A (en) * 2015-08-11 2017-02-22 阿里巴巴集团控股有限公司 Image identification method, metric learning method, image source identification method and devices
CN107766940A (en) * 2017-11-20 2018-03-06 北京百度网讯科技有限公司 Method and apparatus for generation model
CN108197664A (en) * 2018-01-24 2018-06-22 北京墨丘科技有限公司 Model acquisition methods, device, electronic equipment and computer readable storage medium
CN108304936A (en) * 2017-07-12 2018-07-20 腾讯科技(深圳)有限公司 Machine learning model training method and device, facial expression image sorting technique and device
US10115032B2 (en) * 2015-11-04 2018-10-30 Nec Corporation Universal correspondence network
CN108830235A (en) * 2018-06-21 2018-11-16 北京字节跳动网络技术有限公司 Method and apparatus for generating information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10579925B2 (en) * 2013-08-26 2020-03-03 Aut Ventures Limited Method and system for predicting outcomes based on spatio/spectro-temporal data
US20160180214A1 (en) * 2014-12-19 2016-06-23 Google Inc. Sharp discrepancy learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106446754A (en) * 2015-08-11 2017-02-22 阿里巴巴集团控股有限公司 Image identification method, metric learning method, image source identification method and devices
US10115032B2 (en) * 2015-11-04 2018-10-30 Nec Corporation Universal correspondence network
CN108304936A (en) * 2017-07-12 2018-07-20 腾讯科技(深圳)有限公司 Machine learning model training method and device, facial expression image sorting technique and device
CN107766940A (en) * 2017-11-20 2018-03-06 北京百度网讯科技有限公司 Method and apparatus for generation model
CN108197664A (en) * 2018-01-24 2018-06-22 北京墨丘科技有限公司 Model acquisition methods, device, electronic equipment and computer readable storage medium
CN108830235A (en) * 2018-06-21 2018-11-16 北京字节跳动网络技术有限公司 Method and apparatus for generating information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Effective distributed convolutional neural network architecture for remote sensing images target classification with a pre-training approach;LI Binquan 等;《Journal of Systems Engineering and Electronics》;20190513;第238-244页 *

Also Published As

Publication number Publication date
CN110288089A (en) 2019-09-27

Similar Documents

Publication Publication Date Title
CN107766940B (en) Method and apparatus for generating a model
CN110288049B (en) Method and apparatus for generating image recognition model
CN108520220B (en) Model generation method and device
CN109104620B (en) Short video recommendation method and device and readable medium
CN108830235B (en) Method and apparatus for generating information
CN109976997B (en) Test method and device
CN110555714A (en) method and apparatus for outputting information
CN108197652B (en) Method and apparatus for generating information
CN108960316B (en) Method and apparatus for generating a model
CN108520470B (en) Method and apparatus for generating user attribute information
CN107609506B (en) Method and apparatus for generating image
US11922281B2 (en) Training machine learning models using teacher annealing
CN111428010B (en) Man-machine intelligent question-answering method and device
CN109145828B (en) Method and apparatus for generating video category detection model
CN109189544B (en) Method and device for generating dial plate
US20240127058A1 (en) Training neural networks using priority queues
CN108388563B (en) Information output method and device
CN112149699B (en) Method and device for generating model and method and device for identifying image
CN113128419B (en) Obstacle recognition method and device, electronic equipment and storage medium
US11356389B2 (en) Systems and methods for a two-tier machine learning model for generating conversational responses
CN110084317A (en) The method and apparatus of image for identification
CN110008926B (en) Method and device for identifying age
CN111090740B (en) Knowledge graph generation method for dialogue system
CN110688470B (en) Method and apparatus for transmitting information
CN110288089B (en) Method and apparatus for transmitting information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant