CN117785964A - Data processing method and system applied to network service - Google Patents

Data processing method and system applied to network service Download PDF

Info

Publication number
CN117785964A
CN117785964A CN202410221256.9A CN202410221256A CN117785964A CN 117785964 A CN117785964 A CN 117785964A CN 202410221256 A CN202410221256 A CN 202410221256A CN 117785964 A CN117785964 A CN 117785964A
Authority
CN
China
Prior art keywords
network
class
mining
training
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410221256.9A
Other languages
Chinese (zh)
Inventor
吕安全
彭小琴
吕莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yibin Wanshitong Network Information Service Co ltd
Original Assignee
Yibin Wanshitong Network Information Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yibin Wanshitong Network Information Service Co ltd filed Critical Yibin Wanshitong Network Information Service Co ltd
Priority to CN202410221256.9A priority Critical patent/CN117785964A/en
Publication of CN117785964A publication Critical patent/CN117785964A/en
Pending legal-status Critical Current

Links

Abstract

The application provides a data processing method and a system applied to network services, which can fully utilize the existing priori mining results in the data processing process of the network services and improve the mining accuracy and efficiency by introducing a reference type network session training text and the carried reference type priori labeling knowledge. In each cycle network training stage, the optimized application class guided network is converted into a new application class guided network by means of transfer parameter learning, and the new application class guided network is used for mining application class network session training texts, so that the mining effect is further improved. Finally, by generating the first network service mining network, the efficient mining of the application class network session text and the reference class network session text is realized, the optimization result of the previous cyclic network training stage is inherited, flexible adjustment and optimization can be carried out according to the actual network service mining requirement, and the characteristics of error discrimination and migration parameter learning of network parameter learning are fully considered.

Description

Data processing method and system applied to network service
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a data processing method and system applied to network services.
Background
With the rapid development and popularization of network technology, network services have become an integral part of people's daily lives. In web services, web session text is an important medium of interaction between users and service providers, implying rich user demand and behavioral information. Therefore, the method and the device have important significance for effectively mining and analyzing the network session text and improving the quality of network service and user experience.
However, conventional network service mining methods often face a number of challenges. First, web session text has a high degree of diversity and complexity, making it difficult for the mining process to accurately capture critical information therein. Second, the real-time nature of web services requires that the mining method have efficient processing power to cope with large-scale web session data. In addition, how to fully utilize the prior knowledge to guide the excavation process, improve the accuracy and efficiency of excavation, and is also a problem which is difficult to solve by the traditional method.
Disclosure of Invention
In view of this, the present application aims to provide a data processing method and system applied to network services, which gradually generate an accurate and efficient network service mining network by introducing a reference type network session training text and a reference type priori labeling knowledge carried by the reference type network session training text and combining the ideas of cyclic network training and migration parameter learning. Specifically, in each cyclic network training stage, the application class guided network is subjected to network parameter learning according to the mining error generated in the previous stage, so that the network performance is continuously optimized. Meanwhile, the optimized network is converted into a new application class guiding network in a migration parameter learning mode, and the new application class guiding network is used for mining application class network session training texts. The finally generated first network service mining network can fully utilize priori knowledge, accurately and efficiently mine key information in the network session text, and provide powerful support for optimizing and improving the network service. The method not only improves the accuracy and efficiency of network service mining, but also has stronger flexibility and expandability. By adjusting the network structure and parameter settings, the system can adapt to network service mining tasks of different types and scales.
According to a first aspect of the present application, there is provided a data processing method applied to a network service, the method comprising:
acquiring a reference type network session training text and an application type network session training text, wherein the reference type network session training text carries reference type priori labeling knowledge, and the reference type priori labeling knowledge characterizes the priori mining result of the reference type network session training text in a target network service mining process;
in the (k+1) th cyclic network training stage, guiding the network to mine the content of the reference class network session training text by the application class generated in the (k) th cyclic network training stage, and judging the network parameter learning error according to the reference class priori-labeling knowledge to generate a first reference class training error, wherein k is a positive integer;
according to the first reference class training error, performing network parameter learning on the application class guided network generated in the kth cyclic network training stage, and generating a reference application class guided network;
the application class guidance network generated in the (k+1) th cyclic network training stage is used for mining the application class network session training text to generate application class fuzzy mining content, the reference application class guided network is used for mining the application class network session training text, network parameter learning error discrimination is carried out according to the application class fuzzy mining content to generate a first application characteristic space training error, and the application class guidance network generated in the (k+1) th cyclic network training stage is a neural network generated by carrying out migration parameter learning on the reference application class guided network;
And carrying out network parameter learning on the reference application class guided network according to the first application feature space training error, and generating an application class guided network generated in the (k+1) th cyclic network training stage until a first network service mining network is generated, wherein the first network service mining network is used for mining application class network session texts and reference class network session texts.
In a possible implementation manner of the first aspect, the method further includes:
in the (k+1) th cyclic network training stage, according to the application class generated in the (k) th cyclic network training stage, guiding a network to perform semantic scene vector suitability analysis on the reference class network session training text and the application class network session training text, and generating a semantic scene suitability error;
the generating the reference application class guided network by performing network parameter learning on the application class guided network generated in the kth cyclic network training stage according to the first reference class training error includes:
and according to the semantic scene suitability error and the first reference class training error, performing network parameter learning on the application class guided network generated in the kth cycle network training stage, and generating the reference application class guided network.
In a possible implementation manner of the first aspect, the generating, by the guidance network, the application class generated according to the kth loop network training phase performs semantic scene vector suitability parsing on the reference class network session training text and the application class network session training text, to generate a semantic scene suitability error, including:
the application class generated according to the kth cyclic network training stage is guided to the network to extract the reference class semantic embedded vector of the reference class network session training text;
the application class generated according to the kth cyclic network training stage is guided to the network to extract the application class semantic embedded vector of the application class network session training text;
performing service risk mining on the reference class semantic embedded vector to generate first mining content, wherein the first mining content characterizes the confidence coefficient of the reference class network session training text to which the reference class semantic embedded vector belongs;
performing service risk mining on the application class semantic embedded vector to generate second mining content, wherein the second mining content characterizes the confidence coefficient of the application class network session training text to which the application class semantic embedded vector belongs;
Determining the semantic scene suitability error based on a direction of decreasing a feature distance between the first mined content and the second mined content, increasing a feature distance between the reference class semantic embedding vector and the application class semantic embedding vector.
In a possible implementation manner of the first aspect, the step of determining the semantic scene suitability error based on a direction of decreasing a feature distance between the first mined content and the second mined content and increasing a feature distance between the reference class semantic embedded vector and the application class semantic embedded vector includes:
calculating a first feature distance between the first mined content and the second mined content;
calculating a second feature distance between the reference class semantic embedded vector and the application class semantic embedded vector;
based on the first feature distance and the second feature distance, a corresponding suitability error function is defined, which is used to define respective weights for the first feature distance and the second feature distance, so that the first feature distance is minimized and the second feature distance is maximized.
In a possible implementation manner of the first aspect, the step of generating the application class fuzzy mining content includes, before the step of generating the application class fuzzy mining content, mining the application class web session training text by the application class guidance network generated in the kth+1th loop web training phase:
acquiring first network function layer definition information of the guided network of the reference application class;
acquiring second network function layer definition information of an application class guidance network generated in a kth cyclic network training stage;
integrating the first network function layer definition information and the second network function layer definition information based on the set network optimization parameters to generate integrated network function layer definition information;
and generating an application class guidance network generated in the (k+1) th cyclic network training stage according to the integrated network function layer definition information.
In a possible implementation manner of the first aspect, the generating the application class fuzzy mining content through the k+1st cyclic network training phase instructs the network to mine the application class network session training text, and then includes:
the method comprises the steps of mining content of a reduced guided network generated in a kth cycle network training stage on a reference class network session training text, judging network parameter learning errors according to the reference class priori labeling knowledge, and generating a second reference class training error, wherein the parameter quantity of network function layer definition information of the reduced guided network is smaller than that of the application class guiding network;
The reduced guided network generated in the kth cycle network training stage is used for carrying out network parameter learning error discrimination on the mining content of the application type network session training text according to the application type fuzzy mining content, and a second application feature space training error is generated;
and according to the second reference class training error and the second application feature space training error, performing network parameter learning on the reduced guided network generated in the kth cyclic network training stage, and generating a reduced guided network generated in the kth+1th cyclic network training stage until a second network service mining network is generated, wherein the second network service mining network is used for mining the application class network session text and the reference class network session text.
In a possible implementation manner of the first aspect, the generating, by the application class guidance network generated in the kth+1th loop network training phase, mining the application class web session training text, generating application class fuzzy mining content includes:
the application class guidance network generated in the (k+1) th cyclic network training stage is used for mining the application class network session training text, and reference fuzzy mining content corresponding to the application class network session training text and a credibility value corresponding to the reference fuzzy mining content are generated;
And if the credibility value corresponding to the reference fuzzy mining content is larger than the set credibility value, determining the reference fuzzy mining content as the application class fuzzy mining content.
In a possible implementation manner of the first aspect, the step of guiding the network to mine the application class web session training text by the reference application class, and performing network parameter learning error discrimination according to the application class fuzzy mining content to generate a first application feature space training error includes:
acquiring a credibility value corresponding to the application class fuzzy mining content;
the reference application class guided network is used for carrying out mining contents of the application class network session training text, and network parameter learning error discrimination is carried out according to the application class fuzzy mining contents, so that a reference application characteristic space training error is generated;
and fusing the reference application feature space training errors according to the credibility value to generate the first application feature space training errors.
In a possible implementation manner of the first aspect, the method further includes:
and mining the application type network session text and the reference type network session text based on the first network service mining network to generate mining results.
According to a second aspect of the present application, there is provided a data processing system for use in a network service, the data processing system for use in a network service comprising a machine-readable storage medium storing machine-executable instructions and a processor, the processor, when executing the machine-executable instructions, implementing the data processing method for use in a network service as described above.
According to a third aspect of the present application, there is provided a computer-readable storage medium having stored therein computer-executable instructions which, when executed, implement the aforementioned data processing method applied to a network service.
According to any one of the aspects, the technical effects of the application are as follows:
by introducing the reference class network session training text and the reference class priori annotation knowledge carried by the reference class network session training text, the existing priori mining results can be fully utilized in the data processing process of network service, and the mining accuracy and efficiency are improved. In each cyclic network training stage, according to the mining content of the application class generated in the previous stage on the reference class network session training text by the guided network, and combining the reference class priori labeling knowledge to judge the network parameter learning error, thereby generating more accurate training errors. Secondly, by adopting a loop network training mode, a more accurate and efficient application class guided network is gradually generated through continuous iteration and optimization. In each cycle, the application class guided network is subjected to network parameter learning according to the training error generated in the previous stage, so that the application class guided network can better adapt to the requirements of network service mining tasks. Meanwhile, the optimized application class guided network is converted into a new application class guided network by means of transfer parameter learning, and the new application class guided network is used for mining the application class network session training text, so that the mining effect is further improved. Finally, by generating a first network service mining network, efficient mining of application class network session text and reference class network session text is achieved. The first network service mining network not only inherits the optimization result of the previous cyclic network training stage, but also can flexibly adjust and optimize according to the actual network service mining requirement, and provides powerful support for the data processing of the network service. Meanwhile, the generation process of the first network service mining network fully considers the characteristics of error discrimination and migration parameter learning of network parameter learning, and ensures the accuracy and high efficiency of the first network service mining network in the network service mining task.
That is, the main objective of the embodiments of the present application is to optimize the mining process of the web session text, and perform multiple cycle web training and error discrimination by using a priori labeling knowledge of the reference web session training text and applying the fuzzy mining content of the web session training text, so as to generate the final web service mining network. By using a priori labeled knowledge of the reference class web session training text, the method can more accurately guide the web parameter learning process, thereby reducing training errors. In the multi-cycle network training stage, the network structure can be gradually optimized by learning the migration parameters of the reference application class guided network and the application class guided network, so that the efficiency and the precision of network service mining can be improved. By performing network parameter learning error discrimination on the fuzzy mining content of the application-class web session training text, the network parameter learning process can be further optimized, thereby generating a more effective network service mining network. Finally, through the generated network service mining network, the application type network session text and the reference type network session text can be mined more effectively, so that the data processing efficiency and accuracy of the network service are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting in scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a data processing method applied to a network service according to an embodiment of the present application.
Fig. 2 is a schematic component structure of a data processing system applied to a network service for implementing the data processing method applied to a network service according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below according to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented in accordance with some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Furthermore, one skilled in the art, under the direction of this application, may add at least one other operation to the flowchart, or may destroy at least one operation from the flowchart.
As shown in fig. 1, in the flow chart of the data processing method and system for network service according to the embodiments of the present application, it should be understood that, in other embodiments, the sequence of part of the steps in the data processing method for network service according to the embodiments of the present application may be shared with each other according to actual needs, or part of the steps may be omitted or maintained. The detailed steps of the data processing method applied to the network service comprise:
step S110, a reference type network session training text and an application type network session training text are obtained, wherein the reference type network session training text carries reference type priori labeling knowledge, and the reference type priori labeling knowledge characterizes the priori mining result of the reference type network session training text in the target network service mining process.
The reference type web session training text refers to a group of web session text data which is subjected to labeling processing and is used for training a machine learning model. Such data typically originates from real network session scenarios such as chat rooms, forums, social media, and the like. The labeling process refers to adding additional information to the text to indicate the category, topic, or other important feature that it belongs to. For example, assuming an intelligent customer service system is being developed, different types of requests such as complaints, consultations and advice in a user session need to be identified. Then, some historical customer service chat records may be collected and the expert may be asked to annotate the records, indicating the type of each record (e.g., complaint, consultation, etc.). These annotated records constitute the reference class web session training text.
The application type web session training text refers to another group of original web session text data which is not marked and processed, and is also derived from a real web session scene. These data will be used to train the model and expect the model to automatically learn patterns and rules therein. Taking an intelligent customer service system as an example, besides the marked reference text, a large number of unlabeled customer service chat records exist. These records will be used directly to train the model to verify the performance of the model in practical applications.
The reference class priori annotation knowledge refers to annotation information appended to the reference class web session training texts, which represents known results or priori information of these texts in a certain web service mining task. Such knowledge is typically derived based on expert judgment, historical data, or other reliable sources. For example, in the case of an intelligent customer service system, the a priori labeling knowledge is a type tag (e.g., complaint, consultation, etc.) specified by an expert for each reference class text. These labels are known during the training process and are used to guide the model in learning how to correctly classify new conversational text.
The target web service mining process refers to a process of analyzing and mining web session text to achieve a certain target in a specific web service scene. This goal may be to improve customer satisfaction, optimize service flows, find potential problems, etc. For example, in the case of intelligent customer service systems, the target web service mining process automatically recognizes the user's different types of requests by analyzing and mining customer service chat records and provides personalized service responses accordingly. This process aims to improve customer service efficiency, reduce human intervention, and enhance user experience.
Thus, in the present embodiment, a data processing system applied to a web service is first connected as a server to a web session database storing a large amount of web session data. The server retrieves the reference class web session training text and the application class web session training text from the web session database by executing the SQL query statement.
And step S120, in the (k+1) th cyclic network training stage, guiding the network to mine the content of the reference class network session training text by the application class generated in the (k+1) th cyclic network training stage, and judging the network parameter learning error according to the reference class priori-labeling knowledge to generate a first reference class training error, wherein k is a positive integer.
In this embodiment, at the beginning of the (k+1) th cyclic network training phase, the server loads the application class generated in the kth phase and is directed to the network. The server then inputs the reference class training text into the model, and performs forward propagation calculation to obtain the mining result (such as predicted satisfaction label, topic classification, etc.) of the model on the reference class web session training text.
The server then compares the mining results with a priori labeled knowledge carried by the reference class web session training text. The server evaluates the performance of the application class guided network when processing the reference class network session training text by calculating indexes such as accuracy, recall, F1 score and the like, and calculates a first reference class training error according to the performance.
And step S130, performing network parameter learning on the application class guided network generated in the kth cyclic network training stage according to the first reference class training error, and generating a reference application class guided network.
In this embodiment, according to the first reference class training error calculated in step S120, the server starts to adjust parameters of the guided network of the application class generated in the kth loop network training phase. In particular, the weights and bias terms of the network may be updated using a back-propagation algorithm and gradient descent optimizer to reduce training errors.
In the parameter adjustment process, the server can try different super-parameter settings such as learning rate, batch size, iteration number and the like to find the optimal optimization strategy. Meanwhile, in order to prevent the occurrence of the overfitting phenomenon, the server may further employ regularization techniques (such as L1 regularization and L2 regularization) or Dropout strategies to enhance the generalization capability of the model.
After multiple rounds of iterative optimization, when the training error reaches a preset threshold value or the iteration number reaches an upper limit, the server stops training and stores the current network state as a reference application class to be used as a guided network. This newly generated reference application class is directed to the network to have greater accuracy and robustness in processing the reference class web session training text.
Step S140, the application class network session training text is mined through an application class guidance network generated in the (k+1) th cyclic network training stage, application class fuzzy mining content is generated, the reference application class guided network is used for mining the application class network session training text, network parameter learning error discrimination is carried out according to the application class fuzzy mining content, a first application feature space training error is generated, and the application class guidance network generated in the (k+1) th cyclic network training stage is a neural network generated by learning migration parameters of the reference application class guided network.
In the (k+1) th cyclic network training phase, the server loads the newly generated application class guidance network (this is obtained by performing migration parameter learning on the reference application class guided network). And then, the server inputs the application type network session training text into the application type guiding network to perform forward propagation calculation, so as to obtain the mining result of the application type guiding network on the application type network session training text.
These mining results may be ambiguous or ambiguous because the application-class web session training text does not have a priori knowledge of the annotation available for comparative verification. In order to evaluate the performance of the application class guidance network in processing the application class web session training text and calculate the training error, the server adopts an unsupervised learning method, and may use a certain similarity measure (such as cosine similarity, euclidean distance, etc.) to compare the results of two mining of the same batch of application class web session training text by the reference application class guided network (i.e. the results obtained by mining the reference application class guided network and the application class guidance network respectively). The difference value calculated in this way is referred to as a first application feature space training error. This error value reflects the stability and consistency of the reference application class guided network in processing the data in the actual application scenario.
Step S150, performing network parameter learning on the reference application class guided network according to the first application feature space training error, and generating a k+1th application class guided network generated in the cyclic network training stage until a first network service mining network is generated, where the first network service mining network is used for mining application class network session text and reference class network session text.
According to the first application feature space training error calculated in step S140, the server again performs parameter adjustment and optimization on the reference application class guided network. Similar to step S130, the weights and bias terms of the network are updated using a back-propagation algorithm and gradient descent optimizer, but this time with the goal of minimizing the application feature space training error.
During parameter adjustment, the server may also try different super parameter settings and regularization strategies to find the best optimization scheme. In addition, to further improve the performance of the model and prevent the occurrence of the over-fitting phenomenon, the server may also use an ensemble learning method (e.g. Bagging, boosting, etc.) to combine the prediction results of the multiple models.
After multiple rounds of iterative optimization, when the training error of the application feature space reaches a preset threshold value or the iteration number reaches an upper limit, the server stops training and stores the current network state as an application class guided network generated in the (k+1) th cyclic network training stage. This newly generated network model has greater accuracy and stability in processing application-like web session training text.
Finally, when all the cyclic network training phases are completed (i.e., the preset maximum number of cycles is reached or other termination conditions are met), the server will generate a final network service mining network, the first network service mining network. The network can effectively mine and analyze the reference type network session text and the application type network session text at the same time, and provides powerful support for future network services.
Based on the steps, the prior mining result can be fully utilized in the data processing process of the network service by introducing the reference type network session training text and the prior labeling knowledge of the reference type carried by the reference type network session training text, so that the mining accuracy and efficiency are improved. In each cyclic network training stage, according to the mining content of the application class generated in the previous stage on the reference class network session training text by the guided network, and combining the reference class priori labeling knowledge to judge the network parameter learning error, thereby generating more accurate training errors. Secondly, by adopting a loop network training mode, a more accurate and efficient application class guided network is gradually generated through continuous iteration and optimization. In each cycle, the application class guided network is subjected to network parameter learning according to the training error generated in the previous stage, so that the application class guided network can better adapt to the requirements of network service mining tasks. Meanwhile, the optimized application class guided network is converted into a new application class guided network by means of transfer parameter learning, and the new application class guided network is used for mining the application class network session training text, so that the mining effect is further improved. Finally, by generating a first network service mining network, efficient mining of application class network session text and reference class network session text is achieved. The first network service mining network not only inherits the optimization result of the previous cyclic network training stage, but also can flexibly adjust and optimize according to the actual network service mining requirement, and provides powerful support for the data processing of the network service. Meanwhile, the generation process of the first network service mining network fully considers the characteristics of error discrimination and migration parameter learning of network parameter learning, and ensures the accuracy and high efficiency of the first network service mining network in the network service mining task.
That is, the main objective of the embodiments of the present application is to optimize the mining process of the web session text, and perform multiple cycle web training and error discrimination by using a priori labeling knowledge of the reference web session training text and applying the fuzzy mining content of the web session training text, so as to generate the final web service mining network. By using a priori labeled knowledge of the reference class web session training text, the method can more accurately guide the web parameter learning process, thereby reducing training errors. In the multi-cycle network training stage, the network structure can be gradually optimized by learning the migration parameters of the reference application class guided network and the application class guided network, so that the efficiency and the precision of network service mining can be improved. By performing network parameter learning error discrimination on the fuzzy mining content of the application-class web session training text, the network parameter learning process can be further optimized, thereby generating a more effective network service mining network. Finally, through the generated network service mining network, the application type network session text and the reference type network session text can be mined more effectively, so that the data processing efficiency and accuracy of the network service are improved.
In one possible embodiment, the method further comprises:
and step A110, in the (k+1) th cyclic network training stage, guiding a network to perform semantic scene vector suitability analysis on the reference class network session training text and the application class network session training text according to the application class generated in the (k) th cyclic network training stage, and generating a semantic scene suitability error.
In this embodiment, in the (k+1) th cyclic network training phase, the server first loads the application class generated in the kth cyclic network training phase to be directed to the network. This network already has some processing power for web session text.
The server inputs the reference class web session training text and the application class web session training text simultaneously into the application class guided network. And the network performs suitability analysis of semantic scene vectors on the two types of texts. By semantic context vector, it is meant that semantic information in text is converted into a vector representation in high-dimensional space for mathematical operations and comparison. The suitability resolution is to check whether these vectors match the current learning state of the network.
Thus, the server gets a semantic scene suitability error. This error reflects the difference between the semantic scene vector it generates and the current learning state of the network when it processes both types of text. If the error is large, the semantic understanding of the two types of texts by the network is not accurate enough or has deviation.
Step S130 may include: and according to the semantic scene suitability error and the first reference class training error, performing network parameter learning on the application class guided network generated in the kth cycle network training stage, and generating the reference application class guided network.
After the semantic scene suitability error is obtained, the server prepares to conduct parameter adjustment on the application class generated in the kth cyclic network training stage by guiding the network.
In detail, the server not only considers the first reference class training error calculated before (namely, the difference between the network and the actual labeling knowledge when processing the reference class text), but also combines the semantic scene suitability error just obtained. The two errors are used as the basis of network parameter learning.
In particular, the server uses a back-propagation algorithm and gradient descent optimizer to update the weights and bias terms of the network. In the updating process, the server adjusts the network parameters according to the magnitude and direction of the two errors, so that the network can be expected to more accurately understand the semantics and generate vector representations more conforming to the actual scene when the similar text is processed next time.
After a round of network parameter learning, the server generates a new network model, the reference application class is guided to the network. This network not only has higher accuracy in processing the reference class text (due to the first reference class training error being taken into account), but also generates a vector representation that better fits the actual semantic scene (due to the semantic scene suitability error being taken into account) when processing the application class text.
In this way, by combining the semantic scene suitability error and the first reference class training error to perform network parameter learning, the server can generate a more comprehensive and accurate network service mining model. This model will be better able to understand and process various web session text in future web services, providing a better quality of service experience.
In one possible embodiment, step a110 may include:
and step A111, the application class generated according to the kth cyclic network training stage is guided to the network to extract the reference class semantic embedded vector of the reference class network session training text.
In the (k+1) th cyclic network training phase, the server first directs the network to process the reference class web session training text using the application class generated in the kth cyclic network training phase.
In detail, the server inputs the reference class web session training text into the application class guided network, and extracts the reference class semantic embedded vector through the internal operation of the network. These vectors are points in high-dimensional space that capture semantic information in the text such that similar text is located closer together in vector space.
Therefore, the server obtains the reference class semantic embedded vector set corresponding to the reference class web session training text.
And step A112, the application class generated according to the kth cyclic network training stage is guided to the network to extract the application class semantic embedded vector of the application class network session training text.
The server then uses the same application class guided network to process application class web session training text.
In detail, the server inputs the application class web session training text into the network and extracts the application class semantic embedded vector. These vectors also capture semantic information in the text, but they are generated for the text in the actual application scenario.
Therefore, the server obtains the application class semantic embedded vector set corresponding to the application class web session training text.
And step A113, performing service risk mining on the reference class semantic embedded vector to generate first mining content, wherein the first mining content characterizes the confidence coefficient of the reference class network session training text to which the reference class semantic embedded vector belongs.
And step A114, performing service risk mining on the application-class semantic embedded vector to generate second mining contents, wherein the second mining contents characterize the confidence that the application-class semantic embedded vector belongs to the application-class network session training text.
The server now needs to further analyze the extracted semantic embedded vectors to evaluate the confidence that they each belong to their original text category.
In detail, the server first performs service risk mining on the reference class semantic embedded vector. This typically involves using some algorithm or model to calculate the degree of matching between each vector and the class to which it belongs. The result of the mining is a first mined content that characterizes the confidence that the reference class semantic embedded vector belongs to the reference class web session training text.
Then, the server performs similar service risk mining on the application class semantic embedded vector to generate second mining content, wherein the second mining content characterizes the confidence that the application class semantic embedded vector belongs to the application class web session training text.
Thus, the server obtains two mined contents, which respectively represent the matching degree of the semantic embedded vectors of the reference class and the application class and the original text class.
Step a115, determining the semantic scene suitability error based on a direction of reducing the feature distance between the first mined content and the second mined content and increasing the feature distance between the reference class semantic embedded vector and the application class semantic embedded vector.
Finally, the server needs to determine a semantic scene suitability error, which will be used to guide further training of the network.
In detail, the server determines the error based on two principles: firstly, the feature distance between the first mined content and the second mined content is reduced, which means that a server hopes that a network can maintain consistency when processing different types of texts; and secondly, feature distance between the reference class semantic embedded vector and the application class semantic embedded vector is increased, which reflects that a server hopes that a network can better distinguish different types of texts.
The server synthesizes the semantic scene suitability error by calculating the difference between the two mined contents and the difference between the two types of vectors. This error value will be one of the basis for the adjustment of network parameters.
Thus, the server obtains a quantized semantic scene suitability error value reflecting the current network's adaptation and discrimination capabilities at the semantic level when processing reference class and application class text. This error value will be used to guide the next training and optimization of the network.
In one possible embodiment, step a115 may include:
step a1151, calculating a first feature distance between the first mined content and the second mined content.
Step a1152, calculating a second feature distance between the reference class semantic embedded vector and the application class semantic embedded vector.
Step a1153, defining a corresponding suitability error function based on the first feature distance and the second feature distance, where the suitability error function is used to define a corresponding weight for the first feature distance and the second feature distance, so as to minimize the first feature distance and maximize the second feature distance.
In this embodiment, the server has obtained the first mined content and the second mined content by service risk mining, and has characterized confidence degrees that the semantic embedded vectors of the reference class and the application class belong to respective original text classes respectively. Now, the server needs to calculate the feature distance between the two.
In detail, the server calculates a first feature distance between the first mined content and the second mined content using some distance measurement method (e.g., euclidean distance, cosine similarity, etc.). This distance reflects the difference in confidence level of the network when processing different types of text.
The server thus obtains a quantized first feature distance value that measures the consistency of the confidence of the network in processing the reference class and application class text.
Next, the server needs to calculate the feature distance between the reference class semantic embedded vector and the application class semantic embedded vector. This distance will reflect the ability of the network to distinguish between different types of text at the semantic level.
In detail, the server also uses a distance metric method to calculate the second feature distance between the two classes of semantic embedded vectors. This calculation takes into account the position and distribution of the vectors in the high-dimensional space to ensure that similar text is close in vector space, while different text is far away.
The server thus obtains a quantized second feature distance value that measures the distinguishing capability of the network at the semantic level for reference class and application class text.
Now, the server has calculated two key feature distance values. In order to integrate this information into a unified error metric for network training and optimization, the server needs to define an adaptive error function.
In detail, the server defines the suitability error function according to the first feature distance and the second feature distance. The goal of this function is to minimize the first feature distance (i.e., increase confidence consistency) while maximizing the second feature distance (i.e., enhance semantic level discrimination). To balance these two goals, the server will assign them corresponding weights.
The suitability error function may be a complex mathematical expression that combines the weighted first feature distance and second feature distance, and possibly other regularization terms or constraints. The server can flexibly control the processing modes and the importance degree of different types of texts in the network training process by adjusting the weight and the function form.
Thus, the server obtains an adaptability error function which comprehensively considers the confidence consistency and semantic discrimination capability of the network when processing different types of texts. This function will be the basis for the subsequent network training and optimization process.
In one possible implementation manner, before the step S140, the method includes:
step B110, obtaining the first network function layer definition information of the guided network of the reference application class.
In this embodiment, before the server prepares to perform the (k+1) th cyclic network training phase, it first needs to acquire the first network function layer definition information of the guided network of the reference application class obtained by previous training. This information defines the structure of the network, the layer-to-layer connection relationships, and the specific functions and parameters of each layer.
In detail, the server extracts the first network function layer definition information by accessing a file or database storing a guided network model with reference to the application class. Such information may include parameters such as the type of layer (e.g., convolutional layer, pooled layer, fully-connected layer, etc.), the size of the layer (e.g., the number of neurons), the choice of activation function, and the weight and bias of the layer.
Thus, the server successfully acquires the definition information of the first network function layer of the guided network of the reference application class, and prepares for subsequent network integration and optimization.
And step B120, acquiring second network function layer definition information of the application class guidance network generated in the kth cyclic network training stage.
In this embodiment, the server needs to acquire the second network function layer definition information of the application class guidance network generated in the kth cyclic network training stage. This network is the product of the previous training phase, whose structure and parameters have been subjected to some learning and adjustment.
In detail, the server also extracts the second network function layer definition information by accessing a file or database storing the application class guidance network model generated in the kth cycle network training phase. This information is similar to the first network function layer definition information but may differ due to differences in training data and targets.
Therefore, the server successfully acquires the definition information of the second network function layer of the application class guidance network generated in the kth cyclic network training stage, and provides a necessary basis for the subsequent network integration.
And step B130, integrating the first network function layer definition information and the second network function layer definition information based on the set network optimization parameters to generate integrated network function layer definition information.
In this embodiment, the server now has two pieces of network function layer definition information, which needs to be integrated according to the set network optimization parameters to generate a new and more powerful network model.
In detail, the server first analyzes commonalities and differences between the first network function layer definition information and the second network function layer definition information. The server then uses some integration strategy (e.g., layer overlay, feature fusion, etc.) to effectively combine the two pieces of information based on the set network optimization parameters (e.g., learning rate, regularization strength, etc.). During the integration process, the server may adjust parameters or structures of certain layers to accommodate new training goals and data distributions.
Thus, the server successfully integrates the first network function layer definition information and the second network function layer definition information, and generates definition information of a new network model containing the advantages of the first network function layer definition information and the second network function layer definition information, namely the integrated network function layer definition information.
And step B140, generating an application class guidance network generated in the (k+1) th cyclic network training stage according to the integrated network function layer definition information.
In this embodiment, finally, the server needs to generate the application class guidance network of the kth+1th cyclic network training stage according to the integrated network function layer definition information. This new network will serve as the starting point for the next training phase, responsible for further learning and optimizing the application class session text processing capabilities in the network service.
In detail, the server constructs a new application class guidance network according to the network structure specified in the integrated network function layer definition information, the connection relationship between layers, and the specific function and parameter of each layer. During the build process, the server may utilize some deep learning framework or library (e.g., tensorFlow, pyTorch, etc.) to simplify operation and improve efficiency. At the same time, the server needs to configure the new network with appropriate training algorithms and optimizers in order to be able to learn and update its parameters efficiently in the following training phase.
Thus, the server successfully generates the application class guidance network of the (k+1) th cyclic network training stage, and fully prepares for the subsequent training and optimization work. This new network will combine the advantages of the first two networks and exhibit greater performance and generalization capability in processing application-like web session text.
In one possible implementation manner, after the step S140, the method includes:
and step C110, the mining content of the reduced guided network generated in the kth cycle network training stage on the reference class network session training text is judged according to the reference class priori labeling knowledge, a second reference class training error is generated, and the parameter quantity of the network function layer definition information of the reduced guided network is smaller than that of the network function layer definition information of the application class guiding network.
In this embodiment, after completing the (k+1) th cyclic network training stage and generating the application class guide network, the server now needs to utilize the reduced guided network generated in the kth cyclic network training stage to mine the reference class network session training text, and determine the network parameter learning error according to the reference class priori labeling knowledge, so as to generate the second reference class training error.
In detail, the server first processes the reference class web session training text using the reduced guided network generated in the kth loop network training phase to extract mined content.
Then, the server compares the extracted mined content with known prior labeling knowledge of the reference class, and calculates the difference between the mined content and the known prior labeling knowledge.
According to the differences, the server judges out the network parameter learning errors and generates second reference class training errors. This error reflects the degree of deviation of the reduced guided network from a priori knowledge in processing the reference class network session training text.
The server thus obtains a second reference class training error, which is a quantified indicator that measures the performance of the reduced guided network in processing the reference class web session training text.
And step C120, the reduced guided network generated in the kth cycle network training stage is used for judging the mining content of the application-class network session training text, and the network parameter learning error is judged according to the application-class fuzzy mining content to generate a second application feature space training error.
In this embodiment, the server then needs to process the application-class web session training text using the same reduced guided network, and determine the network parameter learning error according to the previously generated application-class fuzzy mining content, so as to generate the second application feature space training error.
In detail, the server processes the application class web session training text using the reduced guided network generated in the kth loop network training phase to extract mined content.
Then, the server compares the extracted mined content with the previously generated application class fuzzy mined content, and calculates the difference between the extracted mined content and the previously generated application class fuzzy mined content.
According to the differences, the server judges out the network parameter learning errors and generates second application feature space training errors. This error reflects the degree of deviation of the reduced guided network from the fuzzy mining content in processing the application class web session training text.
Thus, the server obtains a second application feature space training error, which is also a quantified indicator for measuring the performance of the reduced guided network in processing application-class web session training text.
And step C130, performing network parameter learning on the reduced guided network generated in the kth cyclic network training stage according to the second reference class training error and the second application feature space training error, and generating a reduced guided network generated in the kth+1th cyclic network training stage until a second network service mining network is generated, wherein the second network service mining network is used for mining the application class network session text and the reference class network session text.
In this embodiment, the server needs to perform network parameter learning on the reduced guided network generated in the kth cyclic network training stage according to the second reference class training error and the second application feature space training error, so as to generate the reduced guided network generated in the kth+1th cyclic network training stage.
In detail, the server takes as input a second reference class training error and a second application feature space training error, and updates the parameters of the reduced guided network with some optimization algorithm (e.g., gradient descent algorithm).
In the process of parameter updating, the server can continuously adjust parameters such as weight and bias of the network so as to minimize training errors and improve the performance of the network.
After a certain number of iterations and optimizations, the server generates a reduced guided network generated in the (k+1) th loop network training phase.
Thus, the server obtains a new, optimized, reduced guided network with better performance in processing reference class and application class web session text.
Finally, the server needs to repeat the above steps until a second network service mining network is generated that meets the requirements. This network will be used to mine application class web session text and reference class web session text.
In detail, the server will repeat the cyclic network training phase, continually generating new reduced guided networks and optimizing their performance.
After each cyclic network training phase is completed, the server evaluates whether the performance of the current reduced guided network meets the requirements. If the requirements are met, stopping training and outputting a second network service mining network; if the requirements are not met, the next cyclic network training phase is continued to be trained and optimized.
Finally, when the reduced guided network generated in a certain cyclic network training stage meets all requirements (for example, reaches preset indexes such as accuracy and recall rate), the server outputs the reduced guided network as a second network service mining network.
Thus, the server obtains a high-performance second network service mining network which can effectively mine and analyze the application type network session text and the reference type network session text.
In one possible implementation, step S140 may include:
and step S141, mining the application type network session training text through an application type guidance network generated in the (k+1) th cyclic network training stage, and generating reference fuzzy mining content corresponding to the application type network session training text and a credibility value corresponding to the reference fuzzy mining content.
In this embodiment, the server has generated an application class guidance network through the (k+1) th cyclic network training phase. The server will now use this network to mine the application-class web session training text to generate reference fuzzy mined content and assign confidence values to the content.
In detail, the server inputs application class web session training text into the application class guidance network generated in the kth+1th round of web training phase. The application class instructs the network to process and analyze the input text and extract the characteristics and modes thereof.
Based on the internal logic of the network and the learned parameters, the application class instructs the network to generate reference fuzzy mining content corresponding to the input text. The content may be keywords, phrases, sentences or combinations thereof in the text reflecting the primary information and potential meaning of the text.
Meanwhile, the application class guidance network also distributes a credibility value for each reference fuzzy mining content. This value represents the degree of confidence of the network in the mined content, i.e., the relevance and accuracy of the content. The higher the confidence value, the greater the confidence that the network is in mining the content; otherwise, the less confidence.
Thus, the server obtains the reference fuzzy mining content corresponding to the application class web session training text and the credibility values of the reference fuzzy mining content.
Step S142, if the confidence value corresponding to the reference fuzzy mining content is greater than the set confidence value, determining that the reference fuzzy mining content is the application class fuzzy mining content.
In this embodiment, the server now needs to determine the final application class fuzzy mined content based on the confidence value of the reference fuzzy mined content. Only when the confidence value exceeds the set confidence threshold will the corresponding reference fuzzy mining be selected as the application class fuzzy mining.
In detail, the server traverses all reference fuzzy mined content and its corresponding confidence values.
For each reference fuzzy mined content, the server compares its confidence value with a set confidence threshold. The threshold is a standard preset according to actual requirements and network performance and is used for screening high-quality mined content.
If the credibility value of a certain reference fuzzy mining content is larger than a set credibility threshold, the server determines the reference fuzzy mining content as application fuzzy mining content. This means that the content is considered sufficiently relevant and accurate to be used for subsequent analysis and processing.
If the credibility value of a certain reference fuzzy mining content is smaller than or equal to the set credibility threshold, the server excludes the reference fuzzy mining content and does not take the reference fuzzy mining content as the application class fuzzy mining content. This may be because the relevance or accuracy of the content is insufficient to meet the requirements of subsequent processing.
Therefore, the server determines the final application fuzzy mining content according to the comparison result of the credibility values. The content is screened from the reference fuzzy mining content, has higher quality and accuracy, and can be used for subsequent text analysis and processing of network session.
In one possible implementation manner, the step S140 may further include:
step S143, obtaining a credibility value corresponding to the application class fuzzy mining content.
And S144, the reference application class is guided to conduct network parameter learning error discrimination on the mining content of the application class network session training text according to the application class fuzzy mining content, and a reference application feature space training error is generated.
Step S145, fusing the reference application feature space training errors according to the reliability value, and generating the first application feature space training errors.
In this embodiment, the server has assigned a confidence value to each mined content during the process of the application class guidance network generating the application class fuzzy mined content. The server now needs to obtain these confidence values for use in subsequent network parameter learning error decisions.
For example, the server retrieves application class fuzzy mining content and its corresponding trust value from a previously stored or cached location. The server ensures that each mined content is correctly matched with its trust value so that these values can be used accurately in subsequent processing. Therefore, the server successfully acquires the credibility value corresponding to the application class fuzzy mining content, and prepares for the next step of network parameter learning error discrimination.
Then, the reference application class guided network is required to mine the application class network session training text, and the network parameter learning error is judged according to the fuzzy mining content of the application class, so as to generate the reference application feature space training error. In detail, the server inputs the application class web session training text into the reference application class guided network to obtain mined content. The server compares the mined content generated by the reference application class guided network with the previously obtained fuzzy mined content of the application class. The server calculates a difference between the mined content and the application class fuzzy mined content, the difference reflecting a learning error of the reference application class guided network in processing the application class web session training text. The server quantifies this difference into a reference application feature space training error. Thus, the server generates a reference application feature space training error reflecting the degree of deviation of the reference application class from the application class fuzzy mining content when the reference application class is directed to the network to process the application class web session training text.
Then, the reference application feature space training error is fused according to the credibility value of the application class fuzzy mining content, and a first application feature space training error is generated. In detail, the server traverses all the reference application feature space training errors and the credibility values of the corresponding application class fuzzy mining content. And for each reference application feature space training error, the server performs weighting processing according to the credibility value of the corresponding application class fuzzy mining content. Training errors corresponding to mined content with higher confidence values will obtain greater weights, while training errors corresponding to mined content with lower confidence values will obtain less weights. And the server fuses all the weighted reference application feature space training errors to obtain a first application feature space training error. This error comprehensively considers the influence of the mined content of different credibility values on the learning of network parameters. Therefore, a first application feature space training error is generated, and the error is a weighted fusion result of mining contents integrating different credibility values, so that the performance of the reference application class guided network in processing application class network session training text can be reflected more accurately.
In one possible implementation, the mining result may be generated by mining the application class web session text and the reference class web session text based on the first web service mining network.
For example, the server inputs the application class web session text and the reference class web session text into the first web service mining network, respectively. The text may contain various forms of data such as user conversations, chat logs, comments, feedback, and the like.
The first web service mining network processes and analyzes the input text. It may use techniques such as Natural Language Processing (NLP), deep learning, machine learning, etc. to extract features and representations of text. During the processing, the first network service mining network may recognize information such as keywords, phrases, entities, emotional tendency, topic classification, etc. in the text. Such information is critical to understanding the content and intent of the text. The first web service mining network may also compare and correlate the application class web session text with the reference class web session text to find similarities and differences between them. This helps reveal insight in user behavior, preferences, needs, etc.
After processing and analysis of the mining network, the server will generate a series of mining results. These results may be presented in structured data, visual charts, reports, or other forms so that the user can easily understand and use them.
For example, the server obtains processed data and analysis results from the first network service mining network. Such data may include extracted features, class labels, association rules, and the like. The server performs post-processing, such as aggregation, sequencing, filtering, etc., on the mining results as needed to make it more in line with the expectations and demands of the users. Finally, the server presents the mining results to the user in an appropriate format and manner. Through these results, the user may gain insight and understanding into the application-class web session text and the reference-class web session text, making more informed decisions or taking more efficient actions.
The data processing 100 applied to the network service shown in fig. 2 includes: a processor 1001 and a memory 1003. The processor 1001 is coupled to the memory 1003, such as via a bus 1002. Optionally, the data processing 100 applied to the network service may further include a transceiver 1004, and the transceiver 1004 may be used for data interaction between the server and other servers, such as transmission of data and/or reception of data, etc. It should be noted that, in actual scheduling, the transceiver 1004 is not limited to one, and the structure of the data processing 100 applied to the network service does not limit the embodiments of the present application.
The processor 1001 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application SpecificIntegrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. The processor 1001 may also be a combination that implements computing functionality, such as a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 1002 may include a path to transfer information between the components. Bus 1002 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or EISA (ExtendedIndustry Standard Architecture ) bus, among others. The bus 1002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 2, but not only one bus or one type of bus.
The Memory 1003 may be, but is not limited to, ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, EEPROM (Electrically ErasableProgrammable Read Only Memory ), CD-ROM (Compact DiscRead Only Memory, compact disc Read Only Memory) or other optical disk storage, optical disk storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media, other magnetic storage devices, or any other medium that can be used to carry or store program code and that can be Read by a computer.
The memory 1003 is used for storing program codes for executing the embodiments of the present application, and is controlled to be executed by the processor 1001. The processor 1001 is configured to execute the program code stored in the memory 1003 to implement the steps shown in the foregoing method embodiment.
The embodiments of the present application provide a computer readable storage medium having a program code stored thereon, which when executed by a processor, implements the steps of the foregoing method embodiments and corresponding content.
It should be understood that, although the flowcharts of the embodiments of the present application indicate the respective operation steps by arrows, the order of implementation of these steps is not limited to the order indicated by the arrows. In some implementations of embodiments of the present application, the implementation steps in the flowcharts may be performed in other orders based on demand, unless explicitly stated herein. Furthermore, some or all of the steps in the flowcharts may include a plurality of sub-steps or a plurality of stages, depending on the actual implementation scenario. Some or all of these sub-steps or phases may be performed at the same time, or each of these sub-steps or phases may be performed at different times, respectively. In the scenario that the execution time is different, the execution sequence of the sub-steps or stages can be flexibly configured based on requirements, which is not limited by the embodiment of the present application.
The foregoing is merely an optional implementation manner of the implementation scenario of the application, and it should be noted that, for those skilled in the art, other similar implementation manners according to the technical ideas of the application are adopted without departing from the technical ideas of the application, and also belong to the protection scope of the embodiments of the application.

Claims (10)

1. A data processing method applied to a network service, the method comprising:
acquiring a reference type network session training text and an application type network session training text, wherein the reference type network session training text carries reference type priori labeling knowledge, and the reference type priori labeling knowledge characterizes the priori mining result of the reference type network session training text in a target network service mining process;
in the (k+1) th cyclic network training stage, guiding the network to mine the content of the reference class network session training text by the application class generated in the (k) th cyclic network training stage, and judging the network parameter learning error according to the reference class priori-labeling knowledge to generate a first reference class training error, wherein k is a positive integer;
according to the first reference class training error, performing network parameter learning on the application class guided network generated in the kth cyclic network training stage, and generating a reference application class guided network;
the application class guidance network generated in the (k+1) th cyclic network training stage is used for mining the application class network session training text to generate application class fuzzy mining content, the reference application class guided network is used for mining the application class network session training text, network parameter learning error discrimination is carried out according to the application class fuzzy mining content to generate a first application characteristic space training error, and the application class guidance network generated in the (k+1) th cyclic network training stage is a neural network generated by carrying out migration parameter learning on the reference application class guided network;
And carrying out network parameter learning on the reference application class guided network according to the first application feature space training error, and generating an application class guided network generated in the (k+1) th cyclic network training stage until a first network service mining network is generated, wherein the first network service mining network is used for mining application class network session texts and reference class network session texts.
2. The data processing method applied to a network service according to claim 1, wherein the method further comprises:
in the (k+1) th cyclic network training stage, according to the application class generated in the (k) th cyclic network training stage, guiding a network to perform semantic scene vector suitability analysis on the reference class network session training text and the application class network session training text, and generating a semantic scene suitability error;
the generating the reference application class guided network by performing network parameter learning on the application class guided network generated in the kth cyclic network training stage according to the first reference class training error includes:
and according to the semantic scene suitability error and the first reference class training error, performing network parameter learning on the application class guided network generated in the kth cycle network training stage, and generating the reference application class guided network.
3. The method for processing data applied to a web service according to claim 2, wherein the generating the semantic scene suitability error by guiding the web to perform semantic scene vector suitability parsing on the reference web session training text and the application web session training text according to the application class generated in the kth cyclic web training phase includes:
the application class generated according to the kth cyclic network training stage is guided to the network to extract the reference class semantic embedded vector of the reference class network session training text;
the application class generated according to the kth cyclic network training stage is guided to the network to extract the application class semantic embedded vector of the application class network session training text;
performing service risk mining on the reference class semantic embedded vector to generate first mining content, wherein the first mining content characterizes the confidence coefficient of the reference class network session training text to which the reference class semantic embedded vector belongs;
performing service risk mining on the application class semantic embedded vector to generate second mining content, wherein the second mining content characterizes the confidence coefficient of the application class network session training text to which the application class semantic embedded vector belongs;
Determining the semantic scene suitability error based on a direction of decreasing a feature distance between the first mined content and the second mined content, increasing a feature distance between the reference class semantic embedding vector and the application class semantic embedding vector.
4. The data processing method applied to a web service according to claim 3, wherein the step of determining the semantic scene suitability error based on a direction of decreasing a feature distance between the first mined content and the second mined content and increasing a feature distance between the reference class semantic embedded vector and the application class semantic embedded vector comprises:
calculating a first feature distance between the first mined content and the second mined content;
calculating a second feature distance between the reference class semantic embedded vector and the application class semantic embedded vector;
based on the first feature distance and the second feature distance, a corresponding suitability error function is defined, which is used to define respective weights for the first feature distance and the second feature distance, so that the first feature distance is minimized and the second feature distance is maximized.
5. The data processing method applied to the network service according to claim 2, wherein the application class guidance network generated through the k+1st cyclic network training phase mines the application class web session training text, and before the step of generating the application class fuzzy mining content, the method comprises:
acquiring first network function layer definition information of the guided network of the reference application class;
acquiring second network function layer definition information of an application class guidance network generated in a kth cyclic network training stage;
integrating the first network function layer definition information and the second network function layer definition information based on the set network optimization parameters to generate integrated network function layer definition information;
and generating an application class guidance network generated in the (k+1) th cyclic network training stage according to the integrated network function layer definition information.
6. The method for processing data applied to a network service according to any one of claims 2 to 5, wherein the application class guidance network generated through the kth+1th cyclic network training phase mines the application class web session training text, and after generating the application class fuzzy mining content, the method comprises:
The method comprises the steps of mining content of a reduced guided network generated in a kth cycle network training stage on a reference class network session training text, judging network parameter learning errors according to the reference class priori labeling knowledge, and generating a second reference class training error, wherein the parameter quantity of network function layer definition information of the reduced guided network is smaller than that of the application class guiding network;
the reduced guided network generated in the kth cycle network training stage is used for carrying out network parameter learning error discrimination on the mining content of the application type network session training text according to the application type fuzzy mining content, and a second application feature space training error is generated;
and according to the second reference class training error and the second application feature space training error, performing network parameter learning on the reduced guided network generated in the kth cyclic network training stage, and generating a reduced guided network generated in the kth+1th cyclic network training stage until a second network service mining network is generated, wherein the second network service mining network is used for mining the application class network session text and the reference class network session text.
7. The method for processing data applied to a network service according to any one of claims 2 to 5, wherein the application class guidance network generated through the kth+1th cyclic network training phase mines the application class web session training text, and generating application class fuzzy mining content includes:
the application class guidance network generated in the (k+1) th cyclic network training stage is used for mining the application class network session training text, and reference fuzzy mining content corresponding to the application class network session training text and a credibility value corresponding to the reference fuzzy mining content are generated;
and if the credibility value corresponding to the reference fuzzy mining content is larger than the set credibility value, determining the reference fuzzy mining content as the application class fuzzy mining content.
8. The method for processing data applied to network services according to claim 7, wherein the step of guiding the network to mine the application class web session training text by the reference application class, and performing network parameter learning error discrimination according to the application class fuzzy mining content to generate a first application feature space training error comprises:
Acquiring a credibility value corresponding to the application class fuzzy mining content;
the reference application class guided network is used for carrying out mining contents of the application class network session training text, and network parameter learning error discrimination is carried out according to the application class fuzzy mining contents, so that a reference application characteristic space training error is generated;
and fusing the reference application feature space training errors according to the credibility value to generate the first application feature space training errors.
9. The data processing method applied to a network service according to claim 1, wherein the method further comprises:
and mining the application type network session text and the reference type network session text based on the first network service mining network to generate mining results.
10. A data processing system for application to a network service, comprising a processor and a computer readable storage medium storing machine executable instructions which when executed by the processor implement the data processing method for application to a network service of any one of claims 1 to 9.
CN202410221256.9A 2024-02-28 2024-02-28 Data processing method and system applied to network service Pending CN117785964A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410221256.9A CN117785964A (en) 2024-02-28 2024-02-28 Data processing method and system applied to network service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410221256.9A CN117785964A (en) 2024-02-28 2024-02-28 Data processing method and system applied to network service

Publications (1)

Publication Number Publication Date
CN117785964A true CN117785964A (en) 2024-03-29

Family

ID=90400384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410221256.9A Pending CN117785964A (en) 2024-02-28 2024-02-28 Data processing method and system applied to network service

Country Status (1)

Country Link
CN (1) CN117785964A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111680123A (en) * 2020-05-25 2020-09-18 腾讯科技(深圳)有限公司 Method and device for training conversation model, computer equipment and storage medium
CN111709493A (en) * 2020-07-10 2020-09-25 腾讯科技(深圳)有限公司 Object classification method, training method, device, equipment and storage medium
CN111724083A (en) * 2020-07-21 2020-09-29 腾讯科技(深圳)有限公司 Training method and device for financial risk recognition model, computer equipment and medium
CN113807183A (en) * 2021-08-17 2021-12-17 华为技术有限公司 Model training method and related equipment
CN114219971A (en) * 2021-12-13 2022-03-22 腾讯科技(深圳)有限公司 Data processing method, data processing equipment and computer readable storage medium
CN114359526A (en) * 2021-12-29 2022-04-15 中山大学 Cross-domain image style migration method based on semantic GAN
CN115560983A (en) * 2022-09-30 2023-01-03 哈尔滨理工大学 Rolling bearing fault diagnosis method and system under different working conditions based on federal feature transfer learning
CN115705678A (en) * 2021-08-09 2023-02-17 腾讯科技(深圳)有限公司 Image data processing method, computer equipment and medium
WO2023047117A1 (en) * 2021-09-23 2023-03-30 UCL Business Ltd. A computer-implemented method, data processing apparatus, and computer program for active learning for computer vision in digital images
CN117474748A (en) * 2023-11-03 2024-01-30 腾讯科技(深圳)有限公司 Image generation method and device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111680123A (en) * 2020-05-25 2020-09-18 腾讯科技(深圳)有限公司 Method and device for training conversation model, computer equipment and storage medium
CN111709493A (en) * 2020-07-10 2020-09-25 腾讯科技(深圳)有限公司 Object classification method, training method, device, equipment and storage medium
CN111724083A (en) * 2020-07-21 2020-09-29 腾讯科技(深圳)有限公司 Training method and device for financial risk recognition model, computer equipment and medium
CN115705678A (en) * 2021-08-09 2023-02-17 腾讯科技(深圳)有限公司 Image data processing method, computer equipment and medium
CN113807183A (en) * 2021-08-17 2021-12-17 华为技术有限公司 Model training method and related equipment
WO2023047117A1 (en) * 2021-09-23 2023-03-30 UCL Business Ltd. A computer-implemented method, data processing apparatus, and computer program for active learning for computer vision in digital images
CN114219971A (en) * 2021-12-13 2022-03-22 腾讯科技(深圳)有限公司 Data processing method, data processing equipment and computer readable storage medium
CN114359526A (en) * 2021-12-29 2022-04-15 中山大学 Cross-domain image style migration method based on semantic GAN
CN115560983A (en) * 2022-09-30 2023-01-03 哈尔滨理工大学 Rolling bearing fault diagnosis method and system under different working conditions based on federal feature transfer learning
CN117474748A (en) * 2023-11-03 2024-01-30 腾讯科技(深圳)有限公司 Image generation method and device, electronic equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
LICHEN JIN 等: "Active Domain Transfer on Network Embedding", 《WWW \'20: PROCEEDINGS OF THE WEB CONFERENCE 2020》, 30 April 2020 (2020-04-30), pages 2683, XP058722868, DOI: 10.1145/3366423.3380024 *
杜娟, 杨钧植: "基于迁移学习的小样本连接器缺陷检测方法", 《自动化与信息工程》, vol. 43, no. 05, 28 October 2022 (2022-10-28), pages 1 - 7 *
黄心依, 曹浩: "基于知识本体的图像分类迁移学习算法", 《蚌埠学院学报》, vol. 10, no. 02, 10 March 2021 (2021-03-10), pages 88 - 92 *

Similar Documents

Publication Publication Date Title
US20230100376A1 (en) Text sentence processing method and apparatus, computer device, and storage medium
US11816439B2 (en) Multi-turn dialogue response generation with template generation
US11763091B2 (en) Automated content tagging with latent dirichlet allocation of contextual word embeddings
WO2020208444A1 (en) Fairness improvement through reinforcement learning
US11360927B1 (en) Architecture for predicting network access probability of data files accessible over a computer network
US20210042344A1 (en) Generating or modifying an ontology representing relationships within input data
CN113312447A (en) Semi-supervised log anomaly detection method based on probability label estimation
US11811708B2 (en) Systems and methods for generating dynamic conversational responses using cluster-level collaborative filtering matrices
Napoli et al. An agent-driven semantical identifier using radial basis neural networks and reinforcement learning
US11790183B2 (en) Systems and methods for generating dynamic conversational responses based on historical and dynamically updated information
EP3832485A1 (en) Question answering systems
US20230368028A1 (en) Automated machine learning pre-trained model selector
CN116561542B (en) Model optimization training system, method and related device
US20220351634A1 (en) Question answering systems
JP2020135689A (en) Model learning system, intention interpretation system, method for learning model, and model learning program
Zhao et al. Safe semi-supervised classification algorithm combined with active learning sampling strategy
KR102586799B1 (en) Method, device and system for automatically processing creation of web book based on web novel using artificial intelligence model
Wen et al. A Cross-Project Defect Prediction Model Based on Deep Learning With Self-Attention
CN117785964A (en) Data processing method and system applied to network service
CN114912623A (en) Method and device for model interpretation
US20210241040A1 (en) Systems and Methods for Ground Truth Dataset Curation
KR20200010679A (en) Heterogeneity learning based information classification apparatus
US11934794B1 (en) Systems and methods for algorithmically orchestrating conversational dialogue transitions within an automated conversational system
US20230008628A1 (en) Determining data suitability for training machine learning models
US20240037425A1 (en) Integrated machine learning and rules platform for improved accuracy and root cause analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination