CN114372442A - Automatic arranging method, device and equipment for message application function and computer readable storage medium - Google Patents

Automatic arranging method, device and equipment for message application function and computer readable storage medium Download PDF

Info

Publication number
CN114372442A
CN114372442A CN202011104540.6A CN202011104540A CN114372442A CN 114372442 A CN114372442 A CN 114372442A CN 202011104540 A CN202011104540 A CN 202011104540A CN 114372442 A CN114372442 A CN 114372442A
Authority
CN
China
Prior art keywords
target
function
message application
historical
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011104540.6A
Other languages
Chinese (zh)
Inventor
邢彪
郑屹峰
陈维新
程佳鸣
彭熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Group Zhejiang Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Group Zhejiang Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Group Zhejiang Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202011104540.6A priority Critical patent/CN114372442A/en
Publication of CN114372442A publication Critical patent/CN114372442A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/126Character encoding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

A method, a device and equipment for automatically arranging message application functions and a computer readable storage medium are provided. The embodiment of the invention relates to the technical field of communication, and discloses an automatic arranging method for message application functions, which comprises the following steps: acquiring a target message application demand parameter and a function parameter set; the function parameter set comprises function parameters of each target function node; converting the target message application demand parameters and the function parameters into target demand characteristic vectors and target function characteristic vectors; inputting the target demand characteristic vector and the target function characteristic vector into a message composite ability arrangement model to obtain a target arrangement relation between target function nodes corresponding to target message application demand parameters; the message composite ability arrangement model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arrangement relation among the corresponding historical function nodes; and outputting the target arrangement relation between the target function nodes corresponding to the target message application requirement parameters. Through the mode, the embodiment of the invention realizes the automatic arrangement of the required functional nodes.

Description

Automatic arranging method, device and equipment for message application function and computer readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of mobile communication, in particular to an automatic arranging method and device for message application functions, automatic arranging equipment for message application functions and a computer readable storage medium.
Background
At present, with the development of technology, the development of application functions is more and more intelligent, but an intelligent development tool is still lacked at present, and an application can be automatically formed according to the requirements. For example, for 5G messaging applications, the diversity of functions and information increases for 5G messaging applications over traditional short messaging. At present, however, in the prior art, the compound capability arrangement required by the 5G message application is mainly realized by manually developing a typical 5G message industry application by means of expert experience. However, due to the endless and diversified industrial requirements, the manual arrangement mode is inefficient, time-consuming, labor-consuming and prone to errors.
Disclosure of Invention
In view of the foregoing problems, embodiments of the present invention provide an automatic message application function arranging method, an automatic message application function arranging device, an automatic message application function arranging apparatus, and a computer-readable storage medium, which are used to solve the technical problem in the prior art that a message application cannot automatically perform function arrangement according to needs.
According to an aspect of the embodiments of the present invention, there is provided an automatic arranging method for information application functions, the method including:
acquiring a target message application demand parameter and a function parameter set; the function parameter set comprises function parameters of each target function node;
respectively converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors;
inputting the target demand characteristic vector and the target function characteristic vector into a message composite ability arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
In an optional mode, the message composite capability orchestration model includes a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the network capability topology encoder includes a first relation graph convolution layer, a first full-link layer, a first random discard layer, a second relation graph convolution layer, a second full-link layer, and a second random discard layer; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
the capacity coding topology generator comprises a first merging layer and a tensor factorization layer; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the message composite ability arrangement model is obtained by training according to the historical message application demand parameter, the functional parameter of the historical functional node, and the historical arrangement relationship between the corresponding historical functional nodes, and further includes:
acquiring historical message application demand parameters and functional parameters of historical functional nodes;
converting the historical message application demand parameters into historical demand characteristic vectors;
converting the function parameters of the historical function nodes into historical function characteristic vectors corresponding to the historical function nodes;
marking the arrangement relation of each historical function feature vector according to the arrangement relation among the historical function nodes to obtain a historical feature vector set, wherein the historical feature vector set comprises the historical function feature vectors and the historical arrangement relation of the historical function nodes;
and inputting the historical demand characteristic vector and the historical characteristic vector set into a preset neural network model for training to obtain a message composite ability arrangement model.
In an optional manner, the target message application requirement parameter includes a target message application requirement description text, and the function parameter of each target function node includes a function description text of each target function node; converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors respectively, including:
performing text cleaning on the message application requirement description text and the function description texts of the function nodes to obtain a target requirement text and a target function text;
serializing the target demand text and the target function text to obtain a target demand text sequence and a target function text sequence;
and respectively converting the target demand text sequence and the target function text sequence for vector representation to obtain a target demand characteristic vector and a target function characteristic vector.
In an optional mode, acquiring a target message application requirement parameter and a function parameter set; the function parameter set includes function parameters of each target function node, and includes:
acquiring a target message application demand request, wherein the target message application demand request carries the target message application demand parameters.
According to another aspect of the embodiments of the present invention, there is provided an automatic arrangement apparatus for a message application function, including:
the acquisition module is used for acquiring the target message application requirement parameters and the function parameter set; the function parameter set comprises function parameters of each target function node;
the conversion module is used for respectively converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors;
the prediction module is used for inputting the target demand characteristic vector and the target function characteristic vector into a message composite capacity arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and the output module is used for outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
In an optional mode, the message composite capability orchestration model includes a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
According to another aspect of the embodiments of the present invention, there is provided a message application function automatic arrangement apparatus, including:
the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation of the automatic arranging method of the message application function.
According to a further aspect of the embodiments of the present invention, there is provided a computer-readable storage medium having at least one executable instruction stored therein, which when running on a message application function automatic arrangement device, causes the message application function automatic arrangement device to perform the operations of the message application function automatic arrangement method described above.
According to the embodiment of the invention, the arrangement relation of the capacity nodes corresponding to the message application requirements is predicted through the neural network, multi-scene message application can be realized according to the message application requirements, an enterprise can rapidly complete the deployment of the message application through a platform without complex code development, and the message application can be automatically, simply and conveniently created.
In addition, the specific structures of the network capability topology encoder, the message application requirement encoder and the capability arranging topology generator are arranged, so that the neural network can predict the arranging relation more accurately.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and the embodiments of the present invention can be implemented according to the content of the description in order to make the technical means of the embodiments of the present invention more clearly understood, and the detailed description of the present invention is provided below in order to make the foregoing and other objects, features, and advantages of the embodiments of the present invention more clearly understandable.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart illustrating an automatic orchestration method of message application functions according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a structure of a message composition capability orchestration model according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a method for training a message composition capability orchestration model according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an automatic orchestration device for message application functions according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an automatic orchestration device according to a message application function provided by an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein.
5G message: the method is an upgrade of short message service, is a basic telecommunication service of operators, realizes the leap of service experience based on the IP technology, and has more supported media formats and richer expression forms. Compared with the traditional short message with single function, the 5G message not only widens the information receiving and sending range, supports the user to use multimedia contents such as text, audio and video, cards, positions and the like, but also extends the depth of interaction experience, and the user can complete services such as service search, discovery, interaction, payment and the like in a message window to construct an information window of one-stop service. The 5G message has the characteristics of 2C and 2B applications, can be used for conveniently transmitting rich media information such as voice, pictures, videos, cards, files and the like between users, and also supports enterprises to provide interactive services in a Chatbot mode on a 5G message platform. The 5G message is a comprehensive upgrade to the traditional short message and provides an entrance for a series of abundant 5G applications at hand. People can use the 5G message to shop, order food, order tickets, order hotels, also can watch films, make video conferences, experience VR and online education, and the 5G message becomes a bridge connecting people with the society.
In the embodiment of the present invention, the functional node refers to a minimum granularity capability in the network capabilities, that is, a node that implements a minimum function. Such as location capabilities, call capabilities, big data capabilities, short message capabilities, etc.
Fig. 1 is a flowchart illustrating a message application function automatic arrangement method according to an embodiment of the present invention, which is executed by a message application function automatic arrangement device. The automatic arranging device of the message application function can be a computer device, a terminal device and the like. As shown in fig. 1, the method comprises the steps of:
step 110: acquiring a target message application demand parameter and a function parameter set; the function parameter set comprises function parameters of each target function node.
The embodiment of the present invention takes 5G message application as an example, which does not limit 5G message application. An industry user can initiate a new target message application requirement request to a 5G message application open platform, wherein the target message application requirement request carries the target message application requirement parameters. Since there may be multiple functions for one message application, the target message application requirement parameter includes multiple sub-message application requirement parameters, each sub-message application requirement parameter being implemented by at least one target function node arrangement. The set of requirements parameters for a 5G messaging application includes the functions that the messaging application needs to implement, e.g., a messaging application requirement for an educational class is described as: the method and the system realize the collection and analysis of the daily health condition reported information of students and teachers and issue epidemic prevention health guidance to the students and the teachers.
Step 120: and converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors respectively.
The target message application requirement parameters comprise target message application requirement description texts, and the function parameters of each target function node comprise function description texts of each target function node.
In the embodiment of the present invention, the converting the target message application requirement parameter and the function parameter of each target function node into the target requirement feature vector and the target function feature vector respectively includes the following steps:
performing text cleaning on the message application requirement description text and the function description texts of the function nodes to obtain a target requirement text and a target function text; the message application requirement description text comprises a plurality of sub-requirement texts, and the function description text comprises a plurality of sub-function description texts. The text cleaning comprises word segmentation processing, special symbol removal processing and the like.
And serializing the target demand text and the target function text to obtain a target demand text sequence and a target function text sequence. Serialization can refer to the conversion of the words of an entity into an abstract numerical representation, so that text information can be converted into a data stream for reading and transmission. Because the message application requirement description text comprises a plurality of sub-requirement texts, the target requirement text sequence corresponds to a plurality of sub-target requirement text sequences; accordingly, the function description text sequence corresponds to a plurality of sub-function description text sequences.
And respectively converting the target demand text sequence and the target function text sequence for vector representation to obtain a target demand characteristic vector and a target function characteristic vector. And the target function feature vector comprises the feature attribute of the corresponding target function node.
The target requirement text and the target function text are serialized to obtain a target requirement text sequence and a target function text sequence, specifically, the target requirement text and the target function text are converted into an integer sequence through a preset dictionary, and the sequence length of the sub-target requirement text with the longest length in the target requirement text is taken as the requirement coding sequence length demand _ length. And completing the sub-target requirement text sequence with the sequence length smaller than the length of the requirement coding sequence to the length of the requirement coding sequence. Similarly, the length of the functional coding sequence is determined according to the length of the sub-target functional text sequence with the longest length in the target functional text sequence, and the completion is carried out on any one sub-target functional text sequence in the target functional text sequence according to the length of the functional coding sequence. The embodiment of the invention does not specifically limit the content of the preset dictionary, and a person skilled in the art can construct the dictionary according to the existing text dictionary conversion rule. The dictionary size is demand _ vocab _ size. The target function feature vector of the target function node is as follows: { x1、x2、x3、…、xNAnd the function text sequence containing each target function node.
In the embodiment of the invention, for a 5G message application scene, a historical 5G message industry application requirement description set and a corresponding historical function node feature set are obtained from a 5G message open platform, and meanwhile, a missing arrangement relation among function nodes of each 5G message application requirement in a data set is marked. And representing the historical existing function node list as a directed and unweighted topological graph, wherein the function feature vector comprises feature attributes of each function node. The functional node topology graph can be represented as a directed, unweighted graph G ═ V, E, R, E being a set of edges, where an edge represents an orchestration relationship between two functional nodes. Since the calling relationship between the target function nodes is unknown, the input E is null. V is a set of atomic capability nodes, V ═ V1,V2,V3,…,VNAnd expressing the target function characteristic vector set of the function node.
Step 130: inputting the target demand characteristic vector and the target function characteristic vector into a message composite ability arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite ability arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes.
Fig. 2 shows a specific structure of a message composition capability orchestration model according to an embodiment of the present invention. The invention is implemented on the basis of a relational graph convolutional neural network GCN to optimize and obtain a message composite ability arrangement model. The essence of GCN is to extract the spatial features of the topological graph, and R-GCN can process the multi-relation data features in the knowledge base. The graph in the embodiment of the invention is a capability arrangement topological graph, each function node in the topological graph represents a network capability, each edge represents a call relation between the function nodes, and the characteristic of each function node is the characteristic attribute of the function node. The message composite ability orchestration model may learn a mapping of signals or features on the ability orchestration topology graph G ═ V, E, R, input feature vectors for each functional node of the model, generate functional latent feature representations for each functional node using a relational graph convolution network, and then predict tensor factorization models of labeled edges using these functional latent feature representations, thereby predicting invocation relationships between functional nodes.
In the embodiment of the invention, the message composite ability arranging model comprises a network ability topology encoder, a message application requirement encoder and an ability arranging topology generator.
The network capability topology encoder comprises a first relation graph convolution layer, a first full-connection layer, a first random abandonment layer, a second relation graph convolution layer, a second full-connection layer and a second random abandonment layer; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation; and the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation.
The message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain a requirement potential characteristic representation. The message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain a requirement potential characteristic representation.
The capacity coding topology generator comprises a first merging layer and a tensor factorization layer; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation. And the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
Namely, the target function feature vector is input into the network capability topology encoder, and the network capability topology encoder maps each target function node according to the target demand feature vector to obtain the function potential feature representation of each network capability node. And inputting the target demand characteristic vector into the message application demand encoder to obtain a demand potential characteristic representation corresponding to the target message application demand parameter according to the target demand characteristic vector. And after combining the functional potential feature representation and the demand potential feature representation, the capability coding topology generator scores the potential arrangement relationship between every two target function nodes through tensor factorization operation, so as to obtain the target arrangement relationship between the target function nodes corresponding to the target message application demand parameters.
Step 140: and outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
After the target arrangement relation among the target function nodes corresponding to the target message application demand parameters is obtained through the method, a function node arrangement scheme is formed and fed back to the 5G message open platform, and the 5G message open platform carries out arrangement implementation according to the function node arrangement scheme, so that the function of the industry user on the message application demand is realized.
According to the embodiment of the invention, the arrangement relation of the capacity nodes corresponding to the message application requirements is predicted through the neural network, multi-scene message application can be realized according to the message application requirements, an enterprise can rapidly complete the deployment of the message application through a platform without complex code development, and the message application can be automatically, simply and conveniently created.
In addition, the specific structures of the network capability topology encoder, the message application requirement encoder and the capability arranging topology generator are arranged, so that the neural network can predict the arranging relation more accurately.
Referring to fig. 2 and 3, fig. 3 is a flow chart illustrating the training of the message composition capability orchestration model according to the embodiment of the invention. Before performing step 110-. The method specifically comprises the following steps:
step 210: acquiring historical data, wherein the historical data comprises historical message application demand parameters, corresponding function parameters of various historical function nodes and an arrangement relation among the historical function nodes.
Step 220: and respectively converting the historical message application demand parameters and the functional parameters of the historical functional nodes into historical demand characteristic vectors and historical functional characteristic vectors.
The conversion process of the historical requirement feature vector and the historical function feature vector is the same as the conversion process of the target requirement feature vector and the target function feature vector, and is not described herein again.
And after the historical function characteristic vectors corresponding to the historical function nodes are obtained, marking the historical arrangement relation among the historical function characteristic vectors according to the historical arrangement relation among the historical function nodes. And 1 represents that the functional node i needs to call the functional node j, output-1 represents that the functional node i needs to be called by the functional node j, and output 0 represents that no call relation exists between the functional node i and the functional node j. And obtaining a training set after the labeling is finished. The training set comprises historical demand characteristic vectors and a corresponding historical function characteristic vector set carrying historical arrangement relations. After the training set is obtained, the training sample set can be divided into a training sample set and a testing sample set for use in training a message composite ability arrangement model.
For a 5G message application scene, a historical 5G message industry application requirement description set and a corresponding historical function node feature set are obtained from a 5G message open platform, and meanwhile, a missing arrangement relation among function nodes of each 5G message application requirement in a data set is marked. And representing the historical existing function node list as a directed and unweighted topological graph, wherein the function feature vector comprises feature attributes of each function node.
Step 230: and constructing a message composite ability arrangement model.
As described above, in an embodiment of the present invention, a message composition capability orchestration model includes a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator.
Firstly, for the network capability topology encoder, the network capability topology encoder comprises a first relation graph convolution layer, a first full-connection layer, a first random abandonment layer, a second relation graph convolution layer, a second full-connection layer and a second random abandonment layer which are connected in sequence; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation; and the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation.
Specifically, the number of convolution kernels of the first relation graph convolution layer is set to 128, that is, the output dimension, and the activation function is set to "relu". The first map volume layer has an expression for each layer:
Figure BDA0002726508130000111
wherein the content of the first and second substances,
Figure BDA0002726508130000112
is node v in the l layer of the neural networkiR is a relationship type,
Figure BDA0002726508130000113
the method is a parameter matrix of the first neural network layer with specific relation types, the activation function selects ReLU (·) max (0,), ReLU is a linear correction unit (normalized linear unit), and the input of the first layer is a function node feature vector
Figure BDA0002726508130000114
If L layers are co-stacked, the final output of the encoder is
Figure BDA0002726508130000115
In ordinary GCN, D'-1/2A’D’-1/2The method is symmetrical normalization of an adjacency matrix A, wherein A ' is A + I, and D ' is a node degree diagonal matrix of A ', and for a single functional node in the embodiment of the invention, normalization is carried out, namely, the normalization is divided by the degree of the functional node, so that the information transmission value of each adjacency edge is normalized, and the influence of the former is not larger than that of the latter due to more edges of one functional node and less edges of another node, and therefore 1/ci,rEquivalent to the normalization of the adjacency matrix in GCN, ci,rIs a regularized constant selected
Figure BDA0002726508130000116
Figure BDA0002726508130000117
Representing the set of neighbors of functional node i under the relationship r.
The number of the neurons of the first full-connection layer is 128, and the activation function is set to be 'relu'; the rejection probability of the first random rejection layer is set to be 0.2, the input neurons are randomly disconnected according to the first preset probability when parameters are updated every time in the training process, and the first random rejection layer is used for preventing overfitting.
The number of convolution kernels of the second relation graph convolution layer is 64, the activation function is set to be lambda, the expression of each layer of the second relation graph convolution layer is the same as the idea adopted by the expression of the first relation graph convolution layer, and the description is omitted here.
The number of neurons of the second fully-connected layer is 64, and the activation function is set to "relu".
The discard probability of the second random discard layer is set to 0.2.
And outputting the function potential feature representation Z of each function node by utilizing the network capability topology encoder.
Furthermore, the message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer which are connected in sequence; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain a requirement potential characteristic representation. The message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain a requirement potential characteristic representation.
Specifically, the input data dimension of the first layer word embedding layer (embedding) is set to scene _ length and template _ length, respectively, and the output is set to a size 128 dimension that requires converting words into vector space. The role of this layer is to perform vector mapping (word embedding) for each word in the input text, i.e. to convert the sequence of integers for each word in the text into a vector of fixed shape 128 dimensions. In the embodiment of the invention, the requirement characteristic vector is converted into a vector with 128 dimensions of a fixed shape.
The number of convolution kernels of the first convolution layer is 128 (i.e. the output dimension), the length of the spatial window of the convolution kernels is set to 2 (i.e. the convolution kernels read 2 words at a time in succession), and the activation function is set to "relu". And extracting text features by utilizing the convolutional layer. The embodiment of the invention is used for extracting the text characteristics of the data processed by the first word embedding layer.
The first pooling layer is a maximum pooling layer, the size of a pooling window is set to be 2, the maximum pooling layer reserves the maximum value in the characteristic values extracted by the convolution kernels of the first convolution layer, and all other characteristic values are discarded.
The number of convolution kernels for the second convolution layer is 128 (i.e., the output dimension), the spatial window length of the convolution kernel is set to 2 (i.e., the convolution kernel reads 2 words at a time in succession), and the activation function is set to "relu". And extracting text features by utilizing the convolutional layer. The embodiment of the invention is used for extracting the text characteristics of the data processed by the first pooling layer.
The second pooling layer is a maximum pooling layer, the size of the pooling window is set to be 2, the maximum pooling layer reserves the maximum value in the characteristic values extracted by the convolution kernels of the second convolution layer, and all other characteristic values are discarded.
The first flattening layer is used to "flatten" the input, converting the three-dimensional input into two dimensions.
And respectively mapping the target function characteristic vector and the target demand characteristic vector to a function potential characteristic representation Z and a demand potential characteristic representation U with the dimension d through a network capability topology encoder and a message application demand encoder.
Finally, the capability coding topology generator comprises a first merging layer and a tensor factorization layer which are sequentially connected; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation. And the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In particular, the first merging layer is used to merge the functional latent feature representation Z and the demand latent feature representation U into one latent feature vector Z'. For example, for functional node i and functional node j, the potential feature vector is Z, respectivelyi' and Zj’。
The tensor factorization layer (DistMultfactorization) is used for predicting candidate edges (v) among various functional nodes through decomposition operationi,r,vj) The activation function is set to "sigmoid".
By a function g (v)i,r,vj) To possible edges (v)i,r,vj) Scoring to determine the likelihood that the edges belong to the E set,the score g (v)i,r,vj) Representing functional nodes viAnd a functional node vjThe likelihood of association occurring through the relationship r. Relationship r represents a functional node viAnd a functional node vjThe arrangement relationship between the target demand characteristic vectors can realize the relationship of any sub-target demand characteristic vector in the target demand characteristic vectors. Potential feature vector z using combined function node i and function node ji' and zj'. Predicting candidate edges (v) by DistMult decomposition as scoring functioni,r,vj) The concrete formula is as follows:
Figure BDA0002726508130000131
wherein R isrIs a diagonal matrix of shape d x d, representing ziThe importance of each dimension in' to the association r. Final sigmoid (g (v)i,r,vj) Represents an edge (v)i,r,vj) And the output 1 represents that the capability node i needs to be called by the capability node j, the output-1 represents that the capability node i needs to be called by the capability node j, and the output 0 represents that no calling relationship exists between the capability node i and the capability node j.
Step 240: and inputting the training sample set into the constructed message composite ability arrangement model for training, thereby obtaining the trained message composite ability arrangement model.
Specifically, the historical demand feature vector and the corresponding historical function feature vector set carrying the historical arrangement relationship are respectively input into a network capability topology encoder, a message application demand encoder and a capability arrangement topology generator, the error between the predicted arrangement relationship and the real function arrangement relationship (namely, the arrangement relationship label) is calculated, and the training objective is to minimize the error. The objective function selects 'catalytic _ cross' multi-class log-loss function. The training round number is set to 2000(epochs 2000), and the gradient descent optimization algorithm selects an adam optimizer for improving the learning speed of the conventional gradient descent (optizer adam'). Through gradient descent, an optimal weight value of a parameter of the message composition capability layout model which minimizes the objective function can be found. The training set is adopted for training, so that the smaller the objective function is, the better the objective function is, the verification model is evaluated by the test set after each round of training, the weight value is adjusted, and the current weight value is determined to be the parameter weight of the message composite ability arrangement model after the objective function is converged, so that the trained message composite ability arrangement model is obtained.
Fig. 4 is a schematic structural diagram illustrating an automatic orchestration device for message application functions according to an embodiment of the present invention. As shown in fig. 4, the apparatus 300 includes: an acquisition module 310, a conversion module 320, a prediction module 330, and an output module 340.
An obtaining module 310, configured to obtain a target message application requirement parameter and a function parameter set; the function parameter set comprises function parameters of each target function node;
a conversion module 320, configured to convert the target message application requirement parameter and the function parameter of each target function node into a target requirement feature vector and a target function feature vector, respectively;
the prediction module 330 is configured to input the target demand feature vector and the target function feature vector into a message composite capability arrangement model, so as to obtain a target arrangement relationship between target function nodes corresponding to the target message application demand parameter; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and the output module 340 is configured to output a target arrangement relationship between target function nodes corresponding to the target message application requirement parameter.
In an optional mode, the message composite capability orchestration model includes a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the network capability topology encoder includes a first relation graph convolution layer, a first full-link layer, a first random discard layer, a second relation graph convolution layer, a second full-link layer, and a second random discard layer; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
the capacity coding topology generator comprises a first merging layer and a tensor factorization layer; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the message composite ability arrangement model is obtained by training according to the historical message application demand parameter, the functional parameter of the historical functional node, and the historical arrangement relationship between the corresponding historical functional nodes, and further includes:
acquiring historical message application demand parameters and functional parameters of historical functional nodes;
converting the historical message application demand parameters into historical demand characteristic vectors;
converting the function parameters of the historical function nodes into historical function characteristic vectors corresponding to the historical function nodes;
marking the arrangement relation of each historical function feature vector according to the arrangement relation among the historical function nodes to obtain a historical feature vector set, wherein the historical feature vector set comprises the historical function feature vectors and the historical arrangement relation of the historical function nodes;
and inputting the historical demand characteristic vector and the historical characteristic vector set into a preset neural network model for training to obtain a message composite ability arrangement model.
In an optional manner, the target message application requirement parameter includes a target message application requirement description text, and the function parameter of each target function node includes a function description text of each target function node; converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors respectively, including:
performing text cleaning on the message application requirement description text and the function description texts of the function nodes to obtain a target requirement text and a target function text;
serializing the target demand text and the target function text to obtain a target demand text sequence and a target function text sequence;
and respectively converting the target demand text sequence and the target function text sequence for vector representation to obtain a target demand characteristic vector and a target function characteristic vector.
In an optional mode, acquiring a target message application requirement parameter and a function parameter set; the function parameter set includes function parameters of each target function node, and includes:
acquiring a target message application demand request, wherein the target message application demand request carries the target message application demand parameters.
According to the embodiment of the invention, the arrangement relation of the capacity nodes corresponding to the message application requirements is predicted through the neural network, multi-scene message application can be realized according to the message application requirements, an enterprise can rapidly complete the deployment of the message application through a platform without complex code development, and the message application can be automatically, simply and conveniently created.
In addition, the specific structures of the network capability topology encoder, the message application requirement encoder and the capability arranging topology generator are arranged, so that the neural network can predict the arranging relation more accurately.
Fig. 5 is a schematic structural diagram illustrating an automatic message application function arranging apparatus according to an embodiment of the present invention, and a specific embodiment of the present invention is not limited to a specific implementation of the automatic message application function arranging apparatus.
As shown in fig. 5, the message application function automatic orchestration device may include: a processor (processor)402, a Communications Interface 404, a memory 406, and a Communications bus 408.
Wherein: the processor 402, communication interface 404, and memory 406 communicate with each other via a communication bus 408. A communication interface 404 for communicating with network elements of other devices, such as clients or other servers. The processor 402, configured to execute the program 410, may specifically perform the relevant steps in the above-described embodiment of the method for automatically arranging message application functions.
In particular, program 410 may include program code comprising computer-executable instructions.
The processor 402 may be a central processing unit CPU or an application Specific Integrated circuit asic or one or more Integrated circuits configured to implement embodiments of the present invention. The message application function automatic programming device comprises one or more processors which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 406 for storing a program 410. Memory 406 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 410 may be specifically invoked by the processor 402 to cause the message application function automation device to perform the following operations:
acquiring a target message application demand parameter and a function parameter set; the function parameter set comprises function parameters of each target function node;
respectively converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors;
inputting the target demand characteristic vector and the target function characteristic vector into a message composite ability arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
In an optional mode, the message composite capability orchestration model includes a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the network capability topology encoder includes a first relation graph convolution layer, a first full-link layer, a first random discard layer, a second relation graph convolution layer, a second full-link layer, and a second random discard layer; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
the capacity coding topology generator comprises a first merging layer and a tensor factorization layer; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the message composite ability arrangement model is obtained by training according to the historical message application demand parameter, the functional parameter of the historical functional node, and the historical arrangement relationship between the corresponding historical functional nodes, and further includes:
acquiring historical message application demand parameters and functional parameters of historical functional nodes;
converting the historical message application demand parameters into historical demand characteristic vectors;
converting the function parameters of the historical function nodes into historical function characteristic vectors corresponding to the historical function nodes;
marking the arrangement relation of each historical function feature vector according to the arrangement relation among the historical function nodes to obtain a historical feature vector set, wherein the historical feature vector set comprises the historical function feature vectors and the historical arrangement relation of the historical function nodes;
and inputting the historical demand characteristic vector and the historical characteristic vector set into a preset neural network model for training to obtain a message composite ability arrangement model.
In an optional manner, the target message application requirement parameter includes a target message application requirement description text, and the function parameter of each target function node includes a function description text of each target function node; converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors respectively, including:
performing text cleaning on the message application requirement description text and the function description texts of the function nodes to obtain a target requirement text and a target function text;
serializing the target demand text and the target function text to obtain a target demand text sequence and a target function text sequence;
and respectively converting the target demand text sequence and the target function text sequence for vector representation to obtain a target demand characteristic vector and a target function characteristic vector.
In an optional mode, acquiring a target message application requirement parameter and a function parameter set; the function parameter set includes function parameters of each target function node, and includes:
acquiring a target message application demand request, wherein the target message application demand request carries the target message application demand parameters.
According to the embodiment of the invention, the arrangement relation of the capacity nodes corresponding to the message application requirements is predicted through the neural network, multi-scene message application can be realized according to the message application requirements, an enterprise can rapidly complete the deployment of the message application through a platform without complex code development, and the message application can be automatically, simply and conveniently created.
In addition, the specific structures of the network capability topology encoder, the message application requirement encoder and the capability arranging topology generator are arranged, so that the neural network can predict the arranging relation more accurately.
The embodiment of the invention provides a computer-readable storage medium, where the storage medium stores at least one executable instruction, and when the executable instruction runs on an automatic message application function arranging device, the automatic message application function arranging device is enabled to execute the automatic message application function arranging method in any method embodiment.
The executable instructions may be specifically configured to cause the message application function automatic orchestration device to perform the following operations:
acquiring a target message application demand parameter and a function parameter set; the function parameter set comprises function parameters of each target function node;
respectively converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors;
inputting the target demand characteristic vector and the target function characteristic vector into a message composite ability arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
In an optional mode, the message composite capability orchestration model includes a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the network capability topology encoder includes a first relation graph convolution layer, a first full-link layer, a first random discard layer, a second relation graph convolution layer, a second full-link layer, and a second random discard layer; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
the capacity coding topology generator comprises a first merging layer and a tensor factorization layer; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
In an optional manner, the message composite ability arrangement model is obtained by training according to the historical message application demand parameter, the functional parameter of the historical functional node, and the historical arrangement relationship between the corresponding historical functional nodes, and further includes:
acquiring historical message application demand parameters and functional parameters of historical functional nodes;
converting the historical message application demand parameters into historical demand characteristic vectors;
converting the function parameters of the historical function nodes into historical function characteristic vectors corresponding to the historical function nodes;
marking the arrangement relation of each historical function feature vector according to the arrangement relation among the historical function nodes to obtain a historical feature vector set, wherein the historical feature vector set comprises the historical function feature vectors and the historical arrangement relation of the historical function nodes;
and inputting the historical demand characteristic vector and the historical characteristic vector set into a preset neural network model for training to obtain a message composite ability arrangement model.
In an optional manner, the target message application requirement parameter includes a target message application requirement description text, and the function parameter of each target function node includes a function description text of each target function node; converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors respectively, including:
performing text cleaning on the message application requirement description text and the function description texts of the function nodes to obtain a target requirement text and a target function text;
serializing the target demand text and the target function text to obtain a target demand text sequence and a target function text sequence;
and respectively converting the target demand text sequence and the target function text sequence for vector representation to obtain a target demand characteristic vector and a target function characteristic vector.
In an optional mode, acquiring a target message application requirement parameter and a function parameter set; the function parameter set includes function parameters of each target function node, and includes:
acquiring a target message application demand request, wherein the target message application demand request carries the target message application demand parameters.
According to the embodiment of the invention, the arrangement relation of the capacity nodes corresponding to the message application requirements is predicted through the neural network, multi-scene message application can be realized according to the message application requirements, an enterprise can rapidly complete the deployment of the message application through a platform without complex code development, and the message application can be automatically, simply and conveniently created.
In addition, the specific structures of the network capability topology encoder, the message application requirement encoder and the capability arranging topology generator are arranged, so that the neural network can predict the arranging relation more accurately.
The embodiment of the invention provides an automatic arranging device for message application functions, which is used for executing the automatic arranging method for the message application functions.
Embodiments of the present invention provide a computer program, where the computer program can be called by a processor to enable a message application function automatic arrangement device to execute the message application function automatic arrangement method in any of the above method embodiments.
Embodiments of the present invention provide a computer program product, which includes a computer program stored on a computer-readable storage medium, where the computer program includes program instructions, and when the program instructions are run on a computer, the computer is caused to execute the automatic arranging method for message application function in any of the above-mentioned method embodiments.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specified otherwise.

Claims (10)

1. A method for automatically arranging message application functions, the method comprising:
acquiring a target message application demand parameter and a function parameter set; the function parameter set comprises function parameters of each target function node;
respectively converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors;
inputting the target demand characteristic vector and the target function characteristic vector into a message composite ability arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
2. The method of claim 1, wherein the message composition capability orchestration model comprises a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
3. The method of claim 1, wherein the network-capable topology encoder comprises a first relational graph convolutional layer, a first fully-connected layer, a first random-discard layer, a second relational graph convolutional layer, a second fully-connected layer, and a second random-discard layer; the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder comprises a first word embedding layer, a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a first leveling layer; the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
the capacity coding topology generator comprises a first merging layer and a tensor factorization layer; and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
4. The method of claim 1, wherein the message composition capability arrangement model is trained according to historical message application demand parameters, functional parameters of historical functional nodes, and historical arrangement relationships among corresponding historical functional nodes, and further comprising:
acquiring historical message application demand parameters and functional parameters of historical functional nodes;
converting the historical message application demand parameters into historical demand characteristic vectors;
converting the function parameters of the historical function nodes into historical function characteristic vectors corresponding to the historical function nodes;
marking the arrangement relation of each historical function feature vector according to the arrangement relation among the historical function nodes to obtain a historical feature vector set, wherein the historical feature vector set comprises the historical function feature vectors and the historical arrangement relation of the historical function nodes;
and inputting the historical demand characteristic vector and the historical characteristic vector set into a preset neural network model for training to obtain a message composite ability arrangement model.
5. The method according to any one of claims 1-4, wherein the target message application requirement parameters comprise target message application requirement description texts, and the function parameters of each target function node comprise function description texts of each target function node; converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors respectively, including:
performing text cleaning on the message application requirement description text and the function description texts of the function nodes to obtain a target requirement text and a target function text;
serializing the target demand text and the target function text to obtain a target demand text sequence and a target function text sequence;
and respectively converting the target demand text sequence and the target function text sequence for vector representation to obtain a target demand characteristic vector and a target function characteristic vector.
6. The method of claim 1, wherein a target message application requirement parameter and a set of function parameters are obtained; the function parameter set includes function parameters of each target function node, and includes:
acquiring a target message application demand request, wherein the target message application demand request carries the target message application demand parameters.
7. An apparatus for automatic orchestration of message application functions, the apparatus comprising:
the acquisition module is used for acquiring the target message application requirement parameters and the function parameter set; the function parameter set comprises function parameters of each target function node;
the conversion module is used for respectively converting the target message application demand parameters and the functional parameters of each target functional node into target demand characteristic vectors and target functional characteristic vectors;
the prediction module is used for inputting the target demand characteristic vector and the target function characteristic vector into a message composite capacity arrangement model to obtain a target arrangement relation between target function nodes corresponding to the target message application demand parameters; the message composite capacity arranging model is obtained by training according to historical message application demand parameters, the function parameters of the historical function nodes and the historical arranging relation among the corresponding historical function nodes;
and the output module is used for outputting the target arrangement relation between the target function nodes corresponding to the target message application demand parameters.
8. The apparatus of claim 7, wherein the message composition capability orchestration model comprises a network capability topology encoder, a message application requirement encoder, and a capability orchestration topology generator;
the network capability topology encoder is used for extracting the features of the target function feature vector to obtain a function potential feature representation;
the message application requirement encoder is used for extracting the characteristics of the requirement characteristic vector to obtain requirement potential characteristic representation;
and the capacity coding topology generator is used for determining a target arrangement relation between target function nodes according to the function potential characteristic representation and the demand potential characteristic representation.
9. An automatic orchestration device of message application functions, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform the operations of the message application function auto-orchestration method according to any one of claims 1-6.
10. A computer-readable storage medium having stored therein at least one executable instruction which, when run on a message application function auto-orchestration device, causes the message application function auto-orchestration device to perform the operations of the message application function auto-orchestration method according to any one of claims 1-6.
CN202011104540.6A 2020-10-15 2020-10-15 Automatic arranging method, device and equipment for message application function and computer readable storage medium Pending CN114372442A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011104540.6A CN114372442A (en) 2020-10-15 2020-10-15 Automatic arranging method, device and equipment for message application function and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011104540.6A CN114372442A (en) 2020-10-15 2020-10-15 Automatic arranging method, device and equipment for message application function and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114372442A true CN114372442A (en) 2022-04-19

Family

ID=81138867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011104540.6A Pending CN114372442A (en) 2020-10-15 2020-10-15 Automatic arranging method, device and equipment for message application function and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114372442A (en)

Similar Documents

Publication Publication Date Title
CN111026842B (en) Natural language processing method, natural language processing device and intelligent question-answering system
CN109344884B (en) Media information classification method, method and device for training picture classification model
CN109934249A (en) Data processing method, device, medium and calculating equipment
CN112016601B (en) Network model construction method based on knowledge graph enhanced small sample visual classification
CN109948160B (en) Short text classification method and device
CN116127020A (en) Method for training generated large language model and searching method based on model
WO2023138188A1 (en) Feature fusion model training method and apparatus, sample retrieval method and apparatus, and computer device
CN108734212A (en) A kind of method and relevant apparatus of determining classification results
CN115131698B (en) Video attribute determining method, device, equipment and storage medium
CN110377733A (en) A kind of text based Emotion identification method, terminal device and medium
CN112184284A (en) Picture classification method and device and electronic equipment
CN112989843B (en) Intention recognition method, device, computing equipment and storage medium
CN113657473A (en) Web service classification method based on transfer learning
CN113312445B (en) Data processing method, model construction method, classification method and computing equipment
CN116756281A (en) Knowledge question-answering method, device, equipment and medium
CN113723367B (en) Answer determining method, question judging method and device and electronic equipment
CN114374660B (en) Recommendation method, device, equipment and storage medium of 5G message chat robot
CN114372442A (en) Automatic arranging method, device and equipment for message application function and computer readable storage medium
CN113010687B (en) Exercise label prediction method and device, storage medium and computer equipment
CN115705464A (en) Information processing method, device and equipment
CN114611609A (en) Graph network model node classification method, device, equipment and storage medium
CN114372191A (en) Message industry application template recommendation method and device and computing equipment
CN112417290A (en) Training method of book sorting push model, electronic equipment and storage medium
CN115269901A (en) Method, device and equipment for generating extended image
CN114328797B (en) Content search method, device, electronic apparatus, storage medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination