CN115442324B - Message generation method, device, message management equipment and storage medium - Google Patents

Message generation method, device, message management equipment and storage medium Download PDF

Info

Publication number
CN115442324B
CN115442324B CN202110629209.4A CN202110629209A CN115442324B CN 115442324 B CN115442324 B CN 115442324B CN 202110629209 A CN202110629209 A CN 202110629209A CN 115442324 B CN115442324 B CN 115442324B
Authority
CN
China
Prior art keywords
message
generator
initial
sent
topological graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110629209.4A
Other languages
Chinese (zh)
Other versions
CN115442324A (en
Inventor
邢彪
张汉良
丁东
胡皓
陈嫦娇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Group Zhejiang Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Group Zhejiang Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Group Zhejiang Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202110629209.4A priority Critical patent/CN115442324B/en
Publication of CN115442324A publication Critical patent/CN115442324A/en
Application granted granted Critical
Publication of CN115442324B publication Critical patent/CN115442324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/06Message adaptation to terminal or network requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/06Message adaptation to terminal or network requirements
    • H04L51/066Format adaptation, e.g. format conversion or compression
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information

Abstract

The invention discloses a message generation method, which comprises the following steps: receiving message data to be sent; converting the message data to be sent into a first message structure topological graph; inputting the first message structure topological graph into a training obtained generator to generate a message code sent out in a preset format; and generating a message to be sent based on the message code to be sent. The invention also discloses a message generating device, a message management device and a computer readable storage medium. By using the method of the invention, the efficiency of generating the issued message is higher.

Description

Message generation method, device, message management equipment and storage medium
Technical Field
The present invention relates to the field of information processing, and in particular, to a message generating method, apparatus, message management device, and computer readable storage medium.
Background
The message service provides the sending function and the receiving function of media contents such as texts, pictures, audios, videos, positions, contacts and the like for users based on a native short message entrance of the mobile terminal, and the messages comprise point-to-point messages, group sending messages, group chat messages, point-to-application messages and the like. Compared with the traditional short message with single function, the message not only widens the breadth of message receiving and transmitting, supports the user to use the text, audio and video, cards, positions and other multimedia contents, extends the depth of interaction experience, and the user can complete services such as service searching, discovery, interaction, payment and the like in the message window to construct the message window of one-stop service.
In the related art, a message generating method is disclosed, in which an industry technician manually converts message data to be sent, which is sent by a user, to obtain a corresponding message code, and a final message is obtained by using the message code.
However, the existing message generation method is adopted, and the efficiency of obtaining the issued message is low.
Disclosure of Invention
The invention mainly aims to provide a message generation method, a device, a message management device and a computer readable storage medium, and aims to solve the technical problem that the prior message generation method is adopted and the efficiency of message issuing is lower in the prior art.
To achieve the above object, the present invention provides a message generating method, which includes the steps of:
receiving message data to be sent;
converting the message data to be sent into a first message structure topological graph;
inputting the first message structure topological graph into a training obtained generator to generate a message code sent out in a preset format;
and generating a message to be sent based on the message code to be sent.
Optionally, before the step of receiving the message data to be sent, the method further includes:
acquiring a first training sample, wherein the first training sample comprises first historical message data to be sent and a first real message code corresponding to the first historical message data to be sent;
Converting the first historical message data to be sent into a second message structure topological graph;
training an initial generator by using the second message structure topological graph, the first real message code and the obtained discriminators to obtain the generator.
Optionally, before the step of training the initial generator to obtain the generator by using the second message structure topological graph, the first real message code and the obtained arbiter by training, the method further includes:
acquiring a second training sample, wherein the second training sample comprises second historical message data to be sent and second real message codes corresponding to the second historical message data to be sent;
converting the second historical message data to be sent into a third message structure topological graph;
training an initial arbiter using the third message structure topology map, the second real message code, and the initial generator to obtain the arbiter.
Optionally, the training an initial arbiter to obtain the arbiter using the third message structure topology map, the second real message code and the initial generator includes:
Determining a first selected message structure topological graph in the third message structure topological graph, and determining a first selected message code corresponding to the first selected message structure topological graph in the second real message code;
acquiring a first selected random noise corresponding to the first selected message structure topological graph;
generating a first result message code using the initial generator based on the first selected message structure topology map and the first selected random noise;
combining the first information pair set and the second information pair set to obtain a third information pair set, wherein the first selected message structure topological graph and the first selected message code form information pairs in the first information set, and the first selected message structure topological graph and the first result message code form information pairs in the second information pair set;
assigning each information pair in the first information pair set, each information pair in the second information pair set and each information pair in the third information pair set to each other;
acquiring a first target parameter based on a first target function, assignment of each information pair in the first information pair set, assignment of each information pair in the second information pair set and assignment of each information pair in the third information pair set;
Updating the initial arbiter with the first target parameter to obtain a new arbiter;
and taking the new discriminator as the initial discriminator, and returning to execute the step of determining a first selected message structure topological graph from the third message structure topological graph until a first target parameter meets a first preset condition, so as to obtain the discriminator.
Optionally, the step of training an initial generator to obtain the generator by using the second message structure topological graph, the first real message code and the obtained arbiter through training includes:
determining a second selected message structure topological graph in the second message structure topological graph, and determining a second selected message code corresponding to the second selected message structure topological graph in the first real message code;
acquiring a second selected random noise corresponding to the second selected message structure topological graph;
generating a second result message code using the initial generator based on the second selected message structure topology map and the second selected random noise;
code the second result message into the arbiter to obtain a decision result;
Obtaining a second target parameter based on a second target function and the judging result;
updating the initial generator with the second target parameter to obtain a new generator;
and taking the new generator as the initial generator, and returning to execute the step of determining a second selected message structure topological graph from the second message structure topological graph until a second target parameter meets a second preset condition, so as to obtain the generator.
Optionally, the step of generating, with the initial generator, a first resulting message code based on the first selected message structure topology map and the first selected random noise includes:
inputting the first selected random noise to a noise encoder in the initial generator to obtain a first feature vector;
inputting a first selected message structure topology map to a topology encoder in the initial generator to obtain a second feature vector;
inputting the first feature vector and the second feature vector into a merging layer in the initial generator to obtain a merging vector;
inputting the combined vector into a sequence decoder in the initial generator to obtain an output message code sequence;
The sequence of output message codes is input to a first output layer in the initial generator to obtain a first result message code.
Alternatively to this, the method may comprise,
the noise encoder comprises three layers of long and short memory neural networks, each layer of long and short memory neural network is provided with 32 neurons, and the activation function is relu;
the topological encoder comprises a first graph convolution layer, a second graph convolution layer and a third graph convolution layer; the number of convolution kernels of the first graph convolution layer is 256, and the activation function is relu; the number of convolution kernels of the second graph convolution layer is 128, and the activation function is relu; the number of convolution kernels of the third graph convolution layer is 64, and the activation function is lamda;
the sequence decoder comprises three layers of long-short-period memory neural networks, the number of neurons of each layer of long-short-period memory neural network is set to 128, and the activation function is set to relu;
the activation function of the first output layer is softmax, and the number of fully-connected neurons of the first output layer is the same as the output dimension of the first result message code;
the initial arbiter comprises a plurality of full-connection layers and a second output layer, wherein each full-connection layer comprises 64 neurons, the activation function of each full-connection layer is relu, the second output layer comprises one full-connection neuron, and the activation function of the second output layer is sigmoid.
In addition, to achieve the above object, the present invention also proposes a message generating apparatus, including:
the receiving module is used for receiving the message data to be sent;
the first conversion module is used for converting the message data to be sent into a first message structure topological graph;
the generation module is used for inputting the first message structure topological graph into a training obtained generator so as to generate a preset format of message code;
and the second conversion module is used for generating a message to be sent based on the message code to be sent.
In addition, in order to achieve the above object, the present invention also proposes a message management apparatus including: memory, a processor and a computer program stored on the memory and running on the processor, which when executed by the processor implements the steps of the message generation method as claimed in any one of the preceding claims.
Furthermore, to achieve the above object, the present invention also proposes a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the message generation method as set forth in any one of the above.
The technical scheme of the invention provides a message generation method which is applied to message management equipment, and comprises the following steps: receiving message data to be sent; converting the message data to be sent into a first message structure topological graph; inputting the first message structure topological graph into a training obtained generator to generate a message code sent out in a preset format; and generating a message to be sent based on the message code to be sent.
In the existing message generation method, industry technicians are required to manually convert message data to be sent to obtain message codes, and final issued messages are obtained based on the message codes, so that the message code obtaining duration is longer, and the issuing message obtaining efficiency is lower. In the invention, the message management equipment directly uses the generator obtained by training to generate the message code to be sent in the preset format based on the first message structure topological graph corresponding to the message data to be sent, and does not need industry technicians to manually convert the message data to be sent, thereby reducing the conversion time and improving the efficiency of obtaining the message to be sent.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a message management device in a hardware operating environment according to an embodiment of the present application;
FIG. 2 is a flow chart of a first embodiment of a message generating method according to the present application;
FIG. 3 is a schematic diagram of the structure of message data to be sent according to the present application;
FIG. 4 is a schematic diagram of an initial producer data processing flow of the present application;
FIG. 5 is a schematic diagram of the initial generator and initial arbiter training process of the present application;
fig. 6 is a block diagram showing the construction of a first embodiment of the message generating apparatus of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The main solutions of the embodiments of the present application are: there is provided a message generation method applied to a message management device, the method comprising the steps of: receiving message data to be sent; converting the message data to be sent into a first message structure topological graph; inputting the first message structure topological graph into a training obtained generator to generate a message code sent out in a preset format; and generating a message to be sent based on the message code to be sent.
In the existing message generation method, industry technicians are required to manually convert message data to be sent to obtain message codes, and final issued messages are obtained based on the message codes, so that the message code obtaining duration is longer, and the issuing message obtaining efficiency is lower. In the invention, the message management equipment directly uses the generator obtained by training to generate the message code to be sent in the preset format based on the first message structure topological graph corresponding to the message data to be sent, and does not need industry technicians to manually convert the message data to be sent, thereby reducing the conversion time and improving the efficiency of obtaining the message to be sent.
Referring to fig. 1, fig. 1 is a schematic diagram of a message management device of a hardware running environment according to an embodiment of the present invention.
In general, the message management device may be a message management server, also called a message platform, the message management device comprising: at least one processor 301, a memory 302 and a computer program stored on said memory and executable on said processor, said computer program being configured to implement the steps of the message generation method as described above.
Processor 301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 301 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 301 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central ProcessingUnit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 301 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. The processor 301 may also include an AI (Artificial Intelligence ) processor for processing related message generation method operations so that the message generation method model may be self-training learned, improving efficiency and accuracy.
Memory 302 may include one or more computer-readable storage media, which may be non-transitory. Memory 302 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 302 is used to store at least one instruction for execution by processor 301 to implement the message generation method provided by the method embodiments of the present application.
In some embodiments, the terminal may further optionally include: a communication interface 303, and at least one peripheral device. The processor 301, the memory 302 and the communication interface 303 may be connected by a bus or signal lines. The respective peripheral devices may be connected to the communication interface 303 through a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 304, a display screen 305, and a power supply 306.
The communication interface 303 may be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 301 and the memory 302. In some embodiments, processor 301, memory 302, and communication interface 303 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 301, the memory 302, and the communication interface 303 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 304 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 304 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 304 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 304 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 304 may also include NFC (Near Field Communication ) related circuitry, which is not limiting of the application.
The display screen 305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 305 is a touch screen, the display 305 also has the ability to collect touch signals at or above the surface of the display 305. The touch signal may be input as a control signal to the processor 301 for processing. At this point, the display 305 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 305 may be one, the front panel of an electronic device; in other embodiments, the display screen 305 may be at least two, respectively disposed on different surfaces of the electronic device or in a folded design; in still other embodiments, the display 305 may be a flexible display disposed on a curved surface or a folded surface of the electronic device. Even more, the display screen 305 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 305 may be made of LCD (LiquidCrystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The power supply 306 is used to power the various components in the electronic device. The power source 306 may be alternating current, direct current, disposable or rechargeable. When the power source 306 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology. It will be appreciated by those skilled in the art that the structure shown in fig. 1 does not constitute a limitation of the message management device, and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Furthermore, embodiments of the present application also propose a computer-readable storage medium on which a computer program is stored, which when being executed by a processor implements the steps of the message generation method as described above. Therefore, a detailed description will not be given here. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer-readable storage medium according to the present application, please refer to the description of the method embodiments of the present application. As determined by way of example, the program instructions may be deployed to be executed on one message management device or on multiple message management devices located at one site or, alternatively, on multiple message management devices distributed across multiple sites and interconnected by a communication network.
Those skilled in the art will appreciate that implementing all or part of the above-described methods may be accomplished by way of computer programs, which may be stored on a computer-readable storage medium, and which, when executed, may comprise the steps of the embodiments of the methods described above. The computer readable storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random access Memory (Random AccessMemory, RAM), or the like.
Based on the above hardware structure, an embodiment of the message generation method of the present invention is presented.
Referring to fig. 2, fig. 2 is a flowchart of a first embodiment of a message generating method of the present invention, which is applied to a message management device, and the method includes the following steps:
step S11: message data to be sent is received.
The execution body of the method is a message management device, which may be a message management server, also called an information platform, and the message management device is provided with a computer program, and when the message management device executes the computer program, the steps of the message generation method of the invention are realized.
In the invention, the mobile terminal sending the message data to be sent is a first mobile terminal, and the mobile terminal receiving the converted information (namely the message sent by the invention) is a second mobile terminal. The information transmission between the mobile terminals is carried out through the message management equipment: the first mobile terminal sends information, the information is processed by the information management equipment, and the information management equipment sends the processed information to the second mobile terminal.
It can be understood that, in the present invention, the information transmission manner is similar to the existing transmission manner, and the message data to be sent by the first mobile terminal may include information such as a phone number of the second mobile terminal, and the task management device may send the sending message corresponding to the message data to be sent to the second mobile terminal through information such as the phone number of the second mobile terminal in the message data to be sent.
Specifically, in the present invention, the message data to be sent may be 5G information, and the message data to be sent may be in the form of multimedia content such as text, audio and video, card, position, etc., or may be in other types of forms, which is not limited in the present invention; in addition, the message data to be transmitted may be a point message, a group transmission message, a group chat message, a point-to-application message, or the like.
Step S12: and converting the message data to be sent into a first message structure topological graph.
Step S13: and inputting the first message structure topological graph into a training obtained generator to generate a preset format of issued message code.
It should be noted that, the message data to be sent cannot be directly utilized by the generator obtained by training, and needs to be converted into a message structure topological diagram, that is, the first message structure topological diagram, which is input to the generator in a manner of an adjacency matrix and a feature matrix.
In the present invention, it is necessary to construct the generator first: the structure of the initial generator, that is, the initial generator (mentioned later), is determined, and then the constructed initial generator is trained using a training sample (first training sample) to obtain the trained generator, step S12 has been achieved. The generator generates a message code, and it can be understood that the preset format in the invention can be an xml format, that is, the message code sent is an xml code, and the generator generates a message other than a text message or a picture message, etc.
Specifically, the initial generator includes a noise encoder, a topology encoder, a sequence decoder, a merging layer, and a first output layer. The noise encoder comprises three layers of long and short memory neural networks, each layer of long and short memory neural network is provided with 32 neurons, and the activation function is relu; the topological encoder comprises a first graph convolution layer, a second graph convolution layer and a third graph convolution layer; the number of convolution kernels of the first graph convolution layer is 256, and the activation function is relu; the number of convolution kernels of the second graph convolution layer is 128, and the activation function is relu; the number of convolution kernels of the third graph convolution layer is 64, and the activation function is lamda; the sequence decoder comprises three layers of long-short-period memory neural networks, the number of neurons of each layer of long-short-period memory neural network is set to 128, and the activation function is set to relu; the activation function of the first output layer is softmax, and the number of fully connected neurons of the first output layer is the same as the output dimension of the first result message code. The structure is the structure of the initial generator constructed.
The training mode of the generator in the invention is as follows: acquiring a first training sample, wherein the first training sample comprises first historical message data to be sent and a first real message code corresponding to the first historical message data to be sent; converting the first historical message data to be sent into a second message structure topological graph; training an initial generator by using the second message structure topological graph, the first real message code and the obtained discriminators to obtain the generator.
In the present invention, a discriminator (initial discriminator) is constructed, and then the constructed initial discriminator is trained using a training sample (second training sample) to obtain the discriminator, so as to continuously realize the step of training an initial generator using the second message structure topology map, the first real message code and the trained discriminator, so as to obtain the generator.
The training mode of the discriminator is as follows: acquiring a second training sample, wherein the second training sample comprises second historical message data to be sent and second real message codes corresponding to the second historical message data to be sent; converting the second historical message data to be sent into a third message structure topological graph; training an initial arbiter using the third message structure topology map, the second real message code, and the initial generator to obtain the arbiter.
It will be appreciated that in the above training process, the corresponding historical message data to be sent needs to be converted into a message structure topological graph before training the initial generator and the initial arbiter.
Specifically, the training, by using the third message structure topology map, the second real message code, and the initial generator, an initial arbiter to obtain the arbiter includes: determining a first selected message structure topological graph in the third message structure topological graph, and determining a first selected message code corresponding to the first selected message structure topological graph in the second real message code; acquiring a first selected random noise corresponding to the first selected message structure topological graph; generating a first result message code using the initial generator based on the first selected message structure topology map and the first selected random noise; combining the first information pair set and the second information pair set to obtain a third information pair set, wherein the first selected message structure topological graph and the first selected message code form information pairs in the first information set, and the first selected message structure topological graph and the first result message code form information pairs in the second information pair set; assigning each information pair in the first information pair set, each information pair in the second information pair set and each information pair in the third information pair set to each other; acquiring a first target parameter based on a first target function, assignment of each information pair in the first information pair set, assignment of each information pair in the second information pair set and assignment of each information pair in the third information pair set; updating the initial arbiter with the first target parameter to obtain a new arbiter; and taking the new discriminator as the initial discriminator, and returning to execute the step of determining a first selected message structure topological graph from the third message structure topological graph until a first target parameter meets a first preset condition, so as to obtain the discriminator.
Firstly, collecting a sample for training an initial discriminator, namely a second training sample, wherein the second training sample comprises information which is the second historical information data to be sent (information before conversion), the second training sample comprises information codes which are the second real information codes, one second historical information data to be sent corresponds to one second real information code, and the second training sample comprises a large amount of second historical information data to be sent and a large amount of second real information codes. The second training samples may be historical to-be-transmitted message data collected in the message management device and real message codes corresponding to the historical to-be-transmitted message data.
It will be appreciated that the process of training the initial arbiter is to divide the second training samples into a plurality of batches and perform a plurality of training with the second training samples of the plurality of batches to obtain the arbiter. It can be appreciated that in the second training samples, the first selected message structure topology map and the first selected message code are one batch of training samples. Because one message data in the second historical message data to be sent corresponds to a second real message code, the third message structure topology corresponding to the message also corresponds to the real message code. It will be appreciated that the first selected message structure topology map is in one-to-one correspondence with the first selected message codes.
Message data to be transmitted (relating to the message to be transmitted, the first historical message data to be transmitted, and the second historical message data to be transmitted) often includes a header (the header may relate to a primary header, a secondary header, a tertiary header, and the like) and a body. It can be understood that in the message structure topology map (related to the first message structure topology map, the second message structure topology map, the third message structure topology map, the first selected message structure topology map and the second selected message structure topology map in the present invention), the header and the text are two types of heterogeneous nodes respectively: the relation between the title node and the text node is the edge of the graph.
Referring to fig. 3, fig. 3 is a schematic diagram of a structure of message data to be transmitted according to the present invention, in fig. 3, a message data to be transmitted includes a primary header, three secondary headers, two tertiary headers and a text, wherein the two secondary headers correspond to the text respectively, the two secondary headers correspond to the two tertiary headers respectively, and the two tertiary headers correspond to the text respectively, at this time, a message structure topology corresponding to the message data to be transmitted includes 10 nodes and 9 edges.
In a specific application, the message structure topology graph is represented as g= (V, E), V being a set of nodes and E being a set of edges. The message structure topology is then converted into an adjacency matrix and a feature matrix, and input into a generator (either a training-derived generator or an initial generator). The adjacency matrix represents the logical relationship between nodes, and the adjacency matrix is in the form of:
Wherein e ij The logical relationship representing the ith node and the jth node may include parallel, primary and secondary, total score, and the like, with N representing the total number of nodes. The feature matrix represents the feature sequence of each title node and each text node, the title features comprise title grades and title contents, the title grades can relate to primary titles, secondary titles, tertiary titles and the like, the text nodes are characterized by the text contents, and the form of the feature matrix is as follows:
wherein B is ij The j-th sequence in the characteristic description of the i-th node is represented, and the total length of the characteristic sequence of each node is filled with M. The feature sequence of each node can be obtained through the established feature information dictionary, and the corresponding feature information dictionary is different for the feature sequences with different total filling lengths. The feature information dictionary can be utilized to directly query the feature sequences corresponding to the nodes based on the content of the nodes.
It is understood that in the present invention, the real message codes (related to the first real message code and the second real message code) corresponding to the history message data to be transmitted (related to the first history message data to be transmitted and the second history message data) actually exist in such a manner that the real message to be transmitted, and the real message to be transmitted needs to be generated as the real message code (for example, xml code). The longest length of the real message code can be set as K (natural number), wherein K is the index length of the real message code, the shape of the historical real message code is n×k, and N is the total number of the nodes; the content of the actual message to be sent corresponding to the historical message data can be converted into an actual message code by using a code conversion dictionary, and the size of the code conversion dictionary corresponds to the longest length K.
Wherein the step of generating a first resulting message code using the initial generator based on the first selected message structure topology map and the first selected random noise comprises: inputting the first selected random noise to a noise encoder in the initial generator to obtain a first feature vector; inputting a first selected message structure topology map to a topology encoder in the initial generator to obtain a second feature vector; inputting the first feature vector and the second feature vector into a merging layer in the initial generator to obtain a merging vector; inputting the combined vector into a sequence decoder in the initial generator to obtain an output message code sequence; the sequence of output message codes is input to a first output layer in the initial generator to obtain a first result message code.
Referring to FIG. 4, FIG. 4 is a schematic diagram of an initial producer data processing flow of the present invention; in fig. 4, the generation of the first result message code is explained as an example. The first selected message structure topological graph is the message topological structure (c), the first selected random noise is the noise U, the upper part of the encoder (three layers of long-short-term memory neural networks) is the noise encoder, the lower part of the encoder (three layers of graph convolution layers) is the topological encoder, wherein the feature vector U is the first feature vector, the feature vector Z is the second feature vector, the generator comprises a sequence decoder (comprising three layers of long-short-term memory neural networks), the output xml code in the 5G message format is the first result message code, and the code format is xml.
In a specific application, the third message structure topological graph is input in a mode of a feature matrix and an adjacent matrix and is input through an input layer of an initial generator; the feature matrix and the adjacency matrix are then processed through three layers of graph convolution layers, each of which is represented as follows:
H (l+1) =f(H l ,A)
wherein H is 0 That is, the data input to the input layer, a is the adjacent matrix, l is the number of layers of the convolution layer (l=2, indicating the second layer of the convolution layer), and different models are determined by selecting different mapping functions f and parameters, where in the present application, the mapping functions f are as follows:
wherein σ is the activation function relu, A is the adjacency matrix, D is the diagonal matrix of A, W l The parameters of the layer are convolved for the layer i graph.
The method comprises the following steps: the feature matrix and adjacency matrix are input to a topology encoder, which outputs a first feature vector representing a potential spatial vector of the first selected message topology.
The second selected random noise refers to random noise corresponding to the first selected message structure topology map, and the random noise can be determined by using normal distribution, and one first selected message structure topology map corresponds to one random noise.
The first feature vector and the second feature vector of the two branches are then combined, and features are extracted from the first feature vector and the second feature vector in the combined vectors by a sequence decoder, generating a message code sequence, i.e. the output message code sequence. The sequence of output message codes is then input to a first output layer in the initial generator to obtain a first result message code. The first output layer may be a Dense full connection layer.
Thus far, two information pairs are obtained, the first one is composed of a first selected message structure topological graph and a first selected message code, all the first information pairs are composed of a first information pair set, the second one is composed of the first selected message structure topological graph and a first result message code, and all the second information pairs are composed of a second information pair set. The message code generated by the first selected message structure topological graph is a first result message code. And mixing the first information pair and the second information pair, and taking out the E information pairs from the mixed information pairs as a third information pair set.
Assigning a value of 1 (true) to each information pair in the first set of information pairs, representing that the true message code is the most accurate result; and the assignment of each information pair in the second information pair set and each information pair in the third information pair set is 0 (false), which means that the message code generated by the generator at the moment is false, the accuracy is lower, and the result is the least accurate result.
Specifically, the initial arbiter includes a plurality of fully connected layers and a second output layer (the second output layer may be a device fully connected layer), where each fully connected layer includes 64 neurons, an activation function of each fully connected layer is relu, the second output layer includes one fully connected neuron, and the activation function is sigmoid. After the Sigmoid outputs the result, the worse entropy loss function is sent to obtain target parameters (a first target parameter and a second target parameter).
Wherein, the first objective function is as follows in the present application:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the loss function value of the initial generator, i.e. the first target parameter, E is the number of pairs in the set (the number of pairs in the set is the same for the three pairs), D (c i ,x i ) Assigning a value to each information pair in the first set of information pairs,>assigning values to pairs of the second set of pairs of information,/>And assigning a value to each information pair in the third information pair set. The parameters of the initial arbiter are updated using the following formula:
wherein θ d For the parameters of the initial arbiter, θ d1 The new discriminator is the initial discriminator after parameter update, eta 1 Is the learning efficiency of the initial arbiter.
The first preset condition is that the first target parameter converges, namely, for an initial discriminator after training of a batch of second training samples, the value change of the first target parameter is very small relative to the initial discriminators after training of the previous batches of second training samples, and the first target parameter value is relatively stable during the last training. And then taking the initial discriminator trained by the second training sample of the last batch as the discriminator.
Further, specifically, the step of training the initial generator by using the second message structure topological graph, the first real message code and the obtained arbiter to obtain the generator includes: determining a second selected message structure topological graph in the second message structure topological graph, and determining a second selected message code corresponding to the second selected message structure topological graph in the first real message code; acquiring a second selected random noise corresponding to the second selected message structure topological graph; generating a second result message code using the initial generator based on the second selected message structure topology map and the second selected random noise; code the second result message into the arbiter to obtain a decision result; obtaining a second target parameter based on a second target function and the judging result; updating the initial generator with the second target parameter to obtain a new generator; and taking the new generator as the initial generator, and returning to execute the step of determining a second selected message structure topological graph from the second message structure topological graph until a second target parameter meets a second preset condition, so as to obtain the generator.
It will be appreciated that for the steps preceding the generation of the second result message code, reference is made to the training procedure of the above discriminator, and will not be repeated here. The first training sample and the intermediate data corresponding to the first training sample may be directly used as the second training sample and the intermediate data corresponding to the second training sample, for example, the first result message code corresponding to the first training sample is used as the second result message code, and the step before the result message code is generated is omitted, and the generated data is directly used for training.
After the arbiter is obtained, the arbiter is used to assign a value to the second result message code, if the value of the message code (second result message code) generated by the initial generator is 0, this means that the accuracy of the initial generator is low, and if the value of the message code generated by the initial generator is 1, this means that the accuracy is high, but the value corresponding to the initial generator is usually 0. Specifically, the second objective function is as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the second target parameter, F is the number of training samples for a batch, i.e., the number of second selected message structure topologies corresponding to a second selected message code, G (c) i ,z i ) For any one of the second result message codes, D (G (c) i ,z i ) Is the initial discriminant pair G (C) i ,z i ) Is assigned to the value of (c). Initial discrimination is performed using the following formulaParameter updating of the device:
wherein θ g For the parameters of the initial generator, θ g1 As the parameters of the new generator, the new generator is the initial generator after parameter updating, eta 2 Learning efficiency for the initial generator.
The second preset condition is that the second target parameter converges, namely, for an initial generator after training of a first training sample of one batch, the value change of the second target parameter is very small relative to the initial generators after training of first training samples of the previous batches, and the second target parameter value is relatively stable during the last training. And then taking the initial generator trained by the first training sample of the last batch as the generator.
It can be understood that, for the step of inputting the experience message structure topological graph into the generator and obtaining the downlink message code, the step of obtaining the first result message code by using the first selected message structure topological graph and the first selected random noise in the training process is similar to the step of obtaining the first result message code by using the first selected message structure topological graph and the first selected random noise in the training process, which is not described herein.
Referring to fig. 5, fig. 5 is a schematic diagram of an initial generator and initial arbiter training process according to the present invention, and in fig. 5, a graph condition generating network is a network formed by the initial generator and the initial arbiter, where the graph condition is a message structure topology graph in which a condition is added to the structures of the initial generator and the initial arbiter.
Training process of initial discriminant: and generating a result information code x based on a message structure topological graph c (message structure topology) corresponding to the message to be sent in the training sample by utilizing the initial generator, and training the initial discriminator by utilizing the result information code and the message structure topological graph to obtain a trained discriminator, wherein the specific training process is referred to above and is not repeated herein.
Training process of initial generator: and generating a result information code x based on a message structure topological graph c (message structure topology) corresponding to the message to be sent in the training sample by utilizing the initial generator, judging the result information code by utilizing a trained discriminator (the true output is 1 and the false output is 0) so as to obtain an output judging result, and utilizing the output judging result D (c, x) as training return of initial generator parameters so as to complete training of the initial generator, wherein the specific training process is referred to above and is not repeated herein.
Step S14: and generating a message to be sent based on the message code to be sent.
After the message code is obtained, it is converted into a specific message to be sent.
The invention provides a message generation method which is applied to message management equipment, and comprises the following steps: receiving message data to be sent; converting the message data to be sent into a first message structure topological graph; inputting the first message structure topological graph into a training obtained generator to generate a message code sent out in a preset format; and generating a message to be sent based on the message code to be sent.
In the existing message generation method, industry technicians are required to manually convert message data to be sent to obtain message codes, and final issued messages are obtained based on the message codes, so that the message code obtaining duration is longer, and the issuing message obtaining efficiency is lower. In the invention, the message management equipment directly uses the generator obtained by training to generate the message code to be sent in the preset format based on the first message structure topological graph corresponding to the message data to be sent, and does not need industry technicians to manually convert the message data to be sent, thereby reducing the conversion time and improving the efficiency of obtaining the message to be sent.
Referring to fig. 6, fig. 6 is a block diagram showing a first embodiment of a message generating apparatus for scheduling devices based on the same inventive concept as the previous embodiment, the apparatus comprising:
a receiving module 10, configured to receive message data to be sent;
a first conversion module 20, configured to convert the message data to be sent into a first message structure topology map;
a generating module 30, configured to input the first message structure topological graph into a training-obtained generator to generate a message code sent in a preset format;
and a second conversion module 40, configured to generate a message based on the message code.
It should be noted that, since the steps executed by the apparatus of this embodiment are the same as those of the foregoing method embodiment, specific implementation manners and technical effects that can be achieved of the apparatus of this embodiment may refer to the foregoing embodiment, and will not be repeated herein.
The foregoing description is only of the optional embodiments of the present invention, and is not intended to limit the scope of the invention, and all the equivalent structural changes made by the description of the present invention and the accompanying drawings or the direct/indirect application in other related technical fields are included in the scope of the invention.

Claims (10)

1. A method of message generation, the method comprising the steps of:
receiving message data to be sent;
converting the message data to be sent into a first message structure topological graph;
inputting the first message structure topological graph into a training obtained generator to generate a message code sent out in a preset format;
generating a message to be sent based on the message code to be sent;
before the step of inputting the first message structure topological graph into a training obtained generator to generate a preset format of message code, the method further comprises:
training an initial discriminator by using an initial generator to obtain a discriminator;
training the initial generator by using the discriminator to obtain the generator;
the step of training the initial arbiter by using the initial generator to obtain the arbiter comprises the following steps:
training an initial discriminator by using the third message structure topological graph, the second real message code and the initial generator to obtain a discriminator; the third message structure topological graph is obtained by converting the second historical message data to be sent, and the second real message code is the real message code corresponding to the second historical message data to be sent.
2. The method of claim 1, wherein prior to the step of receiving message data to be transmitted, the method further comprises:
acquiring a first training sample, wherein the first training sample comprises first historical message data to be sent and a first real message code corresponding to the first historical message data to be sent;
converting the first historical message data to be sent into a second message structure topological graph;
the step of training the initial generator by using the discriminator to obtain the generator comprises the following steps:
training the initial generator by using the second message structure topological graph, the first real message code and the obtained discriminators to obtain the generator.
3. The method of claim 2, wherein the step of training the initial generator using the second message structure topology, the first real message code, and the trained arbiter to obtain the generator is preceded by the step of:
acquiring a second training sample, wherein the second training sample comprises the second historical message data to be sent and the second real message code corresponding to the second historical message data to be sent;
And converting the second historical message data to be sent into the third message structure topological graph.
4. The method of claim 3, wherein the step of training an initial arbiter using the third message structure topology map, the second real message code, and the initial generator to obtain the arbiter comprises:
determining a first selected message structure topological graph in the third message structure topological graph, and determining a first selected message code corresponding to the first selected message structure topological graph in the second real message code;
acquiring a first selected random noise corresponding to the first selected message structure topological graph;
generating a first result message code using the initial generator based on the first selected message structure topology map and the first selected random noise;
combining the first information pair set and the second information pair set to obtain a third information pair set, wherein the first selected message structure topological graph and the first selected message code form information pairs in the first information pair set, and the first selected message structure topological graph and the first result message code form information pairs in the second information pair set;
Assigning each information pair in the first information pair set, each information pair in the second information pair set and each information pair in the third information pair set to each other;
acquiring a first target parameter based on a first target function, assignment of each information pair in the first information pair set, assignment of each information pair in the second information pair set and assignment of each information pair in the third information pair set;
updating the initial arbiter with the first target parameter to obtain a new arbiter;
and taking the new discriminator as the initial discriminator, and returning to execute the step of determining a first selected message structure topological graph from the third message structure topological graph until a first target parameter meets a first preset condition, so as to obtain the discriminator.
5. The method of claim 4, wherein the training the initial generator to obtain the generator using the second message structure topology, the first real message code, and the trained arbiter comprises:
determining a second selected message structure topological graph in the second message structure topological graph, and determining a second selected message code corresponding to the second selected message structure topological graph in the first real message code;
Acquiring a second selected random noise corresponding to the second selected message structure topological graph;
generating a second result message code using the initial generator based on the second selected message structure topology map and the second selected random noise;
inputting the second result message code into the arbiter to obtain a decision result;
obtaining a second target parameter based on a second target function and the judging result;
updating the initial generator with the second target parameter to obtain a new generator;
and taking the new generator as the initial generator, and returning to execute the step of determining a second selected message structure topological graph from the second message structure topological graph until a second target parameter meets a second preset condition, so as to obtain the generator.
6. The method of claim 5, wherein the step of generating a first resulting message code using the initial generator based on the first selected message structure topology map and the first selected random noise comprises:
inputting the first selected random noise to a noise encoder in the initial generator to obtain a first feature vector;
Inputting a first selected message structure topology map to a topology encoder in the initial generator to obtain a second feature vector;
inputting the first feature vector and the second feature vector into a merging layer in the initial generator to obtain a merging vector;
inputting the combined vector into a sequence decoder in the initial generator to obtain an output message code sequence;
the sequence of output message codes is input to a first output layer in the initial generator to obtain a first result message code.
7. The method of claim 6, wherein,
the noise encoder comprises three layers of long and short memory neural networks, each layer of long and short memory neural network is provided with 32 neurons, and the activation function is relu;
the topological encoder comprises a first graph convolution layer, a second graph convolution layer and a third graph convolution layer; the number of convolution kernels of the first graph convolution layer is 256, and the activation function is relu; the number of convolution kernels of the second graph convolution layer is 128, and the activation function is relu; the number of convolution kernels of the third graph convolution layer is 64, and the activation function is lamda;
the sequence decoder comprises three layers of long-short-period memory neural networks, the number of neurons of each layer of long-short-period memory neural network is set to 128, and the activation function is set to relu;
The activation function of the first output layer is softmax, and the number of fully-connected neurons of the first output layer is the same as the output dimension of the first result message code;
the initial arbiter comprises a plurality of full-connection layers and a second output layer, wherein each full-connection layer comprises 64 neurons, the activation function of each full-connection layer is relu, the second output layer comprises one full-connection neuron, and the activation function of the second output layer is sigmoid.
8. A message generating apparatus, the apparatus comprising:
the receiving module is used for receiving the message data to be sent;
the first conversion module is used for converting the message data to be sent into a first message structure topological graph;
the generation module is used for inputting the first message structure topological graph into a training obtained generator so as to generate a preset format of message code;
the second conversion module is used for generating a message to be sent based on the message code to be sent;
the generating module is also used for training the initial discriminator by utilizing the initial generator to obtain the discriminator; training the initial generator by using the discriminator to obtain the generator;
The generation module is further used for training the initial discriminator by using the third message structure topological graph, the second real message code and the initial generator to obtain the discriminator; the third message structure topological graph is obtained by converting the second historical message data to be sent, and the second real message code is the real message code corresponding to the second historical message data to be sent.
9. A message management device, the message management device comprising: memory, a processor and a computer program stored on the memory and running on the processor, which when executed by the processor realizes the steps of the message generation method according to any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the message generation method according to any of claims 1 to 7.
CN202110629209.4A 2021-06-04 2021-06-04 Message generation method, device, message management equipment and storage medium Active CN115442324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110629209.4A CN115442324B (en) 2021-06-04 2021-06-04 Message generation method, device, message management equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110629209.4A CN115442324B (en) 2021-06-04 2021-06-04 Message generation method, device, message management equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115442324A CN115442324A (en) 2022-12-06
CN115442324B true CN115442324B (en) 2023-08-18

Family

ID=84271706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110629209.4A Active CN115442324B (en) 2021-06-04 2021-06-04 Message generation method, device, message management equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115442324B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7689531B1 (en) * 2005-09-28 2010-03-30 Trend Micro Incorporated Automatic charset detection using support vector machines with charset grouping
CN108648135A (en) * 2018-06-01 2018-10-12 深圳大学 Hide model training and application method, device and computer readable storage medium
CN109885667A (en) * 2019-01-24 2019-06-14 平安科技(深圳)有限公司 Document creation method, device, computer equipment and medium
CN111865752A (en) * 2019-04-23 2020-10-30 北京嘀嘀无限科技发展有限公司 Text processing device, method, electronic device and computer readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10089974B2 (en) * 2016-03-31 2018-10-02 Microsoft Technology Licensing, Llc Speech recognition and text-to-speech learning system
US10503834B2 (en) * 2017-11-17 2019-12-10 Digital Genius Limited Template generation for a conversational agent

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7689531B1 (en) * 2005-09-28 2010-03-30 Trend Micro Incorporated Automatic charset detection using support vector machines with charset grouping
CN108648135A (en) * 2018-06-01 2018-10-12 深圳大学 Hide model training and application method, device and computer readable storage medium
CN109885667A (en) * 2019-01-24 2019-06-14 平安科技(深圳)有限公司 Document creation method, device, computer equipment and medium
CN111865752A (en) * 2019-04-23 2020-10-30 北京嘀嘀无限科技发展有限公司 Text processing device, method, electronic device and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《Adversarial Speaker Adaptation》;Zhong Meng;《 ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)》;全文 *

Also Published As

Publication number Publication date
CN115442324A (en) 2022-12-06

Similar Documents

Publication Publication Date Title
US10284705B2 (en) Method and apparatus for controlling smart device, and computer storage medium
CN110598213A (en) Keyword extraction method, device, equipment and storage medium
CN109299458A (en) Entity recognition method, device, equipment and storage medium
CN109948633A (en) User gender prediction method, apparatus, storage medium and electronic equipment
CN111753498B (en) Text processing method, device, equipment and storage medium
CN110069715A (en) A kind of method of information recommendation model training, the method and device of information recommendation
CN111222647A (en) Federal learning system optimization method, device, equipment and storage medium
CN110162604B (en) Statement generation method, device, equipment and storage medium
CN113723378B (en) Model training method and device, computer equipment and storage medium
CN110020022A (en) Data processing method, device, equipment and readable storage medium storing program for executing
CN111507094B (en) Text processing model training method, device and equipment based on deep learning
CN114004905B (en) Method, device, equipment and storage medium for generating character style pictogram
CN115546589A (en) Image generation method based on graph neural network
CN114610677A (en) Method for determining conversion model and related device
EP4083860A1 (en) Method and apparatus for training item coding model
CN115442324B (en) Message generation method, device, message management equipment and storage medium
CN113869377A (en) Training method and device and electronic equipment
CN116186295B (en) Attention-based knowledge graph link prediction method, attention-based knowledge graph link prediction device, attention-based knowledge graph link prediction equipment and attention-based knowledge graph link prediction medium
CN112862021A (en) Content labeling method and related device
CN116229188A (en) Image processing display method, classification model generation method and equipment thereof
CN110532448B (en) Document classification method, device, equipment and storage medium based on neural network
CN111814044A (en) Recommendation method and device, terminal equipment and storage medium
CN116233495A (en) Program resource recommendation method and device, program management engine and storage medium
CN112200198B (en) Target data feature extraction method, device and storage medium
CN115168609A (en) Text matching method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant