CN115687031A - Method, device, equipment and medium for generating alarm description text - Google Patents

Method, device, equipment and medium for generating alarm description text Download PDF

Info

Publication number
CN115687031A
CN115687031A CN202211431021.XA CN202211431021A CN115687031A CN 115687031 A CN115687031 A CN 115687031A CN 202211431021 A CN202211431021 A CN 202211431021A CN 115687031 A CN115687031 A CN 115687031A
Authority
CN
China
Prior art keywords
alarm
text
target
training
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211431021.XA
Other languages
Chinese (zh)
Inventor
饶琛琳
梁玫娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youtejie Information Technology Co ltd
Original Assignee
Beijing Youtejie Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youtejie Information Technology Co ltd filed Critical Beijing Youtejie Information Technology Co ltd
Priority to CN202211431021.XA priority Critical patent/CN115687031A/en
Publication of CN115687031A publication Critical patent/CN115687031A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Alarm Systems (AREA)

Abstract

The invention discloses a method, a device, equipment and a medium for generating an alarm description text, wherein the method comprises the following steps: the method comprises the steps of obtaining a target alarm log generated by a monitoring system aiming at a target service system, inputting the target alarm log into a pre-trained text description model, and outputting a target alarm description text matched with the target alarm log through the text description model. The technical scheme of the embodiment of the invention can save the labor cost and the time cost consumed in the alarm log processing process and improve the repair efficiency of the service system.

Description

Method, device, equipment and medium for generating alarm description text
Technical Field
The invention relates to the technical field of computers, in particular to a method, a device, equipment and a medium for generating an alarm description text.
Background
With the development of information technology, most of the operations of the existing services are integrated in a computer system. In order to ensure the normal operation of the business system, an enterprise acquires the operation data of the business system in real time by deploying a monitoring system, and generates an alarm log according to the operation data. Because the service system is more and more huge and the number of alarm logs is more and more, how to screen logs with higher severity grade from a large number of alarm logs is particularly important for the development of the service system.
In the prior art, operation and maintenance personnel usually determine the severity level of each alarm log in a large number of alarm logs according to preset indexes in a manual mode, then screen the alarm logs with higher severity levels, and investigate the fault reasons of the alarm logs.
However, the existing alarm log processing method needs to consume higher labor cost and time cost, which results in lower repair efficiency of the service system.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for generating an alarm description text, which can save the labor cost and the time cost consumed in the alarm log processing process and improve the repair efficiency of a service system.
According to an aspect of the present invention, a method for generating an alarm description text is provided, including:
acquiring a target alarm log generated by a monitoring system aiming at a target service system;
inputting the target alarm log into a pre-trained text description model;
the text description model is obtained by training a unified pre-training language model UniLM through an alarm text training set; the UniLM model comprises a plurality of Transformer network layers;
outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
Optionally, before obtaining the target alarm log generated by the monitoring system for the target service system, the method further includes:
acquiring an alarm text training set corresponding to a monitoring system, wherein the alarm text training set comprises a plurality of alarm texts;
and using the alarm text training set to carry out iterative training on a UniLM model to obtain the text description model.
Optionally, obtaining an alarm text training set corresponding to the monitoring system includes:
acquiring a plurality of historical alarm logs generated by a monitoring system aiming at a plurality of service systems;
acquiring a plurality of extended alarm texts from a preset open source data warehouse;
and generating the alarm text training set according to the plurality of historical alarm logs and the plurality of extended alarm texts.
Optionally, generating the alarm text training set according to the multiple historical alarm logs and the multiple extended alarm texts includes:
according to preset alarm fields, respectively extracting alarm field information from each historical alarm log;
formatting each alarm field message to obtain a plurality of historical alarm texts;
and generating the alarm text training set according to the plurality of historical alarm texts and the plurality of extended alarm texts.
Optionally, using the alarm text training set to perform iterative training on a UniLM model to obtain the text description model, including:
generating a word embedding vector matrix corresponding to each alarm text in an alarm text training set through a UniLM (UniLM model);
embedding words corresponding to each alarm text into a vector matrix, sequentially inputting the words into a plurality of transform network layers, and training the plurality of transform network layers to obtain the text description model.
Optionally, training the multiple transform network layers to obtain the text description model includes:
dividing the alarm texts into a first text set, a second text set and a third text set according to a preset weight proportion;
performing unidirectional training on the plurality of transform network layers using the first text set;
performing bidirectional training on the plurality of transform network layers using the second text set;
performing sequence training on the plurality of transform network layers by using the third text set.
Optionally, inputting the target alarm log into a pre-trained text description model, including:
generating a target word embedding vector matrix corresponding to the target alarm log;
and embedding the target words into a vector matrix, and inputting the target words into a pre-trained text description model.
According to another aspect of the present invention, there is provided an apparatus for generating an alert description text, the apparatus comprising:
the log acquisition module is used for acquiring a target alarm log generated by the monitoring system aiming at the target service system;
the log input module is used for inputting the target alarm log into a pre-trained text description model;
the text description model is obtained by training a unified pre-training language model UniLM through an alarm text training set; the UniLM model comprises a plurality of Transformer network layers;
the text output module is used for outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the method of generating an alert description text according to any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the method for generating an alert description text according to any one of the embodiments of the present invention when the computer instructions are executed.
According to the technical scheme provided by the embodiment of the invention, a target alarm log generated by a monitoring system aiming at a target service system is acquired, the target alarm log is input into a pre-trained text description model, and a target alarm description text matched with the target alarm log is output through the text description model; the target alarm description text comprises the alarm level corresponding to the target alarm log and the technical means of the fault reason, so that the labor cost and the time cost consumed in the alarm log processing process can be saved, and the repair efficiency of the service system is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart of a method for generating an alarm description text according to an embodiment of the present invention;
FIG. 2 is a flowchart of another method for generating an alert description text according to an embodiment of the present invention;
FIG. 3 is a flowchart of another method for generating an alert description text according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an apparatus for generating an alert description text according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device implementing the method for generating an alarm description text according to the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a flowchart of a method for generating an alarm description text according to an embodiment of the present invention, where this embodiment is applicable to a situation where a corresponding description text is generated for an alarm log generated by a monitoring system, and the method may be executed by an alarm description text generation apparatus, where the alarm description text generation apparatus may be implemented in a form of hardware and/or software, and the alarm description text generation apparatus may be configured in an electronic device (for example, a terminal or a server) with a data processing function. As shown in fig. 1, the method includes:
and step 110, acquiring a target alarm log generated by the monitoring system aiming at the target service system.
In this embodiment, the monitoring system may generate a corresponding target alarm log according to the operating condition of the target service system, where the target alarm log may include multiple alarm events.
Step 120, inputting the target alarm log into a pre-trained text description model;
in this embodiment, the text description Model is obtained by using an alarm text training set to train a Unified Language Model (UniLM); the UniLM model comprises a plurality of Transformer network layers.
In a specific embodiment, the alarm text training set may include a plurality of alarm texts for training the UniLM model. The UniLM model may be a model for understanding a natural language and generating a corresponding description text. The UniLM model can be fine-tuned for natural language understanding and generation tasks. The model is pre-trained using three types of language modeling tasks: unidirectional, bidirectional, and sequence-to-sequence prediction, and uses a shared transform network layer, and a specific attention mask to control the context of the prediction conditions.
The structure of the Transformer network layer is consistent with that of the Bert model, and the Transformer network layer can be composed of a self-attention mechanism.
Step 130, outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
In this embodiment, after the target alarm log is input into the text description model, the text description model may determine an alarm level and a fault cause of the target alarm log according to a language characteristic corresponding to the target alarm log, a logical relationship between a context and a preceding and following sentence in the log, and a previously learned logical relationship between the fault characteristic and the fault cause, and generate a target alarm description text according to the alarm level and the fault cause.
In a specific embodiment, the higher the alarm level corresponding to the target alarm log is, the more serious the fault condition in the target alarm log can be considered, so that operation and maintenance personnel can repair the fault condition in time according to the fault reason.
In this embodiment, by constructing a text description model in advance, after a monitoring system generates a new alarm log, the severity level and the failure reason of the alarm log are automatically determined, so that operation and maintenance personnel can screen logs with higher levels in a large number of alarm logs in time and quickly repair failure conditions related to the logs, thereby saving labor cost and time cost consumed in the alarm log processing process and improving the repair efficiency of a service system.
According to the technical scheme provided by the embodiment of the invention, a target alarm log generated by a monitoring system aiming at a target service system is acquired, the target alarm log is input into a pre-trained text description model, and a target alarm description text matched with the target alarm log is output through the text description model; the target alarm description text comprises the alarm level corresponding to the target alarm log and the technical means of the fault reason, so that the labor cost and the time cost consumed in the alarm log processing process can be saved, and the repair efficiency of the service system is improved.
Fig. 2 is a flowchart of a method for generating an alarm description text according to a second embodiment of the present invention, which is a further refinement of the above-described embodiment. As shown in fig. 2, the method includes:
step 210, obtaining an alarm text training set corresponding to the monitoring system, where the alarm text training set includes a plurality of alarm texts.
In an embodiment of this embodiment, acquiring an alarm text training set corresponding to a monitoring system includes: acquiring a plurality of historical alarm logs generated by a monitoring system aiming at a plurality of service systems; acquiring a plurality of extended alarm texts from a preset open source data warehouse; and generating the alarm text training set according to the plurality of historical alarm logs and the plurality of extended alarm texts.
In this embodiment, the open source data warehouse may include an open source and software project oriented data platform (e.g., gitHub), and various open source retrieval websites. The open source data warehouse may include a large number of alarm texts (i.e. extended alarm texts) corresponding to other business systems.
The method has the advantages that when the UniLM model is trained, due to the fact that the system type corresponding to the historical alarm log and the fault reason are limited, the alarm texts generated by other business systems are obtained from the open source data warehouse, the sample types of the training set can be enriched, and effectiveness and accuracy of the subsequent text description model are improved.
In a specific embodiment, generating the alarm text training set according to the plurality of historical alarm logs and the plurality of extended alarm texts includes: according to preset alarm fields, respectively extracting alarm field information from each historical alarm log; formatting each alarm field information to obtain a plurality of historical alarm texts; and generating the alarm text training set according to the plurality of historical alarm texts and the plurality of extended alarm texts.
In this embodiment, in order to ensure consistency of the alarm text format, the alarm field information in the history alarm log may be formatted. The advantage of setting up like this is convenient for UniLM model according to unified processing mode, draws the characteristic of each warning text, improves the training efficiency of model from this.
And step 220, performing iterative training on a UniLM model by using the alarm text training set to obtain the text description model.
And step 230, acquiring a target alarm log generated by the monitoring system for the target service system.
And 240, inputting the target alarm log into a pre-trained text description model.
Step 250, outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
According to the technical scheme provided by the embodiment of the invention, through acquiring the alarm text training set corresponding to the monitoring system, using the alarm text training set to perform iterative training on the UniLM model to obtain the text description model, acquiring the target alarm log generated by the monitoring system aiming at the target service system, inputting the target alarm log into the pre-trained text description model, and outputting the target alarm description text matched with the target alarm log through the text description model, the labor cost and the time cost consumed in the alarm log processing process can be saved, and the repair efficiency of the service system is improved.
Fig. 3 is a flowchart of a method for generating an alarm description text according to a third embodiment of the present invention, which is a further refinement of the above-described embodiments. As shown in fig. 3, the method includes:
step 310, an alarm text training set corresponding to the monitoring system is obtained, wherein the alarm text training set comprises a plurality of alarm texts.
And step 320, generating a word embedding vector matrix corresponding to each alarm text in the alarm text training set through the UniLM model.
In this step, optionally, each alarm text may be preprocessed through the UniLM model, including semantic feature embedding, sentence feature embedding, position feature embedding, and the like, and a word embedding vector matrix H corresponding to each alarm text is obtained according to the processing result. Wherein, H = [ x ] 1 ,x 2 ,...,x j ]And x is a word vector corresponding to each participle in the alarm text.
And step 330, embedding words corresponding to the alarm texts into a vector matrix, sequentially inputting the words into a plurality of transform network layers, and training the plurality of transform network layers to obtain the text description model.
In this embodiment, optionally, the number of the transform network layers may be set to 24, and the specific value may be preset according to an actual situation, which is not limited in this embodiment. Specifically, each word of the alert text may be embedded in the vector matrix H 0 As inputs to the 24 transform network layers, the output of each transform network layer may be:
H l =Transformer l (H l-1 )
in a specific embodiment, before the word embedding vector matrix H is input to the Transformer network layer, a multiplication result of H and a preset weight matrix may be further calculated, and the multiplication result is input to the Transformer network layer, and the specific calculation manner may be as shown in the following formula:
Q=H l-1 W 1 Q
K=H l-1 W 1 K
V=H l-1 W 1 V
wherein, W 1 Q ,W 1 K And W 1 V The weight matrix is corresponding to the first layer of the transform network layer, and Q, K and V are respectively a Query vector, a Key vector sum and a Value vector corresponding to the transform network layer.
In this embodiment, when the Transformer network layer is trained, the parameters of the Transformer network layer may also be adjusted according to the mask matrix corresponding to the alarm text. Specifically, the transform network layer may control an attention range of each participle in the text according to a mask matrix of the alarm text.
Wherein, assuming that the alarm text can be focused, the corresponding mask matrix M ij May be 0; if the alarm text cannot be paid attention to, the corresponding mask matrix M ij May be- ∞. After setting the mask matrix of the alarm text, the parameters of the network layer can be adjusted according to the following formula:
Figure BDA0003945127810000101
in an implementation manner of this embodiment, training the multiple transform network layers to obtain the text description model includes: dividing the alarm texts into a first text set, a second text set and a third text set according to a preset weight proportion; performing unidirectional training on the plurality of transform network layers using the first text set; performing bidirectional training on the plurality of transform network layers using the second text set; performing sequence training on the plurality of transform network layers by using the third text set.
In a specific embodiment, 1/3 of the alarm texts may be used as a first text set, 1/3 of the alarm texts may be used as a second text set, and 1/3 of the alarm texts may be used as a third text set in all the alarm texts. The specific weight ratio may be preset according to actual situations, and this embodiment does not limit this.
In a specific embodiment, optionally, half of the texts in the first text set may be used to perform unidirectional left-to-right training on the Transformer network layer, and the other half of the texts may be used to perform unidirectional right-to-left training on the Transformer network layer. For example, assume that the pre-sequencing column is x 1 ,x 2 [mask]And x 4 Then x can be utilized 1 And x 2 Self-encoding is performed and the upper triangular matrix is used as a mask matrix.
In one embodiment, when a plurality of transform network layers are bidirectionally trained using the second text set, all tokens in the network layers may be used for encoding. For example, assume that the pre-sequencing column is x 1 ,x 2 [mask]And x 4 Then x can be utilized 1 、x 2 、x 4 Self-encoding is performed and the all 0 matrix is used as a mask matrix.
In a specific embodiment, when sequence training is performed on multiple transform network layers by using a third text set, a [ sos ] mark can be added at the beginning and an [ eos ] mark can be added at the end of the text input, so that the marks are used as task boundary identifications.
Specifically, in the sequence training process, if a predicted special mark [ mask ] appears in a first section of text, all tokens in the first section of text can be used for prediction; if the predicted special mark [ mask ] appears in the second text segment, then all tokens in the first text segment and all toknes to the left of the predicted mark in the second text segment can be used for prediction.
For example, assume that the pre-sequencing column is:[sos]x 1 x 2 [mask1]x 4 [eos]x 5 x 6 [mask2][eos]prediction of [ mask1 ]]When [ SOS ] can be removed]And [ EOS]And use of x 1 ,x 2 And x 4 Carrying out self-encoding; prediction [ mask2 ]]When [ SOS ] can be removed]And [ EOS]And using x 1 ,x 2 [mask1],x 4 ,x 5 And x 6 Self-encoding is performed.
And 340, acquiring a target alarm log generated by the monitoring system aiming at the target service system.
And 350, generating a target word embedded vector matrix corresponding to the target alarm log.
In this step, optionally, the target alarm log may be preprocessed through a UniLM model, where the preprocessing includes semantic feature embedding, sentence feature embedding, position feature embedding, and the like, and a target word embedding vector matrix is obtained according to a processing result.
And 360, embedding the target words into a vector matrix, and inputting the target words into a pre-trained text description model.
Step 370, outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
According to the technical scheme provided by the embodiment of the invention, a word embedding vector matrix corresponding to each alarm text in an alarm text training set is generated by acquiring the alarm text training set corresponding to a monitoring system through a UniLM (unified modeling language), the word embedding vector matrix corresponding to each alarm text is sequentially input to a plurality of transform network layers, a text description model is obtained by training the plurality of transform network layers, a target alarm log generated by the monitoring system aiming at a target service system is acquired, a target word embedding vector matrix corresponding to the target alarm log is generated, the target word embedding vector matrix is input to a pre-trained text description model, and the target alarm description text matched with the target alarm log is output through the text description model.
Fig. 4 is a schematic structural diagram of an apparatus for generating an alarm description text according to a fourth embodiment of the present invention, and as shown in fig. 4, the apparatus includes: a log acquisition module 410, a log input module 420, and a text output module 430.
The log obtaining module 410 is configured to obtain a target alarm log generated by the monitoring system for the target service system;
a log input module 420, configured to input the target alarm log into a pre-trained text description model;
the text description model is obtained by training a unified pre-training language model UniLM through an alarm text training set; the UniLM model comprises a plurality of Transformer network layers;
the text output module 430 is used for outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
According to the technical scheme provided by the embodiment of the invention, a target alarm log generated by a monitoring system aiming at a target service system is acquired, the target alarm log is input into a pre-trained text description model, and a target alarm description text matched with the target alarm log is output through the text description model; the target alarm description text comprises the alarm level corresponding to the target alarm log and the technical means of the fault reason, so that the labor cost and the time cost consumed in the alarm log processing process can be saved, and the repair efficiency of a service system is improved.
On the basis of the above embodiment, the apparatus further includes:
the system comprises a training set acquisition module, a monitoring system acquisition module and a warning text processing module, wherein the training set acquisition module is used for acquiring a warning text training set corresponding to the monitoring system, and the warning text training set comprises a plurality of warning texts;
and the model training module is used for performing iterative training on the UniLM model by using the alarm text training set to obtain the text description model.
The training set acquisition module comprises:
the historical log acquiring unit is used for acquiring a plurality of historical alarm logs generated by the monitoring system aiming at a plurality of service systems;
the extended text acquisition unit is used for acquiring a plurality of extended alarm texts from a preset open source data warehouse;
a training set generating unit, configured to generate the alarm text training set according to the multiple historical alarm logs and the multiple extended alarm texts;
a field extraction unit, configured to extract alarm field information from each historical alarm log according to a preset alarm field;
the field processing unit is used for formatting each alarm field information to obtain a plurality of historical alarm texts;
and the text processing unit is used for generating the alarm text training set according to the plurality of historical alarm texts and the plurality of extended alarm texts.
The model training module comprises:
the matrix generating unit is used for generating a word embedding vector matrix corresponding to each alarm text in the alarm text training set through a UniLM (UniLM model);
the matrix input unit is used for embedding words corresponding to each alarm text into a vector matrix, sequentially inputting the words into a plurality of transform network layers, and training the plurality of transform network layers to obtain the text description model;
the text dividing unit is used for dividing the plurality of alarm texts into a first text set, a second text set and a third text set according to a preset weight proportion;
a unidirectional training unit, configured to perform unidirectional training on the plurality of transform network layers by using the first text set;
a bidirectional training unit, configured to perform bidirectional training on the plurality of transform network layers by using the second text set;
and the sequence training unit is used for performing sequence training on the plurality of transform network layers by using the third text set.
The log input module 420 includes:
the alarm log processing unit is used for generating a target word embedded vector matrix corresponding to the target alarm log; and embedding the target words into a vector matrix, and inputting the target words into a pre-trained text description model.
The device can execute the methods provided by all the embodiments of the invention, and has corresponding functional modules and beneficial effects for executing the methods. For technical details which are not described in detail in the embodiments of the present invention, reference may be made to the methods provided in all the aforementioned embodiments of the present invention.
FIG. 5 illustrates a schematic diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to the bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as the generation of alert description text.
In some embodiments, the method of generating the alert description text may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the method of generating an alert description text described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured by any other suitable means (e.g., by means of firmware) to perform the method of generating the alert description text.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for generating an alarm description text is characterized by comprising the following steps:
acquiring a target alarm log generated by a monitoring system aiming at a target service system;
inputting the target alarm log into a pre-trained text description model;
the text description model is obtained by training a unified pre-training language model UniLM through an alarm text training set; the UniLM model comprises a plurality of Transformer network layers;
outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
2. The method of claim 1, before obtaining the target alarm log generated by the monitoring system for the target service system, further comprising:
acquiring an alarm text training set corresponding to a monitoring system, wherein the alarm text training set comprises a plurality of alarm texts;
and using the alarm text training set to carry out iterative training on a UniLM (UniLM) model to obtain the text description model.
3. The method of claim 2, wherein obtaining an alert text training set corresponding to a monitoring system comprises:
acquiring a plurality of historical alarm logs generated by a monitoring system aiming at a plurality of service systems;
acquiring a plurality of extended alarm texts from a preset open source data warehouse;
and generating the alarm text training set according to the plurality of historical alarm logs and the plurality of extended alarm texts.
4. The method of claim 3, wherein generating the alarm text training set according to the plurality of historical alarm logs and the plurality of extended alarm texts comprises:
according to preset alarm fields, respectively extracting alarm field information from each historical alarm log;
formatting each alarm field information to obtain a plurality of historical alarm texts;
and generating the alarm text training set according to the plurality of historical alarm texts and the plurality of extended alarm texts.
5. The method of claim 2, wherein iteratively training a UniLM model using the alarm text training set to obtain the text description model comprises:
generating a word embedding vector matrix corresponding to each alarm text in an alarm text training set through a UniLM (UniLM model);
embedding words corresponding to each alarm text into a vector matrix, sequentially inputting the words into a plurality of transform network layers, and training the plurality of transform network layers to obtain the text description model.
6. The method of claim 5, wherein training the plurality of transform network layers to obtain the textual description model comprises:
dividing the plurality of alarm texts into a first text set, a second text set and a third text set according to a preset weight proportion;
performing unidirectional training on the plurality of transform network layers using the first text set;
performing bidirectional training on the plurality of transform network layers using the second text set;
using the third text set, performing sequence training on the plurality of transform network layers.
7. The method of claim 1, wherein inputting the target alarm log into a pre-trained textual description model comprises:
generating a target word embedding vector matrix corresponding to the target alarm log;
and embedding the target words into a vector matrix, and inputting the target words into a pre-trained text description model.
8. An apparatus for generating an alarm description text, comprising:
the log acquisition module is used for acquiring a target alarm log generated by the monitoring system aiming at the target service system;
the log input module is used for inputting the target alarm log into a pre-trained text description model;
the text description model is obtained by training a unified pre-training language model UniLM through an alarm text training set; the UniLM model comprises a plurality of Transformer network layers;
the text output module is used for outputting a target alarm description text matched with the target alarm log through the text description model; the target alarm description text comprises an alarm level corresponding to the target alarm log and a fault reason.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the method of generating an alert description text of any of claims 1-7.
10. A computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions for causing a processor to implement the method for generating an alert description text according to any one of claims 1-7 when executed.
CN202211431021.XA 2022-11-15 2022-11-15 Method, device, equipment and medium for generating alarm description text Pending CN115687031A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211431021.XA CN115687031A (en) 2022-11-15 2022-11-15 Method, device, equipment and medium for generating alarm description text

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211431021.XA CN115687031A (en) 2022-11-15 2022-11-15 Method, device, equipment and medium for generating alarm description text

Publications (1)

Publication Number Publication Date
CN115687031A true CN115687031A (en) 2023-02-03

Family

ID=85051134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211431021.XA Pending CN115687031A (en) 2022-11-15 2022-11-15 Method, device, equipment and medium for generating alarm description text

Country Status (1)

Country Link
CN (1) CN115687031A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089231A (en) * 2023-02-13 2023-05-09 北京优特捷信息技术有限公司 Fault alarm method and device, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110083826A (en) * 2019-03-21 2019-08-02 昆明理工大学 A kind of old man's bilingual alignment method based on Transformer model
WO2019210820A1 (en) * 2018-05-03 2019-11-07 华为技术有限公司 Information output method and apparatus
CN110765270A (en) * 2019-11-04 2020-02-07 苏州思必驰信息科技有限公司 Training method and system of text classification model for spoken language interaction
CN111639163A (en) * 2020-04-29 2020-09-08 深圳壹账通智能科技有限公司 Problem generation model training method, problem generation method and related equipment
CN111723547A (en) * 2020-05-25 2020-09-29 河海大学 Text automatic summarization method based on pre-training language model
CN111783459A (en) * 2020-05-08 2020-10-16 昆明理工大学 Laos named entity recognition method based on improved transform + CRF
CN113094200A (en) * 2021-06-07 2021-07-09 腾讯科技(深圳)有限公司 Application program fault prediction method and device
CN113723115A (en) * 2021-09-30 2021-11-30 平安科技(深圳)有限公司 Open domain question-answer prediction method based on pre-training model and related equipment
CN113821408A (en) * 2021-09-23 2021-12-21 中国建设银行股份有限公司 Server alarm processing method and related equipment
CN114528845A (en) * 2022-02-14 2022-05-24 中国工商银行股份有限公司 Abnormal log analysis method and device and electronic equipment
CN114968633A (en) * 2022-04-14 2022-08-30 阿里巴巴(中国)有限公司 Abnormal log detection method and device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019210820A1 (en) * 2018-05-03 2019-11-07 华为技术有限公司 Information output method and apparatus
CN110083826A (en) * 2019-03-21 2019-08-02 昆明理工大学 A kind of old man's bilingual alignment method based on Transformer model
CN110765270A (en) * 2019-11-04 2020-02-07 苏州思必驰信息科技有限公司 Training method and system of text classification model for spoken language interaction
CN111639163A (en) * 2020-04-29 2020-09-08 深圳壹账通智能科技有限公司 Problem generation model training method, problem generation method and related equipment
CN111783459A (en) * 2020-05-08 2020-10-16 昆明理工大学 Laos named entity recognition method based on improved transform + CRF
CN111723547A (en) * 2020-05-25 2020-09-29 河海大学 Text automatic summarization method based on pre-training language model
CN113094200A (en) * 2021-06-07 2021-07-09 腾讯科技(深圳)有限公司 Application program fault prediction method and device
CN113821408A (en) * 2021-09-23 2021-12-21 中国建设银行股份有限公司 Server alarm processing method and related equipment
CN113723115A (en) * 2021-09-30 2021-11-30 平安科技(深圳)有限公司 Open domain question-answer prediction method based on pre-training model and related equipment
CN114528845A (en) * 2022-02-14 2022-05-24 中国工商银行股份有限公司 Abnormal log analysis method and device and electronic equipment
CN114968633A (en) * 2022-04-14 2022-08-30 阿里巴巴(中国)有限公司 Abnormal log detection method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
岳增营; 叶霞; 刘睿珩: "基于语言模型的预训练技术研究综述", 《中文信息学报》, vol. 35, no. 9, pages 2 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116089231A (en) * 2023-02-13 2023-05-09 北京优特捷信息技术有限公司 Fault alarm method and device, electronic equipment and storage medium
CN116089231B (en) * 2023-02-13 2023-09-15 北京优特捷信息技术有限公司 Fault alarm method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN113239705B (en) Pre-training method and device of semantic representation model, electronic equipment and storage medium
CN113792154B (en) Method and device for determining fault association relationship, electronic equipment and storage medium
CN113722493A (en) Data processing method, device, storage medium and program product for text classification
CN113986864A (en) Log data processing method and device, electronic equipment and storage medium
CN116089231B (en) Fault alarm method and device, electronic equipment and storage medium
CN113407610A (en) Information extraction method and device, electronic equipment and readable storage medium
CN115454706A (en) System abnormity determining method and device, electronic equipment and storage medium
CN116307672A (en) Fault diagnosis method, device, electronic equipment and medium
CN112559885A (en) Method and device for determining training model of map interest point and electronic equipment
CN115293149A (en) Entity relationship identification method, device, equipment and storage medium
JP2023025126A (en) Training method and apparatus for deep learning model, text data processing method and apparatus, electronic device, storage medium, and computer program
CN115687031A (en) Method, device, equipment and medium for generating alarm description text
CN113806522A (en) Abstract generation method, device, equipment and storage medium
CN113190746A (en) Recommendation model evaluation method and device and electronic equipment
CN112906368A (en) Industry text increment method, related device and computer program product
US20230185646A1 (en) Method for early warning of failure, electronic device and non-transitory computer-readable storage medium
CN116755974A (en) Cloud computing platform operation and maintenance method and device, electronic equipment and storage medium
US20230052623A1 (en) Word mining method and apparatus, electronic device and readable storage medium
CN115618234A (en) Model training method, device, equipment and storage medium
CN114254028A (en) Event attribute extraction method and device, electronic equipment and storage medium
CN114202309A (en) Method for determining matching parameters of user and enterprise, electronic device and program product
CN114254650A (en) Information processing method, device, equipment and medium
CN114330718A (en) Method and device for extracting causal relationship and electronic equipment
CN113051926A (en) Text extraction method, equipment and storage medium
CN113360346B (en) Method and device for training model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20230203

RJ01 Rejection of invention patent application after publication