US11443168B2 - Log analysis system employing long short-term memory recurrent neural net works - Google Patents
Log analysis system employing long short-term memory recurrent neural net works Download PDFInfo
- Publication number
- US11443168B2 US11443168B2 US16/817,799 US202016817799A US11443168B2 US 11443168 B2 US11443168 B2 US 11443168B2 US 202016817799 A US202016817799 A US 202016817799A US 11443168 B2 US11443168 B2 US 11443168B2
- Authority
- US
- United States
- Prior art keywords
- log
- message
- vectors
- sequence
- processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G06N3/0454—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
- G06F11/0706—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment
- G06F11/0727—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment in a storage system, e.g. in a DASD or network based storage system
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
- G06F11/0751—Error or fault detection not based on redundancy
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
- G06F11/0766—Error or fault reporting or storing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3476—Data logging
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
- G06F18/2137—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on criteria of topology preservation, e.g. multidimensional scaling or self-organising maps
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2433—Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/30003—Arrangements for executing specific machine instructions
- G06F9/30007—Arrangements for executing specific machine instructions to perform operations on data operands
- G06F9/30036—Instructions to perform operations on packed data, e.g. vector, tile or matrix operations
-
- G06K9/6251—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1408—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
- H04L63/1425—Traffic logging, e.g. anomaly detection
Definitions
- the present invention is related to the field of data storage, and in particular to techniques for automated analysis of operational logs of from a data storage system or similar log source.
- a disclosed method includes vectorizing log messages of the system logs to generate corresponding log-message vectors; applying long short-term memory (LSTM) neural network processing to the log-message vectors to generate an LSTM output sequence representing a production flow of the processes; and applying second-level neural network processing to a combination of the LSTM output sequence and a training sequence to generate an analysis sequence containing a representation of anomalies in the production flow of the processes, the training sequence generated from a non-anomalous training flow of the processes.
- LSTM long short-term memory
- second-level neural network processing to a combination of the LSTM output sequence and a training sequence to generate an analysis sequence containing a representation of anomalies in the production flow of the processes, the training sequence generated from a non-anomalous training flow of the processes.
- an anomaly report is generated and provided to a report consumer for taking further action with respect to the anomalies represented in the analysis sequence.
- FIG. 1 is a block diagram of a data processing system including a log analyzer
- FIG. 2 is a flow diagram of log analysis processing
- FIGS. 3-5 are schematic diagrams of recurrent neural networks (RNNs), including a long short-term memory (LSTM) variant shown in FIG. 5 ; and
- FIG. 6 is a simplified plot of vectorized log messages exhibiting clustering due to relatedness by source process.
- a disclosed technique is a solution that can be incorporated into a data storage system (or other intelligent system) that employs aspects of natural language processing (NLP) and deep learning technologies.
- NLP natural language processing
- the log analyzer is an artificially intelligent feature, amalgamating NLP and deep learning technologies to address the above needs.
- a system has capability to narrow down to a specific problem area by using system logs, thus resulting in lesser manual engineering effort which can be a huge savings in time and effort for both customers and product personnel.
- the technique employs a technology called word embedding, which is the use of a learned representation for text where words that have the same meaning have a similar representation.
- Word embeddings are a class of techniques where individual words are represented as real-valued vectors in a predefined vector space. Each word is represented by a real-valued vector, often tens or hundreds of dimensions. The distributed representation is learned based on the usage of words. This allows words that are used in similar ways to result in having similar representations, naturally capturing their meaning.
- the technique also employs recurrent neural networks (RNNs), which are a class of artificial neural networks where connections between nodes form a directed graph along a temporal sequence. This allows an RNN to exhibit temporal dynamic behavior. Unlike feedforward neural networks, RNNs can use their internal state (memory) to process sequences of inputs.
- FIG. 1 is a block diagram of a data processing system having a first system 10 shown as a “log source”, i.e., a source of operational logs 12 as briefly discussed above, a log analyzer 14 , and a management (MGMT) system 16 .
- the log source 10 is a data storage system (DSS) that provides data storage services to remote host computers (HOSTS), but it will be appreciated that the disclosed technique may be used with other types of log sources and in other application domains.
- the log analyzer 14 generates one or more anomaly reports (ANOM RPT) 18 based on the logs 12 , and provides the report(s) 18 to the management system 16 .
- ANOM RPT anomaly reports
- the management system 16 may perform control functions and generate corresponding control signals (CNTL) 20 for the log source 10 and/or log analyzer 14 , and it may also have an interface to a local or remote management user (MGMT USER). As explained below, in alternative embodiments an anomaly report(s) 18 may be used in other ways.
- CTL control signals
- MGMT USER local or remote management user
- the remaining description refers to a single anomaly report 18 .
- a collection of such concurrent reports may be viewed as a single logical report 18 .
- the log source 10 , log analyzer 14 , and management system 16 are all computerized systems having hardware structures as generally known in the art.
- the log analyzer 14 and management 16 may be realized as computer servers with specialized programming to provide functionality as described herein.
- the log source 10 may include purpose-built hardware including a specialized storage processing complex, an array of data storage devices, and specialized interface circuitry for high-speed, storage-oriented interconnection with the hosts.
- the log source 10 may be realized as a specialized computerized device, specifically one that performs operational logging and makes the logs 12 available to the log analyzer 14 .
- the logs 12 contain streams of log messages pertaining to operations of the log source 10 .
- One aspect of the log analyzer 14 is an ability to segregate log messages of different processes being performed concurrently at the log source 10 as reflected in the log messages, as described more below, and thus in general the log source 10 may be a complicated, modern computerized system having very complex, multi-threaded functionality (such as the example of a DSS).
- the anomaly report 18 identifies anomalies (deviations from normal operation) on a per-process basis. Illustrative examples are provided below.
- the management 16 may use the anomaly report 18 for taking further action with respect to the reported anomalies, such as alerting a system administrator, invoking an automated process for deeper analysis or corrective re-configuration (using control signals 20 ), etc. Additionally or instead, the report may be sent to engineers who can more quickly diagnose problems in operational code etc., with this whole process taking considerably less time compared to the scenario when the engineers have to go through a large volume of logs to just find the issue.
- FIG. 2 provides a flow diagram of operation of the log analyzer 14 at a high level.
- the illustrated processing may be initiated automatically or manually (e.g., invocation by a management user), and it may be performed periodically or more on demand, e.g., when an administrator or automated supervisor has a particular reason to perform log analysis.
- a core aspect is a multiple-level vectorization of log messages, followed by a certain type of neural-network processing using so-called Long Short-Term Memory (LSTM) networks. This processing provides certain benefits as described more below.
- LSTM Long Short-Term Memory
- a first step 30 the log analyzer 14 first cleans the logs 12 to remove noise words such as “the”, “of”, etc., and then does a first-level (word-level) tokenization and vectorization of the remaining words in the log messages contained in the logs 12 .
- the processing at 30 uses word embedding as described above.
- the output of step 30 is an alternative representation of the logs in which each log message is represented as a respective sequence of word vectors produced by the cleaning and word embedding.
- a log message “Filesystem is degrading” might be processed into two word vectors V 1 , V 2 , where V 1 represents “filesystem” and V 2 represents “degrading”.
- a second-level (message-level) vectorization of the sequence of word vectors of each log message generating a respective log message vector for each distinct log message.
- This is essentially a set of representations of the log messages in a multi-dimensional vector space. Similar to the word-level vectorization, the messages become vectorized into respective vectors M 1 , M 2 , . . . in this multi-dimensional log-message space.
- One feature of this vectorization is that log messages that are apparently related, such as those belonging to the same process flow, tend to be clustered nearer to each other than to messages of other process flows. This clustering is exploited to extract distinct sequences of vectorized messages for the respective process flows. This is performed by cluster processing 34 which produces the sequences, shown as CLUS 1 , . . . , CLUS n. An example of this clustering is provided below for illustration.
- Each of the sequences CLUS 1 , . . . , CLUS n is then processed in a multi-layer fashion to identify any anomalies that the sequence may contain.
- this processing includes LSTM-based processing (LSTM PROC'G) 36 followed by dense neural network processing (DNN PROC'G) 38 , which generates anomaly indicators (ANOMs) when anomalies are present.
- LSTM PROC'G LSTM-based processing
- DNN PROC'G dense neural network processing
- ANOMs anomaly indicators
- the DNN processing 38 obtains current or “production” (PROD'N) inputs from the LSTM processing 36 and “happy path” (H.P.) inputs from a model 40 .
- the production inputs represent a production flow of the respective processes that may be anomalous, while the happy path inputs represent a training flow during a separate, non-anomalous execution of the same process, on which the model 40 has been trained.
- the DNN processing 38 operates to predict and report on the amount of deviation in the production flow of log messages from the happy path flow. After the model 40 has been trained on an ideal dataset, the DNN processing 38 is able to detect anomalies (deviations from normal log flow, which may reflect errors of interest) in the sequence of log messages from the production flow.
- the anomaly indicators ANOMs from the DNN processing 38 for the various clusters is provided to a reporting function (or report generator) 42 , which converts the anomaly indicators into corresponding anomaly reports in a format suitable for a report consumer.
- the reports are text files describing the anomalies, e.g., extra or missing log messages (actual messages, not message vectors).
- utilities like “cscope” can be used to identify the code path corresponding to the log messages.
- FIGS. 3-5 illustrate certain characteristics of recurrent neural networks (RNNs) and LSTMs in particular.
- FIG. 3 illustrates basic RNN structure, which is a cyclical repetition of a function A on a time series of input x t to generate a time series of output h t , with each stage also receiving the output of the preceding stage as a second input.
- FIG. 3 shows a compact representation at left and an “unrolled” representation at right.
- FIG. 4 shows the basic per-stage processing, which in this example is a hyperbolic tangent (tan h) function of the current input x t and the preceding-stage output h t-1 .
- tan h hyperbolic tangent
- FIG. 5 shows a single stage 50 of an LSTM, which is a special kind of RNN that is capable of learning long-term dependencies, incorporating features to mitigate the problem of vanishing gradient and thus increasing the accuracy of the output.
- an LSTM has the form of a chain of repeating modules of neural network, but the repeating module has a more specific structure. Instead of having a single neural network path or operation (such as single tan h), there are four paths/operations, and they interact in a special way.
- an LSTM includes a cell, an input gate, an output gate and a “forget” gate.
- the cell remembers values over potentially long time periods, and the three gates regulate the flow of information into and out of the cell. This structure helps preserve the error that can be back propagated through time and layers. By maintaining a more constant error, LSTMs allow recurrent nets to continue to learn over many time steps (e.g., over 1000), thereby opening a channel to link causes and effects remotely. LSTMs contain information outside the normal flow of the recurrent network in a gated cell. Information can be stored in, written to, or read from the cell, much like data in a computer's memory. The cell makes decisions about what to store, and when to allow reads, writes and erasures, via gates that open and close. Unlike the digital storage on computers, however, these gates are analog, implemented with element-wise multiplication by sigmoids ⁇ , which are all in the range of 0-1. Analog has the advantage over digital of being differentiable, and therefore suitable for backpropagation.
- FIG. 6 shows a representation of vectorized log messages in which the above-mentioned clustering is apparent (clustering indicated by circles).
- each line is a log message generated by a respective process of the log generator 10 (in this case a DSS) at a respective time, with time progressing downward in this sequence.
- This example relates to logging of events in the processing of input/output (IO) commands which are directed to objects of a file system.
- the messages are tagged with numbers 1-8 to correlate them with the vectorized representation in FIG. 6 .
- the filesystem object 0x01 has received an IO. - - - 1
- the filesystem object 0x01 is processing IO. - - - 2
- the filesystem object 0x01 has completed processing the IO. - - - 5
- the filesystem object 0x02 is processing delete request. - - - 6
- the first step is to tokenize the individual words in the log messages after cleaning them for commonly occurring words.
- This array of tokens is again tokenized which is essentially tokenizing each whole log message using only relevant word tokens.
- These message tokens are then represented in the form of multi-dimensional space vectors using the word embedding techniques as described above.
- similar log messages are clustered near each other in the multi-dimensional space.
- four clusters appear: log messages 1, 2, 5, 8 in one cluster group (IO for 0x01), messages 3, 6 in another (IO for 0x02), and messages 4 (abort) and 7 (failover) in respective single-element “clusters”.
- the cluster processing 34 intelligently groups log messages of the same process flows together and separates them from other flows, forming a solid data set for training and analysis using the LSTMs.
- the log analyzer can be realized as a standalone system or tool such as described above, e.g., as a specialized server. In alternative embodiments, it may be integrated with the log source, and/or with a consumer of the anomaly reports such as a management system. In another aspect, the anomaly reports may alternatively be provided to a higher-level automated tool or process such as an analytics engine, classification engine, etc. for additional processing and use.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Quality & Reliability (AREA)
- Computer Security & Cryptography (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Debugging And Monitoring (AREA)
Abstract
Description
Claims (8)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/817,799 US11443168B2 (en) | 2020-03-13 | 2020-03-13 | Log analysis system employing long short-term memory recurrent neural net works |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/817,799 US11443168B2 (en) | 2020-03-13 | 2020-03-13 | Log analysis system employing long short-term memory recurrent neural net works |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210287068A1 US20210287068A1 (en) | 2021-09-16 |
| US11443168B2 true US11443168B2 (en) | 2022-09-13 |
Family
ID=77664784
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/817,799 Active 2041-04-21 US11443168B2 (en) | 2020-03-13 | 2020-03-13 | Log analysis system employing long short-term memory recurrent neural net works |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US11443168B2 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP4446942A3 (en) * | 2023-04-14 | 2024-11-20 | Viavi Solutions Inc. | Determination of dense embedding tensors for log data using blockwise recurrent neural networks |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12086038B2 (en) * | 2021-01-06 | 2024-09-10 | Kyndryl, Inc. | Unsupervised log data anomaly detection |
| CN113890821B (en) * | 2021-09-24 | 2023-11-17 | 绿盟科技集团股份有限公司 | Log association method and device and electronic equipment |
| CN114785606B (en) * | 2022-04-27 | 2024-02-02 | 哈尔滨工业大学 | A log anomaly detection method, electronic device and storage medium based on pre-trained LogXLNet model |
| CN114640548A (en) * | 2022-05-18 | 2022-06-17 | 宁波市镇海区大数据投资发展有限公司 | Network security sensing and early warning method and system based on big data |
| CN115021981B (en) * | 2022-05-18 | 2024-06-18 | 桂林电子科技大学 | A method for intrusion detection and tracing of industrial control systems |
| CN118885362A (en) * | 2024-09-30 | 2024-11-01 | 湖南工商大学 | Log dual-mode anomaly detection method and system based on full semantic information |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9508340B2 (en) | 2014-12-22 | 2016-11-29 | Google Inc. | User specified keyword spotting using long short term memory neural network feature extractor |
| US20170063912A1 (en) * | 2015-08-31 | 2017-03-02 | Splunk Inc. | Event mini-graphs in data intake stage of machine data processing platform |
| US20170262996A1 (en) | 2016-03-11 | 2017-09-14 | Qualcomm Incorporated | Action localization in sequential data with attention proposals from a recurrent network |
| US9830315B1 (en) | 2016-07-13 | 2017-11-28 | Xerox Corporation | Sequence-based structured prediction for semantic parsing |
| US10558750B2 (en) | 2016-11-18 | 2020-02-11 | Salesforce.Com, Inc. | Spatial attention model for image captioning |
| US20200279157A1 (en) | 2017-10-16 | 2020-09-03 | Illumina, Inc. | Deep Learning-Based Techniques for Training Deep Convolutional Neural Networks |
| US20210019209A1 (en) * | 2019-07-15 | 2021-01-21 | Microsoft Technology Licensing, Llc | Health indicator platform for software regression reduction |
| US20210281592A1 (en) * | 2020-03-06 | 2021-09-09 | International Business Machines Corporation | Hybrid Machine Learning to Detect Anomalies |
| US20220036154A1 (en) * | 2020-07-30 | 2022-02-03 | International Business Machines Corporation | Unsupervised multi-dimensional computer-generated log data anomaly detection |
| US20220114437A1 (en) * | 2020-10-14 | 2022-04-14 | Dell Products L.P. | Correlating data center resources in a multi-tenant execution environment using machine learning techniques |
-
2020
- 2020-03-13 US US16/817,799 patent/US11443168B2/en active Active
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9508340B2 (en) | 2014-12-22 | 2016-11-29 | Google Inc. | User specified keyword spotting using long short term memory neural network feature extractor |
| US20170063912A1 (en) * | 2015-08-31 | 2017-03-02 | Splunk Inc. | Event mini-graphs in data intake stage of machine data processing platform |
| US20170262996A1 (en) | 2016-03-11 | 2017-09-14 | Qualcomm Incorporated | Action localization in sequential data with attention proposals from a recurrent network |
| US9830315B1 (en) | 2016-07-13 | 2017-11-28 | Xerox Corporation | Sequence-based structured prediction for semantic parsing |
| US10558750B2 (en) | 2016-11-18 | 2020-02-11 | Salesforce.Com, Inc. | Spatial attention model for image captioning |
| US20200279157A1 (en) | 2017-10-16 | 2020-09-03 | Illumina, Inc. | Deep Learning-Based Techniques for Training Deep Convolutional Neural Networks |
| US20210019209A1 (en) * | 2019-07-15 | 2021-01-21 | Microsoft Technology Licensing, Llc | Health indicator platform for software regression reduction |
| US20210281592A1 (en) * | 2020-03-06 | 2021-09-09 | International Business Machines Corporation | Hybrid Machine Learning to Detect Anomalies |
| US20220036154A1 (en) * | 2020-07-30 | 2022-02-03 | International Business Machines Corporation | Unsupervised multi-dimensional computer-generated log data anomaly detection |
| US20220114437A1 (en) * | 2020-10-14 | 2022-04-14 | Dell Products L.P. | Correlating data center resources in a multi-tenant execution environment using machine learning techniques |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP4446942A3 (en) * | 2023-04-14 | 2024-11-20 | Viavi Solutions Inc. | Determination of dense embedding tensors for log data using blockwise recurrent neural networks |
Also Published As
| Publication number | Publication date |
|---|---|
| US20210287068A1 (en) | 2021-09-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11443168B2 (en) | Log analysis system employing long short-term memory recurrent neural net works | |
| US11790256B2 (en) | Analyzing test result failures using artificial intelligence models | |
| Nedelkoski et al. | Anomaly detection and classification using distributed tracing and deep learning | |
| US11422776B2 (en) | Intelligent assistant for automating recommendations for analytics programs | |
| Yu et al. | Use of deep learning model with attention mechanism for software fault prediction | |
| CN115545169A (en) | Method, system and equipment for abnormal detection of multi-view business process based on GRU-AE network | |
| CN118113503A (en) | A method, device, equipment and storage medium for predicting faults in an intelligent operation and maintenance system | |
| CN119760655A (en) | Data analysis method and device based on intelligent data analysis large model | |
| Huangfu et al. | System failure detection using deep learning models integrating timestamps with nonuniform intervals | |
| Itkin et al. | User-assisted log analysis for quality control of distributed fintech applications | |
| CN118521275B (en) | Method, system, medium and device for synthesizing abnormal RPA workflow data | |
| US12340194B1 (en) | Systems and methods for streamlining model risk documentation platform outputs using natively sourced kernels | |
| Tinawi | Machine learning for time series anomaly detection | |
| Kukic et al. | Divide-and-conquer one-step simulator for the generation of synthetic households | |
| Wang et al. | Data enhancement for data-driven modeling in power plants based on a conditional variational-adversarial generative network | |
| Maharaj | Generalizing in the real world with representation learning | |
| Partovian et al. | Loggenst: A framework for synthetic log generation using llms for smart-troubleshooting | |
| Gil et al. | Advanced log analysis for operations at Paranal Observatory | |
| Sudan et al. | Prediction of success and complex event processing in E-learning | |
| Marry et al. | Federated Learning-Driven Decentralized Intelligence for Explainable Anomaly Detection in Industrial Operations | |
| Yablonsky | Machine learning-driven innovation management: Conceptional framework | |
| Fernandes et al. | Impact of Non-Fitting Cases for Remaining Time Prediction in a Multi-Attribute Process-Aware Method. | |
| Falegnami et al. | Blossoming instantiations in FRAM: a temporal tensor framework for socio-technical systems | |
| Li et al. | From Logs to Insights: Exploring User Behavior in RobotStudio | |
| Youseff et al. | Machine Learning–Assisted Data Validation and Anomaly Detection in Real-Time ETL Systems |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., TEXAS Free format text: SECURITY AGREEMENT;ASSIGNORS:CREDANT TECHNOLOGIES INC.;DELL INTERNATIONAL L.L.C.;DELL MARKETING L.P.;AND OTHERS;REEL/FRAME:053546/0001 Effective date: 20200409 |
|
| AS | Assignment |
Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNORS:DELL PRODUCTS L.P.;EMC IP HOLDING COMPANY LLC;REEL/FRAME:052771/0906 Effective date: 20200528 |
|
| AS | Assignment |
Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS Free format text: SECURITY INTEREST;ASSIGNORS:DELL PRODUCTS L.P.;EMC CORPORATION;EMC IP HOLDING COMPANY LLC;REEL/FRAME:053311/0169 Effective date: 20200603 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS Free format text: SECURITY INTEREST;ASSIGNORS:DELL PRODUCTS L.P.;EMC IP HOLDING COMPANY LLC;REEL/FRAME:052852/0022 Effective date: 20200603 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS Free format text: SECURITY INTEREST;ASSIGNORS:DELL PRODUCTS L.P.;EMC IP HOLDING COMPANY LLC;REEL/FRAME:052851/0917 Effective date: 20200603 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS Free format text: SECURITY INTEREST;ASSIGNORS:DELL PRODUCTS L.P.;EMC IP HOLDING COMPANY LLC;THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT;REEL/FRAME:052851/0081 Effective date: 20200603 |
|
| AS | Assignment |
Owner name: EMC IP HOLDING COMPANY LLC, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SRINIVAS, VIVEK;JINDAL, BHAVNA;REEL/FRAME:052864/0701 Effective date: 20200316 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE OF SECURITY INTEREST AT REEL 052771 FRAME 0906;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058001/0298 Effective date: 20211101 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST AT REEL 052771 FRAME 0906;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:058001/0298 Effective date: 20211101 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| AS | Assignment |
Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0742 Effective date: 20220329 Owner name: EMC CORPORATION, MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0742 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053311/0169);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060438/0742 Effective date: 20220329 Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052851/0917);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060436/0509 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052851/0917);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060436/0509 Effective date: 20220329 Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052851/0081);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060436/0441 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052851/0081);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060436/0441 Effective date: 20220329 Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052852/0022);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060436/0582 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (052852/0022);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:060436/0582 Effective date: 20220329 Owner name: DELL MARKETING L.P. (ON BEHALF OF ITSELF AND AS SUCCESSOR-IN-INTEREST TO CREDANT TECHNOLOGIES, INC.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 Owner name: DELL INTERNATIONAL L.L.C., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 Owner name: DELL PRODUCTS L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 Owner name: DELL USA L.P., TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 Owner name: EMC CORPORATION, MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 Owner name: DELL MARKETING CORPORATION (SUCCESSOR-IN-INTEREST TO FORCE10 NETWORKS, INC. AND WYSE TECHNOLOGY L.L.C.), TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 Owner name: EMC IP HOLDING COMPANY LLC, TEXAS Free format text: RELEASE OF SECURITY INTEREST IN PATENTS PREVIOUSLY RECORDED AT REEL/FRAME (053546/0001);ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS NOTES COLLATERAL AGENT;REEL/FRAME:071642/0001 Effective date: 20220329 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |