CN114611816A - Latent event prediction method, device, equipment and storage medium - Google Patents

Latent event prediction method, device, equipment and storage medium Download PDF

Info

Publication number
CN114611816A
CN114611816A CN202210281274.7A CN202210281274A CN114611816A CN 114611816 A CN114611816 A CN 114611816A CN 202210281274 A CN202210281274 A CN 202210281274A CN 114611816 A CN114611816 A CN 114611816A
Authority
CN
China
Prior art keywords
event
graph
nodes
scene
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210281274.7A
Other languages
Chinese (zh)
Other versions
CN114611816B (en
Inventor
姚旭杨
李伟
谷红明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Telecom Corp Ltd
Original Assignee
China Telecom Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Telecom Corp Ltd filed Critical China Telecom Corp Ltd
Priority to CN202210281274.7A priority Critical patent/CN114611816B/en
Publication of CN114611816A publication Critical patent/CN114611816A/en
Application granted granted Critical
Publication of CN114611816B publication Critical patent/CN114611816B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The disclosure provides a potential event prediction method, a potential event prediction device, a potential event prediction equipment and a storage medium, and relates to the technical field of machine learning cognitive computation. The method comprises the following steps: acquiring a matter graph of a target event scene, wherein the matter graph comprises: the method comprises the following steps that a plurality of event nodes and node connection relations among the event nodes are achieved, and each event node corresponds to an event in a target event scene; determining feature vectors of a case map using a graph neural network model; inputting the characteristic vectors of the case map into a pre-trained neural network model, and predicting missing event nodes in the case map and corresponding node connection relations; and determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation. The method and the device can improve the accuracy of predicting the event based on the event text.

Description

Potential event prediction method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of machine learning cognitive computing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for predicting a potential event.
Background
In the context of the information age, events that have occurred may be recorded as documents for storage, and used in scenarios such as message distribution or event analysis. However, in the process of summarizing events into texts, situations of loss, omission or fuzziness of key information occur inevitably, and the effectiveness of the information in the texts is seriously affected.
In the related art, an algorithm based on natural language processing is used to predict the missing event. By taking the text describing the events as input, the relation of each event in the text is modeled depending on the complexity of the model, but the prediction result completely depends on the normalization of the input, so that the reliability of the output result is not stable enough.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a method, an apparatus, an electronic device and a storage medium for predicting a potential event, which at least to some extent overcome the problem of unreliable output results of the method for predicting a missing event provided in the related art. The method for predicting the event based on the event text in the related art has the technical problem that the prediction result is inaccurate.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, there is provided a potential event prediction method, including: acquiring a matter graph of a target event scene, wherein the matter graph comprises: a plurality of event nodes and node connection relations among the event nodes, wherein each event node corresponds to one event in the target event scene; determining feature vectors of the event graph using a graph neural network model; inputting the characteristic vectors of the matter graph into a pre-trained neural network model, and predicting the missing event nodes in the matter graph and the corresponding node connection relation; and determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
In one embodiment of the present disclosure, determining feature vectors of the case atlas using a graph neural network model comprises: determining feature vectors of each event node in the event graph by using a graph neural network model; and determining the characteristic vector of the event graph according to the characteristic vector of each event node in the event graph.
In an embodiment of the present disclosure, before inputting the feature vectors of the event graph into a pre-trained neural network model to predict missing event nodes and corresponding node connection relationships in the event graph, the method further includes: acquiring a plurality of complete affair maps, wherein each complete affair map corresponds to an event scene; deleting event nodes in each complete affair map and node connection relations among the event nodes in proportion to obtain a missing affair map of each complete affair map; and training the neural network model by taking the plurality of complete case maps and the corresponding missing case maps as training data to obtain the trained neural network model.
In one embodiment of the present disclosure, obtaining a plurality of complete fact maps comprises: acquiring a plurality of events under each event scene; generalizing a plurality of events in each event scene to obtain a plurality of event nodes in each event scene; inputting a plurality of events under each event scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes under each event scene; and generating a complete affair map under each event scene according to the plurality of event nodes under each event scene and the corresponding node connection relation.
In an embodiment of the present disclosure, inputting the feature vector of the event graph into a pre-trained neural network model, and predicting missing event nodes and corresponding node connection relationships in the event graph, includes: and predicting missing event nodes in the event graph and corresponding node connection relations layer by adopting a multilayer attention mechanism based on a pre-trained neural network model.
In an embodiment of the present disclosure, predicting missing event nodes and corresponding node connection relationships in the event graph layer by using a multi-layer attention system based on a pre-trained neural network model includes: inputting the feature vectors of the event graph into a pre-trained neural network model, and outputting the missing event node types; inputting the missing event node type into a pre-trained neural network model, and outputting the missing event node content; inputting the types of the missing event nodes and the content of the missing event nodes into a pre-trained neural network model, and outputting the missing event nodes in the event graph and the corresponding node connection relation.
In one embodiment of the present disclosure, obtaining a case map of a target event scenario includes: acquiring a plurality of events in a target scene; generalizing a plurality of events in a target scene to obtain a plurality of event nodes in the target scene; inputting a plurality of events in a target scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes in the target scene; and generating a matter graph of the target event scene according to the plurality of event nodes and the corresponding node connection relation in the target scene.
According to another aspect of the present disclosure, there is provided a potential event prediction apparatus including: the event map acquisition module is used for acquiring an event map of a target event scene, wherein the event map comprises: a plurality of event nodes and node connection relations among the event nodes, wherein each event node corresponds to one event in the target event scene; the characteristic vector determining module is used for determining the characteristic vector of the affair map by using the graph neural network model; the neural network input module is used for inputting the feature vectors of the case map into a pre-trained neural network model and predicting missing event nodes in the case map and corresponding node connection relations; and the potential event determining module is used for determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
According to still another aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the potential event prediction method described above via execution of the executable instructions.
According to yet another aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the potential event prediction method described above.
According to the method for predicting the potential event, the event graph of the target event scene is obtained, the feature vector of the event graph of the target event scene is calculated and input into the pre-trained neural network model, the missing event nodes in the event graph of the target event scene and the corresponding node connection relations are obtained, and the potential event in the target event scene is determined according to the missing event nodes in the event graph and the corresponding node connection relations. In the embodiment of the disclosure, a case map is constructed based on an event text, and events represented by natural language are abstracted into event nodes and connection relations, so that potential events are accurately predicted.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 illustrates a flow chart of a method for potential event prediction in an embodiment of the present disclosure;
FIG. 2 illustrates a feature vector determination flow diagram in an embodiment of the present disclosure;
FIG. 3 illustrates a neural network model training flow diagram in an embodiment of the disclosure;
FIG. 4 illustrates a complete matter graph acquisition flow diagram in an embodiment of the disclosure;
FIG. 5 illustrates a neural network model layer-by-layer prediction flow diagram in an embodiment of the disclosure;
FIG. 6 illustrates a neural network model layer-by-layer prediction flow diagram in an embodiment of the disclosure;
FIG. 7 illustrates a case graph acquisition flow diagram for a target event scenario in an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a potential event prediction device in an embodiment of the disclosure;
FIG. 9 is a block diagram of an electronic device according to an embodiment of the disclosure;
FIG. 10 is a schematic diagram of a computer-readable storage medium in an embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The present exemplary embodiment will be described in detail below with reference to the drawings and examples.
First, the embodiments of the present disclosure provide a method for predicting a potential event, which may be performed by any electronic device with computing processing capability.
Fig. 1 shows a flowchart of a potential event prediction method in an embodiment of the present disclosure, and as shown in fig. 1, the potential event prediction method provided in the embodiment of the present disclosure includes the following steps:
s102, acquiring a matter graph of the target event scene, wherein the matter graph comprises: the method comprises the steps of obtaining a plurality of event nodes and node connection relations among the event nodes, wherein each event node corresponds to one event in a target event scene.
The target event scenario may be any event scenario, and the target event scenario includes multiple events and a logical relationship between the multiple events. Optionally, the target event scenario in the embodiment of the present disclosure may be any one of a medical scenario, an industrial scenario, a financial scenario, and the like, which is to be predicted. The event graph may be a graph formed by a plurality of event nodes and connection relations between the event nodes, each event node in the graph corresponds to an event, and the connection relations between the event nodes represent logical relations between the corresponding events.
And S104, determining a feature vector of the event graph by using the graph neural network model.
It should be noted that the Graph Neural network model may be a network model obtained by training a Graph Neural Network (GNN). The feature vector of a certain event map refers to the feature vector of a matrix formed by all event nodes in the event map.
And S106, inputting the characteristic vectors of the matter graph into a pre-trained neural network model, and predicting the missing event nodes in the matter graph and the corresponding node connection relation.
It should be noted that the neural network model may be a model obtained by machine learning training. Optionally, the neural network model in the embodiment of the present disclosure may be a model capable of predicting missing event nodes in the event graph and corresponding node connection relationships according to the feature vector of the input event graph. The missing event node may be a node of an event in the event graph that itself lacks a literal record.
And S108, determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
It should be noted that the potential events may be events that may be missing from the event map.
In specific implementation, all events contained in a target event scene are generalized into event nodes, the event scene is represented in a matter graph mode, and compared with a method for directly processing a text, the method can explicitly represent an event structure through the matter graph, reduce the dependence on model complexity and accurately predict potential events.
In one embodiment of the present disclosure, as shown in fig. 2, the potential event prediction method provided in the embodiment of the present disclosure may determine the feature vector of the event graph using the graph neural network model by:
s202, determining feature vectors of all event nodes in the event graph by using the graph neural network model.
In specific implementation, each event node in the event graph is respectively formed into a corresponding matrix, then the matrixes are respectively input into the graph neural network model, and the graph neural network model respectively outputs one or more eigenvectors of the matrixes.
And S204, determining the feature vector of the event graph according to the feature vector of each event node in the event graph.
In specific implementation, an adjacent matrix of the event graph is constructed according to the node connection relation among the event nodes, and the feature vector of each event node and the adjacent matrix are subjected to feature extraction through the graph attention network model to obtain the feature vector of the event graph.
According to the method, the characteristic vector is calculated through each event node in the event graph, so that the characteristic vector of the event graph is accurately determined.
In an embodiment of the present disclosure, as shown in fig. 3, before inputting feature vectors of a case graph into a pre-trained neural network model to predict missing event nodes and corresponding node connection relationships in the case graph, the method for predicting potential events provided in the embodiment of the present disclosure further includes the following steps:
s302, a plurality of complete affair maps are obtained, wherein each complete affair map corresponds to an event scene.
It should be noted that the complete event graph may be a directed graph composed of all event nodes included in one event and node connection relationships between the event nodes. The event scene can be any one of a medical scene, an industrial scene, a financial scene and the like with complete character records.
S304, deleting the event nodes in each complete affair map and the node connection relations among the event nodes according to the proportion to obtain the missing affair map of each complete affair map.
S306, taking the plurality of complete case maps and the corresponding missing case maps as training data, and training the neural network model to obtain the trained neural network model.
In specific implementation, event nodes in the complete affair graph and node connection relations among the event nodes are partially deleted, and the deleted affair graph is used as training data of the neural network after the complete affair graph is deleted. According to the method, part of event nodes in the complete affair graph and the node connection relation among the event nodes are deleted and used as training data to train the neural network model, and the prediction accuracy of the neural network model is improved.
In one embodiment of the present disclosure, as shown in fig. 4, the potential event prediction method provided in the embodiment of the present disclosure may obtain a plurality of complete event maps by the following steps:
s402, acquiring a plurality of events in each event scene.
In specific implementation, a plurality of events with complete character records, such as a medical scene, an industrial scene, a financial scene and the like, are acquired. For example, a fault alarm event, a site survey event, a fault location event, and a troubleshooting event in an industrial scenario are obtained.
S404, generalizing the plurality of events in each event scene to obtain a plurality of event nodes in each event scene.
In one embodiment, a fault alarm event, a field survey event, a fault location event and a fault elimination event in an industrial scene are generalized to obtain a plurality of event nodes of the fault alarm event, a plurality of event nodes of the field survey event, a plurality of event nodes of the fault location event and a plurality of event nodes of the fault elimination event. The generalization means that specific and individual events are expanded into general events, and the generalization may be performed by expanding specific events into representative nodes.
S406, inputting the plurality of events in each event scene into a natural language processing model, and outputting node connection relations among the plurality of event nodes in each event scene.
In one embodiment, a fault alarm event, a field survey event, a fault location event and a fault elimination event in an industrial scene are respectively input into a natural language processing model, so as to obtain a logical relationship among the events in the fault alarm event, a logical relationship among the events in the field survey event, a logical relationship among the events in the fault location event and a logical relationship among the events in the fault elimination event, and the logical relationships are used as the connection relationship of the event nodes.
And S408, generating a complete affair graph under each event scene according to the plurality of event nodes under each event scene and the corresponding node connection relation.
In one embodiment, the complete case map of the failure alarm event, the complete case map of the field survey event, the complete case map of the failure location event and the complete case map of the failure removal event are generated according to the plurality of event nodes of the failure alarm event, the plurality of event nodes of the field survey event, the plurality of event nodes of the failure location event, the plurality of event nodes of the failure removal event and the logical relationship (corresponding to the connection relationship) between the events in the field survey event, the logical relationship (corresponding to the connection relationship) between the events in the failure location event and the logical relationship (corresponding to the connection relationship) between the events in the failure removal event.
In an embodiment of the present disclosure, as shown in fig. 5, in inputting the feature vectors of the event graph into the pre-trained neural network model to predict missing event nodes and corresponding node connection relationships in the event graph, the method for predicting potential events provided in the embodiment of the present disclosure further includes the following steps:
and S502, predicting missing event nodes and corresponding node connection relations in the event graph layer by adopting a multi-layer attention mechanism based on a pre-trained neural network model.
It should be noted that the multi-layer attention mechanism may be configured to use the output (including the output vector and the manually set weight parameter) of the previous layer attention mechanism as the input of the next layer attention, and in this way, the information of the previous layer is used to guide the prediction of the next layer.
To further illustrate the layer-by-layer prediction using the multi-layer attention mechanism, in an embodiment of the present disclosure, as shown in fig. 6, S502 may specifically include the following steps:
and S602, inputting the feature vector of the event graph into a pre-trained neural network model, and outputting the missing event node type.
It should be noted that the event node type may be an attribute representing the event characteristic. For example, in an industrial scenario, a fault alarm event type, a field survey event type, a fault location event type, and a troubleshooting event type.
And S604, inputting the types of the missing event nodes into a pre-trained neural network model, and outputting the content of the missing event nodes.
Note that, the event node content may be a node representing an event characteristic.
The field investigation content is more relevant to the fault alarm condition, and then the model can pay more attention to the fault alarm node when predicting a specific field investigation event, so that the node content of the missing node is obtained through prediction.
And S606, inputting the types of the missing event nodes and the content of the missing event nodes into a pre-trained neural network model, and outputting the missing event nodes in the event graph and the corresponding node connection relation.
For example, a target event scenario is a device maintenance record, a case graph of a target event includes a failure alarm event, a field survey event, a failure location event, and a failure removal event, and each event has a plurality of event nodes. First layer prediction: inputting the feature vector of the event map into a pre-trained neural network model, and inferring that the event missing situation of the field investigation exists in the event map by the neural network model according to the trained event structure (fault alarm event, field investigation event, fault location event and fault removal event), namely determining the first layer attribute of the missing node: the node type. Second layer prediction: in the fault alarm event, the field investigation event, the fault location event and the fault elimination event, because the field investigation event is most relevant to the fault alarm event, when the node type output by the first layer of prediction is input into the neural network model for prediction, the neural network model allocates high weight to the node of the fault alarm event, and outputs a second layer attribute of the missing node: node content (corresponding to the node of the above-described site survey event). And third-layer prediction: inputting the node type of the first-layer prediction output and the node content of the second-layer prediction output into the neural network model, and outputting the node with higher correlation with the missing node and the event logic relationship (corresponding to the connection relationship) between the missing node and the related node, such as which fault alarm event caused by the field investigation event and which fault positioning event should be connected.
The method introduces a multilayer attention structure in a potential event prediction method, generalizes an event text into event nodes with multilayer attribute information, and uses an attention mechanism based on a previous layer prediction result in a layer-by-layer prediction process of the node attribute, so that the reliability and the interpretability of the prediction result are improved.
In an embodiment of the present disclosure, as shown in fig. 7, in obtaining a case graph of a target event scenario, a method for predicting a potential event provided in an embodiment of the present disclosure further includes the following steps:
s702, acquiring a plurality of events in a target scene;
s704, generalizing the events in the target scene to obtain event nodes in the target scene;
s706, inputting a plurality of events in the target scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes in the target scene;
and S708, generating a matter graph of the target event scene according to the plurality of event nodes and the corresponding node connection relation in the target scene.
In specific implementation, the event text is generalized into event nodes, the event scene is represented in a matter graph mode, and compared with a method for directly processing the text, the method has the advantages that the event structure can be represented explicitly through the matter graph, dependence on model complexity is reduced, and therefore potential events are predicted accurately.
Based on the same inventive concept, the embodiment of the present disclosure further provides a potential event prediction device, as described in the following embodiments. Because the principle of the embodiment of the apparatus for solving the problem is similar to that of the embodiment of the method, the embodiment of the apparatus can be implemented by referring to the implementation of the embodiment of the method, and repeated details are not described again.
Fig. 8 is a schematic diagram of a potential event prediction apparatus in an embodiment of the disclosure, as shown in fig. 8, the apparatus includes: a case map acquisition module 801, a feature vector determination module 802, a neural network input module 803, and a potential event determination module 804.
The event map acquiring module 801 is configured to acquire an event map of a target event scene, where the event map includes: the method comprises the following steps that a plurality of event nodes and node connection relations among the event nodes are obtained, and each event node corresponds to one event in a target event scene; a feature vector determination module 802 for determining feature vectors of the event graph using the graph neural network model; a neural network input module 803, configured to input the feature vectors of the case map into a pre-trained neural network model, and predict event nodes missing in the case map and corresponding node connection relationships; the potential event determining module 804 is configured to determine a potential event in the target event scene according to the event node missing in the event graph and the corresponding node connection relationship.
In an embodiment of the present disclosure, the above feature vector determining module 802 is further configured to: determining a feature vector of each event node in the event graph by using the graph neural network model; and determining the characteristic vector of the event map according to the characteristic vector of each event node in the event map.
In an embodiment of the present disclosure, the potential event prediction apparatus further includes: a neural network model training module 805, configured to obtain a plurality of complete event maps, where each complete event map corresponds to an event scene; deleting event nodes in each complete affair map and node connection relations among the event nodes in proportion to obtain a missing affair map of each complete affair map; and training the neural network model by taking the plurality of complete case maps and the corresponding missing case maps as training data to obtain the trained neural network model.
In an embodiment of the present disclosure, the neural network model training module 805 is further configured to: acquiring a plurality of events under each event scene; generalizing a plurality of events in each event scene to obtain a plurality of event nodes in each event scene; inputting a plurality of events under each event scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes under each event scene; and generating a complete affair map under each event scene according to the plurality of event nodes under each event scene and the corresponding node connection relation.
In an embodiment of the present disclosure, the neural network input module 803 is further configured to: and predicting missing event nodes in the event graph and corresponding node connection relations layer by adopting a multi-layer attention mechanism based on a pre-trained neural network model.
In an embodiment of the present disclosure, the neural network input module 803 is further configured to: inputting the feature vectors of the physical map into a pre-trained neural network model, and outputting the missing event node types; inputting the missing event node type into a pre-trained neural network model, and outputting the missing event node content; inputting the types of the missing event nodes and the content of the missing event nodes into a pre-trained neural network model, and outputting the missing event nodes in the event graph and the corresponding node connection relation.
In an embodiment of the present disclosure, the event map obtaining module 801 is further configured to: acquiring a plurality of events in a target scene; generalizing a plurality of events in a target scene to obtain a plurality of event nodes in the target scene; inputting a plurality of events in a target scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes in the target scene; and generating a matter graph of the target event scene according to the plurality of event nodes and the corresponding node connection relation in the target scene.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.), or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the disclosure is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the functionality or scope of use of the embodiments of the present disclosure.
As shown in fig. 9, the electronic device 900 is embodied in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one memory unit 920, and a bus 930 that couples various system components including the memory unit 920 and the processing unit 910.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present disclosure described in the above section "exemplary method" of the present specification.
For example, the processing unit 910 may perform the following steps of the above method embodiments: acquiring a matter graph of a target event scene, wherein the matter graph comprises: the method comprises the following steps that a plurality of event nodes and node connection relations among the event nodes are obtained, and each event node corresponds to one event in a target event scene; determining feature vectors of a case map using a graph neural network model; inputting the characteristic vectors of the case map into a pre-trained neural network model, and predicting missing event nodes in the case map and corresponding node connection relations; and determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
In one embodiment, the processing unit 910 may perform the following steps of the above-described method embodiments to determine feature vectors of a case graph using a graph neural network model: determining a feature vector of each event node in the event graph by using the graph neural network model; and determining the characteristic vector of the event map according to the characteristic vector of each event node in the event map.
In one embodiment, before inputting the feature vectors of the event graph into the pre-trained neural network model to predict the missing event nodes and the corresponding node connection relationships in the event graph, the processing unit 910 may perform the following steps of the method embodiment: acquiring a plurality of complete affair maps, wherein each complete affair map corresponds to an event scene; deleting event nodes in each complete affair map and node connection relations among the event nodes in proportion to obtain a missing affair map of each complete affair map; and training the neural network model by taking the plurality of complete case maps and the corresponding missing case maps as training data to obtain the trained neural network model.
In one embodiment, the processing unit 910 may perform the following steps of the above method embodiments to obtain a plurality of complete event maps: acquiring a plurality of events under each event scene; generalizing a plurality of events in each event scene to obtain a plurality of event nodes in each event scene; inputting a plurality of events under each event scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes under each event scene; and generating a complete affair map under each event scene according to the plurality of event nodes under each event scene and the corresponding node connection relation.
In one embodiment, the processing unit 910 may perform the following steps of the above method embodiments to predict missing event nodes and corresponding node connection relationships in the event graph: and predicting missing event nodes in the event graph and corresponding node connection relations layer by adopting a multi-layer attention mechanism based on a pre-trained neural network model.
In one embodiment, the processing unit 910 may perform the following steps of the above method embodiment to predict missing event nodes and corresponding node connection relationships in the event graph layer by layer: inputting the feature vectors of the physical map into a pre-trained neural network model, and outputting the missing event node types; inputting the missing event node type into a pre-trained neural network model, and outputting the missing event node content; inputting the types of the missing event nodes and the content of the missing event nodes into a pre-trained neural network model, and outputting the missing event nodes in the event graph and the corresponding node connection relation.
In one embodiment, the processing unit 910 may perform the following steps of the above method embodiments to obtain a case map of the target event scenario: acquiring a plurality of events in a target scene; generalizing a plurality of events in a target scene to obtain a plurality of event nodes in the target scene; inputting a plurality of events in a target scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes in the target scene; and generating a matter graph of the target event scene according to the plurality of event nodes and the corresponding node connection relation in the target scene.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM) 9203.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 940 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium, which may be a readable signal medium or a readable storage medium. Fig. 10 is a schematic diagram of a computer-readable storage medium in an embodiment of the disclosure, and as shown in fig. 10, the computer-readable storage medium 1000 has a program product stored thereon, which is capable of implementing the above-mentioned method of the disclosure. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
For example, the program product in the embodiments of the present disclosure, when executed by a processor, implements a method comprising: acquiring a matter graph of a target event scene, wherein the matter graph comprises: the method comprises the following steps that a plurality of event nodes and node connection relations among the event nodes are obtained, and each event node corresponds to one event in a target event scene; determining feature vectors of a case map using a graph neural network model; inputting the characteristic vectors of the case map into a pre-trained neural network model, and predicting missing event nodes in the case map and corresponding node connection relations; and determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
In some embodiments, the program product in the embodiments of the present disclosure, when executed by a processor, may further implement a method of: determining a feature vector of each event node in the event graph by using the graph neural network model; and determining the characteristic vector of the event map according to the characteristic vector of each event node in the event map.
In some embodiments, the program product in the embodiments of the present disclosure, when executed by a processor, may further implement a method of: acquiring a plurality of complete affair maps, wherein each complete affair map corresponds to an event scene; deleting event nodes in each complete affair map and node connection relations among the event nodes in proportion to obtain a missing affair map of each complete affair map; and taking the plurality of complete case maps and the corresponding missing case maps as training data, and training the neural network model to obtain the trained neural network model.
In some embodiments, the program product in the embodiments of the present disclosure, when executed by a processor, may further implement a method of: acquiring a plurality of events under each event scene; generalizing a plurality of events in each event scene to obtain a plurality of event nodes in each event scene; inputting a plurality of events under each event scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes under each event scene; and generating a complete affair map under each event scene according to the plurality of event nodes under each event scene and the corresponding node connection relation.
In some embodiments, the program product in the embodiments of the present disclosure, when executed by a processor, may further implement a method of: and predicting missing event nodes in the event graph and corresponding node connection relations layer by adopting a multi-layer attention mechanism based on a pre-trained neural network model.
In some embodiments, the program product in the embodiments of the present disclosure, when executed by a processor, may further implement a method of: inputting the feature vectors of the physical map into a pre-trained neural network model, and outputting the missing event node types; inputting the missing event node type into a pre-trained neural network model, and outputting the missing event node content; inputting the types of the missing event nodes and the content of the missing event nodes into a pre-trained neural network model, and outputting the missing event nodes in the event graph and the corresponding node connection relation.
In some embodiments, the program product in the embodiments of the present disclosure, when executed by a processor, may further implement a method of: acquiring a plurality of events in a target scene; generalizing a plurality of events in a target scene to obtain a plurality of event nodes in the target scene; inputting a plurality of events in a target scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes in the target scene; and generating a matter graph of the target event scene according to the plurality of event nodes and the corresponding node connection relation in the target scene.
More specific examples of the computer-readable storage medium in the present disclosure may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may include a propagated data signal with readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Alternatively, program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
In particular implementations, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. A method for predicting a potential event, comprising:
acquiring a matter graph of a target event scene, wherein the matter graph comprises: a plurality of event nodes and node connection relations among the event nodes, wherein each event node corresponds to one event in the target event scene;
determining feature vectors of the event graph using a graph neural network model;
inputting the characteristic vectors of the matter graph into a pre-trained neural network model, and predicting the missing event nodes in the matter graph and the corresponding node connection relation;
and determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
2. The method of predicting potential events according to claim 1, wherein determining feature vectors of the event graph using a graph neural network model comprises:
determining feature vectors of each event node in the event graph by using a graph neural network model;
and determining the characteristic vector of the event graph according to the characteristic vector of each event node in the event graph.
3. The method of predicting potential events according to claim 1, wherein before inputting the feature vectors of the event graph into a pre-trained neural network model to predict missing event nodes and corresponding node connection relationships in the event graph, the method further comprises:
acquiring a plurality of complete affair maps, wherein each complete affair map corresponds to an event scene;
deleting event nodes in each complete affair map and node connection relations among the event nodes in proportion to obtain a missing affair map of each complete affair map;
and training the neural network model by taking the plurality of complete case maps and the corresponding missing case maps as training data to obtain the trained neural network model.
4. The method of predicting potential events according to claim 3, wherein obtaining a plurality of complete event maps comprises:
acquiring a plurality of events under each event scene;
generalizing a plurality of events in each event scene to obtain a plurality of event nodes in each event scene;
inputting a plurality of events under each event scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes under each event scene;
and generating a complete affair map under each event scene according to the plurality of event nodes under each event scene and the corresponding node connection relation.
5. The method for predicting the potential events according to claim 1, wherein the step of inputting the feature vectors of the event graph into a pre-trained neural network model to predict missing event nodes and corresponding node connection relations in the event graph comprises:
and predicting missing event nodes in the event graph and corresponding node connection relations layer by adopting a multilayer attention mechanism based on a pre-trained neural network model.
6. The method for predicting the potential events according to claim 4, wherein the step of predicting missing event nodes and corresponding node connection relations in the event graph layer by adopting a multi-layer attention mechanism based on a pre-trained neural network model comprises:
inputting the feature vectors of the event graph into a pre-trained neural network model, and outputting the missing event node types;
inputting the types of the missing event nodes into a pre-trained neural network model, and outputting the content of the missing event nodes;
inputting the types of the missing event nodes and the content of the missing event nodes into a pre-trained neural network model, and outputting the missing event nodes in the event graph and the corresponding node connection relation.
7. The method of predicting potential events according to claim 1, wherein obtaining a case map of a target event scenario comprises:
acquiring a plurality of events in a target scene;
generalizing a plurality of events in a target scene to obtain a plurality of event nodes in the target scene;
inputting a plurality of events in a target scene into a natural language processing model, and outputting node connection relations among a plurality of event nodes in the target scene;
and generating a matter graph of the target event scene according to the plurality of event nodes and the corresponding node connection relation in the target scene.
8. A potential event prediction device, comprising:
the event map acquisition module is used for acquiring an event map of a target event scene, wherein the event map comprises: a plurality of event nodes and node connection relations among the event nodes, wherein each event node corresponds to one event in the target event scene;
the characteristic vector determining module is used for determining the characteristic vector of the affair map by using the graph neural network model;
the neural network input module is used for inputting the feature vectors of the case map into a pre-trained neural network model and predicting missing event nodes in the case map and corresponding node connection relations;
and the potential event determining module is used for determining potential events in the target event scene according to the missing event nodes in the event graph and the corresponding node connection relation.
9. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the potential event prediction method of any one of claims 1-7 via execution of the executable instructions.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method for predicting potential events according to any one of claims 1 to 7.
CN202210281274.7A 2022-03-21 2022-03-21 Potential event prediction method, device, equipment and storage medium Active CN114611816B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210281274.7A CN114611816B (en) 2022-03-21 2022-03-21 Potential event prediction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210281274.7A CN114611816B (en) 2022-03-21 2022-03-21 Potential event prediction method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114611816A true CN114611816A (en) 2022-06-10
CN114611816B CN114611816B (en) 2024-02-27

Family

ID=81864940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210281274.7A Active CN114611816B (en) 2022-03-21 2022-03-21 Potential event prediction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114611816B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110363449A (en) * 2019-07-25 2019-10-22 中国工商银行股份有限公司 A kind of Risk Identification Method, apparatus and system
CN110688489A (en) * 2019-09-09 2020-01-14 中国电子科技集团公司电子科学研究院 Knowledge graph deduction method and device based on interactive attention and storage medium
CN110968699A (en) * 2019-11-01 2020-04-07 数地科技(北京)有限公司 Logic map construction and early warning method and device based on event recommendation
CN111881219A (en) * 2020-05-19 2020-11-03 杭州中奥科技有限公司 Dynamic knowledge graph completion method and device, electronic equipment and storage medium
US20210012216A1 (en) * 2019-07-10 2021-01-14 International Business Machines Corporation Detecting and predicting object events from images
CN112819164A (en) * 2021-02-02 2021-05-18 京东数科海益信息科技有限公司 Inference method and device of affair map and computer equipment
CN113822494A (en) * 2021-10-19 2021-12-21 平安科技(深圳)有限公司 Risk prediction method, device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210012216A1 (en) * 2019-07-10 2021-01-14 International Business Machines Corporation Detecting and predicting object events from images
CN110363449A (en) * 2019-07-25 2019-10-22 中国工商银行股份有限公司 A kind of Risk Identification Method, apparatus and system
CN110688489A (en) * 2019-09-09 2020-01-14 中国电子科技集团公司电子科学研究院 Knowledge graph deduction method and device based on interactive attention and storage medium
CN110968699A (en) * 2019-11-01 2020-04-07 数地科技(北京)有限公司 Logic map construction and early warning method and device based on event recommendation
CN111881219A (en) * 2020-05-19 2020-11-03 杭州中奥科技有限公司 Dynamic knowledge graph completion method and device, electronic equipment and storage medium
CN112819164A (en) * 2021-02-02 2021-05-18 京东数科海益信息科技有限公司 Inference method and device of affair map and computer equipment
CN113822494A (en) * 2021-10-19 2021-12-21 平安科技(深圳)有限公司 Risk prediction method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN114611816B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN108491320A (en) Exception analysis method, device, computer equipment and the storage medium of application program
US10635521B2 (en) Conversational problem determination based on bipartite graph
CN111291882A (en) Model conversion method, device, equipment and computer storage medium
CN112003834B (en) Abnormal behavior detection method and device
CN111258832B (en) Interface parameter verification method, device, equipment and medium
CN115034596A (en) Risk conduction prediction method, device, equipment and medium
CN113098715B (en) Information processing method, device, system, medium and computing equipment
CN111126422B (en) Method, device, equipment and medium for establishing industry model and determining industry
US11119892B2 (en) Method, device and computer-readable storage medium for guiding symbolic execution
CN114611816B (en) Potential event prediction method, device, equipment and storage medium
CN110515758A (en) A kind of Fault Locating Method, device, computer equipment and storage medium
CN115964701A (en) Application security detection method and device, storage medium and electronic equipment
CN114500075B (en) User abnormal behavior detection method and device, electronic equipment and storage medium
CN111523681A (en) Global feature importance representation method and device, electronic equipment and storage medium
CN113434193B (en) Root cause change positioning method and device
CN117033318B (en) Method and device for generating data to be tested, storage medium and electronic equipment
CN115600216B (en) Detection method, detection device, detection equipment and storage medium
CN113298636B (en) Risk control method, device and system based on simulation resource application
CN117519591A (en) Data storage method and device, storage medium and electronic equipment
CN116975289A (en) Text attribute-level emotion classification method based on semantic information and related equipment
CN115934501A (en) Application program detection method and device, storage medium and electronic equipment
CN116455619A (en) Risk identification method and device, electronic equipment and storage medium
CN117439748A (en) Network attack detection method, device, equipment and storage medium
CN115329097A (en) Method, device, equipment and storage medium for constructing threat intelligence knowledge graph
CN114385142A (en) Data storage method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant