CN116882408B - Construction method and device of transformer graph model, computer equipment and storage medium - Google Patents

Construction method and device of transformer graph model, computer equipment and storage medium Download PDF

Info

Publication number
CN116882408B
CN116882408B CN202311147309.9A CN202311147309A CN116882408B CN 116882408 B CN116882408 B CN 116882408B CN 202311147309 A CN202311147309 A CN 202311147309A CN 116882408 B CN116882408 B CN 116882408B
Authority
CN
China
Prior art keywords
target
initial
cluster
graph model
name data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311147309.9A
Other languages
Chinese (zh)
Other versions
CN116882408A (en
Inventor
李鹏
黄文琦
戴珍
李轩昂
习伟
侯佳萱
冯勤宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern Power Grid Digital Grid Research Institute Co Ltd
Original Assignee
Southern Power Grid Digital Grid Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern Power Grid Digital Grid Research Institute Co Ltd filed Critical Southern Power Grid Digital Grid Research Institute Co Ltd
Priority to CN202311147309.9A priority Critical patent/CN116882408B/en
Publication of CN116882408A publication Critical patent/CN116882408A/en
Application granted granted Critical
Publication of CN116882408B publication Critical patent/CN116882408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • G06F40/295Named entity recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9032Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/906Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Supply And Distribution Of Alternating Current (AREA)

Abstract

The application relates to a method, a device, computer equipment and a storage medium for constructing a transformer graph model. The method comprises the following steps: the method comprises the steps of firstly, extracting entity points and relation edges from equipment data of each transformer, constructing an initial graph model of each transformer, then clustering attribute databank of the entity points in the initial graph model, and processing each initial graph model according to a target cluster obtained after clustering to obtain a target graph model. By the method, a user does not need to learn each transformer graph model in advance when using the transformer graph model, and the convenience in using the transformer graph model is greatly improved.

Description

Construction method and device of transformer graph model, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of automation technologies, and in particular, to a method and an apparatus for constructing a transformer graph model, a computer device, and a storage medium.
Background
The graph model of the transformer plays a vital role in the application field of the transformer, and all information related to the transformer can be conveniently inquired by utilizing the graph model of the transformer.
However, with the development of society, the coverage area of the power grid is wider and wider, so that more transformers are needed, but different transformer names may be inconsistent, names of devices and units related to the transformers are inconsistent, and the like, so that the current transformer graph model has more limitations. For example, names of different transformer graph models cannot be unified, so that a user cannot use the different transformer graph models smoothly, and needs to learn in advance to know the using methods of the different transformer graph models, which greatly reduces the working efficiency.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an apparatus, a computer device, and a storage medium for constructing a transformer map model, which can improve the convenience of using the transformer map model.
In a first aspect, the present application provides a method for constructing a transformer graph model. The method comprises the following steps:
extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
And processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In one embodiment, the attribute data includes name data, and clustering the attribute data of the entity points in each initial graph model to obtain a plurality of target clusters, including:
determining word vectors of name data of entity points in each initial graph model;
and clustering the name data according to the word vector of the name data to obtain a plurality of target clusters.
In one embodiment, clustering each name data according to a word vector of each name data to obtain a plurality of target clusters includes:
clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers;
determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster;
judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center;
if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center;
If not, each initial cluster is used as a target cluster for name data.
In one embodiment, according to a plurality of target clusters, processing each initial graph model to obtain a target graph model corresponding to each transformer includes:
extracting keywords from each target cluster;
and replacing the name data of the entity points in each initial graph model by adopting the extracted keywords to obtain target graph models corresponding to the transformers.
In one embodiment, replacing name data of entity points in each initial graph model by using the extracted keywords to obtain target graph models corresponding to each transformer, including:
acquiring at least two pieces of optional semantic information of the extracted keywords;
determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information;
replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer;
and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
In one embodiment, extracting keywords from each target cluster includes:
Taking name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or,
and taking the name data closest to the center of the target cluster in each target cluster as the key word of each target cluster.
In one embodiment, the method further comprises:
displaying each target graph model to the operation and maintenance terminal;
optimizing each target graph model according to the editing operation of each target graph model
In a second aspect, the application also provides a device for constructing the transformer graph model. The device comprises:
the data extraction module is used for extracting entity points and relationship edges from the equipment data of each transformer;
the model construction module is used for constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
the data clustering module is used for clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and the target determining module is used for processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor which when executing the computer program performs the steps of:
Extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In a fourth aspect, the present application also provides a computer-readable storage medium. A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In a fifth aspect, the present application also provides a computer program product. Computer program product comprising a computer program which, when executed by a processor, realizes the steps of:
Extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
The method, the device, the computer equipment and the storage medium for constructing the transformer graph model. The method comprises the steps of firstly extracting entity points and relation edges from equipment data of each transformer, constructing an initial graph model of each transformer, clustering attribute data of the entity points in the initial graph model, and processing each initial graph model according to a target cluster obtained after clustering to obtain target graph models, wherein the target graph models obtained by the method are obtained after clustering the initial graph models, and the clustering process can unify the attribute data of the entity points in the initial graph models, so that a user does not need to learn each transformer graph model in advance when using the transformer graph models, and the convenience of using the transformer graph models is greatly improved.
Drawings
FIG. 1 is an application environment diagram of a method of constructing a transformer map model in one embodiment;
FIG. 2 is a flow chart of a method of constructing a transformer diagram model in one embodiment;
FIG. 3 is a flow diagram of obtaining a target cluster in one embodiment;
FIG. 4 is a flow diagram of a method for obtaining a target graph model in one embodiment;
FIG. 5 is a flow diagram of an optimization objective diagram model in one embodiment;
FIG. 6 is a flow chart of a method of constructing a transformer diagram model in another embodiment;
FIG. 7 is a block diagram of an apparatus for constructing a transformer map model in one embodiment;
FIG. 8 is a block diagram of a construction apparatus of a transformer map model in another embodiment;
FIG. 9 is a block diagram of a construction apparatus of a transformer map model in yet another embodiment;
FIG. 10 is a block diagram of a construction apparatus of a transformer map model in still another embodiment;
FIG. 11 is an internal block diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The method for constructing the transformer graph model, which is provided by the embodiment of the application, can be applied to an application environment shown in fig. 1. In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in FIG. 1. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing data required for the relevant processing. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements the method of constructing a transformer map model as shown in any of the embodiments described below.
In one embodiment, as shown in fig. 2, a method for constructing a transformer graph model is provided, and the method is applied to the computer device in fig. 1 for illustration, and includes the following steps:
S201, extracting entity points and relation edges from equipment data of each transformer.
The equipment data of the transformer comprises attribute data of transformer equipment, and specifically can comprise, but is not limited to, transformer equipment ledger data, transformer equipment provider management data, transformer equipment topological relation data and the like; the transformer graph model is a graph for visually describing relevant information of the transformer; the entity point represents a random variable in the graph model, and the relation edge represents the dependency relationship between the random variables; for example, the transformer and manufacturer can be used as physical points, and the relationship between the transformer manufacturer and the transformer can be used as a relationship edge.
Alternatively, the device data of the transformer may be input into an extraction model to extract the physical points and relationship edges of the transformer device. Or, the extraction requirements of the entity points and the relationship edges can be obtained from the storage system of the server, and the entity points and the relationship edges can be extracted from the equipment data of each transformer according to the extraction requirements.
S202, constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges.
Specifically, according to the extracted relation edges, all the entity points are connected, and the construction of the initial graph model corresponding to each transformer is completed.
Alternatively, a graph model building model can be obtained by training according to the equipment data of the historical transformers and the historical transformer graph model in advance, and the entity points and the transformation relations extracted from the equipment data of the transformers are input into the graph model building model, so that the initial graph model corresponding to each transformer can be obtained.
And S203, clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters.
The clustering is to classify attribute data of entity points in each initial graph model. Specifically, a clustering standard uploaded by a user can be obtained, attribute data of entity points in each initial graph model are clustered, and the clustered attribute data of each type of entity points are used as a target cluster, so that a plurality of target clusters are obtained.
S204, processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
Specifically, for each target cluster, disambiguation is performed to eliminate ambiguous portions in each target cluster, and because each target cluster is constructed based on attribute data of entity points in the initial graph model, after the disambiguation is performed on the target cluster, the initial graph model corresponding to the target cluster is adaptively adjusted, namely, the disambiguation is performed on the initial graph model, and the disambiguation-processed initial graph model is the target graph model corresponding to each transformer.
In the above embodiment, the entity points and the relationship edges are extracted from the device data of each transformer, the initial graph model of each transformer is constructed, the attribute databank of the entity points in the initial graph model is clustered, each initial graph model is processed according to the target cluster obtained after the clustering to obtain the target graph model, the target graph model obtained in the method is obtained after the clustering of the initial graph model, and the clustering process can unify the attribute data of the entity points in the initial graph model, so that a user does not need to learn each transformer graph model in advance when using the transformer graph model, and the convenience of using the transformer graph model is greatly improved.
The above embodiment describes the method for constructing the object model from the device data of the transformer as a whole, wherein the process of acquiring the plurality of object clusters is an important step in constructing the object graph model, and thus, in the present embodiment, as shown in fig. 3, a detailed process of acquiring the plurality of object clusters is described in detail, and the specific method includes:
s301, determining word vectors of name data of entity points in each initial graph model.
The word vector is a vector converted according to words in natural language, and the more complex the semantics of the words in natural language are, the higher the corresponding word vector dimension is.
Specifically, training according to a word vector algorithm to obtain a word vector training model, inputting name data of entity points in each initial graph model into the word vector training model, and outputting word vectors of the name data of the entity points in each initial graph model.
S302, clustering the name data according to the word vector of the name data to obtain a plurality of target clusters.
Specifically, the dimensions of each word vector may be counted first, and name data corresponding to the word vector with the same dimension or the word vector with a dimension close to the dimension is used as a group, where each group of name data corresponds to one target cluster.
Alternatively, after the word vector distances between the word vectors are calculated, sorting is performed according to the word vector distances, name data corresponding to the word vectors with the word vector distances lower than the preset distance threshold are classified into a group, and each group of name data corresponds to one target cluster.
Alternatively, the method may also include clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data.
Specifically, according to the number of kinds of name data (for example, k kinds of name data are taken as examples in this embodiment for convenience of explanation, wherein k is not less than 2), word vectors of k kinds of name data are randomly selected as k initial cluster centers; calculating the distance from each word vector to the k initial clusters to obtain k distance values, comparing the obtained k distance values, classifying the word vector into the initial cluster with the minimum distance corresponding to the initial cluster center, and classifying all the word vectors by the same method to obtain k initial clusters; according to the word vectors of the name data in the initial clusters, connecting each word vector pair by pair to form a polygon, calculating the center point of the polygon by using analytic geometry, namely, the center point of each initial cluster is taken as the target cluster center of each initial cluster, judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center, if not, taking the k initial clusters as the target clusters, taking the target cluster center as the initial cluster center, and executing the operation of clustering the name data according to the word vectors of the name data to obtain the initial clusters corresponding to the initial cluster centers until all the target cluster centers coincide with the positions of the initial cluster centers, wherein the initial cluster process for obtaining the initial cluster centers corresponding to the initial cluster centers is elaborated in the foregoing and is not repeated.
In the above embodiment, the word vector of the name data of the entity point in each initial graph model is determined first, and then each name data is clustered according to the word vector of each name data to obtain a plurality of target clusters.
The above embodiment teaches how to obtain the target cluster, and in this embodiment, as shown in fig. 4, teaches how to obtain the target graph model according to the target cluster, and the specific method includes:
s401, extracting keywords from each target cluster.
Optionally, name data corresponding to the target cluster center of each target cluster can be used as keywords of each target cluster; or, name data closest to the center of each target cluster is used as a keyword of each target cluster.
Specifically, if the cluster center of the target cluster is coincident with a word vector of any one of the name data, the name data is used as a keyword of the target cluster; if the cluster center of the target cluster is not coincident with the word vector of any one of the name data, the name data closest to the target cluster center in each target cluster is used as a keyword of each target cluster, and the distance between the word vector of all the name data in each target cluster and the target cluster center is required to be calculated; the cosine distance can also be used as the distance between the word vector and the center of the target cluster; the comprehensive distance can be obtained according to the Euclidean distance and the cosine distance in a preset calculation mode, and the comprehensive distance is used as the distance between the word vector and the center of the target cluster; and selecting name data corresponding to the word vector nearest to the center of the target cluster as a keyword of the target cluster.
And S402, replacing the name data of the entity points in each initial graph model by using the extracted keywords to obtain target graph models corresponding to the transformers.
Specifically, deleting all other name data except the keywords, and taking the extracted keywords as the name data of the entity points in each initial graph model to obtain the target graph model corresponding to each transformer.
Optionally, at least two pieces of optional semantic information of the extracted keywords can be acquired first; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and finally, removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
Specifically, all semantic information of the keywords is analyzed, scoring is carried out on the analyzed semantic information according to a preset scoring rule, all semantic information of the keywords is ranked according to the scoring level, the semantic information with the highest score is used as target semantic information of the keywords, the extracted keywords are used as name data of entity points in each initial graph model, an intermediate graph model can be obtained, other semantic information with lower scores (namely, other semantic information except the target semantic information) is deleted from the intermediate graph model, and the target graph model corresponding to each transformer can be obtained.
In the above embodiment, the keywords of the target cluster are extracted first, and then the name data of the entity points in each initial graph model are replaced by the keywords, so that the target graph models corresponding to each transformer are obtained.
On the basis of the above embodiment, as shown in fig. 5, in order to make the target graph model more capable of meeting the needs of the user, the target graph model may be optimized according to the needs of the user, and the specific method includes:
s501, displaying each target graph model to the operation and maintenance terminal.
Specifically, each target graph model is displayed through a display screen of the operation and maintenance terminal, or names of each target graph model are displayed on the display screen of the operation and maintenance terminal, and after the user clicks, the target graph model clicked by the user is displayed.
S502, optimizing each target graph model according to the editing operation of each target graph model.
Specifically, according to the editing operation of the user on the target graph model in the display screen, the entity points and the relation edges of the target graph model can be optimized, so that the optimization of each target graph model is realized.
Optionally, the editing operation of the user on each target graph model can be made into an optimization requirement, after the server obtains the optimization requirement, the optimization requirement on the entity points of the target graph model and the optimization requirement on the relationship sides of the target graph model in the optimization requirement are analyzed, and the entity points and the relationship sides of the target graph model are optimized according to the optimization requirement on the entity points of the target graph model and the optimization requirement on the relationship sides of the target graph model, so that the optimization of the target graph model is completed.
In the above embodiment, the target graph model is displayed first, and then each target graph model is optimized according to the editing operation of each target graph model, so that the optimized target transformer graph model can better meet the requirements of users.
In order to more fully demonstrate the scheme, this embodiment provides an alternative way of constructing a transformer graph model, as shown in fig. 6:
s601, extracting entity points and relation edges from equipment data of each transformer.
S602, constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges.
S603, determining word vectors of name data of entity points in each initial graph model.
S604, clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers.
S605, determining the center of the target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster.
S606, judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center, if so, executing S607, otherwise, executing S608.
S607, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain the initial cluster corresponding to each initial cluster center.
S608, each initial cluster is set as a target cluster for the name data.
S609, using name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or, name data closest to the center of each target cluster is used as a keyword of each target cluster.
S610, acquiring at least two pieces of optional semantic information of the extracted keywords.
S611, determining target semantic information from the optional semantic information according to the scores of the optional semantic information.
And S612, replacing the name data of the entity points in each initial graph model by using the extracted keywords to obtain an intermediate graph model of each transformer.
And S613, removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
And S614, displaying each target graph model to the operation and maintenance terminal.
S615 optimizes each target graph model according to the editing operation on each target graph model.
The specific process of S601 to S615 may refer to the description of the above method embodiment, and its implementation principle and technical effect are similar, and will not be described herein.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a device for constructing the transformer graph model for realizing the method for constructing the transformer graph model. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiment of the device for constructing one or more transformer graph models provided below may refer to the limitation of the method for constructing a transformer graph model hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 7, there is provided a construction apparatus 7 of a transformer map model, including: a data extraction module 70, a model construction module 71, a data clustering module 72, and a target determination module 73, wherein:
a data extraction module 70, configured to extract entity points and relationship edges from device data of each transformer;
the model construction module 71 is configured to construct an initial graph model corresponding to each transformer according to the extracted entity points and the relationship edges;
the data clustering module 72 is configured to cluster attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
the target determining module 73 is configured to process each initial graph model according to a plurality of target clusters, so as to obtain a target graph model corresponding to each transformer.
In another embodiment, as shown in fig. 8, the data clustering module 72 in fig. 7 includes:
a word vector determining unit 720, configured to determine a word vector of name data of entity points in each initial graph model;
the data clustering unit 721 is configured to cluster each name data according to the word vector of each name data, so as to obtain a plurality of target clusters.
In another embodiment, the data clustering unit 721 in fig. 7 is specifically configured to:
clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data.
In another embodiment, as shown in fig. 9, the above-mentioned targeting module 73 in fig. 7 includes:
a keyword obtaining unit 730 for extracting keywords from each target cluster;
and the data replacing unit 731 is configured to replace name data of the entity points in each initial graph model by using the extracted keywords, so as to obtain target graph models corresponding to each transformer.
In another embodiment, the keyword obtaining unit 730 in fig. 9 is specifically configured to:
taking name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or, name data closest to the center of each target cluster is used as a keyword of each target cluster.
In another embodiment, the data replacement unit 731 in fig. 9 is specifically configured to:
acquiring at least two pieces of optional semantic information of the extracted keywords; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
In another embodiment, as shown in fig. 10, the device 7 for constructing a transformer map model in fig. 7 further includes:
a model display module 74 for displaying each target graph model to the operation and maintenance terminal;
the model optimization module 75 is configured to optimize each target graph model according to the editing operation on each target graph model.
The respective modules in the above-described construction apparatus of the transformer map model may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 11. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of constructing a transformer map model. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 11 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the computer device to which the present application applies, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In one embodiment, the processor when executing the computer program further performs the steps of:
determining word vectors of name data of entity points in each initial graph model; and clustering the name data according to the word vector of the name data to obtain a plurality of target clusters.
In one embodiment, the processor when executing the computer program further performs the steps of:
clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data.
In one embodiment, the processor when executing the computer program further performs the steps of:
extracting keywords from each target cluster; and replacing the name data of the entity points in each initial graph model by adopting the extracted keywords to obtain target graph models corresponding to the transformers.
In one embodiment, the processor when executing the computer program further performs the steps of:
Acquiring at least two pieces of optional semantic information of the extracted keywords; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
In one embodiment, the processor when executing the computer program further performs the steps of:
taking name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or, name data closest to the center of each target cluster is used as a keyword of each target cluster.
In one embodiment, the processor when executing the computer program further performs the steps of:
displaying each target graph model to the operation and maintenance terminal; and optimizing each target graph model according to the editing operation of each target graph model.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
Extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining word vectors of name data of entity points in each initial graph model; and clustering the name data according to the word vector of the name data to obtain a plurality of target clusters.
In one embodiment, the computer program when executed by the processor further performs the steps of:
clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
extracting keywords from each target cluster; and replacing the name data of the entity points in each initial graph model by adopting the extracted keywords to obtain target graph models corresponding to the transformers.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring at least two pieces of optional semantic information of the extracted keywords; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
In one embodiment, the computer program when executed by the processor further performs the steps of:
taking name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or, name data closest to the center of each target cluster is used as a keyword of each target cluster.
In one embodiment, the computer program when executed by the processor further performs the steps of:
displaying each target graph model to the operation and maintenance terminal; and optimizing each target graph model according to the editing operation of each target graph model.
In one embodiment, a computer program product is provided comprising a computer program which, when executed by a processor, performs the steps of:
extracting entity points and relationship edges from the equipment data of each transformer;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
clustering attribute data of entity points in each initial graph model to obtain a plurality of target clusters;
and processing each initial graph model according to the plurality of target clusters to obtain target graph models corresponding to each transformer.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining word vectors of name data of entity points in each initial graph model; and clustering the name data according to the word vector of the name data to obtain a plurality of target clusters.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
extracting keywords from each target cluster; and replacing the name data of the entity points in each initial graph model by adopting the extracted keywords to obtain target graph models corresponding to the transformers.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring at least two pieces of optional semantic information of the extracted keywords; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain the target graph model corresponding to each transformer.
In one embodiment, the computer program when executed by the processor further performs the steps of:
taking name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or, name data closest to the center of each target cluster is used as a keyword of each target cluster.
In one embodiment, the computer program when executed by the processor further performs the steps of:
displaying each target graph model to the operation and maintenance terminal; and optimizing each target graph model according to the editing operation of each target graph model.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive RandomAccess Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PhaseChange Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (StaticRandom Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples represent only a few embodiments of the present application, which are described in more detail and are not thereby to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A method for constructing a transformer map model, the method comprising:
extracting entity points and relationship edges from the equipment data of each transformer; wherein the entity points represent random variables in the transformer graph model, and the relationship edges represent the dependency relationships among different entity points;
constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
Determining word vectors of name data of entity points in each initial graph model; clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data;
extracting keywords from each target cluster; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain target graph models corresponding to each transformer; acquiring at least two pieces of optional semantic information of the extracted keywords; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain target graph models corresponding to the transformers.
2. The method of claim 1, wherein the extracting keywords from each target cluster comprises:
taking name data corresponding to the target cluster center of each target cluster as a keyword of each target cluster; or,
and taking the name data closest to the center of the target cluster in each target cluster as the key word of each target cluster.
3. The method according to claim 2, wherein the step of using the name data closest to the center of the target cluster in each target cluster as the keyword of each target cluster includes:
for each target cluster, calculating Euclidean distance and/or cosine distance between word vector of each name data and the center of the target cluster;
taking the Euclidean distance and/or cosine distance as the distance between the word vector and the center of the target cluster;
and selecting name data corresponding to the word vector nearest to the center of the target cluster as a keyword of the target cluster.
4. The method of claim 1, wherein extracting the physical points and the relationship edges from the device data for each transformer comprises:
and inputting the equipment data of each transformer into an extraction model, and extracting the entity points and the relation edges.
5. The method of claim 1, wherein replacing name data of entity points in each initial graph model comprises:
deleting all other name data except the keywords;
and taking the extracted keywords as name data of entity points in each initial graph model.
6. The method of claim 1, wherein determining target semantic information from among the selectable semantic information comprises:
ordering all semantic information of the keywords according to the score;
and taking the semantic information with the highest score as target semantic information of the keyword.
7. The method according to claim 1, wherein the method further comprises:
displaying each target graph model to the operation and maintenance terminal;
and optimizing each target graph model according to the editing operation of each target graph model.
8. A device for constructing a transformer map model, the device comprising:
the data extraction module is used for extracting entity points and relationship edges from the equipment data of each transformer;
the model construction module is used for constructing an initial graph model corresponding to each transformer according to the extracted entity points and the relation edges;
The data clustering module is used for determining word vectors of name data of entity points in each initial graph model; clustering the name data according to the word vector of the name data based on at least two initial cluster centers to obtain initial clusters corresponding to the initial cluster centers; determining the center of a target cluster corresponding to each initial cluster according to the word vector of the name data contained in each initial cluster; judging whether the initial cluster center of any initial cluster is inconsistent with the target cluster center; if yes, taking each target cluster center as a new initial cluster center, and returning to execute the operation of clustering each name data according to the word vector of each name data based on at least two initial cluster centers to obtain an initial cluster corresponding to each initial cluster center; if not, each initial cluster is used as a target cluster for name data;
the target determining module is used for extracting keywords from each target cluster; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain target graph models corresponding to each transformer; acquiring at least two pieces of optional semantic information of the extracted keywords; determining target semantic information from the selectable semantic information according to the scores of the selectable semantic information; replacing name data of entity points in each initial graph model by adopting the extracted keywords to obtain an intermediate graph model of each transformer; and removing other optional semantic information except the target semantic information corresponding to the keywords in each intermediate graph model to obtain target graph models corresponding to the transformers.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202311147309.9A 2023-09-07 2023-09-07 Construction method and device of transformer graph model, computer equipment and storage medium Active CN116882408B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311147309.9A CN116882408B (en) 2023-09-07 2023-09-07 Construction method and device of transformer graph model, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311147309.9A CN116882408B (en) 2023-09-07 2023-09-07 Construction method and device of transformer graph model, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116882408A CN116882408A (en) 2023-10-13
CN116882408B true CN116882408B (en) 2024-02-27

Family

ID=88262575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311147309.9A Active CN116882408B (en) 2023-09-07 2023-09-07 Construction method and device of transformer graph model, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116882408B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108427956A (en) * 2017-02-14 2018-08-21 腾讯科技(深圳)有限公司 A kind of clustering objects method and apparatus
CN109687464A (en) * 2018-07-19 2019-04-26 国网北京市电力公司 The analysis method and device of electric network swim, storage medium, processor
CN113298267A (en) * 2021-06-10 2021-08-24 浙江工业大学 Vertical federal model defense method based on node embedding difference detection
WO2022022045A1 (en) * 2020-07-27 2022-02-03 平安科技(深圳)有限公司 Knowledge graph-based text comparison method and apparatus, device, and storage medium
CN114385816A (en) * 2022-01-12 2022-04-22 阿里巴巴(中国)有限公司 Conversation flow mining method and device, electronic equipment and computer storage medium
CN114780727A (en) * 2022-04-24 2022-07-22 润联软件系统(深圳)有限公司 Text classification method and device based on reinforcement learning, computer equipment and medium
CN115687606A (en) * 2021-07-23 2023-02-03 深信服科技股份有限公司 Corpus processing method and device, electronic equipment and storage medium
CN115796187A (en) * 2022-11-26 2023-03-14 南京航空航天大学 Open domain dialogue method based on dialogue structure diagram constraint
CN116150397A (en) * 2023-01-05 2023-05-23 马上消费金融股份有限公司 Ontology construction method and device, electronic equipment and computer readable storage medium
CN116167289A (en) * 2023-04-26 2023-05-26 南方电网数字电网研究院有限公司 Power grid operation scene generation method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11461349B2 (en) * 2020-05-01 2022-10-04 Sap Se Data filtering utilizing constructed graph structure

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108427956A (en) * 2017-02-14 2018-08-21 腾讯科技(深圳)有限公司 A kind of clustering objects method and apparatus
CN109687464A (en) * 2018-07-19 2019-04-26 国网北京市电力公司 The analysis method and device of electric network swim, storage medium, processor
WO2022022045A1 (en) * 2020-07-27 2022-02-03 平安科技(深圳)有限公司 Knowledge graph-based text comparison method and apparatus, device, and storage medium
CN113298267A (en) * 2021-06-10 2021-08-24 浙江工业大学 Vertical federal model defense method based on node embedding difference detection
CN115687606A (en) * 2021-07-23 2023-02-03 深信服科技股份有限公司 Corpus processing method and device, electronic equipment and storage medium
CN114385816A (en) * 2022-01-12 2022-04-22 阿里巴巴(中国)有限公司 Conversation flow mining method and device, electronic equipment and computer storage medium
CN114780727A (en) * 2022-04-24 2022-07-22 润联软件系统(深圳)有限公司 Text classification method and device based on reinforcement learning, computer equipment and medium
CN115796187A (en) * 2022-11-26 2023-03-14 南京航空航天大学 Open domain dialogue method based on dialogue structure diagram constraint
CN116150397A (en) * 2023-01-05 2023-05-23 马上消费金融股份有限公司 Ontology construction method and device, electronic equipment and computer readable storage medium
CN116167289A (en) * 2023-04-26 2023-05-26 南方电网数字电网研究院有限公司 Power grid operation scene generation method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于TextRank和簇过滤的林业文本关键信息抽取研究;陈志泊等;农业机械学报(第05期);第179-221页 *

Also Published As

Publication number Publication date
CN116882408A (en) 2023-10-13

Similar Documents

Publication Publication Date Title
US20160203191A1 (en) Recommendation system with metric transformation
EP1890257A2 (en) Clustering for structured data
CN114579584B (en) Data table processing method and device, computer equipment and storage medium
CN115795000A (en) Joint similarity algorithm comparison-based enclosure identification method and device
CN115905630A (en) Graph database query method, device, equipment and storage medium
CN117332766A (en) Flow chart generation method, device, computer equipment and storage medium
CN116882408B (en) Construction method and device of transformer graph model, computer equipment and storage medium
JP7213890B2 (en) Accelerated large-scale similarity computation
JP2013242675A (en) Dispersion information control device, dispersion information search method, data dispersion arrangement method and program
Büscher et al. VPI-FP: an integrative information system for factory planning
CN117829951A (en) Three-dimensional model recommendation method, device, computer equipment and storage medium
CN117667999A (en) Data pushing method, device, computer equipment and computer readable storage medium
CN116909875A (en) Real-time test data generation method, device, computer equipment and storage medium
CN117670443A (en) Model recommendation method, device, computer equipment and storage medium
CN117035997A (en) Risk prediction method, risk prediction device, computer equipment and storage medium
CN118152504A (en) Unstructured data indexing method, device, apparatus, medium and program product
CN116932935A (en) Address matching method, device, equipment, medium and program product
CN117033451A (en) Searching method, searching device, computer equipment and storage medium
CN116881543A (en) Financial resource object recommendation method, device, equipment, storage medium and product
CN114781346A (en) Audit model sharing method and device, computer equipment and computer program product
CN116091209A (en) Credit service processing method, apparatus, computer device and storage medium
CN117521599A (en) Commodity code determining method, commodity code determining device, electronic equipment and storage medium
CN117312653A (en) Service policy determination method, device, computer equipment and storage medium
CN117150311A (en) Data processing method, device, equipment and storage medium
CN117909550A (en) Query method, query device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant