CN114880490A - Knowledge graph completion method based on graph attention network - Google Patents
Knowledge graph completion method based on graph attention network Download PDFInfo
- Publication number
- CN114880490A CN114880490A CN202210646659.9A CN202210646659A CN114880490A CN 114880490 A CN114880490 A CN 114880490A CN 202210646659 A CN202210646659 A CN 202210646659A CN 114880490 A CN114880490 A CN 114880490A
- Authority
- CN
- China
- Prior art keywords
- neighbor
- graph
- triple
- triplet
- incomplete
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/36—Creation of semantic tools, e.g. ontology or thesauri
- G06F16/367—Ontology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Probability & Statistics with Applications (AREA)
- Animal Behavior & Ethology (AREA)
- Computational Linguistics (AREA)
- Databases & Information Systems (AREA)
- Machine Translation (AREA)
Abstract
The invention discloses a knowledge graph completion method based on a graph attention network, which is a completion method based on matching measurement.
Description
Technical Field
The invention belongs to the technical field of knowledge graphs, and particularly relates to a knowledge graph completion method based on a graph attention network.
Background
With the development of information technology, natural language processing technology becomes more and more important, and knowledge maps have become a vital part in many downstream artificial intelligence applications such as question-answering systems, recommendation systems, information retrieval and the like. Many knowledge-graphs represent each piece of information in the form of a triplet (h, r, t) that includes: two entity elements, i.e. element h is the head entity and element t is the tail entity, and one relationship element, i.e. element r is the relationship.
The knowledge graph comprises a large number of triples, but a part of the triples often has incomplete phenomena and lacks one or some elements, the triples need to be complemented, the complementation is to predict the missing elements in the triples, and the complementing knowledge graph is to complement all incomplete triples in the knowledge graph.
Unfortunately, in the real world, the knowledge graph has a long-tailed relation, that is, a large part of data has only few training examples, and for the sparse long-tailed relation, due to the lack of enough training triples, the effective development of the knowledge graph completion work is greatly limited.
In view of this, how to efficiently and accurately complement the knowledge graph becomes a more urgent technical subject in the industry.
Disclosure of Invention
Therefore, the invention aims to provide a knowledge graph completion method based on a graph attention network, which can be used for efficiently and accurately completing the knowledge graph.
According to the specific needs of the triples of the knowledge graph, the completion elements are predicted, and the completion tasks can be divided into three types: head entity completion (.
In order to achieve the above object, the method for complementing a knowledge graph based on a graph attention network provided by the present invention specifically comprises the following steps:
step 1, extracting an incomplete triple in a knowledge graph, wherein the incomplete triple at least comprises a to-be-supplemented element e and an existing entity element;
step 2, based on the existing entity elements, searching neighbor triplets (h, r, t) related to the existing entity elements in the knowledge graph spectrum, and collecting all the searched neighbor triplets (h, r, t) as a neighbor triplet set S;
step 3, calculating the normalized attention coefficient alpha of each neighbor triplet (h, r, t) in the neighbor triplet set S Chinese angelica root-bark The method specifically comprises the following steps:
first, the attention value α for each neighbor triplet (h, r, t) is calculated using the following formula:
α=σ(U T W(h,r,t))
wherein, the magnitude of the attention value alpha represents the importance degree of the neighbor triplet group to the existing entity element, sigma is a nonlinear activation function for the graph attention network, U is a weight matrix of sigma, W is a learnable linear transformation weight matrix, and W (h, r, t) is vector representation of the neighbor triplet (h, r, t);
next, the normalized attention coefficient α for each neighbor triplet (h, r, t) is calculated using the following formula Chinese angelica root-bark :
α Chinese angelica root-bark =Softmax(α)
Wherein Softmax is a normalized exponential function for the graph attention network;
step 4, updating a head entity element h and a tail entity element t of each neighbor triple (h, r, t) in the neighbor triple set S by using the following formula based on the attention mechanism of the graph attention network to obtain an updated neighbor triple (h ', r, t'):
wherein the content of the first and second substances,to update the vector representation of the neighbor triplet (h ', r, t'), σ is the non-linear activation function for the graph attention network, W is the learnable linear transformation weight matrix, W (h, r, t) is the vector representation of the neighbor triplet (h, r, t), W (h, r, t) 1 In order to be a weight matrix, the weight matrix,b is the offset for the vector representation of the existing entity elements;
step 5, selecting N updated neighbor triples (h ', r, t') from the neighbor triple set S, and collecting the N updated neighbor triples as a reference set S N ;
Step 6, calculating the incomplete triple and the reference set S by using the following formula N With reference to normalized attention coefficient a N :
Wherein, the first and the second end of the pipe are connected with each other,for a vector representation of an incomplete triplet,is a reference set S N A vector representation of (a);
and 7, based on the attention mechanism, calculating a general expression suitable for different task relationships by using the following formula:
step 8, extracting one element e' in the knowledge graph as a compensation value to be filled in the position of the element e to be compensated in the incomplete triple, and then calculating the incomplete triple and a reference set S by using the following formula N Semantic similarity β between:
wherein the content of the first and second substances,in order to be a function of the dot-product metric,using the vector representation after element e' completion for the incomplete triple;
step 9, continuously repeating the step 8 until the semantic similarity beta corresponding to all elements e 'in the knowledge graph is calculated, and filling the element e' corresponding to the maximum semantic similarity beta as a final completion value to the position of the element e to be completed in the incomplete triple;
and 10, continuously repeating the steps 1-9 until all incomplete triples in the knowledge graph are completed.
Graph Attention Network (GAT) is a combination of Graph neural networks and Attention layers that introduces an Attention mechanism to a spatial domain-based Graph neural Network, with the ability to learn through Graph data and provide more accurate results, and the Graph Attention Network does not require complex computations using laplacian et al matrices, updating node features only through the characterization of one intervening neighbor node.
According to the knowledge graph completion method based on the graph attention network, the local neighborhood of the element of the incomplete triple is modeled through the attention mechanism based on the graph attention network, different weights are given to the neighbors in the local neighborhood according to the importance of the node characteristics, and more accurate element representation is generated, so that each triple in the knowledge graph can be completed more accurately, and the result of the completion method is more in line with objective facts.
Essentially, the completion method provided by the invention is a completion method based on matching measurement, and selects the candidate element with the highest rank as the final completion value by the local neighborhood of the coding element of the neighbor encoder and matching the incomplete triple and the reference set through the matching network.
Specifically, the completion method firstly constructs a neighbor encoder based on the graph attention network in step 3-4, and obtains a local neighborhood graph of the element by extracting a first-order neighbor of the target element, so that more element neighborhood information and relationship information are packaged in the neighbor encoder stage; then, in step 5-8, a matching network is constructed based on the graph attention network, meta-information representation obtained from the neighbor encoder is input into the matching network, matching scores are inquired, and the candidate element with the highest ranking is selected as a final complementary value.
Compared with the prior art, the completion method provided by the invention can obtain more accurate meta representation when coding the local neighborhood, can well reflect objective facts, improves the completion performance of the knowledge graph, and can efficiently and accurately complete the completion task especially for the small sample knowledge graph with obvious long-tail relation.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings. In the drawings, like reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore intended to provide a further understanding of the invention, and are not to be considered limiting of its scope, as it is defined by the present application. Wherein:
fig. 1 is a schematic block diagram of a compensation method provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and should not be taken to be limiting.
According to the knowledge graph completion method based on the graph attention network, the local neighborhood of the element of the incomplete triple is modeled through the attention mechanism based on the graph attention network, different weights are given to the neighbors in the local neighborhood according to the importance of the node characteristics, and more accurate element representation is generated, so that each triple in the knowledge graph can be completed more accurately, and the result of the completion method is more in line with objective facts.
As shown in fig. 1, the completion method is a completion method based on matching metric, which selects the highest-ranked candidate element as the final completion value by using the local neighborhood of the coding element of the neighbor encoder and matching the incomplete triplet with the reference set by using the matching network.
In this embodiment, we specifically take the tail entity completion (h, r, is) task as an example for explanation, and assume that we want to complete an incomplete triple (Elon Musk, spout, is) in a knowledge graph, the method for completing a knowledge graph based on a graph attention network provided by the present invention includes the following steps:
step 1, extracting an incomplete triple in the knowledge graph, wherein the incomplete triple is assumed to be extracted in the embodiment (Elon munk, spout;
step 2, based on the existing head entity element Elon Musk, searching out a neighbor triple (h, r, t) related to the head entity element Elon Musk from a knowledge graph spectrum, and grouping all the searched neighbor triples (h, r, t) into a neighbor triple set S;
step 3, calculating the normalized attention coefficient alpha of each neighbor triplet (h, r, t) in the neighbor triplet set S Chinese angelica root-bark The method specifically comprises the following steps:
first, the attention value α for each neighbor triplet (h, r, t) is calculated using the following formula:
α=σ(U T W(h,r,t))
wherein, the magnitude of the attention value α represents the importance degree of the neighbor triplet to the head entity element elon musk, σ is a non-linear activation function (e.g. LEAKYRelu activation function) for the graph attention network, U is a weight matrix of σ, W is a learnable linear transformation weight matrix, and W (h, r, t) is a vector representation of the neighbor triplet (h, r, t);
next, the normalized attention coefficient α for each neighbor triplet (h, r, t) is calculated using the following formula Chinese angelica root-bark :
α Chinese angelica root-bark =Softmax(α)
Wherein Softmax is a normalized exponential function for the graph attention network;
step 4, updating a head entity element h and a tail entity element t of each neighbor triple (h, r, t) in the neighbor triple set S by using the following formula based on the attention mechanism of the graph attention network to obtain an updated neighbor triple (h ', r, t'):
wherein the content of the first and second substances,to update the vector representation of the neighbor triplet (h ', r, t'), σ is the graph attention network represented by a non-linear activation function (e.g., the LEAKyRelu activation function), W is the learnable linear transformation weight matrix, W (h, r, t) is the vector representation of the neighbor triplet (h, r, t), W is the graph attention network represented by the non-linear activation function (e.g., the LEAKyRelu activation function), W is the learnable linear transformation weight matrix, and W (h, r, t) is the vector representation of the neighbor triplet (h, r, t) 1 In order to be a weight matrix, the weight matrix,representing the vector of the existing head entity element Elon Musk, and b is offset;
step 5, select 5 updated neighbor triplets (h ', r, t') from the neighbor triplet set S, that is, N is 5 in this embodiment, and the three updated neighbor triplets are collected as the reference set S N ;
In bookIn the examples, reference set S N Exemplarily chosen as follows:
(Elon Musk,father_in_law,Talulah Riley)
(Elon Musk,son_of,Elon Musk)
(Elon Musk,friend,Bill Gates)
(Elon Musk,education,Univ.of Pennsylvania)
(Elon Musk,occupation,Entrepreneur)
step 6, calculating the incomplete triple and the reference set S by using the following formula N With reference to normalized attention coefficient a N :
Wherein the content of the first and second substances,for the vector representation of an incomplete triplet of pixels,is a reference set S N A vector representation of (a);
and 7, based on the attention mechanism, calculating a general expression suitable for different task relationships by using the following formula:
wherein, the first and the second end of the pipe are connected with each other,is a general expression;
step 8, extracting one element e' in the knowledge graph as a supplement value to be filled in the position of the element e of the tail entity to be supplemented in the incomplete triple, and then calculating the incomplete triple and a reference set S by using the following formula N Semantic similarity β between:
wherein the content of the first and second substances,in order to be a function of the dot-product metric,using the vector representation after element e' completion for the incomplete triple;
step 9, continuously repeating the step 8 until the semantic similarity beta corresponding to all elements e 'in the knowledge graph is calculated, and filling the element e' corresponding to the maximum semantic similarity beta as a final completion value to the position of the tail entity element e to be completed in the incomplete triple;
in this embodiment, the knowledge graph includes, for example, the elements { Kevin desk, applet, Alice Riley, pinano … … }, and we fill them individually to the positions of the final entity elements e to be compensated in the incomplete triples, and obtain the following:
(Elon Musk,spouse,Kevin Musk)
(Elon Musk,spouse,apple)
(Elon Musk,spouse,Alice Riley)
(Elon Musk,spouse,piano)
……
then, the semantic similarity β of the triples is calculated through step 8, and finally the semantic similarity β of (Elon desk, spout, Alice Riley) is obtained to be the maximum, so that Alice Riley is filled into the incomplete triples as a final completion value.
Thus, through steps 1-9 above, an incomplete triplet (Elon Musk, spout, is) in the knowledge-graph is completed.
Obviously, the above-described completion method is equally applicable to the other two completion tasks.
It is understood that the completion method of the present invention further includes: and 10, continuously repeating the steps 1-9 until all incomplete triples in the knowledge graph are completed.
Furthermore, a subset of NELL-one data set and FB15k237-one data set are respectively extracted from two known data sets in the field of knowledge graph spectrum, namely NELL data set and FB15k237 data set, so as to carry out the effect verification of the completion method.
Specifically, relations of more than 50 triples and less than 500 triples in the data sets are selected as learning tasks, the numbers of the tasks in the two data sets are 67 and 45 respectively, and 51/5/11 and 32/5/8 task relations are set for dividing the number of the tasks and are used as a training set, a verification set and a test set respectively.
The statistics of the dataset are shown in table 1 below, where "# Ent" represents the number of all entities, "# Tri" represents the number of all relationship triplets, "# Rel" represents the number of all relationships, and "Task" represents the number of relationships selected as a small sample Task.
Data set | #Ent | #Tri | #Rel | Tasks |
NELL-one | 68,545 | 181,109 | 358 | 67 |
FB15k237-one | 14,478 | 309,621 | 237 | 45 |
TABLE 1 statistics of NELL-one and FB15k237-one datasets
In the verification process, a network model is realized by adopting a Pythroch deep learning framework, entity embedding is initialized through TransE, the maximum neighbor number in a data set in the verification is set to be 50, the embedding dimension of a NELL-one data set and an FB15k237-one data set is set to be 50, the N value is set to be 5, the learning rate is set to be 5e-5, an Adam optimizer is used for optimizing model parameters, an L2 model is further used for avoiding overfitting, two traditional evaluation indexes of link prediction are used for evaluating the effects of different methods on the two data sets, namely MRR and HITS @ N, MRR represents average reciprocal ranking, HITS @ N is the proportion of entities which are ranked correctly in the first N relational prediction tasks, and the N value is set to be 1, 5 and 10 in the verification.
The model was experimentally compared with known baseline models, GMatching model, FSRL model and MetaR model, using the same experimental environment on both data sets, and the results of the verification are shown in tables 2 and 3 below:
TABLE 2 validation results for data set NELL-one
TABLE 3 validation results for data set FB15k237-one
It can be seen that the MRR value of the completion method provided by the present invention is higher than that of other baseline models, because the GMatching model, the FSRL model and the MetaR model all use R-GCN to encode the local graph structure, while the present invention uses the attention network to encode the local graph structure, adds the attention mechanism, and can capture more entity information, thereby effectively improving the performance of knowledge graph completion.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention.
Claims (4)
1. A knowledge graph completion method based on a graph attention network is characterized by comprising the following steps:
step 1, extracting an incomplete triple in the knowledge graph, wherein the incomplete triple at least comprises a to-be-supplemented element e and an existing entity element;
step 2, based on the existing entity elements, retrieving neighbor triplets (h, r, t) related to the existing entity elements from the knowledge-graph, and grouping all the retrieved neighbor triplets (h, r, t) into a neighbor triplet set S;
step 3, calculating the normalized attention coefficient alpha of each neighbor triplet (h, r, t) in the neighbor triplet set S Chinese angelica root-bark The method specifically comprises the following steps:
first, the attention value α for each neighbor triplet (h, r, t) is calculated using the following formula:
α=σ(U T W(h,r,t))
wherein, the magnitude of the attention value alpha represents the importance degree of the neighbor triplet (h, r, t) to the existing entity element, sigma is a nonlinear activation function for the graph attention network, U is a weight matrix of sigma, W is a learnable linear transformation weight matrix, and W (h, r, t) is a vector representation of the neighbor triplet (h, r, t);
next, the normalized attention coefficient α for each neighbor triplet (h, r, t) is calculated using the following formula Chinese angelica root-bark :
α Chinese angelica root-bark =Softmax(α)
Wherein Softmax is a normalized exponential function for the graph attention network;
step 4, based on the attention mechanism of the graph attention network, updating a head entity element h and a tail entity element t of each neighbor triple (h, r, t) in the neighbor triple set S by using the following formula to obtain an updated neighbor triple (h ', r, t'):
wherein the content of the first and second substances,to update the vector representation of the neighbor triplet (h ', r, t'), σ is the non-linear activation function for the graph attention network, W is the learnable linear transformation weight matrix, W (h, r, t) is the vector representation of the neighbor triplet (h, r, t), W is the weight matrix, and 1 in the form of a matrix of weights,b is a bias for the vector representation of the existing entity elements;
step 5, selecting N updated neighbor triples (h ', r, t') from the neighbor triple set S, and collecting the N updated neighbor triples as a reference set S N ;
Step 6, calculating the incomplete triple and the reference set S by using the following formula N With reference to normalized attention coefficient a N :
Wherein the content of the first and second substances,for the vector representation of the incomplete triplet,is the reference set S N Softmax is a normalized exponential function for the graph attention network;
and 7, based on the attention mechanism, calculating a general expression suitable for different task relationships by using the following formula:
step 8, extracting an element e' in the knowledge graph as a compensation value to be filled in the position of the element e to be compensated in the incomplete triple, and then calculating the incomplete triple and the reference set S by using the following formula N Semantic similarity β between:
wherein the content of the first and second substances,in order to be a function of the dot-product metric,using the vector representation after completing the element e' for the incomplete triple;
step 9, continuously repeating the step 8 until the semantic similarity beta corresponding to all elements e 'in the knowledge graph is calculated, and filling the element e' corresponding to the maximum semantic similarity beta as a final completion value to the position of the element e to be completed in the incomplete triple;
and 10, continuously repeating the steps 1-9 until all incomplete triples in the knowledge graph are completed.
2. The method for completion of a knowledge graph based on a graph attention network according to claim 1, wherein in the steps 3 and 4, the nonlinear activation function σ is a LEAKyRelu activation function.
3. The method according to claim 1, wherein in step 5, the value of N is set to 5.
4. The method for completion of a knowledge graph based on an attention network of any one of claims 1-3, wherein the knowledge graph is a small sample knowledge graph.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210646659.9A CN114880490A (en) | 2022-06-08 | 2022-06-08 | Knowledge graph completion method based on graph attention network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210646659.9A CN114880490A (en) | 2022-06-08 | 2022-06-08 | Knowledge graph completion method based on graph attention network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114880490A true CN114880490A (en) | 2022-08-09 |
Family
ID=82680728
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210646659.9A Pending CN114880490A (en) | 2022-06-08 | 2022-06-08 | Knowledge graph completion method based on graph attention network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114880490A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116187446A (en) * | 2023-05-04 | 2023-05-30 | 中国人民解放军国防科技大学 | Knowledge graph completion method, device and equipment based on self-adaptive attention mechanism |
-
2022
- 2022-06-08 CN CN202210646659.9A patent/CN114880490A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116187446A (en) * | 2023-05-04 | 2023-05-30 | 中国人民解放军国防科技大学 | Knowledge graph completion method, device and equipment based on self-adaptive attention mechanism |
CN116187446B (en) * | 2023-05-04 | 2023-07-04 | 中国人民解放军国防科技大学 | Knowledge graph completion method, device and equipment based on self-adaptive attention mechanism |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113535984B (en) | Knowledge graph relation prediction method and device based on attention mechanism | |
CN110796313B (en) | Session recommendation method based on weighted graph volume and item attraction model | |
CN109740924B (en) | Article scoring prediction method integrating attribute information network and matrix decomposition | |
KR20210040248A (en) | Generative structure-property inverse computational co-design of materials | |
CN105956089A (en) | Recommendation method capable of aiming at classification information with items | |
CN113190688B (en) | Complex network link prediction method and system based on logical reasoning and graph convolution | |
CN116541607B (en) | Intelligent recommendation method based on commodity retrieval data analysis | |
CN104199826A (en) | Heterogeneous media similarity calculation method and retrieval method based on correlation analysis | |
CN112199508A (en) | Parameter adaptive agricultural knowledge graph recommendation method based on remote supervision | |
CN112650933A (en) | High-order aggregation-based graph convolution and multi-head attention mechanism conversation recommendation method | |
CN114880490A (en) | Knowledge graph completion method based on graph attention network | |
CN110472659B (en) | Data processing method, device, computer readable storage medium and computer equipment | |
CN111079011A (en) | Deep learning-based information recommendation method | |
CN113239266B (en) | Personalized recommendation method and system based on local matrix decomposition | |
CN112905906B (en) | Recommendation method and system fusing local collaboration and feature intersection | |
CN112612948B (en) | Deep reinforcement learning-based recommendation system construction method | |
Paul et al. | Concomitant record ranked set sampling | |
JP2023043703A (en) | Data analysis device, method, and program | |
CN113449182A (en) | Knowledge information personalized recommendation method and system | |
CN116911949A (en) | Article recommendation method based on boundary rank loss and neighborhood perception graph neural network | |
Hatami et al. | A graph-based multi-label feature selection using ant colony optimization | |
CN105653686A (en) | Domain name network address activeness statistics method and system | |
CN114860952A (en) | Graph topology learning method and system based on data statistics and knowledge guidance | |
CN115344698A (en) | Label processing method, label processing device, computer equipment, storage medium and program product | |
JP6993250B2 (en) | Content feature extractor, method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |