CN117851888A - Hypergraph node classification method and hypergraph node classification device based on fusion - Google Patents

Hypergraph node classification method and hypergraph node classification device based on fusion Download PDF

Info

Publication number
CN117851888A
CN117851888A CN202410036694.8A CN202410036694A CN117851888A CN 117851888 A CN117851888 A CN 117851888A CN 202410036694 A CN202410036694 A CN 202410036694A CN 117851888 A CN117851888 A CN 117851888A
Authority
CN
China
Prior art keywords
graph
layer
node
hypergraph
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410036694.8A
Other languages
Chinese (zh)
Inventor
房祥飞
潘庆霖
张珩
宦成颖
武延军
赵琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Software of CAS
Original Assignee
Institute of Software of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Software of CAS filed Critical Institute of Software of CAS
Priority to CN202410036694.8A priority Critical patent/CN117851888A/en
Publication of CN117851888A publication Critical patent/CN117851888A/en
Pending legal-status Critical Current

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a hypergraph node classification method and device based on fusion, wherein the method comprises the following steps: fusing the graph and the hypergraph through binary representation of the hypergraph; after message transmission and message aggregation are carried out on the fusion graph structure, embedded vector representation after node updating is obtained; wherein, the fusion map structure includes: first layer graph G consisting of nodes in the graph 1 A second layer graph G composed of supernodes in the supergraph 2 And a first layer diagram G 1 And a second layer graph G 2 Interconnections AE between 12 The method comprises the steps of carrying out a first treatment on the surface of the Embedding vector after updating based on nodeAnd (5) representing to perform node classification to obtain a hypergraph node classification result. The method solves the problem of simultaneous binary relation and multiple relation learning, solves the problem of difficult extraction of the node characteristics of the graph neural network and the hypergraph neural network caused by the model limitations of the graph and the hypergraph, and solves the problems of lower accuracy rate on classification tasks and model robustness on different data sets.

Description

Hypergraph node classification method and hypergraph node classification device based on fusion
Technical Field
The invention relates to the field of graph neural networks, in particular to a hypergraph node classification method and device based on fusion.
Background
The successful use of neural networks in non-euclidean graph representation learning has led to a proliferation of interest in studying correlations between entities. In one aspect, many conventional Graph Neural Networks (GNNs) have been widely used for various applications, such as node classification, link prediction, and graph classification. In particular, a message passing based neural network Model (MPNN) may be popular in that it may iteratively update the embedded vector by propagating and aggregating information of neighboring entities. The most classical graph neural network models at present include GIN, GAT and graphSAGE. On the other hand, the need to learn higher order correlations from more complex structures has prompted the development of Hypergraphic Neural Networks (HNNs). Meanwhile, the recent hypergraph neural network introduces the related technology of the graph neural network so as to improve the hypergraph learning capability.
However, the graph neural network is very effective in capturing inter-dependencies between instances from non-euclidean space data; however, since the graph only models the paired connection between entities, and the graph neural network, particularly MPNN, performs message propagation based on the graph structure, limitations in expressing higher-order correlations are caused. Furthermore, the graph neural network is typically shallow, and most of the graph neural networks have no more than three layers, which makes it difficult to obtain information from a remote node. Hypergraphs, while providing an effective mathematical abstraction to solve the above problems, typically result in significant information loss, as hypergraphs are typically obtained from the original graph structure by information extraction or the like. Meanwhile, the hypergraph weakens the expression capacity of the classical binary relation in the process of strengthening the expression capacity of the binary relation. This makes hypergraph-based hypergraph neural networks pose serious challenges in learning complex representations and irregularities, etc. For the graph neural network and the hypergraph neural network, the problems cause that in the process of performing node classification tasks, the graph neural network and the hypergraph neural network cannot well extract characteristics, so that classification accuracy is low and accuracy fluctuation on different data sets is large.
Disclosure of Invention
Aiming at the problems, the invention designs a hypergraph node classification method and device based on fusion, which are used for fusing a graph structure and a hypergraph structure through an effective fusion neural network frame and carrying out a message transmission method on the fused structure, so that the problem of simultaneous binary relation and multiple relation learning is solved, the problem of difficult extraction of the characteristics of the graph neural network and the hypergraph neural network nodes caused by the limitation of the graphs and the hypergraph models is solved, and the problems of lower accuracy rate on classification tasks and model robustness on different data sets are solved.
To achieve the object of the present invention, the present invention employs the steps of:
a hypergraph node classification method based on fusion comprises the following steps:
fusing the graph and the hypergraph through binary representation of the hypergraph;
after message transmission and message aggregation are carried out on the fusion graph structure, embedded vector representation after node updating is obtained; wherein, the fusion map structure includes: first layer graph G consisting of nodes in the graph 1 A second layer graph G composed of supernodes in the supergraph 2 And a first layer diagram G 1 And a second layer graph G 2 Interconnections AE between 12
And carrying out node classification based on the embedded vector representation after node updating to obtain a hypergraph node classification result.
Further, the fusing the graph structure and the hypergraph structure through the bipartite representation of the hypergraph comprises:
aligning nodes of the graph and the hypergraph;
the hypergraph is represented as a supernode by abstracting the hyperedges in the hypergraph as supernodesWherein U represents a node set corresponding to the graph structure, W represents a supernode set corresponding to the supergraph structure, and ++>Representing the belongings of nodes and supersides in a supergraph, wherein the graph structure corresponds to the nodes in the graph one by one;
extracting edge E in graph structure 1 And edge E in hypergraph structure 2
Corresponding the node set U to a layer of graph and combining the edge E 1 Generating a first layer diagram G 1
Mapping the supernode set into another layer of graph and combining the edge E 2 Generating a second layer graph G' 2
Based on the relationship of nodes and supersides in the hypergraphGenerating a first layer graph G 1 And a second layer graph G 2 Interconnections AE between 12
Generating an original fusion map structure
For the second layer graph G' 2 After the information is complemented on the side in the process, a fusion graph structure is generatedWherein G is 2 And (5) supplementing the information with a second layer of drawings.
Further, the method for aligning nodes of the graph and the hypergraph comprises the following steps: remapping or data distillation.
Further, the saidExtracting edge E in graph structure 1 And edge E in hypergraph structure 2 The method of (1) comprises: KNN clustering, graph structure learning or high-dimensional information extraction.
Further, the pair of second layer graphs G' 2 The information complement is carried out on the side of the (C) and comprises the following steps:
if the second layer graph G' 2 Both supernodes of (a) are connected with at least one identical first layer graph G 1 And connecting the two supernodes.
Further, after the message passing and message aggregation are performed on the fusion graph structure, an embedded vector representation after node updating is obtained, including:
in each iteration round, for the first layer graph G 1 Node in (b) and layer-two graph G 2 Intra-layer messaging for supernodes in (1) and for first layer graph G 1 Node in (b) and layer-two graph G 2 After the super node in (a) carries out interlayer message transfer, for each node and super node, aggregating information obtained in an intra-layer message transfer process and an inter-layer message transfer process and updating by combining the last state of the super node, so as to obtain embedded vector representations of each node and the super node after the iteration turn;
after the specified round of iterations is completed, the updated embedded vector representation of the node is output.
Further, a first layer diagram G 1 Information obtained by the intra-layer message passing process of the node in the network Wherein t represents iteration round, v 1j Representing a first layer diagram G 1 Node of middle number j, v 1i ∈N 1 (v 1j ) Representing node v 1j At the first layer diagram G 1 Neighbor node set in->Represents node v after the t-th iteration round 1i Is embedded vector representation of->Represents node v after the t-th iteration round 1i Is embedded vector representation of->Representing the first layer of graph G in t iteration runs 1 Is provided.
Further, for the first layer graph G 1 Information obtained by the inter-layer message passing process of the node in the network Wherein v is 2i ∈N 1 (v 1j ) Representing node v 1j In the second layer diagram G 2 Is a set of neighbor nodes in the network,represents node v after the t-th iteration round 2i Is embedded vector representation of->Representing a second layer graph G in t iteration runs 2 To the first layer diagram G 1 Is included in the message layer.
Further, the embedded vector representation of the node after the t+1st iteration round Wherein (1)>Representing first layer of graphic information at the t-th round of stackingAnd a message aggregation function in the generation, wherein gamma represents a first weight coefficient, and delta represents a second weight coefficient.
A fusion-based hypergraph node classification apparatus comprising:
the diagram fusion module is used for fusing the diagram and the hyperdiagram through binary representation of the hyperdiagram;
the diagram calculation module is used for obtaining the embedded vector representation after node updating after message transmission and message aggregation are carried out on the fusion diagram structure; wherein, the fusion map structure includes: first layer graph G consisting of nodes in the graph 1 A second layer graph G composed of supernodes in the supergraph 2 And a first layer diagram G 1 And a second layer graph G 2 Interconnections AE between 12
And the node classification module is used for carrying out node classification based on the embedded vector representation after node updating to obtain a hypergraph node classification result.
The invention has the beneficial effects that:
1) Compared with the prior art of graphic neural network and hypergraphic neural network, the method can combine the information of hypergraphic and graphic structures, and can more effectively extract binary relation information and polynary relation information.
2) According to the invention, a plurality of super parameters are used for dynamically adjusting the weights of different information in the information aggregation process, so that the proportion of beneficial information is improved, and the accuracy of the model is further improved.
3) The classification accuracy is higher on node classification tasks, the accuracy fluctuation is smaller on different data sets, and good effects are achieved on most data sets.
Drawings
Fig. 1 is a flowchart of a fusion method of a graph convolution network and a hypergraph convolution network according to an embodiment of the present invention.
Fig. 2 shows a diagram of the unigram structure according to the present invention.
Fig. 3 shows a schematic diagram of a messaging and aggregation process according to the present invention.
Figure 4 shows a graph of accuracy of the present invention versus four prior algorithms.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
The technical scheme of the invention is that an information fusion method is adopted, and a graph structure and a hypergraph structure are fused through binary representation of the hypergraph; secondly, message passing and message aggregation are carried out on the fused graph structure, and the embedded vector of the node is updated. Finally, the nodes are classified by vector representation.
Compared with the traditional graph neural network and hypergraph neural network, the method can combine and utilize the information of the graph structure and the information of the hypergraph structure, more effectively combine the binary relation and the multi-element relation, and provides a new data structure unigraph on the basis, then defines the graph convolution neural network FGNN based on message transmission on the unigraph, and simultaneously provides the realization modes of the message transmission function and the message aggregation function in the message transmission process. The specific technical scheme is as follows:
step 1: the method of node alignment is used, the numbers of nodes of the graph and the hypergraph are in one-to-one correspondence according to label information of data in reality, node vector representation and other information, and the edge E in the graph is obtained through KNN clustering, graph structure learning, high-dimensional information extraction and other methods 1 And E is 2
The node labels in the graph and the hypergraph are reordered through the remapping, the data distillation and the like, so that the same node in different data sets can acquire the same number again. And edges in the graph are generated by KNN clustering, graph structure learning, high-dimensional information extraction and other methodsE 1 And E is 2
Step 2: the hypergraph is represented in a bipartite way by abstracting the hyperedges in the hypergraph into hypernodes, and the hypergraph is represented as a structure form of interconnection between nodes of two layers of graphs by the bipartite representation.
Hypergrams can be represented asIn which U, W correspond to sets of nodes and supernodes, respectively, ++>The node and superside belonging relationship in the original supergraph is represented. Layering the nodes of the two sets, and adding +.>As a connection between two levels of nodes.
Step 3: and respectively corresponding the supernodes and the nodes into a two-layer graph structure, fusing the graph and the supergraph into an unigram graph structure according to a shared node set, and complementing the missing structure information in the graph.
The invention constructs unigram graph structure specifically as follows:
step 3.1: the node set is correspondingly arranged in a layer of graph to be used as a first layer of graph; the set of supernodes corresponds to one layer of graph as a second layer of graph. The two-layer diagrams are respectively expressed as:
G 1 =(V 1 ,E 1 ),G 2 =(V 2 ,E 2 )
step 3.2: node set V according to a first layer graph common to hypergraphs and graphs 1 Fusion is performed to represent the graph as nodes in the first layer and edges between the nodes, i.e. G 1 =(V 1 ,E 1 ) Whereas hypergraphs are represented as nodes and between nodes in a two-level graph structure
Interconnect AE 12 Will be the whole G 1 ,G 2 ,AE 12 Is defined as unigram, and can be expressed as:
and 3.3, completing the information of the edges in the second-layer graphs missing in the graph, and if the nodes in the two second-layer graphs are connected with at least one node in the same first-layer graph, connecting the nodes in the two second-layer graphs, namely, one edge.
Step 4: and realizing a convolutional neural network FGNN on the unigram, realizing a message transfer and message aggregation process of graph convolution on the unigram, and carrying out node classification through iteratively updated vector representation.
The network structure of step 4 specifically includes:
step 4.1: first, intra-layer message transmission of the nodes is carried out, and messages which can be obtained by each node in the first layer diagram and the second layer diagram in the message transmission process are respectively
And
Wherein t expresses the iteration round of the neural network; n (N) k (v kj ) Representing node v in a k-layer graph kj A set of neighbor nodes in the k-th layer graph, k=1, 2;representing node v in a k-layer graph kj Feature vectors in iteration t. />Representing a message transfer function in a t-th round from a k-th layer to a first layer graph; />Representing node v in layer I lj The information sum is obtained from the k-th layer map in the t-th iteration.
Step 4.2: then, message transmission of the interlayer nodes is carried out, wherein the messages which can be obtained by the nodes in the second layer diagram in the process of transmitting the messages from the nodes in the first layer diagram to the nodes in the second layer diagram and the nodes in the first layer diagram in the process of transmitting the messages from the second layer node to the first layer node are respectively
Step 4.3: for each node, aggregating information obtained by an intra-layer message transfer process and an inter-layer message transfer process and updating the information by combining the last state of the information, wherein message aggregation functions of the second-layer node and the first-layer node are respectively as follows:
and
Wherein alpha, beta, gamma and delta are weights of different information respectively,representing the message aggregation function of the j-th layer information in the t-th round of iteration. The message aggregation of the first layer node uses the new state information of the node of the upper layer, the message aggregation sequence of the whole model is from top to bottom, and the message aggregation process is shown in fig. 3. Meanwhile, the model uses the super parameter delta for communicationAnd (5) adjusting weights of different information in the information transmission process.
Step 4.4: iterative updating is carried out according to the message passing definition, the depth of the model is 2, and the vector representation of the updated nodes passes through a normalization layer and a classifier after training for 200 rounds.
Step 5: and (3) carrying out experiments on data sets such as Cora, citeseer, pubmed and comparing results, training a network model on a training set, selecting a model with highest accuracy according to the results of verification as a test model, and checking classification effects on the test set.
The experimental results are more specifically as follows: by comparing the public data set of Cora, citeseer, pubmed with the existing graph neural network and hypergraph neural network methods and comparing the model with the evaluation indexes of the node classification task, such as accuracy, recall rate, accuracy, F1 Score and the like, higher promotion can be obtained. Fig. 4 is a comparison of the accuracy of the method of the present application with the allset transformers (prior algorithm 1), allseepsets (prior algorithm 2), uniGCNII (prior algorithm 3), HGNN (prior algorithm 4) on eight published data sets Cora, citeseer, pubmed, coauthorshipCora, DBLP, house, senate, NTU2012, modelet 40.
In summary, the technical scheme of the invention adopts an information fusion method, and a graph structure and a hypergraph structure are fused through binary representation of the hypergraph; secondly, message passing and message aggregation are carried out on the fused graph structure, and the embedded vector of the node is updated. Finally, the nodes are classified by vector representation. Therefore, the method and the device can solve the problems of simultaneous binary relation and multiple relation learning, difficult extraction of the characteristics of the graph neural network and the hypergraph neural network nodes caused by the model limitations of the graph and the hypergraph, low accuracy in classification tasks and model robustness in different data sets.
The foregoing is merely a preferred example of the present disclosure, and is not intended to limit the present disclosure, so that various modifications and changes may be made to the present disclosure by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.

Claims (10)

1. A fusion-based hypergraph node classification method, the method comprising:
fusing the graph and the hypergraph through binary representation of the hypergraph;
after message transmission and message aggregation are carried out on the fusion graph structure, embedded vector representation after node updating is obtained; wherein, the fusion map structure includes: first layer graph G consisting of nodes in the graph 1 A second layer graph G composed of supernodes in the supergraph 2 And a first layer diagram G 1 And a second layer graph G 2 Interconnections AE between 12
And carrying out node classification based on the embedded vector representation after node updating to obtain a hypergraph node classification result.
2. The method of claim 1, wherein fusing the graph structure and the hypergraph structure by the bipartite representation of the hypergraph comprises:
aligning nodes of the graph and the hypergraph;
the hypergraph is represented as a supernode by abstracting the hyperedges in the hypergraph as supernodesWherein U represents a node set corresponding to the graph structure, W represents a supernode set corresponding to the supergraph structure, and ++>Representing the belongings of nodes and supersides in a supergraph, wherein the graph structure corresponds to the nodes in the graph one by one;
extracting edge E in graph structure 1 And edge E in hypergraph structure 2
Corresponding the node set U to a layer of graph and combining the edge E 1 Generating a first layer diagram G 1
Mapping the supernode set into another layer of graph and combining the edge E 2 Generating a second layer graph G' 2
Based on the relationship of nodes and supersides in the hypergraphGenerating a first layer graph G 1 And a second layer graph G 2 Interconnections AE between 12
Generating an original fusion map structure
For the second layer graph G' 2 After the information is complemented on the side in the process, a fusion graph structure is generatedWherein G is 2 And (5) supplementing the information with a second layer of drawings.
3. The method of claim 2, wherein the method of aligning nodes of the graph and hypergraph comprises: remapping or data distillation.
4. The method of claim 2, wherein the extracting of edge E in the graph structure 1 And edge E in hypergraph structure 2 The method of (1) comprises: KNN clustering, graph structure learning or high-dimensional information extraction.
5. The method according to claim 2, wherein the pair of second layer graphs G' 2 The information complement is carried out on the side of the (C) and comprises the following steps:
if the second layer graph G' 2 Both supernodes of (a) are connected with at least one identical first layer graph G 1 And connecting the two supernodes.
6. The method of claim 1, wherein the obtaining the updated embedded vector representation of the node after the message passing and message aggregation on the fused graph structure comprises:
in each iteration round, for the first layer graph G 1 Node in (b) and layer-two graph G 2 Intra-layer messaging for supernodes in (1) and for first layer graph G 1 Node in (b) and layer-two graph G 2 After the super node in (a) carries out interlayer message transfer, for each node and super node, aggregating information obtained in an intra-layer message transfer process and an inter-layer message transfer process and updating by combining the last state of the super node, so as to obtain embedded vector representations of each node and the super node after the iteration turn;
after the specified round of iterations is completed, the updated embedded vector representation of the node is output.
7. The method of claim 6, wherein the first layer graph G 1 Information obtained by the intra-layer message passing process of the node in the networkWherein t represents iteration round, v 1j Representing a first layer diagram G 1 Node of middle number j, v 1i ∈N 1 (v 1j ) Representing node v 1j At the first layer diagram G 1 Neighbor node set in->Represents node v after the t-th iteration round 1i Is embedded vector representation of->Represents node v after the t-th iteration round 1i Is used to determine the vector representation of the embedded vector,representing the first layer in t iteration runsGraph G 1 Is provided.
8. The method of claim 7, wherein for the first layer graph G 1 Information obtained by the inter-layer message passing process of the node in the networkWherein v is 2i ∈N 1 (v 1j ) Representing node v 1j In the second layer diagram G 2 Neighbor node set in->Represents node v after the t-th iteration round 2i Is embedded vector representation of->Representing a second layer graph G in t iteration runs 2 To the first layer diagram G 1 Is included in the message layer.
9. The method of claim 8, wherein the embedded vector representation of the node after the t+1st iteration roundWherein (1)>Representing the message aggregation function of the first layer diagram information in the t-th round of iteration, gamma represents the first weight coefficient, and delta represents the second weight coefficient.
10. A fusion-based hypergraph node classification apparatus, the apparatus comprising:
the diagram fusion module is used for fusing the diagram and the hyperdiagram through binary representation of the hyperdiagram;
a graph calculation module for message passing on the fusion graph structureAfter the information is aggregated, obtaining an embedded vector representation after node updating; wherein, the fusion map structure includes: a first layer graph G1 composed of nodes in the graph, a second layer graph G composed of supernodes in the supergraph 2 And a first layer diagram G 1 And a second layer graph G 2 Interconnections AE between 12
And the node classification module is used for carrying out node classification based on the embedded vector representation after node updating to obtain a hypergraph node classification result.
CN202410036694.8A 2024-01-10 2024-01-10 Hypergraph node classification method and hypergraph node classification device based on fusion Pending CN117851888A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410036694.8A CN117851888A (en) 2024-01-10 2024-01-10 Hypergraph node classification method and hypergraph node classification device based on fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410036694.8A CN117851888A (en) 2024-01-10 2024-01-10 Hypergraph node classification method and hypergraph node classification device based on fusion

Publications (1)

Publication Number Publication Date
CN117851888A true CN117851888A (en) 2024-04-09

Family

ID=90545879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410036694.8A Pending CN117851888A (en) 2024-01-10 2024-01-10 Hypergraph node classification method and hypergraph node classification device based on fusion

Country Status (1)

Country Link
CN (1) CN117851888A (en)

Similar Documents

Publication Publication Date Title
CN111291836B (en) Method for generating student network model
CN109299342B (en) Cross-modal retrieval method based on cycle generation type countermeasure network
Babatunde et al. A genetic algorithm-based feature selection
CN112508085B (en) Social network link prediction method based on perceptual neural network
CN110677284B (en) Heterogeneous network link prediction method based on meta path
CN112966114B (en) Literature classification method and device based on symmetrical graph convolutional neural network
CN108399421A (en) A kind of zero sample classification method of depth of word-based insertion
CN112465120A (en) Fast attention neural network architecture searching method based on evolution method
CN112199536A (en) Cross-modality-based rapid multi-label image classification method and system
CN114329232A (en) User portrait construction method and system based on scientific research network
WO2022252455A1 (en) Methods and systems for training graph neural network using supervised contrastive learning
CN111460818A (en) Web page text classification method based on enhanced capsule network and storage medium
CN114117142A (en) Label perception recommendation method based on attention mechanism and hypergraph convolution
WO2021042857A1 (en) Processing method and processing apparatus for image segmentation model
WO2022126448A1 (en) Neural architecture search method and system based on evolutionary learning
JP7381814B2 (en) Automatic compression method and platform for pre-trained language models for multitasking
CN116403730A (en) Medicine interaction prediction method and system based on graph neural network
WO2023124342A1 (en) Low-cost automatic neural architecture search method for image classification
CN112633482A (en) Efficient width map convolution neural network model and training method thereof
Wang et al. Evolutionary algorithm-based and network architecture search-enabled multiobjective traffic classification
CN115481727A (en) Intention recognition neural network generation and optimization method based on evolutionary computation
CN114202035A (en) Multi-feature fusion large-scale network community detection algorithm
CN112529057A (en) Graph similarity calculation method and device based on graph convolution network
CN116561376A (en) Multi-agent hypergraph modeling and representing method
CN117851888A (en) Hypergraph node classification method and hypergraph node classification device based on fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination