CN112328578B - Database query optimization method based on reinforcement learning and graph attention network - Google Patents

Database query optimization method based on reinforcement learning and graph attention network Download PDF

Info

Publication number
CN112328578B
CN112328578B CN202011351761.3A CN202011351761A CN112328578B CN 112328578 B CN112328578 B CN 112328578B CN 202011351761 A CN202011351761 A CN 202011351761A CN 112328578 B CN112328578 B CN 112328578B
Authority
CN
China
Prior art keywords
node
query
bits
network
graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011351761.3A
Other languages
Chinese (zh)
Other versions
CN112328578A (en
Inventor
詹思瑜
周维清
王玉林
卢国明
戴波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202011351761.3A priority Critical patent/CN112328578B/en
Publication of CN112328578A publication Critical patent/CN112328578A/en
Application granted granted Critical
Publication of CN112328578B publication Critical patent/CN112328578B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F16/217Database tuning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • G06F16/2228Indexing structures
    • G06F16/2246Trees, e.g. B+trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • G06F16/2433Query languages
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention relates to the technical field of databases, provides a database query optimization method based on reinforcement learning and graph attention network, and aims to solve the technical problems that when the connection relation of the existing query sentences is very complex, the query execution plan space is very huge, and a large amount of time is consumed for searching the whole query execution space. Randomly generating query sentences in a database and executing the query sentences, splitting an execution plan tree corresponding to the query sentences from root nodes, and recording the connection relation of each node; initializing Q-network parameters w in a DQN model, wherein the Q-network in the DQN model adopts a GAT (generic object model) graph attention network, takes a coding feature matrix and a graph description set Edge as network input, and trains the DQN model; and (3) initializing the graph description and the code of a query statement, and generating a connection relation by using the DQN model obtained by training in the step 2 until all tables are connected to generate a complete query plan.

Description

Database query optimization method based on reinforcement learning and graph attention network
Technical Field
The invention relates to a database query optimization method based on reinforcement learning and a graph attention network. Aiming at large-scale multi-connection query, a better database query execution plan can be obtained in a shorter time, so that the execution time of the query in the database is reduced.
Background
For a query statement, the database cannot be executed directly. The database needs to analyze the query statement first, then the optimizer generates a corresponding query execution plan, and finally the plan is handed to the execution engine to execute the plan. The invention provides an effective solution for generating a better query plan aiming at the multi-connection query in a shorter time.
The technical scheme in the two prior arts is most similar to that proposed in the present application:
1. chinese invention patent, patent name: a database multi-connection query optimization method based on an improved SDD-1 algorithm is disclosed, and the application number is as follows: CN201110043615.9.
Firstly, the improved SDD-1 algorithm is executed, a query execution strategy set is obtained by utilizing the algorithm, and the execution strategy set is used as the basis for the initial population generation of the genetic algorithm. And then, executing a genetic algorithm, and optimizing the result obtained by the SDD-1 algorithm by using the global search capability of the genetic algorithm. Finally, a relatively ideal query execution strategy is obtained. The method specifically comprises the following steps:
step 1: setting initial parameters: initial parameter settings including SDD-1 and genetic algorithms;
step 2: acquiring a query execution policy set: searching beneficial bidirectional half-links from the constructed query graph, selecting the beneficial bidirectional half-links from the beneficial bidirectional half-link candidate set to be connected to the beneficial bidirectional half-link set BS, repeating the steps until no beneficial bidirectional half-links exist in the query graph, adding the value of the obtained beneficial bidirectional half-link set BS to the execution strategy set ES, and repeating the steps until the operation frequency reaches N;
and step 3: constructing an initial population of a genetic algorithm: sequentially executing coding operation on elements in the execution strategy set ES, and taking the obtained result as an initial population of the genetic algorithm;
and 4, step 4: running a genetic algorithm: repeatedly performing crossing, variation and selection operations on the population until the running times reach M;
and 5: outputting a query execution strategy: and (4) outputting the best individual in the population as a final result, and decoding the final result into a query tree, namely a query execution strategy.
2. Chinese patent invention, patent name: a big data real-time query optimization method based on hypergraph and dynamic plan is disclosed as follows: CN201020231887.2.
A big data real-time query optimization method based on hypergraph and dynamic plan comprises an optimal cost model construction process and an execution plan space search process. The optimal cost model construction process comprises the following steps:
step 1: analyzing the table data in the metadata server, constructing and generating a column-level statistical information histogram with fine granularity, and storing the column-level statistical information histogram in the metadata server;
and 2, step: and constructing a corresponding optimal cost model for use in generating a plan by using the statistical information.
Performing the planned space search process includes the steps of:
step 1: and analyzing the database query statement, and storing and querying the result in the hypergraph data structure.
And 2, step: the execution plan is initially set for a single relationship and saved in the corresponding dynamic schedule.
And step 3: a compute enumeration policy is defined: each connected subgraph and the connected complement set are generated only once;
and 4, step 4: enumerating connected subgraphs by computing a domain;
and 5: finding a suitable connected complement set for each connected subgraph;
step 6: calculating the cost of the execution plan formed by each pair of connected subgraphs and the complementary set, and updating the execution plan according to the cost model;
and 7: and (5) repeating the step (4) to the step (7) until the execution plan space formed by the whole left linear tree is searched, and generating an execution plan tree.
The first technical scheme has the following defects:
1. the genetic algorithm is essentially a greedy strategy and is easy to fall into a local optimal solution;
2. the scheme needs to set the iteration times of the genetic algorithm, and when the iteration times are less, a better query execution plan cannot be obtained. When the number of iterations is large, the algorithm execution time needs to be long, and the situation that the local optimization is trapped cannot be avoided.
3. The query execution plan is encoded in one dimension, and tree structure information of the query execution plan cannot be captured.
The second technical scheme has the following disadvantages:
1. the left linear tree is used for enumerating and searching the whole query execution plan space, when the connection relation of the query statements is very complex, the query execution plan space is very huge, and a great amount of time is consumed for searching the whole query execution space.
Disclosure of Invention
The invention aims to solve the technical problems that when the connection relation of the existing query statement is very complex, the query execution plan space is very huge, and a great amount of time is consumed for searching the whole query execution space.
In order to solve the technical problems, the invention adopts the following technical scheme:
the invention provides a database query optimization method based on reinforcement learning and a graph attention network, which comprises the following specific steps of:
step 1: data collection, namely randomly generating query statements in a database and executing the query statements, splitting an execution plan tree corresponding to the query statements from a root node, and recording the connection relation of each node;
step 2: model training, namely performing coding description on each node according to the connection relation of each node to obtain a coding characteristic matrix, performing graph description on each node to obtain a graph description set Edge, initializing a Q-network parameter w in the DQN model, adopting a GAT (generic object model) attention network for the Q-network in the DQN model, taking the coding characteristic matrix and the graph description set Edge as network input, and training the DQN model;
and step 3: and (3) model application, namely taking each table as a node for the tables related to a query statement, initializing the graph description and the code of each node, selecting the connection of 2 nodes in each step in the query execution plan, generating by using the DQN model obtained by training in the step 2, and at the moment, carrying out state transition, updating the graph description and the code description until all the tables are connected to generate the complete query plan.
In the above technical solution, the drawings are described as follows:
the method comprises the steps that the number of n tables related to a query statement is represented as [1,2,3,4 \8230n ], each table is used as a Node index, the number of initial Node indexes is n, all current common Node index sets Node [1, 2., n ]) are stored, and an initialization Edge set Edge is null;
selecting 2 nodes i and j which are not marked as connected query plans in the Node set, adding the query plans connected with the nodes i and j into the Node set as new nodes max (index) +1, wherein the index belongs to the Node, marking the nodes i and j as connected query plans, representing the nodes i, j and the new nodes max (index) +1 into two edges (i, max (index) + 1) and (j, max (index) + j), and adding the edges into the Edge set to obtain a graph description set Edge.
In the above technical solution, the encoding is described as follows:
for each node, the number of coding bits is n + m + k bits, the initial n bits are table 1-hot codes related to the current node, the middle m bits are column attribute 1-hot codes related to the current node, the last k bits are 1-hot codes of connection types, the coding description n + m bits of each node can be assigned during initialization, the last k bits are initialized to 0, when connection operation is performed, the coding of the newly added node is newly added, the first n + m bits are bits of connection 2 nodes or operation results, the last k bits are set with the 1-hot codes according to the connection types, and finally a coding feature matrix is obtained.
Because the invention adopts the technical scheme, the invention has the following beneficial effects:
1. compared with the dynamic programming algorithm, the dynamic programming algorithm needs to search all possible solutions of the query execution plan, and the solution space of the query execution plan increases with the exponential order of the number of connections. Although the dynamic programming algorithm may obtain the optimal solution of the query execution plan, it may take a lot of time in obtaining the optimal solution. And by using the reinforced learning DQN model, the model is trained off-line without consideration. When the DQN model is used for deciding a query execution plan, the algorithm execution times are linearly related to the connection number. If there are n connections, only n-1 algorithm executions at most are needed to obtain a better query execution plan.
2. Compared with intelligent algorithms such as genetic algorithm and the like, as described in the second technical scheme, the iteration number of the algorithm needs to be set firstly. However, for different query statements, the iteration number of the algorithm cannot be adapted, which may result in that if the iteration number is not enough, a better solution cannot be obtained. If the number of iterations is too large, a lot of time is consumed while the local optimum may be involved. And by using the DQN model, the selection of each step is selected by the Q-network trained by using a large amount of data, so that the local optimization is less prone to be involved, and the result can be obtained only by executing the algorithm n-1 times at most.
3. Compared with a common reinforcement learning algorithm, the tree structure characteristics of the query execution plan cannot be described, the connection selection adopted in each step cannot be described, and the influence degree of the final query execution plan is not uniform. The two features can be better described by replacing the ordinary full-connection network Q-network in the DQN network with a GAT graph attention network.
Drawings
FIG. 1 is a schematic view of a model structure;
FIG. 2 is a schematic diagram of an execution plan tree split from a root node.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration only, not by way of limitation, i.e., the embodiments described are intended as a selection of the best mode contemplated for carrying out the invention, not as a full mode. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
In the reinforcement learning model DQN, the most important part is to predict the long-term influence of different single step selections through Q-network, and each time the model selects the selection with the best long-term influence, the best query execution plan is finally considered to be obtained. Therefore, the correctness of the prediction of the long-term impact on each single-step selection is crucial. The ordinary Q-network is a full-connection network and only can take one-dimensional data as input. If one-dimensional data is used as input, many problems arise in describing current query execution plans. The common encoding method for the query execution plan can lead to that some different query execution plans have the same encoding representation. Such training may make the long-term impact of Q-network on connection selection inaccurate, as the same input will have the same output result as the input. So to capture the structural information of the overall query execution plan, the GAT attention network is selected as the Q-network instead of the normal full-connection network. The brand-new network structure can enable the long-term influence of the trained model on the single-step connection selection to be more accurate. The best connection selection can also be selected more accurately each time when using the DQN model.
For a query statement, the database needs to execute according to its query execution plan. Query statements involve many table join operations, and table join operations all have commutative associativity. Therefore, in the query execution plan, it is necessary to determine which two parts (which may be tables or the result after the tables and the tables are connected) are connected each time, and finally, each table is connected, that is, the complete query execution plan is formed. Abstracting the query execution plan tree into a Markov decision process: in the start state, all tables are not connected, and it is necessary to select which two tables are connected. After selection, the two tables are replaced by a connection result, and the next state is entered, and the connection result generated in the previous step can be selected. I.e. each time a connection is selected, there may be two tables, one table with one connection result or two connection results. Until only one join result remains in all states, a complete query plan is generated.
The invention specifically provides a database query optimization method based on reinforcement learning and graph attention network, which is characterized by comprising the following specific steps:
step 1: data collection, namely randomly generating query statements in a database and executing the query statements, splitting an execution plan tree corresponding to the query statements from a root node, and recording the connection relation of each node; as shown in fig. 2, a query execution plan including 6 nodes is illustrated, where the node 6 is a root node, and the splitting is ended each time the root node is deleted during the splitting until each node is a leaf node.
Step 2: model training, namely performing code description on each node according to the connection relation of each node to obtain a code characteristic matrix, performing graph description on each node to obtain a graph description set Edge, initializing a Q-network parameter w in a DQN model, wherein the Q-network in the DQN model adopts a GAT (generic object model) graph attention network, and training the DQN model by taking the code characteristic matrix and the graph description set Edge as network input;
and step 3: and (3) model application, namely taking each table as a node for the tables related to a query statement, initializing the graph description and the code of each node, selecting the connection of 2 nodes in each step in the query execution plan, generating by using the DQN model obtained by training in the step 2, and at the moment, carrying out state transition, updating the graph description and the code description until all the tables are connected to generate the complete query plan.
The GAT graph attention network is composed of an attention convolution layer, an active layer, an attention convolution layer and an active layer 4. Wherein the first attention convolution layer receiving input part is state + And action + Determined next state t+1 . To execute a plan srate on a query t Data model, state pair, described as being tractable by a graph attention network t Is described for the figureAnd the code describes two parts:
the figures are described as follows:
the number of n tables related to the query statement is represented as [1,2,3, 4' \ 8230n ]; n ], each table is used as a Node index, the number of initial Node indexes is n, all current common Node index sets Node [1,2,. Once, n ]) are stored, and the initial Edge set Edge is null;
selecting 2 nodes i and j which are not marked as connected query plans from the Node set, adding the query plans connected with the nodes i and j into the Node set as a new Node max (index) +1, wherein the index belongs to the Node, marking the nodes i and j as connected query plans, representing the nodes i, j and the new Node max (index) +1 into two edges (i, max (index) + 1) and (j, max (index) + j), and adding the edges into the Edge set to obtain a graph description set Edge.
The coding in the above technical solution is described as follows:
in the query execution plan, any node i is connected with a query plan node in the later stage, and the result of each query plan node is used as a node j;
for each node, the number of coding bits is n + m + k bits, the initial n bits are table 1-hot codes related to the current node, the middle m bits are column attribute 1-hot codes related to the current node, the last k bits are 1-hot codes of connection types, the coding description n + m bits of each node can be assigned during initialization, the last k bits are initialized to 0, when connection operation is performed, the coding of the newly added node is newly added, the first n + m bits are bits of connection 2 nodes or operation results, the last k bits are set with the 1-hot codes according to the connection types, and finally a coding feature matrix is obtained.
Because the invention adopts the technical scheme, the invention has the following characteristics:
1. in the step 1, enough data can be collected, so that a model with enough generalization capability can be obtained through training in the subsequent model training process;
2. in step 1, all operations are performed off-line, that is, although a large amount of data is collected, the time consumed by the operations is the same as the time of generating the query statement plan by using the model in step 3;
3. in step 2, the DQN model is used as a framework in the whole, so that the connection selection of each step is not limited to whether single-step selection is excellent or not, but is focused on a selection which is more beneficial to the generation of the whole query execution plan, and is less prone to falling into local optimization;
4. in step 2, the general full-connection network is replaced by the attention network GAT in the DQN model as a Q-network, and the tree structure information of the query execution plan can be captured.
5. In step 3, for all query statements, the time for generating a complete query execution plan is linearly related to the number of connections of the query statements, so that a better execution plan can be generated effectively in a very short time.

Claims (1)

1. A database query optimization method based on reinforcement learning and graph attention network is characterized by comprising the following specific steps:
step 1: data collection, namely randomly generating query statements in a database and executing the query statements, splitting an execution plan tree corresponding to the query statements from a root node, and recording the connection relation of each node;
step 2: model training, namely performing code description on each node according to the connection relation of each node to obtain a code characteristic matrix, performing graph description on each node to obtain a graph description set Edge, initializing a Q-network parameter w in a DQN model, wherein the Q-network in the DQN model adopts a GAT (generic object model) graph attention network, and training the DQN model by taking the code characteristic matrix and the graph description set Edge as network input;
and 3, step 3: model application, namely taking each table as a node for tables related to a query statement, initializing graph description and coding for each node, selecting connection of 2 nodes in each step in a query execution plan, and generating a DQN model obtained by training in the step 2, wherein at the moment, the state is transferred, and the graph description and the coding description are updated until all the tables are connected, so that a complete query plan is generated;
the figures are described as follows:
the number of n tables related to the query statement is represented as [1,2,3,4 '\ 8230 ]; n ], each table is used as a Node index, the number of initial Node indexes is n, all current common Node index sets Node [1, 2' \ 8230 ]; n ] are stored, and the initial Edge set Edge is null;
selecting 2 nodes i and j which are not marked as connected query plans in a Node set, adding the query plans connected with the nodes i and j into the Node set as new nodes max (index) +1, wherein the index belongs to the Node, marking the nodes i and j as connected query plans, representing the nodes i, j and the new nodes max (index) +1 into two edges (i, max (index) + 1) and (j, max (index) + j), and adding the edges into an Edge set to obtain a graph description set Edge;
the encoding is described as follows:
for each node, the number of coding bits is n + m + k bits, the initial n bits are table 1-hot codes related to the current node, the middle m bits are column attribute 1-hot codes related to the current node, the last k bits are connection type 1-hot codes, the coding description n + m bits of each node can be assigned during initialization, the last k bits are initialized to be 0, when connection operation is performed, the coding of the newly added node is added, the first n + m bits are bits or operation results of connection 2 nodes, the last k bits are provided with the 1-hot codes according to the connection type, and finally the coding feature matrix is obtained.
CN202011351761.3A 2020-11-26 2020-11-26 Database query optimization method based on reinforcement learning and graph attention network Active CN112328578B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011351761.3A CN112328578B (en) 2020-11-26 2020-11-26 Database query optimization method based on reinforcement learning and graph attention network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011351761.3A CN112328578B (en) 2020-11-26 2020-11-26 Database query optimization method based on reinforcement learning and graph attention network

Publications (2)

Publication Number Publication Date
CN112328578A CN112328578A (en) 2021-02-05
CN112328578B true CN112328578B (en) 2023-03-28

Family

ID=74309558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011351761.3A Active CN112328578B (en) 2020-11-26 2020-11-26 Database query optimization method based on reinforcement learning and graph attention network

Country Status (1)

Country Link
CN (1) CN112328578B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112883066B (en) * 2021-03-29 2022-07-08 电子科技大学 Method for estimating multi-dimensional range query cardinality on database
CN112988802B (en) * 2021-04-29 2022-07-15 电子科技大学 Relational database query optimization method and system based on reinforcement learning
CN113010547B (en) * 2021-05-06 2023-04-07 电子科技大学 Database query optimization method and system based on graph neural network
CN115168408A (en) * 2022-08-16 2022-10-11 北京永洪商智科技有限公司 Query optimization method, device, equipment and storage medium based on reinforcement learning
CN116383454B (en) * 2023-04-10 2024-01-30 星环信息科技(上海)股份有限公司 Data query method of graph database, electronic equipment and storage medium
CN116561173B (en) * 2023-07-11 2023-10-13 太原理工大学 Method and system for selecting query execution plan by using relational graph and attention neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106294888A (en) * 2016-10-24 2017-01-04 北京亚控科技发展有限公司 A kind of method for subscribing of object data based on space-time database
CN110084245A (en) * 2019-04-04 2019-08-02 中国科学院自动化研究所 The Weakly supervised image detecting method of view-based access control model attention mechanism intensified learning, system
CN111581454A (en) * 2020-04-27 2020-08-25 清华大学 Depth map compression algorithm-based parallel query expression prediction system and method
CN111611274A (en) * 2020-05-28 2020-09-01 华中科技大学 Database query optimization method and system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8457781B2 (en) * 2007-09-13 2013-06-04 Lockheed Martin Corporation Facility wide mixed mail sorting and/or sequencing system and components and methods thereof
US11699083B2 (en) * 2018-06-08 2023-07-11 The United States Of America, As Represented By The Secretary Of The Navy Swarm system including an operator control section enabling operator input of mission objectives and responses to advice requests from a heterogeneous multi-agent population including information fusion, control diffusion, and operator infusion agents that controls platforms, effectors, and sensors
US20200327118A1 (en) * 2020-06-27 2020-10-15 Intel Corporation Similarity search using guided reinforcement learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106294888A (en) * 2016-10-24 2017-01-04 北京亚控科技发展有限公司 A kind of method for subscribing of object data based on space-time database
CN110084245A (en) * 2019-04-04 2019-08-02 中国科学院自动化研究所 The Weakly supervised image detecting method of view-based access control model attention mechanism intensified learning, system
CN111581454A (en) * 2020-04-27 2020-08-25 清华大学 Depth map compression algorithm-based parallel query expression prediction system and method
CN111611274A (en) * 2020-05-28 2020-09-01 华中科技大学 Database query optimization method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Weiqing Zhou等.SOAR::a learned join order selector with graph attention mechanism.《2022 international joint conference on Neural Networks》.2022,1-14页. *

Also Published As

Publication number Publication date
CN112328578A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN112328578B (en) Database query optimization method based on reinforcement learning and graph attention network
CN111597209B (en) Database materialized view construction system, method and system creation method
Zou et al. Finding top-k maximal cliques in an uncertain graph
CN111597347B (en) Knowledge embedding defect report reconstruction method and device
CN111428054A (en) Construction and storage method of knowledge graph in network space security field
CN111581454B (en) Parallel query performance prediction system and method based on depth map compression algorithm
CN113010547B (en) Database query optimization method and system based on graph neural network
CN113515539B (en) Method for inquiring data in database
CN113535972B (en) Knowledge graph link prediction model method and device fusing context semantics
CN101944141A (en) High-efficiency global optimization method using adaptive radial basis function based on fuzzy clustering
Wang et al. 3DM: domain-oriented data-driven data mining
EP4075292A1 (en) Method and apparatus for processing database
CN104504018A (en) Top-down real-time big data query optimization method based on bushy tree
CN105335510A (en) Text data efficient searching method
CN114911844B (en) Approximate query optimization system based on machine learning
CN104182489B (en) A kind of inquiry processing method of text big data
CN105160046A (en) Text-based data retrieval method
Zou et al. Survey on learnable databases: A machine learning perspective
CN113515540A (en) Query rewriting method for database
Fang et al. A query-level distributed database tuning system with machine learning
CN115185728A (en) Software system architecture recovery method based on graph node embedding
Lu et al. Assessment of urban water supply system based on query optimization strategy
CN117390064B (en) Database query optimization method based on embeddable subgraph
CN116483863A (en) Query optimization method based on tree attention and radix perception and storage medium
CN112650770B (en) MySQL parameter recommendation method based on query work load analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant