CN115982736B - Data encryption method and system for computer network information - Google Patents

Data encryption method and system for computer network information Download PDF

Info

Publication number
CN115982736B
CN115982736B CN202211649545.6A CN202211649545A CN115982736B CN 115982736 B CN115982736 B CN 115982736B CN 202211649545 A CN202211649545 A CN 202211649545A CN 115982736 B CN115982736 B CN 115982736B
Authority
CN
China
Prior art keywords
encrypted
feature
vectors
vector
feature vectors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211649545.6A
Other languages
Chinese (zh)
Other versions
CN115982736A (en
Inventor
段新东
赵飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanyang Institute of Technology
Original Assignee
Nanyang Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanyang Institute of Technology filed Critical Nanyang Institute of Technology
Priority to CN202211649545.6A priority Critical patent/CN115982736B/en
Publication of CN115982736A publication Critical patent/CN115982736A/en
Application granted granted Critical
Publication of CN115982736B publication Critical patent/CN115982736B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/50Reducing energy consumption in communication networks in wire-line communication networks, e.g. low power modes or reduced link rate

Landscapes

  • Storage Device Security (AREA)

Abstract

The application discloses a data encryption method and system for computer network information. The method comprises the steps of firstly enabling each piece of acquired information to be encrypted to pass through a context encoder to obtain a plurality of semantic feature vectors to be encrypted, then enabling the plurality of semantic feature vectors to be two-dimensionally matrixed and then pass through a convolutional neural network model to obtain a global semantic feature matrix to be encrypted, then taking each piece of semantic feature vector to be encrypted as a query feature vector, calculating the product between the semantic feature vector to be encrypted and the global semantic feature matrix to obtain a plurality of classification feature vectors, then respectively carrying out structural fuzzy correction on each classification feature vector in the plurality of classification feature vectors, then enabling the classification feature vectors to pass through a classifier to obtain a plurality of probability values, and finally determining the encryption priority of the plurality of pieces of information to be encrypted based on the ordering among the plurality of probability values. In this way, the encryption control effect can be improved.

Description

Data encryption method and system for computer network information
Technical Field
The present application relates to the field of data encryption technology, and more particularly, to a data encryption method and system for computer network information.
Background
In the application of computer network data, data security is an important guarantee. The encryption processing of the data is an important means or approach in the process of ensuring the data security. However, in the prior art, when data encryption resources collide with each other, data encryption processing is generally performed on a random order, and thus, there is a problem that the control of data encryption is not effective.
Accordingly, an optimized data encryption scheme for computer network information is desired.
Disclosure of Invention
The present application has been made in order to solve the above technical problems. The embodiment of the application provides a data encryption method and system for computer network information. The method comprises the steps of firstly enabling each piece of acquired information to be encrypted to pass through a context encoder to obtain a plurality of semantic feature vectors to be encrypted, then enabling the plurality of semantic feature vectors to be two-dimensionally matrixed and then pass through a convolutional neural network model to obtain a global semantic feature matrix to be encrypted, then taking each piece of semantic feature vector to be encrypted as a query feature vector, calculating the product between the semantic feature vector to be encrypted and the global semantic feature matrix to obtain a plurality of classification feature vectors, then respectively carrying out structural fuzzy correction on each classification feature vector in the plurality of classification feature vectors, then enabling the classification feature vectors to pass through a classifier to obtain a plurality of probability values, and finally determining the encryption priority of the plurality of pieces of information to be encrypted based on the ordering among the plurality of probability values. In this way, the encryption control effect can be improved.
According to one aspect of the present application, there is provided a data encryption method for computer network information, comprising:
acquiring a plurality of pieces of information to be encrypted;
the information to be encrypted passes through a context encoder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted;
two-dimensional matrixing the semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted;
the global semantic matrix to be encrypted is passed through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted;
taking each semantic feature vector to be encrypted as a query feature vector, and calculating the product between the query feature vector and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors;
respectively carrying out structural fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors to obtain a plurality of corrected classification characteristic vectors;
passing the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values; and
an encryption priority of the pieces of information to be encrypted is determined based on the ordering among the plurality of probability values.
In the above data encryption method for computer network information, the performing structure blur correction on each of the plurality of classification feature vectors to obtain a plurality of corrected classification feature vectors includes:
Performing structural fuzzy correction of feature distribution on each classification feature vector in the plurality of classification feature vectors by using the following formula to obtain a plurality of corrected classification feature vectors;
wherein, the formula is:
wherein ,representing each of the plurality of classification feature vectors, the classification feature vector being in the form of a column vector, +.>Representing each corrected classification feature vector of the plurality of corrected classification feature vectors, +.>Representing the square of the two norms of each of said plurality of classification feature vectors,/->An ordered vector representing the order in which the feature values of each of the plurality of classification feature vectors are arranged in order of magnitude, +.>Transfer vectors representing respective ones of the plurality of classification feature vectors,/>Representing multiplication by location +.>Representing vector multiplication.
In the above data encryption method for computer network information, the step of passing the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values includes:
processing the plurality of corrected classification feature vectors using the classifier in the following formula to obtain a plurality of probability values;
wherein, the formula is: , wherein ,/>To->Is a weight matrix>To->For the bias vector +.>And for each corrected classification feature vector of the plurality of corrected classification feature vectors.
According to another aspect of the present application, there is provided a data encryption system for computer network information, comprising:
the information to be encrypted acquisition module is used for acquiring a plurality of pieces of information to be encrypted;
the context coding module is used for enabling the information to be encrypted to pass through a context coder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted;
the two-dimensional matrixing module is used for two-dimensionally matrixing the plurality of semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted;
the convolution coding module is used for enabling the global semantic matrix to be encrypted to pass through a convolution neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted;
the classified feature vector calculation module is used for calculating the product between each semantic feature vector to be encrypted and the global semantic feature matrix to obtain a plurality of classified feature vectors by taking the semantic feature vector to be encrypted as a query feature vector;
the structure fuzzy correction module is used for respectively carrying out structure fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors so as to obtain a plurality of corrected classification characteristic vectors;
The probability value generation module is used for enabling the corrected classification feature vectors to pass through a classifier to obtain a plurality of probability values; and
and the encryption priority determining module is used for determining the encryption priority of the pieces of information to be encrypted based on the ordering among the probability values.
In the above data encryption system for computer network information, the context encoding module is further configured to:
word segmentation processing is carried out on each piece of information to be encrypted so as to convert each piece of information to be encrypted into a word sequence consisting of a plurality of words;
mapping each word in the word sequence into a word embedding vector using an embedding layer of the context encoder including the embedding layer to obtain a sequence of word embedding vectors;
performing context semantic understanding based on long distance dependence on the sequence of word embedding vectors using the context encoder comprising an embedding layer to obtain a plurality of context word feature vectors; and
and cascading the context word feature vectors to obtain the semantic feature vectors to be encrypted.
In the above data encryption system for computer network information, the performing, using the context encoder including the embedding layer, a long-distance-dependent context semantic understanding on the sequence of word embedding vectors to obtain a plurality of context word feature vectors includes:
One-dimensional arrangement is carried out on the sequence of the word embedding vectors so as to obtain global unfolding feature vectors;
calculating the product between the global expansion feature vector and the transpose vector of each word embedding vector in the sequence of word embedding vectors to obtain a plurality of self-attention association matrices;
respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices;
obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices;
and weighting each word embedding vector in the sequence of word embedding vectors by taking each probability value in the plurality of probability values as a weight so as to obtain the plurality of context word feature vectors.
In the above data encryption system for computer network information, the convolutional encoding module is further configured to:
and respectively carrying out convolution processing, pooling processing along a channel dimension and nonlinear activation processing on input data in forward transfer of layers by using each layer of the convolutional neural network model serving as the feature extractor to output the global semantic feature matrix to be encrypted by the last layer of the convolutional neural network model serving as the feature extractor, wherein the input of the first layer of the convolutional neural network model serving as the feature extractor is the global semantic matrix to be encrypted.
Compared with the prior art, the data encryption method and system for computer network information provided by the application are characterized in that firstly, the acquired information to be encrypted passes through a context encoder to obtain a plurality of semantic feature vectors to be encrypted, then, the semantic feature vectors to be encrypted are subjected to two-dimensional matrixing to obtain a global semantic feature matrix to be encrypted through a convolutional neural network model, then, the semantic feature vectors to be encrypted are used as query feature vectors, the product between the semantic feature vectors to be encrypted and the global semantic feature matrix to be encrypted is calculated to obtain a plurality of classification feature vectors, then, each classification feature vector in the plurality of classification feature vectors is subjected to structure fuzzy correction and then is subjected to classifier to obtain a plurality of probability values, and finally, the encryption priority of the information to be encrypted is determined based on the sequence among the probability values. In this way, the encryption control effect can be improved.
Drawings
The foregoing and other objects, features and advantages of the present application will become more apparent from the following more particular description of embodiments of the present application, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 is a schematic view of a scenario of a data encryption method for computer network information according to an embodiment of the present application.
Fig. 2 is a flow chart of a data encryption method for computer network information according to an embodiment of the present application.
Fig. 3 is a schematic architecture diagram of a data encryption method for computer network information according to an embodiment of the present application.
Fig. 4 is a flowchart of sub-steps of step S120 in a data encryption method for computer network information according to an embodiment of the present application.
Fig. 5 is a flowchart of sub-steps of step S123 in a data encryption method for computer network information according to an embodiment of the present application.
Fig. 6 is a block diagram of a data encryption system for computer network information according to an embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application and not all of the embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Scene overview
As described above, in the case of a conflict in data encryption resources, data encryption processing is generally performed on a random order, and thus, there is a problem that the effect is poor in managing data encryption. Accordingly, an optimized data encryption scheme for computer network information is desired.
In the technical scheme, the encryption priority is determined based on the high-dimensional implicit association of the data to be encrypted, by utilizing the data intelligent scheme, so that the encryption priority suitable for the distribution of the data is determined by fully utilizing the characteristics of the data, and the encryption management and control effect is improved.
Specifically, a plurality of pieces of information to be encrypted are first acquired. And then, passing the pieces of information to be encrypted through a context encoder comprising an embedded layer to obtain a plurality of pieces of semantic feature vectors to be encrypted. That is, the context encoder including the embedded layer is used to perform global context-based semantic understanding on each piece of information to be encrypted so as to obtain a plurality of semantic feature vectors to be encrypted. Specifically, firstly, word segmentation is carried out on each piece of information to be encrypted to obtain word sequences; then, mapping each word in the word sequence into a word embedding vector by using an embedding layer to obtain a sequence of word embedding vectors; further, using the converter-based context encoder, for example, a converter-based Bert model, performing a long-range dependent-based context semantic understanding on the sequence of word embedded vectors to obtain a plurality of context word feature vectors; finally, cascading the context word feature vectors to obtain the semantic feature vectors to be encrypted.
And carrying out two-dimensional matrixing on the plurality of semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted. That is, in a high-dimensional feature space, semantic feature representations of pieces of information to be encrypted are structurally integrated. And then, the global semantic matrix to be encrypted passes through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted. Namely, the convolution neural network model with excellent performance in the field of local feature extraction is used for carrying out local feature extraction based on convolution kernels on the global semantic matrix to be encrypted to obtain the global semantic feature matrix to be encrypted, namely, local associated semantic feature information among various semantic feature vectors to be encrypted is extracted.
And then, taking the semantic feature vectors to be encrypted as query feature vectors, and calculating the product between the query feature vectors and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors. It should be understood that the global semantic feature matrix to be encrypted is used to represent global semantic feature representations of the pieces of information to be encrypted, and each piece of semantic feature vector to be encrypted is a semantic feature representation of each piece of information to be encrypted, and calculating a product between the two can map the global semantic feature matrix to each piece of semantic feature space of the information to be encrypted so that each obtained classification feature vector can reflect a high-dimensional implicit association mode feature of each piece of information to be encrypted relative to the whole pieces of information to be encrypted.
The classification feature vector is then passed through a classifier to obtain a plurality of probability values, where each probability value is used to represent the probability of giving the highest priority to the information to be encrypted. In this way, after the probability values are obtained, the encryption priority of the pieces of information to be encrypted is determined based on the ordering among the probability values, and in this way, the encryption priority is determined based on the high-dimensional implicit association of the pieces of data to be encrypted by utilizing a data intelligent scheme, so that the characteristic of the data is fully utilized to determine the encryption priority suitable for the distribution of the data, and the encryption management and control effect is improved.
In particular, in the technical solution of the present application, by using the semantic feature vectors to be encrypted as query feature vectors, products between the query feature vectors and the global semantic feature matrix to be encrypted are calculated to obtain a plurality of classification feature vectors, and the global inter-sample correlation expression of the context semantic correlation feature of the information to be encrypted, which is expressed by the global semantic feature matrix to be encrypted, can be mapped into the context semantic correlation feature of the information to be encrypted. However, since the global semantic feature matrix to be encrypted expresses a high-order context association between the information to be encrypted, mapping the high-order context association to a low-order context association space of the information to be encrypted may cause a fuzzy structure of feature distribution of the classification feature vector, thereby reducing expression certainty of the classification feature vector and affecting accuracy of a classification result obtained by the classification feature vector through a classifier.
Based on this, the applicant of the present application performs vector-ordered hilbert completion on the classification feature vector, expressed as:
and />Said classification feature vectors before and after correction, respectively,>representing the square of the two norms of the classification feature vector, i.e. the inner product of the classification feature vector itself,/->Is an ordered vector in which feature values of the classification feature vectors are arranged in order of magnitude, and the classification feature vector +.>Is in the form of a column vector.
Here, by mapping the ordered vectors into the hilbert space defined by the self-inner product of the vectors, a meaningful measure of the numerical relation of the feature set in the consistency space can be realized, based on which, a feature space with an orthorhombic structure is built by embedding the relative position with the feature vector, and the structure in the feature space is completed for the high-dimensional manifold of the feature vector based on the vector query, so that the reduction of the expression certainty of the feature vector due to the fuzzification structure can be avoided, and the accuracy of the classification result obtained by the classifier of the classification feature vector is increased. That is, the accuracy of determining the probability of giving the highest encryption priority to each piece of information to be encrypted is improved.
Based on this, the present application provides a data encryption method for computer network information, which is characterized by comprising: acquiring a plurality of pieces of information to be encrypted; the information to be encrypted passes through a context encoder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted; two-dimensional matrixing the semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted; the global semantic matrix to be encrypted is passed through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted; taking each semantic feature vector to be encrypted as a query feature vector, and calculating the product between the query feature vector and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors; respectively carrying out structural fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors to obtain a plurality of corrected classification characteristic vectors; passing the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values; and determining encryption priority of the pieces of information to be encrypted based on the ordering among the plurality of probability values.
Fig. 1 is an application scenario diagram of a data encryption method for computer network information according to an embodiment of the present application. As shown in fig. 1, in this application scenario, a plurality of pieces of information to be encrypted (e.g., D as illustrated in fig. 1) are acquired, and then input into a server (e.g., S as illustrated in fig. 1) where a data encryption algorithm for computer network information is deployed, wherein the server is capable of generating a plurality of probability values based on the data encryption algorithm for computer network information, and then determining encryption priorities of the plurality of pieces of information to be encrypted based on an ordering among the plurality of probability values.
Having described the basic principles of the present application, various non-limiting embodiments of the present application will now be described in detail with reference to the accompanying drawings.
Exemplary method
Fig. 2 is a flow chart of a data encryption method for computer network information according to an embodiment of the present application. As shown in fig. 2, a data encryption method for computer network information according to an embodiment of the present application includes the steps of: s110, acquiring a plurality of pieces of information to be encrypted; s120, passing each piece of information to be encrypted through a context encoder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted; s130, carrying out two-dimensional matrixing on the plurality of semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted; s140, passing the global semantic matrix to be encrypted through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted; s150, taking each semantic feature vector to be encrypted as a query feature vector, and calculating the product between the query feature vector and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors; s160, respectively carrying out structural fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors to obtain a plurality of corrected classification characteristic vectors; s170, the corrected classification feature vectors pass through a classifier to obtain a plurality of probability values; and S180, determining encryption priority of the pieces of information to be encrypted based on the ordering among the probability values.
Fig. 3 is a schematic architecture diagram of a data encryption method for computer network information according to an embodiment of the present application. As shown in fig. 3, in the network architecture, first, a plurality of pieces of information to be encrypted are acquired; then, passing each piece of information to be encrypted through a context encoder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted; then, carrying out two-dimensional matrixing on the semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted; then, the global semantic matrix to be encrypted passes through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted; then, taking each semantic feature vector to be encrypted as a query feature vector, and calculating the product between the query feature vector and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors; then, respectively carrying out structural fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors to obtain a plurality of corrected classification characteristic vectors; then, the corrected classification feature vectors pass through a classifier to obtain a plurality of probability values; finally, based on the ordering among the plurality of probability values, the encryption priority of the pieces of information to be encrypted is determined.
More specifically, in step S110, a plurality of pieces of information to be encrypted are acquired. In the technical scheme, the encryption priority is determined based on the high-dimensional implicit association of the data to be encrypted, by utilizing the data intelligent scheme, so that the encryption priority suitable for the distribution of the data is determined by fully utilizing the characteristics of the data, and the encryption management and control effect is improved.
More specifically, in step S120, the pieces of information to be encrypted are passed through a context encoder including an embedded layer to obtain a plurality of semantic feature vectors to be encrypted. That is, the context encoder including the embedded layer is used to perform global context-based semantic understanding on each piece of information to be encrypted so as to obtain a plurality of semantic feature vectors to be encrypted.
Accordingly, in a specific example, as shown in fig. 4, the step of passing the pieces of information to be encrypted through a context encoder including an embedded layer to obtain a plurality of semantic feature vectors to be encrypted includes: s121, performing word segmentation processing on each piece of information to be encrypted to convert each piece of information to be encrypted into a word sequence consisting of a plurality of words; s122, mapping each word in the word sequence into a word embedding vector by using an embedding layer of the context encoder comprising the embedding layer to obtain a sequence of word embedding vectors; s123, performing context semantic understanding based on long distance dependence on the sequence of word embedding vectors by using the context encoder comprising an embedding layer to obtain a plurality of context word feature vectors; and S124, cascading the context word feature vectors to obtain the semantic feature vectors to be encrypted.
Accordingly, in one specific example, as shown in fig. 5, the performing, using the context encoder including the embedding layer, context semantic understanding of the sequence of word embedding vectors based on long distance dependence to obtain a plurality of context word feature vectors includes: s1231, performing one-dimensional arrangement on the sequence of the word embedding vectors to obtain global expansion feature vectors; s1232, calculating the product between the global expansion feature vector and the transpose vector of each word embedding vector in the sequence of word embedding vectors to obtain a plurality of self-attention association matrices; s1233, respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices; s1234, each normalized self-attention correlation matrix in the normalized self-attention correlation matrices is processed by a Softmax classification function to obtain a plurality of probability values; s1235, weighting each word embedding vector in the sequence of word embedding vectors by taking each probability value in the plurality of probability values as a weight to obtain the plurality of context word feature vectors.
More specifically, in step S130, the plurality of semantic feature vectors to be encrypted are two-dimensionally matrixed to obtain a global semantic matrix to be encrypted. That is, in a high-dimensional feature space, semantic feature representations of pieces of information to be encrypted are structurally integrated.
More specifically, in step S140, the global semantic matrix to be encrypted is passed through a convolutional neural network model as a feature extractor to obtain a global semantic feature matrix to be encrypted. And performing convolution kernel-based local feature extraction on the global semantic matrix to be encrypted by using a convolution neural network model with excellent performance in the field of local feature extraction to obtain the global semantic feature matrix to be encrypted, namely extracting local associated semantic feature information among all semantic feature vectors to be encrypted.
Accordingly, in a specific example, the passing the global semantic matrix to be encrypted through a convolutional neural network model as a feature extractor to obtain a global semantic feature matrix to be encrypted includes: and respectively carrying out convolution processing, pooling processing along a channel dimension and nonlinear activation processing on input data in forward transfer of layers by using each layer of the convolutional neural network model serving as the feature extractor to output the global semantic feature matrix to be encrypted by the last layer of the convolutional neural network model serving as the feature extractor, wherein the input of the first layer of the convolutional neural network model serving as the feature extractor is the global semantic matrix to be encrypted.
More specifically, in step S150, the product between the semantic feature vectors to be encrypted and the global semantic feature matrix is calculated with the semantic feature vectors to be encrypted as query feature vectors, so as to obtain a plurality of classification feature vectors. It should be understood that the global semantic feature matrix to be encrypted is used to represent global semantic feature representations of the pieces of information to be encrypted, and each piece of semantic feature vector to be encrypted is a semantic feature representation of each piece of information to be encrypted, and calculating a product between the two can map the global semantic feature matrix to each piece of semantic feature space of the information to be encrypted so that each obtained classification feature vector can reflect a high-dimensional implicit association mode feature of each piece of information to be encrypted relative to the whole pieces of information to be encrypted.
More specifically, in step S160, structural blur correction is performed on each of the plurality of classification feature vectors to obtain a plurality of corrected classification feature vectors, respectively.
In particular, in the technical solution of the present application, by using the semantic feature vectors to be encrypted as query feature vectors, products between the query feature vectors and the global semantic feature matrix to be encrypted are calculated to obtain a plurality of classification feature vectors, and the global inter-sample correlation expression of the context semantic correlation feature of the information to be encrypted, which is expressed by the global semantic feature matrix to be encrypted, can be mapped into the context semantic correlation feature of the information to be encrypted. However, since the global semantic feature matrix to be encrypted expresses a high-order context association between the information to be encrypted, mapping the high-order context association to a low-order context association space of the information to be encrypted may cause a fuzzy structure of feature distribution of the classification feature vector, thereby reducing expression certainty of the classification feature vector and affecting accuracy of a classification result obtained by the classification feature vector through a classifier. Based on this, the applicant of the present application performs vector-ordered hilbert completion on the classification feature vector.
Accordingly, in a specific example, the performing structural blur correction on each of the plurality of classification feature vectors to obtain a plurality of corrected classification feature vectors includes: performing structural fuzzy correction of feature distribution on each classification feature vector in the plurality of classification feature vectors by using the following formula to obtain a plurality of corrected classification feature vectors; wherein, the formula is:
wherein ,representing each of the plurality of classification feature vectors, the classification feature vector being in the form of a column vector, +.>Representing each corrected classification feature vector of the plurality of corrected classification feature vectors, +.>Representing the square of the two norms of each of the plurality of classification feature vectors,/>an ordered vector representing the order in which the feature values of each of the plurality of classification feature vectors are arranged in order of magnitude, +.>A transfer vector representing each of the plurality of classification feature vectors, +.>Representing multiplication by location +.>Representing vector multiplication.
Here, by mapping the ordered vectors into the hilbert space defined by the self-inner product of the vectors, a meaningful measure of the numerical relation of the feature set in the consistency space can be realized, based on which, a feature space with an orthorhombic structure is built by embedding the relative position with the feature vector, and the structure in the feature space is completed for the high-dimensional manifold of the feature vector based on the vector query, so that the reduction of the expression certainty of the feature vector due to the fuzzification structure can be avoided, and the accuracy of the classification result obtained by the classifier of the classification feature vector is increased. That is, the accuracy of determining the probability of giving the highest encryption priority to each piece of information to be encrypted is improved.
More specifically, in step S170, the plurality of corrected classification feature vectors are passed through a classifier to obtain a plurality of probability values. Here, each probability value is used to represent a probability of giving the highest priority to the information to be encrypted. In this way, after the probability values are obtained, the encryption priority of the pieces of information to be encrypted is determined based on the ordering among the probability values, and in this way, the encryption priority is determined based on the high-dimensional implicit association of the pieces of data to be encrypted by utilizing a data intelligent scheme, so that the characteristic of the data is fully utilized to determine the encryption priority suitable for the distribution of the data, and the encryption management and control effect is improved.
Accordingly, in one specific example, the passing the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values includes: processing the plurality of corrected classification feature vectors using the classifier in the following formula to obtain a plurality of probability values; wherein, the formula is:, wherein ,/>To->Is a weight matrix>To->For the bias vector +.>And for each corrected classification feature vector of the plurality of corrected classification feature vectors.
More specifically, in step S180, encryption priority of the pieces of information to be encrypted is determined based on the ordering among the plurality of probability values.
In summary, according to the data encryption method and system for computer network information according to the embodiments of the present application, firstly, each piece of acquired information to be encrypted is passed through a context encoder to obtain a plurality of semantic feature vectors to be encrypted, then, the plurality of semantic feature vectors to be encrypted are subjected to two-dimensional matrixing to obtain a global semantic feature matrix to be encrypted through a convolutional neural network model, then, products between the each piece of semantic feature vectors to be encrypted and the global semantic feature matrix to be encrypted are calculated to obtain a plurality of classification feature vectors by taking the each piece of semantic feature vectors to be encrypted as query feature vectors, then, each classification feature vector in the plurality of classification feature vectors is subjected to structure fuzzy correction and then is passed through a classifier to obtain a plurality of probability values, and finally, encryption priority of the plurality of pieces of information to be encrypted is determined based on ordering among the plurality of probability values. In this way, the encryption control effect can be improved.
Exemplary System
Fig. 6 is a block diagram of a data encryption system 100 for computer network information according to an embodiment of the present application. As shown in fig. 6, a data encryption system 100 for computer network information according to an embodiment of the present application includes: the information to be encrypted acquisition module 110 is configured to acquire a plurality of pieces of information to be encrypted; the context coding module 120 is configured to pass the pieces of information to be encrypted through a context encoder including an embedded layer to obtain a plurality of semantic feature vectors to be encrypted; the two-dimensional matrixing module 130 is configured to two-dimensionally matrixing the plurality of semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted; the convolutional encoding module 140 is configured to pass the global semantic matrix to be encrypted through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted; the classification feature vector calculation module 150 is configured to calculate a product between the query feature vector and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors by using the semantic feature vectors to be encrypted as query feature vectors; the structure blur correction module 160 is configured to perform structure blur correction on each of the plurality of classification feature vectors to obtain a plurality of corrected classification feature vectors; a probability value generating module 170, configured to pass the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values; and an encryption priority determining module 180 configured to determine encryption priorities of the pieces of information to be encrypted based on the ordering among the plurality of probability values.
In one example, in the above data encryption system 100 for computer network information, the context encoding module 120 is further configured to: word segmentation processing is carried out on each piece of information to be encrypted so as to convert each piece of information to be encrypted into a word sequence consisting of a plurality of words; mapping each word in the word sequence into a word embedding vector using an embedding layer of the context encoder including the embedding layer to obtain a sequence of word embedding vectors; performing context semantic understanding based on long distance dependence on the sequence of word embedding vectors using the context encoder comprising an embedding layer to obtain a plurality of context word feature vectors; and cascading the context word feature vectors to obtain the semantic feature vectors to be encrypted.
In one example, in the above data encryption system 100 for computer network information, the performing, using the context encoder including an embedding layer, a long-range dependent context semantic understanding on the sequence of word embedding vectors to obtain a plurality of context word feature vectors includes: one-dimensional arrangement is carried out on the sequence of the word embedding vectors so as to obtain global unfolding feature vectors; calculating the product between the global expansion feature vector and the transpose vector of each word embedding vector in the sequence of word embedding vectors to obtain a plurality of self-attention association matrices; respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices; obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices; and weighting each word embedding vector in the sequence of word embedding vectors by taking each probability value in the plurality of probability values as a weight so as to obtain the plurality of context word feature vectors.
In one example, in the above data encryption system 100 for computer network information, the convolutional encoding module 140 is further configured to: and respectively carrying out convolution processing, pooling processing along a channel dimension and nonlinear activation processing on input data in forward transfer of layers by using each layer of the convolutional neural network model serving as the feature extractor to output the global semantic feature matrix to be encrypted by the last layer of the convolutional neural network model serving as the feature extractor, wherein the input of the first layer of the convolutional neural network model serving as the feature extractor is the global semantic matrix to be encrypted.
In one example, in the above data encryption system 100 for computer network information, the structure ambiguity correction module 160 is further configured to: performing structural fuzzy correction of feature distribution on each classification feature vector in the plurality of classification feature vectors by using the following formula to obtain a plurality of corrected classification feature vectors; wherein, the formula is:
wherein ,representing each of the plurality of classification feature vectors, the classification feature vector being in the form of a column vector, +. >Representing each corrected classification feature vector of the plurality of corrected classification feature vectors, +.>Representing the square of the two norms of each of said plurality of classification feature vectors,/->An ordered vector representing the order in which the feature values of each of the plurality of classification feature vectors are arranged in order of magnitude, +.>A transfer vector representing each of the plurality of classification feature vectors, +.>Representing multiplication by location +.>Representing vector multiplication.
In one example, in the data encryption system 100 for computer network information described above, the probability value generation module 170 is further configured to: processing the plurality of corrected classification feature vectors using the classifier in the following formula to obtain a plurality of probability values;
wherein, the formula is:, wherein ,/>To->Is a weight matrix>To->For the bias vector +.>And for each corrected classification feature vector of the plurality of corrected classification feature vectors.
Here, it will be understood by those skilled in the art that the specific functions and operations of the respective units and modules in the above-described data encryption system for computer network information 100 have been described in detail in the above description of the data encryption method for computer network information with reference to fig. 1 to 5, and thus, repetitive descriptions thereof will be omitted.
As described above, the data encryption system 100 for computer network information according to the embodiment of the present application may be implemented in various wireless terminals, such as a server or the like having a data encryption algorithm for computer network information. In one example, the data encryption system 100 for computer network information according to embodiments of the present application may be integrated into a wireless terminal as a software module and/or hardware module. For example, the data encryption system 100 for computer network information may be a software module in the operating system of the wireless terminal or may be an application developed for the wireless terminal; of course, the data encryption system 100 for computer network information may also be one of many hardware modules of the wireless terminal.
Alternatively, in another example, the data encryption system 100 for computer network information and the wireless terminal may be separate devices, and the data encryption system 100 for computer network information may be connected to the wireless terminal through a wired and/or wireless network and transmit interactive information in a agreed data format.
The basic principles of the present application have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present application are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present application. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, as the application is not intended to be limited to the details disclosed herein as such.
The block diagrams of the devices, apparatuses, devices, systems referred to in this application are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It is also noted that in the apparatus, devices and methods of the present application, the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent to the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the embodiments of the application to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (10)

1. A data encryption method for computer network information, comprising:
acquiring a plurality of pieces of information to be encrypted;
The information to be encrypted passes through a context encoder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted;
two-dimensional matrixing the semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted;
the global semantic matrix to be encrypted is passed through a convolutional neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted;
taking each semantic feature vector to be encrypted as a query feature vector, and calculating the product between the query feature vector and the global semantic feature matrix to be encrypted to obtain a plurality of classification feature vectors;
respectively carrying out structural fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors to obtain a plurality of corrected classification characteristic vectors;
passing the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values; and
an encryption priority of the pieces of information to be encrypted is determined based on the ordering among the plurality of probability values.
2. The method for encrypting data for computer network information according to claim 1, wherein said passing said pieces of information to be encrypted through a context encoder comprising an embedded layer to obtain a plurality of pieces of semantic feature vectors to be encrypted comprises:
Word segmentation processing is carried out on each piece of information to be encrypted so as to convert each piece of information to be encrypted into a word sequence consisting of a plurality of words;
mapping each word in the word sequence into a word embedding vector using an embedding layer of the context encoder including the embedding layer to obtain a sequence of word embedding vectors;
performing context semantic understanding based on long distance dependence on the sequence of word embedding vectors using the context encoder comprising an embedding layer to obtain a plurality of context word feature vectors; and
and cascading the context word feature vectors to obtain the semantic feature vectors to be encrypted.
3. The method for encrypting data for computer network information according to claim 2, wherein said using said context encoder comprising an embedding layer to perform a long-range dependent context semantic understanding on said sequence of word embedding vectors to obtain a plurality of context word feature vectors comprises:
one-dimensional arrangement is carried out on the sequence of the word embedding vectors so as to obtain global unfolding feature vectors;
calculating the product between the global expansion feature vector and the transpose vector of each word embedding vector in the sequence of word embedding vectors to obtain a plurality of self-attention association matrices;
Respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices;
obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices;
and weighting each word embedding vector in the sequence of word embedding vectors by taking each probability value in the plurality of probability values as a weight so as to obtain the plurality of context word feature vectors.
4. A data encryption method for computer network information according to claim 3, wherein said passing the global semantic matrix to be encrypted through a convolutional neural network model as a feature extractor to obtain a global semantic feature matrix to be encrypted comprises:
and respectively carrying out convolution processing, pooling processing along a channel dimension and nonlinear activation processing on input data in forward transfer of layers by using each layer of the convolutional neural network model serving as the feature extractor to output the global semantic feature matrix to be encrypted by the last layer of the convolutional neural network model serving as the feature extractor, wherein the input of the first layer of the convolutional neural network model serving as the feature extractor is the global semantic matrix to be encrypted.
5. The method for encrypting data for computer network information according to claim 4, wherein said respectively performing structural blur correction on each of said plurality of classification feature vectors to obtain a plurality of corrected classification feature vectors comprises:
performing structural fuzzy correction of feature distribution on each classification feature vector in the plurality of classification feature vectors by using the following formula to obtain a plurality of corrected classification feature vectors;
wherein, the formula is:
wherein ,representing each of the plurality of classification feature vectors, the classification feature vector being in the form of a column vector, +.>Representing each corrected classification feature vector of the plurality of corrected classification feature vectors, +.>Representing the square of the two norms of each of said plurality of classification feature vectors,/->An ordered vector representing the order in which the feature values of each of the plurality of classification feature vectors are arranged in order of magnitude, +.>A transfer vector representing each of the plurality of classification feature vectors, +.>Representing multiplication by location +.>Representing vector multiplication.
6. The data encryption method for computer network information according to claim 5, wherein passing the plurality of corrected classification feature vectors through a classifier to obtain a plurality of probability values comprises:
Processing the plurality of corrected classification feature vectors using the classifier in the following formula to obtain a plurality of probability values;
wherein, the formula is:, wherein ,/>To->Is a weight matrix>To->For the bias vector +.>And for each corrected classification feature vector of the plurality of corrected classification feature vectors.
7. A data encryption system for computer network information, comprising:
the information to be encrypted acquisition module is used for acquiring a plurality of pieces of information to be encrypted;
the context coding module is used for enabling the information to be encrypted to pass through a context coder comprising an embedded layer to obtain a plurality of semantic feature vectors to be encrypted;
the two-dimensional matrixing module is used for two-dimensionally matrixing the plurality of semantic feature vectors to be encrypted to obtain a global semantic matrix to be encrypted;
the convolution coding module is used for enabling the global semantic matrix to be encrypted to pass through a convolution neural network model serving as a feature extractor to obtain a global semantic feature matrix to be encrypted;
the classified feature vector calculation module is used for calculating the product between each semantic feature vector to be encrypted and the global semantic feature matrix to obtain a plurality of classified feature vectors by taking the semantic feature vector to be encrypted as a query feature vector;
The structure fuzzy correction module is used for respectively carrying out structure fuzzy correction on each classification characteristic vector in the plurality of classification characteristic vectors so as to obtain a plurality of corrected classification characteristic vectors;
the probability value generation module is used for enabling the corrected classification feature vectors to pass through a classifier to obtain a plurality of probability values; and
and the encryption priority determining module is used for determining the encryption priority of the pieces of information to be encrypted based on the ordering among the probability values.
8. The data encryption system for computer network information of claim 7, wherein the context encoding module is further configured to:
word segmentation processing is carried out on each piece of information to be encrypted so as to convert each piece of information to be encrypted into a word sequence consisting of a plurality of words;
mapping each word in the word sequence into a word embedding vector using an embedding layer of the context encoder including the embedding layer to obtain a sequence of word embedding vectors;
performing context semantic understanding based on long distance dependence on the sequence of word embedding vectors using the context encoder comprising an embedding layer to obtain a plurality of context word feature vectors; and
And cascading the context word feature vectors to obtain the semantic feature vectors to be encrypted.
9. The data encryption system for computer network information of claim 8, wherein the using the context encoder including an embedding layer to perform a long-range dependent based context semantic understanding of the sequence of word embedding vectors to obtain a plurality of context word feature vectors comprises:
one-dimensional arrangement is carried out on the sequence of the word embedding vectors so as to obtain global unfolding feature vectors;
calculating the product between the global expansion feature vector and the transpose vector of each word embedding vector in the sequence of word embedding vectors to obtain a plurality of self-attention association matrices;
respectively carrying out standardization processing on each self-attention correlation matrix in the plurality of self-attention correlation matrices to obtain a plurality of standardized self-attention correlation matrices;
obtaining a plurality of probability values by using a Softmax classification function through each normalized self-attention correlation matrix in the normalized self-attention correlation matrices;
and weighting each word embedding vector in the sequence of word embedding vectors by taking each probability value in the plurality of probability values as a weight so as to obtain the plurality of context word feature vectors.
10. The data encryption system for computer network information of claim 9, wherein the convolutional encoding module is further configured to:
and respectively carrying out convolution processing, pooling processing along a channel dimension and nonlinear activation processing on input data in forward transfer of layers by using each layer of the convolutional neural network model serving as the feature extractor to output the global semantic feature matrix to be encrypted by the last layer of the convolutional neural network model serving as the feature extractor, wherein the input of the first layer of the convolutional neural network model serving as the feature extractor is the global semantic matrix to be encrypted.
CN202211649545.6A 2022-12-21 2022-12-21 Data encryption method and system for computer network information Active CN115982736B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211649545.6A CN115982736B (en) 2022-12-21 2022-12-21 Data encryption method and system for computer network information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211649545.6A CN115982736B (en) 2022-12-21 2022-12-21 Data encryption method and system for computer network information

Publications (2)

Publication Number Publication Date
CN115982736A CN115982736A (en) 2023-04-18
CN115982736B true CN115982736B (en) 2023-08-08

Family

ID=85967489

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211649545.6A Active CN115982736B (en) 2022-12-21 2022-12-21 Data encryption method and system for computer network information

Country Status (1)

Country Link
CN (1) CN115982736B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117113987A (en) * 2023-04-19 2023-11-24 杭州达西信息技术有限公司 Keyword intelligent distinguishing method and system based on user behavior characteristics
CN117219249A (en) * 2023-05-10 2023-12-12 威海市立医院 Equipment positioning management system based on intelligent medical digital platform
CN116500379B (en) * 2023-05-15 2024-03-08 珠海中瑞电力科技有限公司 Accurate positioning method for voltage drop of STS device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428026A (en) * 2020-02-20 2020-07-17 西安电子科技大学 Multi-label text classification processing method and system and information data processing terminal
US11194972B1 (en) * 2021-02-19 2021-12-07 Institute Of Automation, Chinese Academy Of Sciences Semantic sentiment analysis method fusing in-depth features and time sequence models
CN115186673A (en) * 2022-06-08 2022-10-14 福建华昊信息技术咨询有限公司 Scientific and technological information management system and management method thereof
CN115203380A (en) * 2022-09-19 2022-10-18 山东鼹鼠人才知果数据科技有限公司 Text processing system and method based on multi-mode data fusion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428026A (en) * 2020-02-20 2020-07-17 西安电子科技大学 Multi-label text classification processing method and system and information data processing terminal
US11194972B1 (en) * 2021-02-19 2021-12-07 Institute Of Automation, Chinese Academy Of Sciences Semantic sentiment analysis method fusing in-depth features and time sequence models
CN115186673A (en) * 2022-06-08 2022-10-14 福建华昊信息技术咨询有限公司 Scientific and technological information management system and management method thereof
CN115203380A (en) * 2022-09-19 2022-10-18 山东鼹鼠人才知果数据科技有限公司 Text processing system and method based on multi-mode data fusion

Also Published As

Publication number Publication date
CN115982736A (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN115982736B (en) Data encryption method and system for computer network information
CN115203380B (en) Text processing system and method based on multi-mode data fusion
CN109711422B (en) Image data processing method, image data processing device, image data model building method, image data model building device, computer equipment and storage medium
CN110084365B (en) Service providing system and method based on deep learning
Zhao et al. Synthesizing informative training samples with gan
CN114863573B (en) Category-level 6D attitude estimation method based on monocular RGB-D image
Chen et al. StereoEngine: An FPGA-based accelerator for real-time high-quality stereo estimation with binary neural network
CN115512420A (en) Visitor management system and method thereof
CN115373813A (en) Scheduling method and system based on GPU virtualization in cloud computing environment and electronic equipment
US20210182357A1 (en) System and method for model parameter optimization
JP2023520625A (en) IMAGE FEATURE MATCHING METHOD AND RELATED DEVICE, DEVICE AND STORAGE MEDIUM
CN116612288B (en) Multi-scale lightweight real-time semantic segmentation method and system
CN115118675B (en) Data stream transmission acceleration method and system based on intelligent network card equipment
CN114821169A (en) Method-level non-intrusive call link tracking method under micro-service architecture
Lu et al. Container: Context aggregation networks
CN115951883A (en) Service component management system and method of distributed micro-service architecture
CN116320459B (en) Computer network communication data processing method and system based on artificial intelligence
CN116150371A (en) Asset repayment plan mass data processing method based on sharingJDBC
Jia et al. Generalized unitarily invariant gauge regularization for fast low-rank matrix recovery
CN116027874A (en) Notebook computer power consumption control method and system thereof
CN115617636A (en) Distributed performance test system
CN114741532A (en) Knowledge graph completion method and system based on multi-scale decentralized dynamic routing
CN108985311A (en) A kind of dimension reduction method and system of high-spectrum remote sensing data
CN114066210A (en) Big data distributed task scheduling system and scheduling method thereof
CN113658236A (en) Incomplete point cloud registration method based on graph attention machine system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant