CN117669753B - Quantum model training method, multi-mode data processing method and device - Google Patents

Quantum model training method, multi-mode data processing method and device Download PDF

Info

Publication number
CN117669753B
CN117669753B CN202410132334.8A CN202410132334A CN117669753B CN 117669753 B CN117669753 B CN 117669753B CN 202410132334 A CN202410132334 A CN 202410132334A CN 117669753 B CN117669753 B CN 117669753B
Authority
CN
China
Prior art keywords
quantum
data
modal
mode
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410132334.8A
Other languages
Chinese (zh)
Other versions
CN117669753A (en
Inventor
吕金虎
高庆
郑瑾
王薇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Innovation Research Institute of Beihang University
Original Assignee
Hangzhou Innovation Research Institute of Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Innovation Research Institute of Beihang University filed Critical Hangzhou Innovation Research Institute of Beihang University
Priority to CN202410132334.8A priority Critical patent/CN117669753B/en
Publication of CN117669753A publication Critical patent/CN117669753A/en
Application granted granted Critical
Publication of CN117669753B publication Critical patent/CN117669753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Image Analysis (AREA)

Abstract

The invention belongs to the field of quantum machine learning, and provides a quantum model training method, a multi-mode data processing method and a device, wherein the training method comprises the following steps: constructing a multi-modal sample training set, wherein the multi-modal sample training set comprises classical input data and sample labels of multiple modes; converting classical input data into quantum input data; inputting quantum input data into a quantum multi-modal neural network model to be trained, and obtaining a sample analysis result after quantum single-modal feature extraction, quantum multi-modal feature fusion and quantum bit measurement of the quantum input data through the quantum multi-modal neural network model; and according to the sample analysis result and the sample label, adjusting the model parameters until the iteration training termination condition is met, and obtaining the trained quantum multi-modal neural network model. As the quantum multi-modal neural network model can realize the efficient and accurate processing of multi-modal data, the performance of the model on multi-modal data processing and analysis tasks is improved.

Description

Quantum model training method, multi-mode data processing method and device
Technical Field
The invention relates to the technical field of quantum machine learning, in particular to a quantum model training method, a multi-mode data processing method and a device.
Background
Along with the proliferation of multi-mode data, the task of data analysis is more and more difficult, and the multi-mode data integrates various information forms such as images, texts, audios, videos and the like, so that the complexity and the information quantity of the data are greatly increased.
In the related art, the traditional machine learning algorithm is difficult to effectively fuse the correlation among different modes and efficiently execute cross-mode understanding, so that the traditional machine learning algorithm cannot accurately and efficiently realize the effective processing of multi-mode data.
Therefore, how to accurately and efficiently implement effective processing on multi-mode data is a technical problem that needs to be solved currently.
Disclosure of Invention
The invention provides a quantum model training method, a multi-modal data processing method and a device, which are used for solving the defect that the traditional machine learning algorithm can not accurately and efficiently process multi-modal data.
In one aspect, the present invention provides a quantum model training method, the method comprising:
constructing a multi-modal sample training set, wherein the multi-modal sample training set comprises classical input data and sample labels of multiple modalities;
Converting the classical input data into quantum input data;
Inputting the quantum input data into a quantum multi-modal neural network model to be trained, and obtaining a sample analysis result after quantum single-modal feature extraction, quantum multi-modal feature fusion and quantum bit measurement of the quantum input data through the quantum multi-modal neural network model;
And according to the sample analysis result and the sample label, adjusting model parameters of the quantum multi-modal neural network model until the iteration training termination condition is met, and obtaining a trained quantum multi-modal neural network model.
According to the quantum model training method provided by the invention, the step of converting classical input data into quantum input data comprises the following steps:
preprocessing the classical input data according to respective corresponding modes to obtain vector data corresponding to each mode;
Carrying out quantum conversion on the vector data through an amplitude coding algorithm to obtain quantum data corresponding to each mode;
And carrying out tensor product processing on the quantum data of each mode, and normalizing to obtain quantum input data.
According to the quantum model training method provided by the invention, the quantum multi-modal neural network model comprises the following steps: the device comprises a quantum single-mode feature extraction module, a quantum multi-mode feature fusion module and a quantum bit measurement module;
the quantum unimodal feature extraction module is used for carrying out feature extraction on quantum data of each mode in the quantum input data to obtain a plurality of unimodal features;
The quantum multi-modal feature fusion module is used for fusing the plurality of single-modal features to obtain multi-modal features;
And the quantum bit measurement module is used for carrying out projection measurement on target quantum bits in the quantum circuit after the multi-mode characteristics are obtained, operating the quantum circuit for a plurality of times to obtain an expected value of quantum observability, and mapping the expected value of the quantum observability into a sample analysis result.
According to the quantum model training method provided by the invention, the quantum single-mode feature extraction module comprises a plurality of two-quantum bit gates and a plurality of controlled NOT gates;
In the quantum single-mode feature extraction module, the two-quantum bit gates are sequentially applied to adjacent single-quantum bits in the quantum circuit corresponding to each mode, and the adjacent quantum bits are connected through the controlled NOT gate, so that parameters of a plurality of two-quantum bit gates in each quantum single-mode feature extraction module are the same.
According to the quantum model training method provided by the invention, the quantum multi-mode characteristic fusion module comprises a plurality of single-quantum bit revolving gates and a plurality of controlled NOT gates;
In the quantum multimode feature fusion module, the plurality of single-quantum bit rotation gates are respectively applied to all quantum bits of a quantum circuit, and adjacent quantum bits are connected through the controlled NOT gate so as to fuse the plurality of single-mode features.
According to the quantum model training method provided by the invention, the projection measurement is carried out on the target quantum bit in the quantum circuit, the quantum circuit is operated for a plurality of times, and the expected value of the quantum observability is obtained, and the method comprises the following steps:
Setting quantum observability quantity, and carrying out projection measurement on target quantum bits in a quantum circuit through the quantum observability quantity;
And operating the quantum circuit for a plurality of times, and obtaining an expected value of the quantum observability based on the quantum input data, the whole evolution process of the quantum multi-mode neural network model and the implementation process of the quantum observability.
According to the quantum model training method provided by the invention, according to the sample analysis result and the sample label, the model parameters of the quantum multi-modal neural network model are adjusted, and the method comprises the following steps:
Calculating an output error between the sample analysis result and the sample label through a preset loss function;
Based on the output error, model parameters of the quantum multi-modal neural network model are adjusted by utilizing a gradient calculation and gradient descent optimization algorithm suitable for a quantum circuit so as to minimize the output error.
According to the quantum model training method provided by the invention, after the trained quantum multi-modal neural network model is obtained, the method further comprises the following steps:
Constructing a multi-modal sample test set, wherein the multi-modal sample test set comprises classical input data and sample tags of multiple modalities;
Converting the classical input data into quantum input data;
and testing the trained quantum multi-modal neural network model based on the quantum input data and the sample label.
In another aspect, the present invention further provides a multi-mode data processing method, where the method includes:
Acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of text, images, audio and video;
Converting the multi-mode data to be processed into quantum data to be processed;
Inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model;
the quantum multi-mode neural network model is obtained by training based on any one of the quantum model training methods.
In another aspect, the present invention also provides a multi-mode data processing apparatus, including:
the acquisition module is used for acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of text, images, audio and video;
the conversion module is used for converting the multi-mode data to be processed into quantum data to be processed;
the processing module is used for inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model; the quantum multi-mode neural network model is obtained by training based on any one of the quantum model training methods.
According to the quantum model training method, the multimode data processing method and the device, the classical input data in the multimode sample training set are converted into the quantum input data through the establishment of the quantum multimode neural network model, the quantum input data are input into the quantum multimode neural network model to be trained, the quantum multimode characteristic extraction, the quantum multimode characteristic fusion and the quantum bit measurement are carried out on the quantum input data through the quantum multimode neural network model, the sample analysis result is obtained, the model parameters of the quantum multimode neural network are adjusted according to the sample analysis result and the sample label, so that the quantum multimode neural network model is subjected to iterative training, and the trained quantum multimode neural network model is obtained after the iterative training termination condition is met. The quantum multi-modal neural network model obtained through training can effectively utilize the advantages of quantum computation, can realize the efficient and accurate processing of multi-modal data, and remarkably improves the performance of the model on multi-modal data processing and analysis tasks.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a quantum model training method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a quantum multimodal neural network model;
FIG. 3 is a schematic diagram of a quantum single-mode feature extraction module;
FIG. 4 is a schematic diagram of a two-qubit gate structure;
FIG. 5 is a schematic diagram of a quantum multimode feature fusion module;
FIG. 6 is a schematic diagram of the structure of a qubit measurement module;
FIG. 7 is a second flow chart of a quantum model training method according to an embodiment of the present invention;
FIG. 8 is a flowchart of a multi-mode data processing method according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a multi-mode data processing apparatus according to an embodiment of the present invention;
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment relates to the field of quantum machine learning, and particularly can be applied to a scene of analyzing and processing multi-mode data. At present, the rapid development of information technology promotes the explosive growth of data types, structures and generation speeds, thereby further exacerbating the difficulty of processing large-scale and complex tasks and bringing unprecedented challenges to the traditional machine learning algorithm.
Neural networks, a core technology in machine learning, have demonstrated significant efficacy in a number of fields. However, in handling large complex tasks, the development and training of neural networks is quite demanding in terms of computational resources. With the progressive decay of moore's law effects and the progressive manifestation of classical computational physical limits, traditional computing devices are increasingly constrained in their ability to handle these demanding tasks.
Against this background, quantum computing provides a unique solution against the traditional computing techniques and thermodynamic limitations as a completely new physical computing paradigm. The calculation method utilizes the basic principles of quantum mechanics, such as quantum state superposition and quantum entanglement, can theoretically realize exponential storage capacity and parallel processing capacity exceeding the traditional calculation, and opens up a new way for solving the complex problem in the traditional machine learning algorithm. With the advent of noisy medium-scale quantum processors with stable computational performance, quantum machine learning and quantum neural networks have become hot spots of academic and industrial interest, which combine the efficiency of machine learning techniques and the computational power of quantum systems, demonstrating great potential in driving artificial intelligence development.
At present, the proliferation of multi-mode data brings about a new data analysis revolution, and the data integrates various information forms such as images, texts, audio and video, so that the complexity and the information quantity of the data are greatly increased. However, the complexity of processing such data, fusing correlations between different modalities, and efficiently performing cross-modality understanding pose a significant challenge to conventional machine learning algorithms.
Therefore, according to the characteristics of multi-mode data, a quantum multi-mode neural network model based on quantum computing hardware is designed and trained, the potential of quantum computing is fully utilized, and deep and comprehensive information in the multi-mode data is efficiently analyzed and accurately mined, so that the technical problem to be solved is currently needed.
Aiming at the technical problems, the embodiment of the invention provides an effective solution. Detailed schemes of the quantum model training method, the multi-mode data processing method and the device provided by the embodiment of the invention are described below with reference to fig. 1 to 10.
Fig. 1 is a schematic flow chart of a quantum model training method according to an embodiment of the present invention.
As shown in fig. 1, in the quantum model training method provided by the embodiment of the present invention, an execution body may be an electronic device having functions of data transmission and reception and data processing, and the method mainly includes the following steps:
Step 110: and constructing a multi-modal sample training set, wherein the multi-modal sample training set comprises classical input data and sample labels of multiple modalities.
In this embodiment, the multi-modal sample training set is a supervised learning data set, and the sample label may be label information corresponding to a multi-modal data pair, such as an emotion classification label corresponding to an image-text data pair, an image-video data pair, or a text-video data pair, and specifically, in an emotion classification scene, the sample label may be a positive label or a negative label.
In some embodiments, some single-mode data in the multi-mode data pair may be marked as different labels, and for the case that the labels corresponding to the mode data in the multi-mode data pair are inconsistent, label consistency processing can be performed on the mode data in the multi-mode data pair, so that each data pair only corresponds to a unique label, and data accuracy of the multi-mode sample training set is ensured.
Step 120: classical input data is converted into quantum input data.
It is understood that quantum hardware can only process quantum data, so this embodiment needs to convert classical input data into quantum input data, and this process can be understood as a quantum state preparation process.
Step 130: inputting the quantum input data into a quantum multi-modal neural network model to be trained, and obtaining a sample analysis result after quantum single-modal feature extraction, quantum multi-modal feature fusion and quantum bit measurement of the quantum input data through the quantum multi-modal neural network model.
In this embodiment, the quantum multimode neural network model has the functions of quantum unimodal feature extraction, quantum multimode feature fusion and quantum bit measurement, and can accurately analyze quantum input data by utilizing the advantage of quantum computation, so as to obtain a sample analysis result.
It can be understood that the sample analysis process is a process of classifying the multi-modal data by the quantum multi-modal neural network model, for example, in a emotion analysis scene, the sample analysis result may be a numerical value capable of representing an emotion category corresponding to the quantum input data.
Step 140: and adjusting model parameters of the quantum multi-modal neural network model according to the sample analysis result and the sample label until the iteration training termination condition is met, so as to obtain the trained quantum multi-modal neural network model.
It can be understood that the quantum multi-modal neural network model is a quantum model, the sample analysis result is a predicted value of single-round training of the quantum multi-modal neural network model, the sample label is a corresponding real value, and the adjustable model parameters in the quantum multi-modal neural network model can be adjusted and updated by analyzing the error between the predicted value and the real value so as to continuously optimize the quantum multi-modal neural network model, and directly meet the iteration training termination condition, such as that the loss function value reaches the optimal value or the maximum training frequency, and the training is finished at the moment, so that the trained quantum multi-modal neural network model is obtained, thereby providing effective model support for efficiently and accurately realizing multi-modal data processing.
In one embodiment, converting classical input data into quantum input data specifically includes:
Firstly, preprocessing classical input data according to respective corresponding modes to obtain vector data corresponding to each mode.
In this embodiment, for data of different modalities, a corresponding preprocessing scheme may be adopted to perform preprocessing operation. For example, for image data, image data with high quality and uniform format can be obtained through image preprocessing methods such as scaling, normalization, denoising, centering and standardization. For text data, the text preprocessing methods such as punctuation removal, case normalization, word segmentation, stop word removal, morphological reduction and the like can be used for cleaning the data so as to realize the text preprocessing operation of the text data.
In this embodiment, the vectorization processing of the data of different modalities may be implemented in different manners. For example, for the image data, a method based on manual characteristics (such as principal component analysis, directional gradient histogram, etc.) or a method based on deep learning (such as an automatic encoder, a neural network, etc.) may be adopted, the preprocessed image data is vectorized, and the preset dimension may be a reasonable vector dimension manually set according to the actual application requirement, so as to obtain vector data corresponding to the image data.
For text data, a discrete representation (e.g., unicode) or a distributed representation method (e.g., a method of efficiently encoding text data into a dense or low dimensional vector space) may be employed to convert the cleaned text data into a vector representation of a fixed dimension (i.e., vector data).
And secondly, carrying out quantum state conversion on the vector data through an amplitude coding algorithm to obtain quantum data corresponding to each mode.
In this embodiment, since quantum hardware can only process quantum data, vector data needs to be prepared as quantum data. The method is realized by adopting an amplitude coding algorithm to represent vector data as a target quantum state, and then utilizing a specific quantum circuit to prepare an initialized quantum state (generally states, wherein/> represents the number of quantum bits of the quantum circuit) as the target quantum state.
It is understood that the fundamental unit of information storage and processing in quantum computing is a qubit, similar to the classical bit in classical computing. It is physically implemented by a two-level quantum system and is mathematically represented by a two-dimensional complex vector in the hilbert space.
Specifically, a single qubit state can be represented as:
(1)
Where is a single qubit state,/> and/> are two ground states of a single qubit, and coefficients/> and/> meet the normalization condition/> .
The plurality of qubits may form a multiple qubit system whose quantum state space is the tensor product of the quantum states of each qubit, which may be expressed as:
(2)
Where is the multiple qubit state,/> is the ground state of/> qubits, corresponding to the integer/> binary representation,/> is the coefficient,/> is the number of qubits.
In this embodiment, the quantum circuit is composed of wires and quantum gates to transmit and manipulate quantum information. The state of the quantum gate operating qubits determines the evolution of the whole quantum system.
The evolution of a quantum system can be described by a quantum gate (i.e., a unitary transformation), represented by unitary/> satisfying , where/> is the conjugate transpose of/> , and/> represents the identity matrix of/> dimensions. Quantum gate/> acting on/> will get the following transformations:
(3)
Wherein is the result of the transformation of the quantum gate/> on the qubit states/> , N is the number of qubits, and/> are both coefficients, and/> is the ground state of/> qubits.
It will be appreciated that the amplitude encoding algorithm may encode classical data into the probability amplitude of quantum data, which may be represented by qubits/> -dimensional classical data. In particular, vector data in one/> -dimensional space may be encoded into quantum data as follows:
(4)
where is the vector data/> corresponding quantum data,/> is the normalization constant,/> represents the integer/> binary representation corresponding quantum states, ranging from 0 to/> ,/> representing some element in the vector data.
Therefore, the initial state can be prepared into a target quantum state through a specific quantum circuit formed by a plurality of quantum gates based on an amplitude coding algorithm, so that quantum data can be obtained.
Thirdly, carrying out tensor product processing on the quantum data of each mode, and normalizing to obtain quantum input data.
It can be understood that, because the quantum data corresponding to each mode obtained in the quantum preparation process is discrete, the quantum data cannot be directly input into the quantum multi-mode neural network model for processing, and each quantum data needs to be combined into one piece of quantum data in a tensor product processing mode, and the quantum data is normalized to obtain quantum input data.
In one embodiment, referring to fig. 2, the quantum multi-modal neural network model includes: a quantum unimodal feature extraction module 210, a quantum multimode feature fusion module 220, and a qubit measurement module 230.
The quantum unimodal feature extraction module 210 is configured to perform feature extraction on quantum data of each mode in the quantum input data, so as to obtain a plurality of unimodal features.
The quantum multimode feature fusion module 220 is configured to fuse a plurality of single-mode features to obtain multimode features.
The quantum bit measurement module 230 is configured to perform projection measurement on a target quantum bit in the quantum circuit after obtaining the multi-mode feature, operate the quantum circuit for multiple times to obtain an expected value of the quantum observability, and map the expected value of the quantum observability to a sample analysis result.
It can be understood that, as shown in fig. 2, for the data of two modes, namely, mode 1 and mode 2, after the data preprocessing of mode 1 and the data preprocessing of mode 2 and the preparation of quantum states, one quantum unimodal feature extraction module 210 extracts the unimodal feature of the quantum data corresponding to mode 1, and the other quantum unimodal feature extraction module 210 extracts the unimodal feature of the quantum data corresponding to mode 2.
In this embodiment, the quantum unimodal feature extraction module 210 may be configured individually on quantum circuits representing different modal data. Whether or not the configurations are identical, it is critical to ensure that each quantum unimodal feature extraction module 210 performs feature extraction specifically for its corresponding modal data.
The quantum multimode feature fusion module 220 may fuse the single-mode features extracted by each quantum single-mode feature extraction module 210 to obtain multimode features.
After the qubit measurement module 230 is applied to the quantum multimode feature fusion module 220, the quantum circuit is operated a plurality of times by performing a measurement operation on a specific qubit to obtain an expected value of the quantum observability, and the expected value of the quantum observability is mapped to an output result of the quantum multimode neural network model.
In one embodiment, referring to fig. 3, the quantum unimodal feature extraction module includes a plurality of two-qubit gates and a plurality of controlled gates.
In this embodiment, a general configuration of a quantum unimodal feature extraction module is adopted, the configuration is similar to a architecture of a convolutional neural network, based on a characteristic of local connection in the convolutional neural network, two quantum bit gates are sequentially applied to adjacent uniquantum bits in a quantum circuit corresponding to each mode, the adjacent quantum bits are connected through a controlled NOT gate, and the operation of the two quantum bit gates/> on the quantum bits is utilized to realize the evolution of a quantum system so as to extract unimodal features of the corresponding modes. In addition, based on the characteristic of parameter sharing in convolutional neural networks, the parameters in the two-quantum bit gates/> in each quantum single-mode feature extraction module are the same.
In practical application, the two-quantum bit gate used in the quantum single-mode feature extraction module is a general two-quantum bit gate, so that any quantum state conversion can be realized, and the expression capability of the quantum single-mode feature extraction module is improved. As shown in fig. 4, the general two-qubit gate is decomposed into a series of basic quantum gates, specifically including 3 controlled not gates and 15 single-qubit rotation gates, and the decomposition strategy minimizes the number of quantum gates, thereby enhancing the compactness and practical implementation feasibility of the quantum single-mode feature extraction module.
The single-quantum bit rotating gate specifically may include at least one of rotating quantum gates,/() rotating quantum gates and/> rotating quantum gates with adjustable parameters, and the expressions are respectively:
(5)
(6)
(7)
Wherein represents parameters in the single-qubit rotation gate, parameters used in the model represent the quantum multi-modal neural network model, and/() represents an imaginary number.
The controlled NOT gate is denoted as CNOT gate, and its corresponding mathematical expression is:
(8)
in one embodiment, the quantum multimode feature fusion module includes a plurality of single-quantum bit rotation gates and a plurality of controlled NOT gates.
In this embodiment, for the construction of the quantum multimode feature fusion module, a design similar to a full-connection layer is adopted, and the full-connection layer can capture complex relationships between features and perform deep fusion. The configuration of the closed-loop strongly entangled quantum circuit is formed by respectively acting a single-quantum bit rotation gate on all quantum bits of a quantum circuit representing a quantum multi-modal neural network model and then connecting adjacent quantum bits by adopting a controlled NOT gate.
As shown in FIG. 5, the single-quantum bit rotary gate adopts rotary quantum gates, so that the configuration not only can embody comprehensive feature fusion capability, but also can remarkably enhance the integration and transmission efficiency of information among different modes by utilizing the unique property of quantum entanglement, thereby effectively realizing the fusion of a plurality of single-mode features.
In one embodiment, after the qubit measurement module is applied to the quantum multimode feature fusion module, projection measurement is performed on a target qubit in the quantum circuit, and the expected value of the quantum observability is obtained by running the quantum circuit for a plurality of times, which specifically includes:
setting a quantum observability amount, and performing projection measurement on a target quantum bit in the quantum circuit through the quantum observability amount, wherein it is understood that the target quantum bit can be the last quantum bit of the quantum circuit.
And obtaining the expected value of the quantum observability based on the quantum input data, the whole evolution process of the quantum multimode neural network model and the implementation process of the quantum observability.
In quantum computing, quantum measurements are capable of extracting information from quantum states, and in particular, for projection measurements, is assumed to be a quantum observables, which can be represented by an hermite. According to the spectral decomposition theorem, the quantum observables/> can be expressed as:
(9)
Where denotes the result that might be obtained when projection measurements are made on quantum observables/> , i.e. eigenvalues,/> denotes the projector of the eigenspace corresponding to the eigenvalues/> of quantum observables/> .
Assuming the quantum system is in state , the observed quantity/> measurement expectation can be calculated as:
(10)
Wherein represents the observables/> 's measured expectations,/> represents the quantum states in which the quantum system is located.
In this embodiment, projection measurement is performed on a specific target qubit to construct a qubit measurement module. Specifically, as shown in fig. 6, a measurement operation is applied in a quantum circuit, and by repeatedly executing the entire quantum circuit and measuring the last qubit using the quantum observables , the measurement result can be expressed as a desired value of the quantum observables/> on the qubit, that is:
(11)
wherein represents the expected value of quantum observability,/> represents the input quantum state of the multi-modal data, i.e. the quantum input data,/> represents the whole unitary operation of the quantum multi-modal neural network model,/> represents the parameter vector of the quantum multi-modal neural network model,/> represents the conjugate transpose of/> ,/> represents the matrix trace operation, each element in the parameter vector corresponds to a parameter in one single-quantum bit rotation gate (i.e./> ), wherein/> is a bubble matrix.
In practical application, the quantum observables can adopt Z measurement, and the configuration can ensure that the expected value/> of the quantum observables is in the range/> , and for the classification task of the multi-mode data, the expected value of the quantum observables can be directly used as the objective function of the quantum multi-mode neural network model, namely the analysis result output by the quantum multi-mode neural network model.
The quantum multi-modal neural network model constructed by the embodiment has a flexible model structure, can be suitable for processing and analyzing multi-modal data of various types, has adjustability, and can be optimally configured according to the characteristics of different machine learning tasks and data processing requirements.
In an embodiment, according to a sample analysis result and a sample label, model parameters of the quantum multi-modal neural network model are adjusted, which specifically includes:
Calculating an output error between a sample analysis result and a sample label through a preset loss function;
based on the output error, the model parameters of the quantum multi-modal neural network model are adjusted by utilizing a gradient calculation and gradient descent optimization algorithm suitable for the quantum circuit so as to minimize the output error.
In some embodiments, the quantum multimode neural network model further includes a network parameter optimization module, and the model parameters can be adjusted by the network parameter optimization module to realize a parameter optimization function.
In a specific implementation, the preset loss function may use a mean square error loss function, and the mean square error loss function is used as a measurement index, so that an output error between a sample analysis result of the quantum multi-modal neural network model and a sample label corresponding to quantum input data may be estimated.
The mean square error loss function is defined as:
(12)
Wherein denotes a mean square error loss function value, namely an output error,/> denotes an expected value of quantum observability, namely a sample analysis result,/> denotes a sample label in a training set,/> denotes the scale of the training set, and/> denotes a parameter vector of the quantum multi-modal neural network model.
And then, according to the obtained output error, updating the adjustable model parameters in the quantum multi-modal neural network model by utilizing a gradient calculation and gradient descent optimization algorithm suitable for the parameterized quantum circuit so as to achieve the purpose of minimizing the loss function, namely minimizing the output error.
In some embodiments, the gradient may be calculated using a gradient calculation method suitable for parameterized quantum circuits, such as a parameter shift rule and a chain rule.
Specifically, in order to simplify the expression, a gradient calculation scheme corresponding to is given, and in practical application, the case of multiple samples is also applicable. The derivative of the mean square error loss function value/> with respect to the parameter/> is:
(13)
Wherein denotes a mean square error loss function value, i.e. an output error,/> is a vector with 0 except 1 of the/> element,/> denotes a desired value of quantum observability, i.e. a sample analysis result,/> denotes a sample label in a training set,/> denotes a scale size of the training set,/> denotes a parameter vector of the quantum multi-modal neural network model,/> denotes an input quantum state of multi-modal data, i.e. quantum input data,/> denotes a single quantum bit rotation gate.
According to the calculated derivative value , the embodiment further uses a random gradient optimization algorithm to update model parameters of the quantum multi-modal neural network model according to the gradient descent direction of the loss function, and the specific update rule is as follows:
(15)
Wherein denotes the updated model parameters,/> denotes the learning rate, and denotes the mean square error loss function value, i.e. the output error, for controlling the step size of the parameter update.
By iteratively updating the model parameters, the model parameters can be gradually optimized and adjusted, so that the output result of the quantum multi-modal neural network model is more similar to a sample label. And obtaining the trained quantum multi-modal neural network model with optimal performance after the iteration training termination condition is met.
In an embodiment, after obtaining the trained quantum multi-modal neural network model, the method may further include:
Constructing a multi-mode sample test set, wherein the multi-mode sample test set comprises classical input data of multiple modes and sample labels;
converting classical input data into quantum input data;
and testing the trained quantum multi-modal neural network model based on the quantum input data and the sample label.
In some embodiments, a multi-modal sample dataset for a multi-modal data analysis task may be collected first, and the multi-modal sample dataset may be divided into a multi-modal sample training set and a multi-modal sample testing set according to a certain proportion.
It can be understood that the processing scheme of the classical input data in the multi-mode sample test set is basically consistent with the processing scheme of the classical input data in the multi-mode sample training set, and will not be described in detail herein.
In a specific training scenario, referring to fig. 7, the quantum model training method provided in this embodiment is specifically implemented as follows:
step 310: and constructing a multi-modal sample training set.
Step 320: the data is preprocessed and encoded into quantum states, and the process mainly carries out corresponding preprocessing on classical input data in a multi-mode sample training set according to different modes and encodes the processed data into quantum input data.
Step 330: quantum state input, the process inputs quantum input data into a quantum multi-modal neural network model.
Step 340: and constructing a quantum multi-modal neural network model and initializing parameters, wherein the quantum multi-modal neural network model is a model to be trained, and quantum input data can be input into the model to start a training process.
Step 350: and operating a quantum single-mode feature extraction module to extract single-mode features of different mode data in the quantum input data.
Step 360: and operating a quantum multi-mode feature fusion module to fuse all the single-mode features.
Step 370: and operating the quantum bit measurement module and obtaining an output result, wherein in the process, the quantum circuit is operated for a plurality of times by carrying out measurement operation on specific quantum bits after feature fusion so as to obtain an expected value of quantum observability, and the expected value of quantum observability is mapped into the output result of the quantum multi-mode neural network model.
Step 380: in the process, an output error can be obtained by calculating a sample analysis result and a sample label output by a quantum multi-modal neural network model, and then the model parameters of the quantum multi-modal neural network model which are adjustable are updated and adjusted by a gradient calculation and gradient descent algorithm so as to optimize the model parameters.
Step 390: whether a set threshold condition is reached, such as whether the output error is the lowest or whether the maximum number of iterations is reached, is determined, and if neither of the above two conditions is satisfied, indicating that the set threshold condition is not reached, it is necessary to return to step 350 to continue the iterative training.
Step 3100: if one of the conditions in step 390 is satisfied, indicating that the set threshold condition is reached, then the model is saved, that is, the trained quantum multi-modal neural network model is saved, and the training process is ended.
It is not difficult to find that the quantum multi-modal neural network model obtained by training through the quantum model training method provided by the embodiment has at least the following advantages:
Firstly, the quantum multi-modal neural network model constructed and trained by the embodiment can effectively process multi-modal data analysis tasks by utilizing the advantages of quantum computation, breaks through the limitation that the quantum neural network is limited to process single-modal data, and remarkably expands the application field of the current quantum neural network.
Secondly, the quantum multi-mode neural network model efficiently extracts single-mode characteristics such as texts, images and the like through the local connection and parameter sharing characteristics of the quantum single-mode characteristic extraction module, and utilizes the strongly entangled quantum multi-mode characteristic fusion module to realize the effective fusion of the single-mode characteristics, so that the performance of the quantum multi-mode neural network model on multi-mode data processing and analysis tasks is remarkably improved.
Thirdly, the quantum multi-modal neural network model has excellent generalization capability and is suitable for processing various complex multi-modal data. The model is flexible in design, and can be customized and expanded according to different multi-mode task requirements, so that high-efficiency data processing capability is maintained in various application scenes.
Fig. 8 is a flowchart of a multi-mode data processing method according to an embodiment of the present invention.
As shown in fig. 8, the method for processing multi-mode data provided in this embodiment specifically includes:
Step 410: acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of text, images, audio and video.
In this embodiment, the multi-mode data to be processed may be any two or more modes of data combination, such as a text-image data combination composed of text and image.
For example, the multimodal data to be processed in an emotion analysis scene may be a text-image data combination of a text describing emotion and an image depicting emotion.
Step 420: and converting the multi-mode data to be processed into quantum data to be processed.
It can be understood that the process of converting the multi-mode data to be processed into the quantum data to be processed is also a preparation process of quantum state, and can refer to an implementation flow of converting classical input data into quantum input data.
Step 430: inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model.
The quantum multi-mode neural network model is obtained by training based on the quantum model training method provided by the embodiment.
Aiming at emotion analysis scenes, the implementation flow of the multi-mode data processing method provided by the embodiment is as follows:
firstly, multi-mode data to be processed with emotion polarity is obtained.
And secondly, converting the multi-mode data to be processed with emotion polarity into quantum data to be processed.
And thirdly, inputting the data to be processed into the trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model.
And fourthly, determining an emotion analysis result according to the data analysis result.
In this embodiment, the data analysis result is a predicted value within a specific range of values, for example, the data analysis result is a value within [ -1,1], which can represent the classification result of the multi-modal data to be processed.
For example, for emotion analysis scenario, if the data analysis result is a certain value between-1 and 0, it may be determined that the emotion analysis result of the multi-mode data to be processed is negative emotion. If the data analysis result is a certain value between 0 and 1, the emotion analysis result of the multi-mode data to be processed can be determined to be positive emotion.
With respect to the quantum multi-modal neural network model in the above embodiments, the model training process and the specific scheme of the model structure have been described in detail in the embodiments related to the quantum model training method, and will not be described in detail herein.
It is not difficult to find that the multi-mode data processing method provided by the embodiment can utilize the advantage of quantum computation by applying the trained quantum multi-mode neural network model, so that the processing result of multi-mode data can be accurately and efficiently obtained, and the application requirements of multi-mode data classification tasks under different scenes can be met.
Based on the same general inventive concept, the present invention also protects a multi-mode data processing device, and the multi-mode data processing device provided by the present invention is described below, and the multi-mode data processing device described below and the multi-mode data processing method described above can be referred to correspondingly.
FIG. 9 is a schematic diagram of a multi-mode data processing apparatus according to an embodiment of the present invention.
As shown in fig. 9, the multi-mode data processing apparatus provided in the embodiment of the present invention specifically includes:
An acquisition module 510, configured to acquire multi-mode data to be processed; the multi-mode data to be processed comprises at least two of text, images, audio and video.
The conversion module 520 is configured to convert the multi-mode data to be processed into quantum data.
The processing module 530 is configured to input the quantum to-be-processed data into the trained quantum multi-modal neural network model, and obtain a data analysis result output by the quantum multi-modal neural network model; the quantum multi-mode neural network model is obtained by training based on the quantum model training method provided by the embodiment.
The specific manner in which the respective modules perform the operations in the apparatus of the above embodiments has been described in detail in the embodiments related to the multi-mode data processing method, and will not be described in detail herein.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
As shown in fig. 10, the electronic device may include: classical computer 60 and quantum computer 70, classical computer 60 includes a processor (processor) 610, a communication interface (Communications Interface) 620, a memory (memory) 630, and a communication bus 640, where processor 610, communication interface 620, memory 630 complete communication with each other through communication bus 640. The processor 610 may call logic instructions in the memory 630 to perform the following operations: acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of texts, images and videos; the multi-modal data to be processed is converted into quantum data to be processed and the quantum data to be processed is sent to the quantum computer 70. The quantum computer 70 can input the data to be processed into the trained quantum multi-modal neural network model to obtain the data analysis result output by the quantum multi-modal neural network model; the quantum multi-mode neural network model is obtained by training based on the quantum model training method provided by the embodiment.
Further, the logic instructions in the memory 630 may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a read-only memory (ROM), a random access memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product, the computer program product comprising a computer program, the computer program being storable on a non-transitory computer readable storage medium, the computer program, when executed by a processor, enabling a classical computer to perform the multi-modal data processing method provided by the above embodiments in cooperation with a quantum computer, the method comprising: acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of texts, images and videos; converting the multi-mode data to be processed into quantum data to be processed; inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model; the quantum multi-mode neural network model is obtained by training based on the quantum model training method provided by the embodiment.
In yet another aspect, the present invention further provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor and a quantum computer, can implement the multi-modal data processing method provided in the foregoing embodiments, the method including: acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of texts, images and videos; converting the multi-mode data to be processed into quantum data to be processed; inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model; the quantum multi-mode neural network model is obtained by training based on the quantum model training method provided by the embodiment.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be clear to those skilled in the art that the embodiments may be implemented by means of software plus a necessary general hardware platform, where of course quantum hardware participation is required. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, a quantum memory, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the method described in the various embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (6)

1. A method of quantum model training, comprising:
Constructing a multi-modal sample training set, wherein the multi-modal sample training set comprises classical input data of multiple modes and sample labels, the sample labels are label information corresponding to multi-modal data pairs, and the multi-modal data comprises at least two of texts, images, audios and videos;
Converting the classical input data into quantum input data;
Inputting the quantum input data into a quantum multi-modal neural network model to be trained, and obtaining a sample analysis result after quantum single-modal feature extraction, quantum multi-modal feature fusion and quantum bit measurement of the quantum input data through the quantum multi-modal neural network model;
According to the sample analysis result and the sample label, model parameters of the quantum multi-modal neural network model are adjusted until iteration training termination conditions are met, and a trained quantum multi-modal neural network model is obtained;
The quantum multi-modal neural network model includes: the device comprises a quantum single-mode feature extraction module, a quantum multi-mode feature fusion module and a quantum bit measurement module;
The quantum unimodal feature extraction module is used for carrying out feature extraction on quantum data of each mode in the quantum input data to obtain a plurality of unimodal features; the quantum multi-modal feature fusion module is used for fusing the plurality of single-modal features to obtain multi-modal features; the quantum bit measurement module is used for setting quantum observability after the multi-mode characteristics are obtained, performing projection measurement on target quantum bits in a quantum circuit through the quantum observability, operating the quantum circuit for a plurality of times, obtaining expected values of the quantum observability based on the quantum input data, the whole evolution process of the quantum multi-mode neural network model and the implementation process of the quantum observability, and mapping the expected values of the quantum observability into sample analysis results;
The quantum single-mode feature extraction module comprises a plurality of two-quantum bit gates and a plurality of controlled NOT gates; in the quantum single-mode feature extraction modules, the two-quantum bit gates are sequentially applied to adjacent single-quantum bits in the quantum circuits corresponding to each mode, and the adjacent single-quantum bits are connected through the controlled NOT gate, so that parameters of a plurality of two-quantum bit gates in each quantum single-mode feature extraction module are the same;
the quantum multimode feature fusion module comprises a plurality of single-quantum bit revolving gates and a plurality of controlled NOT gates; in the quantum multimode feature fusion module, the single-quantum bit rotation gates are respectively applied to all quantum bits of a quantum circuit, and adjacent quantum bits are connected through the controlled NOT gate so as to fuse the plurality of single-mode features.
2. The quantum model training method of claim 1, wherein the converting the classical input data into quantum input data comprises:
preprocessing the classical input data according to respective corresponding modes to obtain vector data corresponding to each mode;
carrying out quantum state conversion on the vector data through an amplitude coding algorithm to obtain quantum data corresponding to each mode;
And carrying out tensor product processing on the quantum data of each mode, and normalizing to obtain quantum input data.
3. The quantum model training method according to claim 1, wherein the adjusting the model parameters of the quantum multi-modal neural network model according to the sample analysis result and the sample label comprises:
Calculating an output error between the sample analysis result and the sample label through a preset loss function;
Based on the output error, model parameters of the quantum multi-modal neural network model are adjusted by utilizing a gradient calculation and gradient descent optimization algorithm suitable for a quantum circuit so as to minimize the output error.
4. The quantum model training method of claim 1, wherein after obtaining the trained quantum multi-modal neural network model, the method further comprises:
Constructing a multi-mode sample test set; the multi-modal sample test set comprises classical input data of multiple modalities and sample labels;
Converting the classical input data into quantum input data;
and testing the trained quantum multi-modal neural network model based on the quantum input data and the sample label.
5. A method of multi-modal data processing comprising:
Acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of text, images, audio and video;
Converting the multi-mode data to be processed into quantum data to be processed;
Inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model;
the quantum multi-modal neural network model is trained based on the quantum model training method according to any one of claims 1 to 4.
6. A multi-modal data processing apparatus, comprising:
the acquisition module is used for acquiring multi-mode data to be processed; the multi-mode data to be processed comprises at least two of text, images, audio and video;
the conversion module is used for converting the multi-mode data to be processed into quantum data to be processed;
The processing module is used for inputting the quantum to-be-processed data into a trained quantum multi-modal neural network model to obtain a data analysis result output by the quantum multi-modal neural network model; the quantum multi-modal neural network model is trained based on the quantum model training method according to any one of claims 1 to 4.
CN202410132334.8A 2024-01-31 2024-01-31 Quantum model training method, multi-mode data processing method and device Active CN117669753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410132334.8A CN117669753B (en) 2024-01-31 2024-01-31 Quantum model training method, multi-mode data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410132334.8A CN117669753B (en) 2024-01-31 2024-01-31 Quantum model training method, multi-mode data processing method and device

Publications (2)

Publication Number Publication Date
CN117669753A CN117669753A (en) 2024-03-08
CN117669753B true CN117669753B (en) 2024-04-16

Family

ID=90064493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410132334.8A Active CN117669753B (en) 2024-01-31 2024-01-31 Quantum model training method, multi-mode data processing method and device

Country Status (1)

Country Link
CN (1) CN117669753B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832663A (en) * 2017-09-30 2018-03-23 天津大学 A kind of multi-modal sentiment analysis method based on quantum theory
CN110516723A (en) * 2019-08-15 2019-11-29 天津师范大学 A kind of multi-modal ground cloud atlas recognition methods based on the fusion of depth tensor
CN111401558A (en) * 2020-06-05 2020-07-10 腾讯科技(深圳)有限公司 Data processing model training method, data processing device and electronic equipment
CN113159239A (en) * 2021-06-28 2021-07-23 北京航空航天大学 Method for processing graph data by quantum graph convolutional neural network
CN113361664A (en) * 2021-08-10 2021-09-07 北京航空航天大学 Image recognition system and method based on quantum convolution neural network
CN114693942A (en) * 2022-03-31 2022-07-01 重庆大学 Multimode fault understanding and auxiliary labeling method for intelligent operation and maintenance of instruments and meters
CN115374948A (en) * 2022-08-05 2022-11-22 北京百度网讯科技有限公司 Quantum neural network training method, data processing method, device and medium
CN115828999A (en) * 2022-10-21 2023-03-21 中国人民解放军战略支援部队信息工程大学 Quantum convolution neural network construction method and system based on quantum state amplitude transformation
CN116153390A (en) * 2022-07-15 2023-05-23 上海图灵智算量子科技有限公司 Quantum convolutional neural network-based drug binding energy prediction method
CN116468075A (en) * 2022-11-02 2023-07-21 成都信息工程大学 Big data processing method and system based on quantum fuzzy neural network model
CN117291274A (en) * 2023-10-31 2023-12-26 叶淦晟 Quantum computing assisted multi-modal language large model learning system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3619655A1 (en) * 2017-06-02 2020-03-11 Google LLC Quantum neural network
US20230259779A1 (en) * 2022-02-15 2023-08-17 Samsung Electronics Co., Ltd. Method of processing multimodal tasks, and an apparatus for the same

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107832663A (en) * 2017-09-30 2018-03-23 天津大学 A kind of multi-modal sentiment analysis method based on quantum theory
CN110516723A (en) * 2019-08-15 2019-11-29 天津师范大学 A kind of multi-modal ground cloud atlas recognition methods based on the fusion of depth tensor
CN111401558A (en) * 2020-06-05 2020-07-10 腾讯科技(深圳)有限公司 Data processing model training method, data processing device and electronic equipment
CN113159239A (en) * 2021-06-28 2021-07-23 北京航空航天大学 Method for processing graph data by quantum graph convolutional neural network
CN113361664A (en) * 2021-08-10 2021-09-07 北京航空航天大学 Image recognition system and method based on quantum convolution neural network
CN114693942A (en) * 2022-03-31 2022-07-01 重庆大学 Multimode fault understanding and auxiliary labeling method for intelligent operation and maintenance of instruments and meters
CN116153390A (en) * 2022-07-15 2023-05-23 上海图灵智算量子科技有限公司 Quantum convolutional neural network-based drug binding energy prediction method
CN115374948A (en) * 2022-08-05 2022-11-22 北京百度网讯科技有限公司 Quantum neural network training method, data processing method, device and medium
CN115828999A (en) * 2022-10-21 2023-03-21 中国人民解放军战略支援部队信息工程大学 Quantum convolution neural network construction method and system based on quantum state amplitude transformation
CN116468075A (en) * 2022-11-02 2023-07-21 成都信息工程大学 Big data processing method and system based on quantum fuzzy neural network model
CN117291274A (en) * 2023-10-31 2023-12-26 叶淦晟 Quantum computing assisted multi-modal language large model learning system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
QNMF: A quantum neural network based multimodal fusion system for intelligent diagnosis;Qu, Zhiguo 等;Web of Science ResearcherID;20230826;全文 *
基于量子门组的量子神经网络模型及其应用;李胜;张培林;李兵;周云川;;计算机工程与应用;20140625(第06期);全文 *
多模态情感分析研究综述;张亚洲;戎璐;宋大为;张鹏;;模式识别与人工智能;20200515(第05期);全文 *

Also Published As

Publication number Publication date
CN117669753A (en) 2024-03-08

Similar Documents

Publication Publication Date Title
US10417525B2 (en) Object recognition with reduced neural network weight precision
CN113361664B (en) Image recognition system and method based on quantum convolution neural network
WO2023236977A1 (en) Data processing method and related device
CN114821217B (en) Image recognition method and device based on quantum classical hybrid neural network
CN113011568B (en) Model training method, data processing method and equipment
CN114564593A (en) Completion method and device of multi-mode knowledge graph and electronic equipment
CN112598053A (en) Active significance target detection method based on semi-supervised learning
EP4318313A1 (en) Data processing method, training method for neural network model, and apparatus
WO2023231954A1 (en) Data denoising method and related device
CN116543388B (en) Conditional image generation method and related device based on semantic guidance information
CN117437494B (en) Image classification method, system, electronic equipment and storage medium
CN112163114A (en) Image retrieval method based on feature fusion
CN112115744B (en) Point cloud data processing method and device, computer storage medium and electronic equipment
CN115830375A (en) Point cloud classification method and device
WO2022125181A1 (en) Recurrent neural network architectures based on synaptic connectivity graphs
CN117669753B (en) Quantum model training method, multi-mode data processing method and device
Altares-López et al. AutoQML: Automatic generation and training of robust quantum-inspired classifiers by using evolutionary algorithms on grayscale images
EP4189609A1 (en) Quantum computing device for determining a network parameter
CN116976405A (en) Variable component shadow quantum neural network based on immune optimization algorithm
CN114445692B (en) Image recognition model construction method and device, computer equipment and storage medium
CN116341666A (en) Quantum convolution neural network design method and system based on quantum circuit
CN115544307A (en) Directed graph data feature extraction and expression method and system based on incidence matrix
Kashyap et al. Quantum convolutional neural network architecture for multi-class classification
Xue et al. Fast and unsupervised neural architecture evolution for visual representation learning
CN113077003A (en) Graph attention network inductive learning method based on graph sampling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant