CN114519430A - Soft quantum neural network system and mode identification method - Google Patents

Soft quantum neural network system and mode identification method Download PDF

Info

Publication number
CN114519430A
CN114519430A CN202210413229.2A CN202210413229A CN114519430A CN 114519430 A CN114519430 A CN 114519430A CN 202210413229 A CN202210413229 A CN 202210413229A CN 114519430 A CN114519430 A CN 114519430A
Authority
CN
China
Prior art keywords
quantum
soft
layer
neurons
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210413229.2A
Other languages
Chinese (zh)
Inventor
尹华磊
周民罡
刘志平
富尧
徐同恺
陈增兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Matrix Time Digital Technology Co Ltd
Original Assignee
Nanjing University
Matrix Time Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University, Matrix Time Digital Technology Co Ltd filed Critical Nanjing University
Priority to CN202210413229.2A priority Critical patent/CN114519430A/en
Publication of CN114519430A publication Critical patent/CN114519430A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N10/00Quantum computing, i.e. information processing based on quantum-mechanical phenomena
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention discloses a soft quantum neural network system and a mode identification method, wherein the system comprises: an input layer and an output layer comprising a plurality of soft quantum neurons; the quantum state input unit is used for preparing a quantum bit set by encoding the classical data, and initializing the state of the soft quantum neuron of the input layer according to the quantum bit set; a quantum state measurement unit that measures a quantum state of the soft quantum neuron; a first operation unit that performs controlled operation between connected soft quantum neurons directionally according to a measurement result; the second operation unit is used for executing evolution operation on the soft quantum neuron connected with the previous layer after all controlled operations; and the optimization unit is used for comparing the difference between the measurement result of the output layer and the label output value of the classical data and optimizing the difference until the difference is within a preset range. The invention is easy to expand and realize in the aspect of realization, has more robustness, and has high fault tolerance on the evolution and measurement of the quantum containing noise.

Description

Soft quantum neural network system and pattern recognition method
Technical Field
The invention relates to the technical field of pattern recognition, in particular to a soft quantum neural network system and a pattern recognition method.
Background
Soft computing is a collection of methods that include techniques such as fuzzy logic, genetic algorithms, artificial neural networks, machine learning, and expert systems. It has been a major research area for automatic control engineering since it was first proposed in 1980. Soft computing aims at solving complex problems with tolerance to inaccuracies, uncertainties and partial realism to achieve simplicity, robustness and low cost of processing problems. This method provides an opportunity for revealing ambiguities in human thinking in the face of uncertain reality. In recent years, the artificial neural network has gained tremendous success in the fields of automatic control, signal processing, pattern recognition, assistant decision and the like, with the enablement of large data and high computational power and the renaissance of deep learning. Meanwhile, in the big data era, the data processing capacity of the traditional computer is approaching the limit, and the scale and the dimensionality of data are increasing at an exponential speed, which brings great challenges to the classic machine learning technology. Exploring the advantages of quantum computing in large data processing and storage and combining with machine learning techniques is expected to provide new solutions.
Quantum machine learning is an emerging research field combining quantum computing and machine learning, and is rapidly developing at present. Quantum machine learning aims at designing quantum algorithms that can perform machine learning tasks. By means of quantum resources such as quantum superposition and quantum entanglement, quantum machine learning is expected to bring exponential acceleration to traditional machine learning on the basis of realizing efficient parallelism of quantum computation, or solve the problem that traditional machine learning is difficult to solve on the specific problem. Therefore, how to design an appropriate quantum machine learning model for a quantum device and to explore the performance of quantum machine learning exceeding classical machine learning becomes a core problem of research in this field.
Artificial neural networks are a dominant machine learning model. In recent years, the method has excellent performance in pattern recognition tasks such as computer vision and natural language processing. The simplest and most widely applied artificial neural network architecture is a fully connected feedforward neural network, which is divided into an input layer, a hidden layer and an output layer. Each layer has several neurons, each neuron is connected with the neuron in the previous layer only. The input data is propagated from the input layer to the output layer in a single direction, and the structure of the hidden layer endows the feedforward neural network with strong representation capability of learning nonlinear features from the data.
Because of the powerful representation capability of the traditional artificial neural network, quantization of the traditional neural network model is always a big hotspot in the field of quantum machine learning. The concept of quantum neural networks was first proposed in 1995 by Kak (Advances in imaging and electronics physics, 94, 259-313), and has undergone decades of development, and now the quantum neural network models can be roughly divided into the following categories: measurement-based quantum neural networks (Advances in imaging and electron physics, 94, 259-313); quantum dot-based Neural networks (Proceedings of the 4th Workshop on Physics of calculation, 1996: 22-24), quantum circuit-based Neural networks (International Conference on Natural calculation, Springer, Berlin, Heidelberg, 2005: 283-. The most representative Quantum neural network today is the variational Quantum circuit model proposed in 2018 by Farhi and Neven of the Google Quantum AI team (arXiv preprint arXiv:1802.06002 (2018)).
The model is formed by a series of single-bit quantum gates and double-bit quantum gates with parameters to form a quantum circuit. Classical data (requiring encoding onto quantum states) or quantum data as input, and the expectation of the final measurement of the circuit as a label. Appropriate parameters can be found by a classical optimizer so that the model learns to hidden patterns in the data, completing pattern recognition tasks such as classification. Compared with the traditional neural network, the model has the advantages that the size of the model is remarkably reduced, and the robustness and the fault tolerance are high. The practical deployment of the model is limited by the development of gate-based quantum computing experimental techniques. Due to the influence of decoherence and quantum system dissipation, the current manufacturing of a large-scale quantum circuit with a large quantum bit number and a high-fidelity quantum gate still has great challenges, and the exploration of a quantum neural network model which is physically easy to realize in engineering, easy to expand in a large scale and has system robustness has important significance.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the defects of the existing model, the invention provides a soft quantum neural network system and a mode recognition method, which have expansibility and can quickly process data and complete a mode recognition task.
The technical scheme is as follows: the invention is a soft quantum neural network system implemented by one or more quantum processors, the system comprising:
the input layer and the output layer comprise a plurality of soft quantum neurons, wherein each soft quantum neuron is a physical quantum bit, and the quantum state of each soft quantum neuron is in a pure state or a mixed state;
the quantum state input unit is used for encoding and preparing classical data to obtain a quantum bit set with information of the classical data or receiving the prepared quantum bit set as input; the quantum bit set is the initialization state of each soft quantum neuron of the input layer;
the quantum state measuring unit is used for measuring the quantum state of each soft quantum neuron in each layer before the output layer under the calculation basis vector to obtain a measuring result of 0 or 1; measuring each soft quantum neuron positioned in an output layer, wherein the measurement result is a single measurement result 0 or 1 and the probability distribution of the corresponding 0 or 1 obtained after a large number of measurements, mapping the measurement result to a label value of known classical data, and giving an output result of a mode identification task;
a first operation unit for directionally performing a controlled operation between the soft quantum neurons having connections according to the measurement result;
the second operation unit is used for executing evolution operation on the soft quantum neurons connected with the previous layer after controlled operation between the soft quantum neurons and all the soft quantum neurons connected with the previous layer from the previous layer is completed;
and the optimization unit is used for comparing the difference between the output result of the output layer and the known label value of the classical data, and optimizing by adjusting the controlled operation of the first operation unit and/or the evolution operation of the second operation unit when the difference exceeds a preset range until the difference is within the preset range.
Further, the controlled operation of the first operation unit specifically includes: in addition to the soft quantum neurons in the input layer, other soft quantum neurons are initialized
Figure DEST_PATH_IMAGE001
(ii) a To the first
Figure 100002_DEST_PATH_IMAGE002
First of a layer
Figure DEST_PATH_IMAGE003
A soft quantum neuron whose measurement result is recorded as
Figure 100002_DEST_PATH_IMAGE004
Figure 541393DEST_PATH_IMAGE004
Is 0 or 1; if it is
Figure DEST_PATH_IMAGE005
Then the conversion is carried out to the unit of action on the next layer of soft quantum neurons connected with the unit of action
Figure 100002_DEST_PATH_IMAGE006
If, if
Figure DEST_PATH_IMAGE007
Then apply arbitrary quantum operations
Figure 100002_DEST_PATH_IMAGE008
(ii) a Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE009
the parameter to be adjusted is an object to be optimized by the optimization unit;
the evolution operation of the second operation unit is to apply any revolving gate containing noise to the soft quantum neuron after the controlled operation applied by all the soft quantum neurons with connection from the upper layer is used up
Figure 100002_DEST_PATH_IMAGE010
(ii) a Wherein
Figure DEST_PATH_IMAGE011
And the parameter to be adjusted is an object to be optimized by the optimization unit.
Further, the optimization unit compares the output result of the output layer with the difference of the known label value of the classical data and optimizes the output result;
in the optimization, to adjust the parameters to be adjusted in the first operation unit and the second operation unit, a gradient-based method is specifically adopted: and measuring the difference by defining a loss function, calculating the gradient of the loss function relative to the parameter to be adjusted, and partially or completely adjusting the parameter to be adjusted in the direction opposite to the gradient.
Further, the number of soft quantum neurons of the input layer is determined according to the dimension of the input data; the number of soft quantum neurons in the output layer is designed, and the specific form of evolution from the input layer to the output layer is determined.
Further, a hidden layer is further arranged between an input layer and an output layer of the plurality of soft quantum neurons, and the number of the soft quantum neurons of the input layer is determined according to the dimension of input data; and designing the number of layers of the hidden layer, the number of soft quantum neurons in each layer and the number of soft quantum neurons in the output layer, and determining a specific form of evolution from the input layer, the hidden layer to the output layer.
Further, the quantum state inputThe unit is used for encoding classical data to obtain a quantum bit set with information of the classical data, and specifically comprises the following steps: each component of the classical data is used as the angle of the revolving door by adopting an angle coding mode
Figure 100002_DEST_PATH_IMAGE012
Will rotate the door
Figure DEST_PATH_IMAGE013
Act upon
Figure 425911DEST_PATH_IMAGE001
And obtaining the quantum state after the corresponding component is coded.
The invention also comprises a pattern recognition method based on the soft quantum neural network system of any one of the preceding claims, comprising the following steps:
(1) acquiring a data set to be classified;
(2) inputting the data set to be classified into a soft quantum neural network which is trained in advance;
if the data in the data set to be classified is classical data, encoding according to the dimensionality of the classical data to obtain a qubit set with corresponding information, and inputting the qubit set into a soft quantum neural network; if the data in the data set to be classified is the quantum bit set obtained through the preparation, the data are directly input into the soft quantum neural network;
(3) and acquiring an output result of the soft quantum neural network as a classification result of the input data to be classified.
Further, the soft quantum neural network is trained as follows:
1) preparing a training data set for a corresponding classification task, including input data and expected label values as outputs;
2) coding each component of training data in a training data set to a quantum bit, taking each quantum bit as the initialization state of a soft quantum neuron of a corresponding input layer, and measuring the quantum state of all the soft quantum neurons of the input layer under a calculation basis vector to obtain a measurement result of 0 or 1;
3) sequentially searching the measurement results of the soft quantum neurons of the previous layer connected with the soft quantum neurons from the first soft quantum neuron of the output layer, and sequentially applying respective controlled operations to the soft quantum neurons according to each measurement result;
4) after the controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation on the soft quantum neurons to evolve to a final state, and measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result;
5) repeating the steps 2) -4) for multiple times to obtain probability distribution of the corresponding measurement result of each soft quantum neuron of the output layer, and mapping the probability distribution with an expected label value corresponding to the dimensionality of the input data to obtain the output result of the soft quantum neural network;
6) calculating the difference between the output result obtained in the step 5) and the expected tag value corresponding to the input data, if the difference is not greater than a set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof; if the difference is larger than the set threshold, adjusting all parameters in the controlled operation and the bias evolution operation, and turning to the step 7);
7) and repeating the steps 2) to 6) until the set upper limit of the number of times of repetition is reached or until the difference between the expected tag values corresponding to the output result and the input data is not greater than the set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof.
Further, if the soft quantum neural network includes a hidden layer, the steps 2) to 5) are:
s1, encoding each component of training data in the training data set to a quantum bit, taking each quantum bit as the initialization state of the soft quantum neuron of the corresponding input layer, and measuring the quantum state of all the soft quantum neurons of the input layer under the condition of calculating basis vector to obtain the measurement result of 0 or 1;
s2, sequentially searching the measurement results of the last layer of soft quantum neurons connected with the soft quantum neurons from the first soft quantum neuron of the first layer of hidden layer to the last soft quantum neuron of the last layer of hidden layer, and sequentially applying respective controlled operation to the soft quantum neurons according to each measurement result;
s3, after the controlled operation between the soft quantum neurons connected with the soft quantum neurons from the previous layer is completed, applying bias evolution operation to the soft quantum neurons to evolve to a final state, measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result, and preparing for the controlled operation of the next layer;
s4 traversing each soft quantum neuron of the output layer, sequentially searching the measurement result of the last layer of soft quantum neuron connected with the soft quantum neuron from the first soft quantum neuron of the output layer, and sequentially applying respective controlled operation to the soft quantum neuron according to each measurement result;
s5, after the controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation to the soft quantum neurons to evolve to a final state, and measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result;
s6 repeating the steps S1-S5 for many times to obtain the probability distribution of the measurement result corresponding to each soft quantum neuron of the output layer, and mapping the probability distribution with the expected label value corresponding to the dimensionality of the input data, namely the output result of the soft quantum neural network.
Further, the training of the soft quantum neural network further comprises verification of a quantum neural network training effect, specifically:
dividing a training data set into a non-intersected training set and a test set, predicting the soft quantum neural network obtained by training on the test set after the data of the training set is completely trained, counting the classification accuracy of the soft quantum neural network, and retraining the soft quantum neural network if the accuracy is lower than a set threshold.
Has the advantages that:
(1) the invention is obviously different from the mainstream quantum neural network based on the quantum circuit:
the quantum neural network based on the quantum circuit does not have the characteristics of the neural network in form, namely, the basic neuron and any connection mode between the neurons are defined, so the quantum neural network is called as the quantum neural network, and the quantum neural network is characterized in that a model which has adjustable parameters and can fit a class of nonlinear functions is constructed by using the quantum circuit from an abstract level, and the point is consistent with the classical neural network. The soft quantum neural network of the invention takes soft quantum neurons as basic constituent units, and can define any connection mode between the soft quantum neurons similar to the classical neural network. The soft quantum convolutional neural network has the advantages that the soft quantum neural network has high extensibility, various connection modes of a classical neural network can be highly referenced in theoretical design, the structures such as a convolutional neural network and a cyclic neural network can be conveniently expanded, and the corresponding soft quantum convolutional neural network and the soft quantum cyclic neural network are designed.
(2) The soft quantum neural network does not introduce quantum entanglement, the quantum characteristic of the soft quantum neural network does not come from entanglement, and the quantum entanglement maintaining among quantum bits for a long time is not easy to realize, so that the soft quantum neural network has higher robustness compared with a mainstream quantum neural network model based on a quantum circuit.
(3) Compared with the mainstream quantum neural network based on the quantum circuit, in terms of hardware implementation, the soft quantum neural network only depends on the construction of a single quantum bit gate, only a large number of basic devices which are consistent in structure and are used for realizing the soft quantum neural network are required to be constructed, the connection effect among elements is considered, and the large-scale implementation is easy to realize by using the prior art. When a quantum circuit with large scale and deep depth is constructed, it is very challenging to maintain the fidelity of the quantum gate in the circuit and maintain the coherence for a long time. The characteristic of easy realization on hardware is the embodiment of easy expansion of the soft quantum neural network in realization.
(4) Compared with the mainstream quantum neural network based on the quantum circuit, the design mode of the soft quantum neural network has high fault tolerance on the quantum evolution and measurement containing noise.
Drawings
FIG. 1 is a schematic diagram of a quantum state input unit.
FIG. 2 is a schematic diagram of a soft quantum neuron designed according to the present invention.
Fig. 3 is a schematic diagram of a soft quantum neural network designed by the present invention.
Fig. 4 is a schematic diagram of data of a crescent decision boundary training set according to embodiment 1 of the present invention.
Fig. 5 is a schematic diagram of a loss function value of a class of data classification tasks according to embodiment 1 of the present invention as a function of the number of training rounds (Epoch).
Fig. 6 is a schematic diagram illustrating the prediction accuracy of a class of data classification tasks on a test set according to embodiment 1 of the present invention as a function of the number of training rounds (Epoch).
Fig. 7 is a schematic diagram of data of a circular decision boundary training set according to embodiment 2 of the present invention.
Fig. 8 is a data pattern diagram learned by training with a soft quantum sensor based on a circular decision boundary training set according to embodiment 2 of the present invention.
Detailed Description
The invention is further elucidated with reference to the drawings and the embodiments.
The present invention proposes a soft quantum neural network system implemented by one or more quantum processors, which is an example of a system implemented as a classical or quantum computer program on one or more classical computers or quantum computing devices in one or more locations. Soft quantum as referred to herein refers to a new quantum computing paradigm known as soft quantum computing. Compared with the traditional quantum computation under a reversible and closed system, the soft quantum computation aims at performing non-classical computation by utilizing dissipation and decoherence caused by the environment under a real-world physical system, has tolerance on noisy quantum evolution and measurement, and is a computation paradigm aiming at a specific computation task under an open quantum system.
The soft quantum neural network system comprises an input layer and an output layer, wherein each layer comprises a plurality of soft quantum neurons, each soft quantum neuron is a physical quantum bit, and the quantum state of each soft quantum neuron is in a pure state or a mixed state; the input layer is also provided with a quantum state input unit which is used for preparing a quantum bit set with corresponding information by using classical data codes with a certain mode or receiving the prepared quantum bit set as input, wherein the quantum bit set is the initialization state of each soft quantum neuron of the input layer; each soft quantum neuron is provided with a quantum state measuring unit which is used for measuring the quantum state of each soft quantum neuron in each layer before the output layer under the calculation basis vector to obtain a measuring result of 0 or 1; measuring each soft quantum neuron positioned in an output layer, wherein the measurement result is a single measurement result 0 or 1 and the probability distribution of the corresponding 0 or 1 obtained after a large number of measurements, mapping the measurement result to a label value of known classical data, and giving an output result of a mode identification task; a first operation unit used for directionally executing controlled operation between the connected soft quantum neurons according to the measurement result is arranged between the soft quantum neurons; the soft quantum neurons connected with the upper layer of soft quantum neurons are provided with second operation units which are used for executing evolution operation on the soft quantum neurons connected with the upper layer of soft quantum neurons after controlled operation between the soft quantum neurons and all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed; the optimization unit is used for updating parameters to be adjusted in the first operation unit and the second operation unit, comparing the difference between the output result of the output layer and the label value of the known classical data, and optimizing by adjusting the controlled operation of the first operation unit and/or the evolution operation of the second operation unit when the difference exceeds a preset range until the difference is within the preset range; the parameters to be adjusted in the first operation unit and the second operation unit are initialized randomly according to Gaussian distribution;
in the invention, the first operation unit is controlled operation, in particular controlled Kraus operation; the second operation unit is bias evolution;
the number of soft quantum neurons of the input layer is determined according to the dimensionality of input data, and an expected tag value corresponding to the input data is known; determining the number of soft quantum neurons in an output layer according to the dimension of the label value, and designing a specific form of controlled operation and bias evolution;
in other embodiments, a hidden layer is further included between the input layer and the output layer of the plurality of soft quantum neurons of the present invention, the number of soft quantum neurons of the input layer is determined according to the dimension of the input data, and the expected tag value corresponding to the input data is known; determining the number of layers of the hidden layer, the number of soft quantum neurons in each layer and the number of soft quantum neurons in the output layer according to the dimension of the label value, and designing a specific form of controlled operation and bias evolution;
the quantum state input unit is used for preparing a quantum bit set with corresponding information by encoding classical data with a certain mode or receiving the prepared quantum bit set as input; the quantum bit set is used as an initialization state of each soft quantum neuron positioned in an input layer to wait for subsequent operation; since the subsequent units with quantum properties, such as the quantum state measuring unit, the first operation unit, the second operation unit, and the like, can only operate on the qubits and thus process the information stored therein, the presence of the quantum state input unit is necessary, and its operation diagram is shown in fig. 1.
The method comprises the following steps of preparing a qubit set with corresponding information by encoding classical data with a certain mode as follows:
in quantum computing, a qubit is the most fundamental operation object of quantum computing, similar to a bit of 0 or 1 in a classical computer. Qubits are generally implemented by a quantum two-level system, having two eigenstates
Figure 2385DEST_PATH_IMAGE001
And
Figure 100002_DEST_PATH_IMAGE014
as the basis for the calculation. One feature that distinguishes quantum computation from classical computation is that a single-quantum bit can be any linear superposition of the two eigenstates mentioned above, and the quantum state of the single-quantum bit can be used
Figure DEST_PATH_IMAGE015
Represents:
Figure 100002_DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE017
and
Figure 100002_DEST_PATH_IMAGE018
two complex numbers, called the probability amplitude of the quantum state, satisfy the condition
Figure DEST_PATH_IMAGE019
Figure 473294DEST_PATH_IMAGE001
Available two-dimensional vector
Figure 100002_DEST_PATH_IMAGE020
It is shown that,
Figure 241661DEST_PATH_IMAGE014
available two-dimensional vector
Figure DEST_PATH_IMAGE021
Thus, the single-bit quantum state can be expressed as:
Figure 100002_DEST_PATH_IMAGE022
in addition, the quantum state can also be represented by a density matrix
Figure DEST_PATH_IMAGE023
(pure state) to show:
Figure 100002_DEST_PATH_IMAGE024
the quantum state of the soft quantum neuron in the invention is more general, can be a mixed state, and is expressed by a density matrix mode as follows:
Figure DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE026
the quantum states used for spreading are orthogonal to each other.
The method for preparing the quantum bit set with the information by encoding the classical data specifically comprises the following steps: the classic data is encoded in an angle encoding mode, and each component of the classic data is used as the angle of the revolving door
Figure 484030DEST_PATH_IMAGE012
Will rotate the door
Figure 938014DEST_PATH_IMAGE013
Act upon
Figure 668072DEST_PATH_IMAGE001
Obtaining the quantum state after the corresponding component is coded; for example for one
Figure DEST_PATH_IMAGE027
Real vector of dimension
Figure DEST_PATH_IMAGE028
The following can be obtained:
Figure DEST_PATH_IMAGE029
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE030
is a real vector
Figure DEST_PATH_IMAGE031
A component of (a);
Figure DEST_PATH_IMAGE032
to represent
Figure 290946DEST_PATH_IMAGE030
The encoded quantum state, i.e. representing the second quantum state at the input layer
Figure 969795DEST_PATH_IMAGE002
Initial state of individual soft quantum neurons.
The soft quantum neurons are basic components of the quantum neural network, are objects acted by the first operation unit, the second operation unit and the quantum state measurement unit, and are shown in a structural schematic diagram of fig. 2, a plurality of soft quantum neurons can be constructed into the soft quantum neural network, and are shown in a structural schematic diagram of fig. 3;
the soft quantum neurons need to be initialized, after the soft quantum neurons are initialized, the soft quantum neurons sequentially receive controlled operations applied by all the soft quantum neurons connected with the soft quantum neurons in the upper layer and the first operation units between the soft quantum neurons, and then the soft quantum neurons evolve to a final state through the bias of the second operation units; before the output layer, the quantum state measuring unit measures to obtain 0 or 1 as a measuring result, and the measuring result determines the controlled operation of the next layer of soft quantum neurons connected with the next layer of soft quantum neurons;
the first operation unit directionally executes controlled operation between the connected soft quantum neurons, and specifically comprises the following steps:
for all levels in the whole soft quantum neural network system, for
Figure 130649DEST_PATH_IMAGE002
First of a layer
Figure 789033DEST_PATH_IMAGE003
The soft quantum neuron retrieves the measurement result and records the result as
Figure 109156DEST_PATH_IMAGE004
Figure 211104DEST_PATH_IMAGE004
Equal to 0 or 1, if
Figure 875566DEST_PATH_IMAGE005
Then, unit transformation is performed to the next layer of soft quantum neurons connected with the soft quantum neurons
Figure 681848DEST_PATH_IMAGE006
If, if
Figure 263002DEST_PATH_IMAGE007
Then apply arbitrary quantum operations
Figure 253960DEST_PATH_IMAGE008
The next layer of connected soft quantum neurons are different, then
Figure 514040DEST_PATH_IMAGE009
Also different; wherein the content of the first and second substances,
Figure 264959DEST_PATH_IMAGE009
the parameter to be adjusted is an object to be optimized by the optimization unit;
in addition to the soft quantum neurons in the input layer, the soft quantum neurons need to be initialized
Figure 182843DEST_PATH_IMAGE001
Expressed as a density matrix
Figure DEST_PATH_IMAGE033
(ii) a In the manner described above, sequentially according to the soft quantum neurons from the upper layer to which it has a connection
Figure DEST_PATH_IMAGE034
The controlled operation is carried out and the operation is controlled,
Figure DEST_PATH_IMAGE035
respectively representing measurements from a soft quantum neuron with which the previous layer has a connection,
Figure DEST_PATH_IMAGE036
representing the number of soft quantum neurons connected with the upper layer; respectively operate with corresponding quanta
Figure DEST_PATH_IMAGE037
Act on
Figure DEST_PATH_IMAGE038
If the affected soft quantum neuron is jth, then the entire controlled operation can be expressed as:
Figure DEST_PATH_IMAGE039
wherein, the operator
Figure DEST_PATH_IMAGE040
The effect of which is to project a density matrix onto
Figure 33118DEST_PATH_IMAGE001
State or
Figure 669242DEST_PATH_IMAGE014
State;
Figure DEST_PATH_IMAGE041
represent this
Figure DEST_PATH_IMAGE042
A controlled operation
Figure DEST_PATH_IMAGE043
The function of (1) is to have time sequence, and the sequence can not be randomly exchanged;
Figure DEST_PATH_IMAGE044
representing the quantum state of the ith soft quantum neuron connected with the soft quantum neuron in the previous layer before being measured;
Figure DEST_PATH_IMAGE045
means passing through
Figure 427114DEST_PATH_IMAGE042
After controlled operation, obtaining the state of the jth soft quantum neuron;
subsequently, the evolution operation of the second operation unit is executed on the j soft quantum neuron, and the effect of the evolution operation comprises a single-bit quantum gate with the parameter to be optimized, namely a certain soft quantityAfter the sub-neuron is acted upon by all controlled operations of the previous layer of soft quantum neurons to which it has connectivity, the soft quantum neurons are re-gated with arbitrary, noise-containing gates
Figure 717281DEST_PATH_IMAGE010
To obtain a final state
Figure DEST_PATH_IMAGE046
Wherein
Figure 282998DEST_PATH_IMAGE011
The parameter to be adjusted is also an object to be optimized by the optimization unit;
the quantum state measuring unit is used for obtaining a final state by the action object of the soft quantum neuron and the controlled operation action and the bias evolution action of all the soft quantum neurons connected with the last layer, and executing quantum measuring operation on the final state under the calculation basis vector to obtain 0 or 1 as a measuring result;
resetting the system for multiple times for the soft quantum neurons located in the output layer, and repeating all operations from the input layer to the output layer to obtain the probability distribution p of the measurement result of each soft quantum neuron in the output layer; then further mapping the data to a label output value of the input data as a predicted value to give an output result of the mode identification task;
if the soft quantum neuron in the input layer is subjected to quantum measurement under the basis vector of calculation, the measurement result is b, such as Z gate measurement (in this case, the above-mentioned soft quantum neuron is measured by the Z gate)
Figure 376725DEST_PATH_IMAGE001
And
Figure 735025DEST_PATH_IMAGE014
i.e. the two eigenstates of the Z measurement), the measurement result is 0 or 1,0 corresponding to
Figure 738753DEST_PATH_IMAGE001
State, 1 corresponds to
Figure 806198DEST_PATH_IMAGE014
State.
After all soft quantum neurons of the input layer are measured, vectors are obtained
Figure DEST_PATH_IMAGE047
Figure DEST_PATH_IMAGE048
Representing the number of soft quantum neurons of the input layer, and performing controlled operation on the soft quantum neurons of the next layer connected with the soft quantum neurons of the input layer; if the soft quantum neurons of the hidden layer and the output layer are acted by the first operation unit and the second operation unit, measuring the soft quantum neurons, and recording the obtained vector B; the optimization unit compares an output result, namely a tag output value, given by the quantum state measurement unit of the output layer soft quantum neuron with an expected tag value corresponding to input data, and updates parameters to be adjusted in the first operation unit and the second operation unit; wherein, the comparison adopts a loss function, and the smaller the loss function value is, the smaller the difference between the output result and the expected tag value is;
in the invention, the loss function can be specifically selected from a cross entropy function, a square loss function and the like.
The method for adjusting the parameters comprises the following steps: a gradient-based approach may be used to measure the difference by defining a loss function, calculating the gradient of the loss function relative to the adjustable parameters, and adjusting some or all of the adjustable parameters appropriately in the opposite direction to the gradient, i.e., decreasing the value of the loss function so that the output results are closer to the desired output.
In the soft quantum neural network provided by the invention, the quantum state input unit, the quantum state measurement unit, the first operation unit, the second operation unit and the soft quantum neuron can be formed by simple quantum computing hardware, and the optimization unit can be realized by a classical computer-assisted mode.
Based on the same inventive concept, the invention also provides a pattern recognition method based on the soft quantum neural network system, which comprises the following steps:
(1) acquiring a data set to be classified;
(2) inputting the data set to be classified into a soft quantum neural network which is trained in advance;
if the data in the data set to be classified is classical data, encoding according to the dimensionality of the classical data to obtain a qubit set with corresponding information, and inputting the qubit set into a soft quantum neural network; if the data in the data set to be classified is the quantum bit set obtained through the preparation, the data are directly input into the soft quantum neural network;
(3) and acquiring an output result of the soft quantum neural network as a classification result of the input data to be classified.
Wherein, the soft quantum neural network does not have hidden layer, and the training of soft quantum neural network includes the following step:
1) preparing a training data set for a corresponding classification task, including input data and expected label values as outputs; the input data may be classical data represented by a matrix or quantum data represented by quantum states;
establishing a soft quantum neural network (excluding a hidden layer);
determining the number of soft quantum neurons of an input layer according to the dimensionality of input data, wherein an expected tag value corresponding to the input data is known; determining the number of soft quantum neurons in an output layer according to the dimension of the label value, and designing a specific form of controlled operation and bias evolution;
to total number of
Figure DEST_PATH_IMAGE049
The training data set of (2) can be used for distributing data used for training each time in one of the following two ways;
a. each time extracting differently
Figure DEST_PATH_IMAGE050
The data is used as single training data when training to the first
Figure DEST_PATH_IMAGE051
After the next time, the whole training data set can be traversed to obtain the single training data with corresponding quantity, and the single training data can be sequentially or randomly selectedPerforming training;
b. all data of the training data set are used as single training data at one time;
2) receiving training data, encoding each component of the training data in a training data set to a quantum bit, taking each quantum bit as the initialization state of the soft quantum neuron of the corresponding input layer, and measuring the quantum state of all the soft quantum neurons of the input layer under the condition of calculating basis vector to obtain a measurement result of 0 or 1;
3) sequentially searching the measurement results of the soft quantum neurons of the previous layer connected with the soft quantum neurons from the first soft quantum neuron of the output layer, and sequentially applying respective controlled operations to the soft quantum neurons according to each measurement result;
4) after the controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation on the soft quantum neurons to evolve to a final state, and measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result;
5) repeating the steps 2) -4) for multiple times to obtain the probability distribution of the corresponding measurement result of each soft quantum neuron of the output layer, and mapping the probability distribution with the expected tag value corresponding to the dimensionality of the input data to obtain the output result of the soft quantum neural network;
6) calculating the difference between the output result obtained in the step 5) and the expected tag value corresponding to the input data, if the difference is not greater than a set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof; if the difference is larger than the set threshold, adjusting all parameters in the controlled operation and the bias evolution operation, and turning to the step 7);
7) and repeating the steps 2) to 6) until the set upper limit of the number of times of repetition is reached or until the difference between the expected tag values corresponding to the output result and the input data is not greater than the set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof.
The inclusion of the hidden layer in the soft quantum neural network will be described below, and the training of the soft quantum neural network includes the following steps:
1) preparing a training data set for a corresponding classification task, including input data and expected label values as outputs; the input data may be classical data represented by a matrix or quantum data represented by quantum states;
establishing a soft quantum neural network (including a hidden layer);
determining the number of soft quantum neurons of an input layer according to the dimensionality of input data, wherein an expected tag value corresponding to the input data is known; determining the number of layers of the hidden layer, the number of soft quantum neurons in each layer and the number of soft quantum neurons in the output layer according to the dimension of the label value, and designing a specific form of controlled operation and bias evolution;
to total number of
Figure 174772DEST_PATH_IMAGE049
The training data set of (2) can be used for distributing data used for training each time in one of the following two ways;
a. each time extracting differently
Figure 54873DEST_PATH_IMAGE050
The data is used as single training data when training to the first
Figure 647528DEST_PATH_IMAGE051
After the next time, the whole training data set can be traversed to obtain single training data with corresponding quantity, and the training can be performed by sequentially or randomly selecting;
b. all data of the training data set are used as single training data at one time;
2) receiving training data, encoding each component of the training data in a training data set to a quantum bit, taking each quantum bit as the initialization state of the soft quantum neuron of the corresponding input layer, and measuring the quantum state of all the soft quantum neurons of the input layer under the condition of calculating basis vector to obtain a measurement result of 0 or 1;
3) sequentially searching the measurement results of the last layer of soft quantum neurons connected with the soft quantum neurons from the first soft quantum neuron of the first layer of hidden layers to the last soft quantum neuron of the last layer of hidden layers, and sequentially applying respective controlled operation to the soft quantum neurons according to each measurement result; for example, the number of the connection is D, and D times of controlled operation are sequentially applied; after controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation on the soft quantum neurons to evolve to a final state, measuring the final state under a basis vector to obtain 0 or 1 serving as a measurement result, and preparing for controlled operation of the next layer;
4) traversing each soft quantum neuron of the output layer, sequentially searching the measurement result of the last layer of soft quantum neuron connected with the soft quantum neuron from the first soft quantum neuron of the output layer, and sequentially applying respective controlled operation to the soft quantum neuron according to each measurement result; after the controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation on the soft quantum neurons to evolve to a final state, and measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result;
5) repeating the steps 2) to 4) for multiple times to obtain the probability distribution of the corresponding measurement result of each soft quantum neuron of the output layer, wherein the probability of measuring 0 by the mth soft quantum neuron of the output layer is
Figure DEST_PATH_IMAGE052
(ii) a And mapping the expected label value corresponding to the dimensionality of the input data to obtain an output result of the soft quantum neural network.
6) Calculating the difference between the output result obtained in the step 5) and the expected tag value corresponding to the input data, if the difference is not greater than a set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof; if the difference is larger than the set threshold, adjusting all parameters in the controlled operation and the bias evolution operation, and turning to the step 7);
7) and repeating the steps 2) to 6) until the set upper limit of the number of times of repetition is reached or until the difference between the expected tag values corresponding to the output result and the input data is not greater than the set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof.
The training of the soft quantum neural network also comprises the verification of the training effect of the quantum neural network, and the specific implementation mode is as follows: dividing a training data set into a non-intersected training set and a test set (for example, dividing according to the proportion of 8: 2), predicting on the test set by using a soft quantum neural network obtained by training each time after the data of the training set is completely trained, counting the classification accuracy (the proportion of the correctly predicted data amount to the total data amount), and retraining the soft quantum neural network if the accuracy is lower than a set threshold.
The invention provides two embodiments of the pattern recognition method based on the soft quantum neural network, which comprises the following steps:
example 1:
crescent data is a classical binary data set used for comparing different classical machine learning algorithm performances by a scinit-left official network, the classification decision boundary is named after being similar to a crescent, a task is a point which is correctly classified and respectively positioned at two sides of the decision boundary, and experimental data is generated by using a make _ mos function in a skearn. The experimental data set comprises 200 training samples and 100 testing samples, the dimensionality of input data is 2-dimensional, the label value is 1 or 0, the label above the crescent is 0, and the label below the crescent is 1; a test set data diagram is shown in fig. 4.
The input layer of the established quantum neural network comprises 10 soft quantum neurons (every two soft quantum neurons respectively encode two dimensions of input data, thereby realizing 5 times of repeated encoding). The two hidden layers respectively comprise 4 and 2 soft quantum neurons, and the output layer is a soft quantum neuron. When the output result is less than 0.5, it is marked as label 0, and when it is greater than 0.5, it is marked as label 1. And the optimization unit adopts a square loss function as a loss function, and an Adam optimization algorithm is used for training and adjusting parameters in the controlled Kraus operation unit and the offset evolution unit.
Each time 10 data points are extracted from the training set for training, the total number of training rounds is 20. The graph of the change of the loss function value in the training process is shown in fig. 5, and the graph of the change of the test accuracy value in the training process is shown in fig. 6. As can be seen from fig. 6, the test accuracy rapidly increases in the previous 10 iterative trainings, and finally reaches the recognition accuracy of 100%.
Example 2:
in addition, the method mentioned in the literature (Physical Review A, 2018, 98(3): 032309) is used to generate a binary data set with non-linear circular decision boundaries. As shown in fig. 7, data of a circular area distributed at the center of the picture may be classified into one type, denoted as a label 0, and data outside the circular area distributed at the center of the picture may be classified into another type, denoted as a label 1. The experimental data set contains 200 training samples and 100 testing samples, the dimension of the input data is 2-dimensional, and the label value is 1 or 0.
What is built is a soft quantum perceptron model, i.e. a model with only an input layer and an output layer, without a hidden layer. The input layer of the soft quantum perceptron comprises 4 soft quantum neurons (two dimensions of input data are respectively subjected to 2 times of repeated coding), and the output layer is one soft quantum neuron. When the output result is less than 0.5, it is marked as label 0, and when it is greater than 0.5, it is marked as label 1. And the optimization unit adopts a square loss function as a loss function, and an Adam optimization algorithm is used for training and adjusting parameters in the controlled Kraus operation unit and the offset evolution unit.
Each time 10 data points are extracted from the training set for training, the total number of training rounds is 20. The predicted heat map of the soft quantum perceptron obtained by the final training is shown in fig. 8. A clear circular decision boundary can be seen in the figure. The capability of nonlinear classification of a soft quantum sensor model is demonstrated, and it is worth noting that the traditional sensor model can only process a linear classification task.
The quantum neural network of the invention takes soft quantum neurons as basic constituent units, is similar to a classical neural network, and can define any connection mode among the soft quantum neurons. The method has the advantages that the method has high extensibility, various connection modes of a classical neural network can be highly referenced in theoretical design, the method can be conveniently expanded to structures such as a convolutional neural network and a cyclic neural network, and a corresponding soft quantum convolutional neural network and a soft quantum cyclic neural network are designed.
Although the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the details of the foregoing embodiments, and various equivalent changes (such as number, shape, position, etc.) may be made to the technical solution of the present invention within the technical spirit of the present invention, and these equivalent changes are all within the protection scope of the present invention.

Claims (10)

1. A soft quantum neural network system implemented by one or more quantum processors, the system comprising:
the input layer and the output layer comprise a plurality of soft quantum neurons, wherein each soft quantum neuron is a physical quantum bit, and the quantum state of each soft quantum neuron is in a pure state or a mixed state;
the quantum state input unit is used for encoding and preparing classical data to obtain a quantum bit set with information of the classical data or receiving the prepared quantum bit set as input; the quantum bit set is the initialization state of each soft quantum neuron of the input layer;
the quantum state measuring unit is used for measuring the quantum state of each soft quantum neuron in each layer before the output layer under the calculation basis vector to obtain a measuring result 0 or 1; measuring each soft quantum neuron positioned in an output layer, wherein the measurement result is a single measurement result 0 or 1 and probability distribution corresponding to 0 or 1 obtained after a large number of measurements, mapping the measurement result to a label value of known classical data, and giving an output result of a mode identification task;
a first operation unit for directionally performing a controlled operation between the soft quantum neurons having connections according to the measurement result;
the second operation unit is used for executing evolution operation on the soft quantum neurons connected with the previous layer after controlled operation between the soft quantum neurons and all the soft quantum neurons connected with the previous layer from the previous layer is completed;
and the optimization unit is used for comparing the difference between the output result of the output layer and the known label value of the classical data, and optimizing by adjusting the controlled operation of the first operation unit and/or the evolution operation of the second operation unit when the difference exceeds a preset range until the difference is within the preset range.
2. The soft quantum neural network system of claim 1, wherein: the controlled operation of the first operation unit is specifically: in addition to the soft quantum neurons in the input layer, other soft quantum neurons are initialized
Figure DEST_PATH_IMAGE002
(ii) a To the first
Figure DEST_PATH_IMAGE004
First of a layer
Figure DEST_PATH_IMAGE006
A soft quantum neuron whose measurement result is recorded as
Figure DEST_PATH_IMAGE008
Figure 826580DEST_PATH_IMAGE008
Is 0 or 1; if it is
Figure DEST_PATH_IMAGE010
Then the conversion is carried out to the unit of action of the next layer of soft quantum neurons connected with the unit of action
Figure DEST_PATH_IMAGE012
If, if
Figure DEST_PATH_IMAGE014
Then apply arbitrary quantum operations
Figure DEST_PATH_IMAGE016
(ii) a Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE018
the parameter to be adjusted is an object to be optimized by the optimization unit;
the evolution operation of the second operation unit is to apply any revolving gate containing noise to the soft quantum neuron after the controlled operation applied by all the soft quantum neurons with connection from the upper layer is used up
Figure DEST_PATH_IMAGE020
(ii) a Wherein
Figure DEST_PATH_IMAGE022
And the parameter to be adjusted is an object to be optimized by the optimization unit.
3. The soft quantum neural network system of claim 2, wherein: the optimization unit compares the output result of the output layer with the difference of the known label value of the classical data and optimizes the output result;
in the optimization, to adjust the parameters to be adjusted in the first operation unit and the second operation unit, a gradient-based method is specifically adopted: and measuring the difference by defining a loss function, calculating the gradient of the loss function relative to the parameter to be adjusted, and partially or completely adjusting the parameter to be adjusted along the direction opposite to the gradient.
4. The soft quantum neural network system of claim 1, wherein: the number of soft quantum neurons of the input layer is determined according to the dimensionality of input data; the number of soft quantum neurons in the output layer is designed, and the specific form of evolution from the input layer to the output layer is determined.
5. The soft quantum neural network system of claim 1, wherein: a hidden layer is further arranged between the input layer and the output layer of the plurality of soft quantum neurons, and the number of the soft quantum neurons of the input layer is determined according to the dimension of input data; and designing the number of layers of the hidden layer, the number of soft quantum neurons in each layer and the number of soft quantum neurons in the output layer, and determining a specific form of evolution from the input layer, the hidden layer to the output layer.
6. The soft quantum neural network system of claim 1, wherein: the quantum state input unit encodes classical data to obtain a quantum bit set with information of the classical data, and the quantum bit set specifically comprises the following steps: each component of the classical data is used as the angle of the revolving door by adopting an angle coding mode
Figure DEST_PATH_IMAGE024
Will rotate the door
Figure DEST_PATH_IMAGE026
Act upon
Figure 58235DEST_PATH_IMAGE002
And obtaining the quantum state after the corresponding component is coded.
7. A pattern recognition method based on the soft quantum neural network system of any one of claims 1 to 6, characterized by comprising the steps of:
(1) acquiring a data set to be classified;
(2) inputting the data set to be classified into a soft quantum neural network which is trained in advance;
if the data in the data set to be classified is classical data, encoding according to the dimensionality of the classical data to obtain a qubit set with corresponding information, and inputting the qubit set into a soft quantum neural network; if the data in the data set to be classified is the quantum bit set obtained through the preparation, the data are directly input into the soft quantum neural network;
(3) and acquiring an output result of the soft quantum neural network as a classification result of the input data to be classified.
8. The pattern recognition method according to claim 7, characterized in that: the soft quantum neural network is trained as follows:
1) preparing a training data set for a corresponding classification task, including input data and expected label values as outputs;
2) coding each component of training data in a training data set to a quantum bit, taking each quantum bit as the initialization state of a soft quantum neuron of a corresponding input layer, and measuring the quantum state of all the soft quantum neurons of the input layer under a calculation basis vector to obtain a measurement result of 0 or 1;
3) sequentially searching the measurement results of the soft quantum neurons of the previous layer connected with the soft quantum neurons from the first soft quantum neuron of the output layer, and sequentially applying respective controlled operations to the soft quantum neurons according to each measurement result;
4) after the controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation on the soft quantum neurons to evolve to a final state, and measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result;
5) repeating the steps 2) -4) for multiple times to obtain the probability distribution of the corresponding measurement result of each soft quantum neuron of the output layer, and mapping the probability distribution with the expected tag value corresponding to the dimensionality of the input data to obtain the output result of the soft quantum neural network;
6) calculating the difference between the output result obtained in the step 5) and the expected tag value corresponding to the input data, if the difference is not greater than a set threshold value, terminating iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof; if the difference is larger than the set threshold, adjusting all parameters in the controlled operation and the bias evolution operation, and turning to the step 7);
7) and repeating the steps 2) -6) until the set upper limit of the repetition times is reached or until the difference between the expected label values corresponding to the output result and the input data is not larger than the set threshold value, terminating the iteration, and storing the trained soft quantum neural network model and the optimized parameters thereof.
9. The pattern recognition method according to claim 8, characterized in that: if the soft quantum neural network comprises a hidden layer, the steps from 2) to 5) are as follows:
s1, encoding each component of training data in the training data set to a quantum bit, taking each quantum bit as the initialization state of the soft quantum neuron of the corresponding input layer, and measuring the quantum state of all the soft quantum neurons of the input layer under the condition of calculating basis vector to obtain the measurement result of 0 or 1;
s2, sequentially searching the measurement results of the last layer of soft quantum neurons connected with the soft quantum neurons from the first soft quantum neuron of the first layer of hidden layer to the last soft quantum neuron of the last layer of hidden layer, and sequentially applying respective controlled operation to the soft quantum neurons according to each measurement result;
s3, after the controlled operation between the soft quantum neurons connected with the soft quantum neurons from the previous layer is completed, applying bias evolution operation to the soft quantum neurons to evolve to a final state, measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result, and preparing for the controlled operation of the next layer;
s4, traversing each soft quantum neuron of the output layer, sequentially searching the measurement result of the last layer of soft quantum neuron connected with the soft quantum neuron from the first soft quantum neuron of the output layer, and sequentially applying respective controlled operation to the soft quantum neuron according to each measurement result;
s5, after the controlled operation between all the soft quantum neurons connected with the soft quantum neurons from the upper layer is completed, applying bias evolution operation to the soft quantum neurons to evolve to a final state, and measuring the final state under the calculation basis vector to obtain 0 or 1 as a measurement result;
s6 repeating the steps S1-S5 for many times to obtain the probability distribution of the measurement result corresponding to each soft quantum neuron of the output layer, and mapping the probability distribution with the expected label value corresponding to the dimensionality of the input data, namely the output result of the soft quantum neural network.
10. The pattern recognition method of claim 8, wherein: the training of the soft quantum neural network further comprises the verification of the training effect of the quantum neural network, and specifically comprises the following steps:
dividing a training data set into a non-intersected training set and a test set, predicting the soft quantum neural network obtained by training on the test set after the data of the training set is completely trained, counting the classification accuracy of the soft quantum neural network, and retraining the soft quantum neural network if the accuracy is lower than a set threshold.
CN202210413229.2A 2022-04-20 2022-04-20 Soft quantum neural network system and mode identification method Pending CN114519430A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210413229.2A CN114519430A (en) 2022-04-20 2022-04-20 Soft quantum neural network system and mode identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210413229.2A CN114519430A (en) 2022-04-20 2022-04-20 Soft quantum neural network system and mode identification method

Publications (1)

Publication Number Publication Date
CN114519430A true CN114519430A (en) 2022-05-20

Family

ID=81600525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210413229.2A Pending CN114519430A (en) 2022-04-20 2022-04-20 Soft quantum neural network system and mode identification method

Country Status (1)

Country Link
CN (1) CN114519430A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117786417A (en) * 2024-02-28 2024-03-29 之江实验室 Model training method, transient source identification method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117786417A (en) * 2024-02-28 2024-03-29 之江实验室 Model training method, transient source identification method and device and electronic equipment

Similar Documents

Publication Publication Date Title
JP7362692B2 (en) quantum neural network
US11797872B2 (en) Quantum bit prediction
WO2017214293A1 (en) Systems and methods for quantum computation
CN113361664B (en) Image recognition system and method based on quantum convolution neural network
US10824373B2 (en) Effective quantum RAM architecture for quantum database
WO2020072981A1 (en) Quantum convolutional neural networks
CN112560966B (en) Polarized SAR image classification method, medium and equipment based on scattering map convolution network
Zhao et al. A review of quantum neural networks: methods, models, dilemma
Hu et al. On the design of quantum graph convolutional neural network in the nisq-era and beyond
CN114519430A (en) Soft quantum neural network system and mode identification method
CN113159303B (en) Quantum circuit-based artificial neuron construction method
CN116976405A (en) Variable component shadow quantum neural network based on immune optimization algorithm
Bar et al. An approach based on quantum reinforcement learning for navigation problems
Dash DECPNN: A hybrid stock predictor model using Differential Evolution and Chebyshev Polynomial neural network
Bai et al. Debris flow prediction with machine learning: smart management of urban systems and infrastructures
CN115688883A (en) Confrontation sample detection method and system based on quantum fuzzy convolution neural network
Thabet et al. Laplacian Eigenmaps with variational circuits: a quantum embedding of graph data
CN114511092A (en) Graph attention mechanism implementation method based on quantum circuit
Kashyap et al. Quantum Convolutional Neural Network Architecture for Multi-Class Classification
Innan et al. Simulation of a Variational Quantum Perceptron using Grover's Algorithm
Altares-López et al. AutoQML: Automatic generation and training of robust quantum-inspired classifiers by using evolutionary algorithms on grayscale images
Cacioppo et al. Quantum diffusion models
Saju et al. Analysis on Role of Quantum Computing in Machine Learning
Daskin A Simple Quantum Blockmodeling with Qubits and Permutations
Dogra et al. Dog Breed Classifier Using Deep Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination