CN116912556A - Picture classification method and device, electronic equipment and storage medium - Google Patents

Picture classification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116912556A
CN116912556A CN202310683308.XA CN202310683308A CN116912556A CN 116912556 A CN116912556 A CN 116912556A CN 202310683308 A CN202310683308 A CN 202310683308A CN 116912556 A CN116912556 A CN 116912556A
Authority
CN
China
Prior art keywords
neural network
quantum
classified
quantum neural
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310683308.XA
Other languages
Chinese (zh)
Inventor
于春霖
王迈达
吕启闻
陈岳
曹希
周朋
张鲁峰
李璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Great Wall Technology Group Co ltd
Original Assignee
China Great Wall Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Great Wall Technology Group Co ltd filed Critical China Great Wall Technology Group Co ltd
Priority to CN202310683308.XA priority Critical patent/CN116912556A/en
Publication of CN116912556A publication Critical patent/CN116912556A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The application is suitable for the technical field of picture classification, and provides a picture classification method, a picture classification device, electronic equipment and a storage medium. The method comprises the following steps: obtaining pictures to be classified; performing feature extraction and dimension reduction treatment on the pictures to be classified to obtain target feature vectors corresponding to the pictures to be classified; inputting the target feature vector into a plurality of quantum neural network layers which are arranged in parallel for classification processing, and respectively obtaining classification results of each quantum neural network layer; and determining a final classification result of the pictures to be classified according to the classification result of each quantum neural network layer. According to the application, the characteristic vector can be processed through the quantum neural network layers with a small number of quantum bits, so that a picture classification result is obtained, and a picture classification effect obtained through processing the quantum neural network layers with a large number of quantum bits is achieved, so that the effect of classifying pictures through a classical quantum hybrid neural network can be improved under the condition that the quantum bits supported by a quantum chip are limited.

Description

Picture classification method and device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of image classification, and particularly relates to a picture classification method, a picture classification device, electronic equipment and a storage medium.
Background
At present, the application of a classical quantum hybrid neural network in the field of image classification is more and more widespread, wherein the classical quantum hybrid neural network can be obtained by combining the classical neural network and the quantum neural network.
However, the performance of the classical quantum hybrid neural network is limited by how many quantum bits the quantum chip can support, the greater the number of quantum bits the quantum chip can support, the better the performance of the classical quantum hybrid neural network. Because the number of quantum bits that can be supported by the current quantum chip is limited, the performance of the traditional quantum hybrid neural network is low, and finally the effect of classifying pictures by the traditional quantum hybrid neural network is poor.
Disclosure of Invention
In view of the above, the embodiments of the present application provide a method, an apparatus, an electronic device, and a storage medium for classifying pictures, so as to solve the technical problem that the existing method for classifying pictures by using a classical quantum hybrid neural network has poor effect.
In a first aspect, an embodiment of the present application provides a method for classifying pictures, including:
obtaining pictures to be classified;
performing feature extraction and dimension reduction on the pictures to be classified to obtain target feature vectors corresponding to the pictures to be classified;
Inputting the target feature vector into a plurality of quantum neural network layers which are arranged in parallel for classification processing, and respectively obtaining classification results of each quantum neural network layer;
and determining a final classification result of the pictures to be classified according to the classification result of each quantum neural network layer.
Optionally, the performing feature extraction and dimension reduction on the to-be-classified picture to obtain a target feature vector corresponding to the to-be-classified picture includes:
inputting the pictures to be classified into a trained feature extraction network to perform feature extraction and dimension reduction processing to obtain the target feature vector;
the feature extraction network comprises a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a full connection layer; inputting the pictures to be classified into a trained feature extraction network for feature extraction and dimension reduction processing to obtain the target feature vector, wherein the method comprises the following steps of:
inputting the pictures to be classified into the first convolution layer for feature extraction and dimension reduction processing to obtain a first feature vector;
inputting the first feature vector to the first pooling layer for dimension reduction processing to obtain a second feature vector;
Inputting the second feature vector into the second convolution layer for feature extraction processing to obtain a third feature vector;
inputting the third feature vector to the second pooling layer for dimension reduction processing to obtain a fourth feature vector;
and converting the fourth feature vector into N M-dimensional feature vectors through the full connection layer, determining the N M-dimensional feature vectors as the target feature vector, wherein N represents the preset number of picture categories, and the value of M is determined according to the number of quantum bits of each quantum neural network.
Optionally, the inputting the target feature vector to a plurality of quantum neural network layers arranged in parallel for classification processing includes:
and respectively inputting each two one-dimensional feature vectors in the N one-dimensional feature vectors into one quantum neural network layer for classification processing.
Optionally, the feature extraction network and the plurality of quantum neural network layers are optimized by training in the following manner:
acquiring a sample image marked with a category label;
obtaining a sample classification result obtained after the sample image is subjected to classification treatment through the feature extraction network and the quantum neural network layers;
And adjusting parameters of the feature extraction network and parameters of the quantum neural network layers in a back propagation mode according to the sample classification result and the class labels.
Optionally, the parameters of the feature extraction network include conversion parameters of a full-connection layer, each quantum neural network layer includes a variable-distribution sub-network, and the adjusting, according to the sample classification result and the class label, the parameters of the feature extraction network and the parameters of the plurality of quantum neural network layers by a back propagation method includes:
and adjusting conversion parameters of the full-connection layer in a counter-propagation mode according to the sample classification result and the class labels, and adjusting the rotation angle of a logic gate in a variable-distribution line sub-network of each quantum neural network layer.
Optionally, each quantum neural network layer includes a quantum state preparation sub-network, a variable distribution sub-network and a measurement sub-network, so that the target quantum neural network layer is any one quantum neural network layer, the target feature vector is input to a plurality of quantum neural network layers arranged in parallel for classification processing, and classification results of each quantum neural network layer are respectively obtained, including:
Encoding the target feature vector into a rotation angle of a designated logic gate according to a preset sequence through a quantum state preparation sub-network of the target quantum neural network layer to obtain a quantum state corresponding to the target feature vector, wherein the designated logic gate is a logic gate arranged in the quantum state preparation sub-network of the target quantum neural network layer;
performing variation treatment on the quantum state through a variation line sub-network of the target quantum neural network layer to obtain the quantum state after variation;
and measuring the quantum state after the variation through a measurement sub-network of the target quantum neural network layer to obtain a measurement result, and determining a classification result of the target quantum neural network layer according to the measurement result.
And measuring the quantum state after the variation through a measurement sub-network of the target quantum neural network layer to obtain a measurement result, and determining a classification result of the target quantum neural network layer according to the measurement result.
Optionally, the determining a final classification result of the to-be-classified picture according to the classification result of each quantum neural network layer includes:
And determining an average value of the classification results of each quantum neural network layer as a final classification result of the picture to be classified.
In a second aspect, an embodiment of the present application provides a picture classifying apparatus, including:
the first acquisition unit is used for acquiring pictures to be classified;
the feature extraction and dimension reduction unit is used for carrying out feature extraction and dimension reduction on the pictures to be classified to obtain target feature vectors corresponding to the pictures to be classified;
the classifying unit is used for inputting the target feature vector into a plurality of quantum neural network layers which are arranged in parallel to carry out classifying treatment, and respectively obtaining a classifying result of each quantum neural network layer;
and the first determining unit is used for determining a final classification result of the picture to be classified according to the classification result of each quantum neural network layer.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the picture classification method according to any one of the first aspect when the processor executes the computer program.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the picture classification method according to any one of the first aspects.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to perform the steps of the picture classification method as set forth in any one of the first aspects above.
The picture classification method, the device, the equipment and the medium provided by the embodiment of the application have the following beneficial effects:
according to the picture classification method provided by the embodiment of the application, the picture to be classified is obtained, the feature extraction and dimension reduction treatment are further carried out on the picture to be classified, the target feature vector corresponding to the picture to be classified is obtained, then the target feature vector is input into a plurality of quantum neural network layers which are arranged in parallel for classification treatment, the classification result of each quantum neural network layer is respectively obtained, and finally the final classification result of the picture to be classified is determined according to the classification result of each quantum neural network layer. By adopting the picture classification method provided by the embodiment of the application, the characteristic vector can be processed through the quantum neural network layers with a small amount of quantum bits to obtain the picture classification result, so that the picture classification effect obtained by processing the quantum neural network layers with a large amount of quantum bits is achieved, and therefore, the picture classification method provided by the embodiment of the application can improve the picture classification effect through the classical quantum hybrid neural network under the condition that the quantum bits supported by the quantum chip are limited.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an implementation of a method for classifying pictures according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a feature extraction network according to an embodiment of the present application;
FIG. 3 is a flowchart of an implementation of obtaining a target feature vector according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a quantum neural network layer according to an embodiment of the present application;
fig. 5 is a schematic flow chart of obtaining a classification result of each quantum neural network layer according to an embodiment of the present application;
FIG. 6 is a flowchart of an implementation of training optimization according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a picture classifying apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
It is to be understood that the terminology used in the embodiments of the application is for the purpose of describing particular embodiments of the application only, and is not intended to be limiting of the application. In the description of the embodiments of the present application, unless otherwise indicated, "a plurality" means two or more, and "at least one", "one or more" means one, two or more. The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a definition of "a first", "a second" feature may explicitly or implicitly include one or more of such features.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The execution subject of the picture classification method provided by the embodiment of the application can be electronic equipment. The electronic device may be included in a mobile phone, a tablet computer, a notebook computer, a desktop computer, a picture sorting device, and the like.
The picture classification method provided by the embodiment of the application can be applied to obtaining the final classification result of the pictures to be classified with good classification effect through the quantum chip with fewer quantum bits capable of being supported.
Specifically, when a user needs to obtain a final classification result of a picture to be classified with a good classification effect through a quantum chip with fewer supported quantum bits, each step of the picture classification method provided by the embodiment of the application can be executed through electronic equipment, so that the final classification result of the picture to be classified with a good classification effect can be obtained through the quantum chip with fewer supported quantum bits.
The judgment of the classification effect may be determined by corresponding indexes, which may include, by way of example and not limitation, classification accuracy and classification speed.
Referring to fig. 1, fig. 1 is a flowchart illustrating an implementation of a picture classification method according to an embodiment of the present application, where the picture classification method may include S101 to S104, which are described in detail as follows:
In S101, a picture to be classified is acquired.
In the embodiment of the application, the electronic equipment can firstly acquire the pictures to be classified. The pictures to be classified can be input into the electronic device by a user, or can be obtained from a preset storage position by the electronic device in a preset mode. The specific method for acquiring the pictures to be classified is not particularly limited herein.
In order to achieve a better picture classification result, a minimum pixel threshold of the picture to be classified may be preset. The minimum pixel threshold of the picture to be classified may be set according to actual requirements, and exemplary, the minimum pixel threshold of the picture to be classified may be set to 150×150. Based on this, the pixel of the picture to be classified acquired by the electronic device should be greater than the minimum pixel threshold of the picture to be classified, for example, greater than 150×150. Before the electronic device obtains the picture to be classified, the pixel of the picture to be classified can be obtained first, and if the pixel of the picture to be classified is smaller than the preset minimum pixel threshold of the picture to be classified, the picture to be classified can not be obtained.
In one possible implementation, after S101, step a may be further included. Wherein:
in step a, performing dimension reduction processing on the picture to be classified so as to adjust the pixel of the picture to be classified to a minimum pixel threshold value.
In this implementation manner, in order to more conveniently implement the subsequent steps, for example, the step of performing feature extraction and dimension reduction on the picture to be classified to obtain the target feature vector corresponding to the picture to be classified, before performing feature extraction and dimension reduction on the picture to be classified through the feature extraction network, dimension reduction may be performed on the picture to be classified in advance, so as to adjust the pixel of the picture to be classified to the minimum pixel threshold. For example, the pixels of each picture to be classified are adjusted to 150×150.
In S102, feature extraction and dimension reduction processing are performed on the picture to be classified, so as to obtain a target feature vector corresponding to the picture to be classified.
In the embodiment of the application, after the electronic equipment acquires the picture to be classified, the electronic equipment can perform feature extraction and dimension reduction processing on the picture to be classified to obtain the target feature vector corresponding to the picture to be classified.
In one possible implementation manner, the image to be classified may be input into a trained feature extraction network to perform feature extraction and dimension reduction processing, so as to obtain a target feature vector. The feature extraction network may be a classical neural network.
Specifically, as shown in fig. 2, fig. 2 is a schematic structural diagram of a feature extraction network according to an embodiment of the present application, where the feature extraction network 2 may include a first convolution layer 21, a first pooling layer 22, a second convolution layer 23, a second pooling layer 24, and a full connection layer 25, where the first convolution layer 21 is connected to the first pooling layer 22, the first pooling layer 22 is connected to the second convolution layer 23, the second convolution layer 23 is connected to the second pooling layer 24, the second pooling layer 24 is connected to the full connection layer 25, and the full connection layer 25 is connected to each quantum neural network layer.
In this implementation manner, the target feature vector may be obtained through S201 to S205 shown in fig. 3, and fig. 3 is a flowchart of an implementation of obtaining the target feature vector according to an embodiment of the present application. The details are as follows:
in S201, a picture to be classified is input to a first convolution layer to perform feature extraction and dimension reduction processing, so as to obtain a first feature vector.
In this implementation manner, after obtaining a picture to be classified whose pixel is a minimum pixel threshold, the electronic device may input the picture to be classified whose pixel is the minimum pixel threshold into a first convolution layer of the feature extraction network, and perform feature extraction and dimension reduction processing on the picture to be classified through the first convolution layer to obtain a first feature vector.
In S202, the first feature vector is input to the first pooling layer for performing a dimension reduction process, so as to obtain a second feature vector.
In this implementation manner, after obtaining the first feature vector, the electronic device may input the first feature vector into a first pooling layer of the feature extraction network, and perform a dimension reduction process through the first pooling layer to obtain the second feature vector.
In S203, the second feature vector is input to the second convolution layer to perform feature extraction processing, so as to obtain a third feature vector.
In this implementation manner, after obtaining the second feature vector, the electronic device may input the second feature vector into a second convolution layer of the feature extraction network, and perform feature extraction processing through the second convolution layer to obtain a third feature vector.
In S204, the third feature vector is input to the second pooling layer for performing the dimension reduction processing, so as to obtain a fourth feature vector.
In this implementation manner, after obtaining the third feature vector, the electronic device may input the third feature vector into the second pooling layer of the feature extraction network, and perform the dimension reduction processing through the second pooling layer to obtain the fourth feature vector.
In S205, the fourth feature vector is converted into N M-dimensional feature vectors through the full connection layer, and the N M-dimensional feature vectors are determined as target feature vectors, where N represents a preset number of picture categories, and the value of M is determined according to the number of quantum bits of each quantum neural network.
In this implementation manner, after the fourth feature vector, the electronic device may input the fourth feature vector into a full connection layer of the feature extraction network, convert the fourth feature vector into N M-dimensional feature vectors through the full connection layer, and determine the N M-dimensional feature vectors as the target feature vector.
Wherein N represents a preset number of picture categories. The number of preset picture categories may be set according to actual requirements, which is not limited herein. For example, the number of preset picture categories may be 4, and when the number of preset picture categories is 4, the fourth feature vector may be converted into 4M-dimensional feature vectors through the full connection layer, and the 4M-dimensional feature vectors may be determined as the target feature vectors, and so on.
The value of M may be determined according to the number of qubits of each quantum neural network. For example, the number of qubits that each quantum neural network has may be 2, so the maximum dimension of the feature vector that each quantum neural network can receive is 2, and thus the value of M may be determined to be 2, based on which the fourth feature vector may be converted into N two-dimensional feature vectors by the full connection layer, and the N two-dimensional feature vectors may be determined to be the target vectors. For example, the number of qubits that each quantum neural network has may be 1, so the maximum dimension of the feature vector that each quantum neural network can receive is 1, and thus the value of M may be determined to be 1, based on which the fourth feature vector may be converted into N one-dimensional feature vectors by the full connection layer, and the N one-dimensional feature vectors may be determined to be target vectors.
In the embodiment of the application, the number of quantum bits of each quantum neural network is determined by the quantum chip corresponding to each quantum neural network, and the number of quantum bits of each quantum neural network is 2 if the number of bits of the quantum chip corresponding to each quantum neural network is 2; the number of bits of the quantum chip corresponding to each quantum neural network is 1, and the number of quantum bits of each quantum neural network is 1.
In S103, the target feature vector is input to a plurality of quantum neural network layers arranged in parallel for classification processing, and classification results of each quantum neural network layer are obtained respectively.
In the embodiment of the application, after the electronic device determines the N M-dimensional feature vectors as the target feature vectors, the target feature vectors can be input into a plurality of quantum neural network layers which are arranged in parallel for classification processing, so that classification results of each quantum neural network layer are respectively obtained.
In one possible implementation, each M-dimensional feature vector of the N M-dimensional feature vectors may be separately input to a quantum neural network layer for classification processing.
For example, when the target feature vector includes 4 two-dimensional feature vectors (a first two-dimensional feature vector, a second two-dimensional feature vector, a third two-dimensional feature vector, and a fourth two-dimensional feature vector), the first two-dimensional feature vector may be input into the first quantum neural network layer for classification processing, the second two-dimensional feature vector may be input into the second quantum neural network layer for classification processing, the third two-dimensional feature vector may be input into the third quantum neural network layer for classification processing, and the fourth two-dimensional feature vector may be input into the fourth quantum neural network layer for classification processing, thereby respectively obtaining classification results corresponding to the first quantum neural network layer, the second quantum neural network layer, the third quantum neural network layer, and the fourth quantum neural network layer.
As can be seen from the above, when the number of bits of the quantum chip is limited, if one chip has only M bits, the number of quantum bits of each quantum neural network is M, so that the fourth feature vector can be decomposed into N M-dimensional feature vectors, and each M-dimensional feature vector is respectively input into one quantum neural network to be classified, so as to obtain a classification result corresponding to each quantum neural network layer, so that a picture classification result obtained by a quantum neural network layer with a large number of quantum bits can be obtained through a plurality of quantum neural network layers with a small number of quantum bits.
After each two one-dimensional feature vectors in the N one-dimensional feature vectors are respectively input into one quantum neural network layer for classification treatment, the classification result of each quantum neural network layer can be obtained. The classification result of each quantum neural network layer may be the same or different, and needs to be determined according to the actual situation.
In one possible implementation, as shown in fig. 4, fig. 4 is a schematic structural diagram of a quantum neural network layer provided in an embodiment of the present application, where each quantum neural network layer includes a quantum state preparation sub-network 41, a variable distribution sub-network 42, and a measurement sub-network 43. Wherein the quantum state preparation subnetwork 41 is connected with the variable distribution sub-network 42, and the variable distribution sub-network 42 is connected with the measurement subnetwork 43.
The classification result of each quantum neural network layer can be obtained through S301 to S303 shown in fig. 5, and fig. 5 is a schematic flow chart of obtaining the classification result of each quantum neural network layer according to the embodiment of the present application. The details are as follows:
in S301, a target feature vector is encoded into a rotation angle of a designated logic gate according to a preset order through a quantum state preparation sub-network of a target quantum neural network layer, so as to obtain a quantum state corresponding to the target feature vector.
The designated logic gate is a logic gate arranged in a quantum state preparation sub-network of the target quantum neural network layer.
In this implementation, the target quantum neural network layer may be any quantum neural network layer that receives the target feature vector. For example, when the target feature vector includes 4 one-dimensional feature vectors, the target feature vector may be received and processed through the first quantum neural network layer and the second quantum neural network layer, and then the first quantum neural network layer and the second quantum neural network layer are the target quantum neural network layers.
The electronic device can receive the corresponding target feature vector through the quantum state preparation sub-network of the target quantum neural network layer, and can encode the target feature vector into the rotation angle of a logic gate arranged in the quantum state preparation sub-network of the target quantum neural network layer according to a preset sequence, so that the quantum state corresponding to the target feature vector can be obtained. The above step of obtaining the quantum state corresponding to the target feature vector may be performed for each target quantum neural network layer, so that the quantum state corresponding to each target quantum neural network layer may be obtained.
In S302, the quantum state is subjected to a variation process through a variation line sub-network of the target quantum neural network layer, and the quantum state after the variation is obtained.
In the implementation manner, after the electronic device obtains the quantum states corresponding to each target quantum neural network layer, the electronic device can perform variation processing on the corresponding quantum states through the variation circuit sub-network of the target quantum neural network layer to obtain the varied quantum states. The above step of obtaining the transformed quantum state may be performed for each target quantum neural network layer, so that the transformed quantum state corresponding to each target quantum neural network layer may be obtained.
In S303, the measurement sub-network of the target quantum neural network layer is used to measure the quantum state after the variation, so as to obtain a measurement result, and the classification result of the target quantum neural network layer is determined according to the measurement result.
In the implementation manner, after the electronic device obtains the quantum states after the transformation of each target quantum neural network layer, measurement processing can be performed on the transformed quantum states through a measurement sub-network of the target quantum neural network layer to obtain a measurement result, and a classification result of the target quantum neural network layer is determined according to the measurement result. The step of obtaining the classification result can be performed for each target quantum neural network layer, so that the classification result corresponding to each target quantum neural network layer is obtained. The specific method for determining the classification result of the target quantum neural network layer according to the measurement result may be specifically set according to practical application, and is not limited herein.
In S104, according to the classification result of each quantum neural network layer, a final classification result of the picture to be classified is determined.
In the embodiment of the application, after the electronic equipment obtains the classification result of each quantum neural network layer, the final classification result of the picture to be classified can be determined according to the classification result of each quantum neural network layer.
In one possible implementation, an average value of the classification result of each quantum neural network layer may be determined as a final classification result of the picture to be classified. It should be noted that, the manner of determining the final classification result of the picture to be classified according to the classification result of each quantum neural network layer may also be other manners, and the method provided herein is merely exemplary and not limiting.
It can be seen from the foregoing that, in the image classification method provided by the embodiment of the present application, the image to be classified is obtained, and then the image to be classified is subjected to feature extraction and dimension reduction processing, so as to obtain a target feature vector corresponding to the image to be classified, and then the target feature vector is input into a plurality of quantum neural network layers arranged in parallel for classification processing, so as to obtain a classification result of each quantum neural network layer, and finally, a final classification result of the image to be classified is determined according to the classification result of each quantum neural network layer. By adopting the picture classification method provided by the embodiment of the application, the characteristic vector can be processed through the quantum neural network layers with a small amount of quantum bits to obtain the picture classification result, and the picture classification effect obtained by processing the quantum neural network layers with a large amount of quantum bits can be achieved.
In the embodiment of the application, before the steps are executed, training optimization can be performed on the feature extraction network and each quantum neural network layer in advance. Referring to fig. 6, fig. 6 is a flowchart of an implementation of training optimization according to an embodiment of the present application, and as shown in fig. 6, training optimization on a feature extraction network and each quantum neural network layer may be implemented through steps S401 to S403. The details are as follows:
in S401, a sample image of a labeled category label is acquired.
In the embodiment of the application, the electronic equipment can acquire the sample image marked with the category label. The sample image is an image used for training and optimizing the feature extraction network and each quantum neural network layer as a training set. The user can input the sample image of the labeled category label into the electronic device, and the electronic device can acquire the sample image of the labeled category label from the preset position in a preset mode.
In S402, a sample classification result obtained by classifying a sample image through a feature extraction network and a plurality of quantum neural network layers is obtained.
In the embodiment of the application, after the electronic device acquires the sample images marked with the class labels, the electronic device can input each sample image into the feature extraction network and the quantum neural network layers for classification processing to obtain sample classification results corresponding to each sample image.
In S403, parameters of the feature extraction network and parameters of the plurality of quantum neural network layers are adjusted by back propagation according to the sample classification result and the class labels.
In the embodiment of the application, after the electronic equipment obtains the sample classification result corresponding to each sample image, parameters of the feature extraction network and parameters of a plurality of quantum neural network layers can be adjusted in a back propagation mode according to the sample classification result and the class label.
In one possible implementation, the parameters of the feature extraction network may include conversion parameters of full connection layers, each quantum neural network layer including a variational line subnetwork, and the parameters of each quantum neural network layer may include a rotation angle of a logic gate in the variational line subnetwork of the quantum neural network layer. Based on this, S403 may be implemented in step b, as follows:
in step b, according to the sample classification result and the class label, the conversion parameters of the full-connection layer are adjusted in a back propagation mode, and the rotation angle of a logic gate in the variable-distribution line sub-network of each quantum neural network layer is adjusted.
In this implementation manner, the electronic device may adjust, according to the classification result of each sample and each class label, a conversion parameter of a connection layer in the feature extraction network by using a back propagation algorithm, and adjust a rotation angle of a logic gate in a variable sub-network of a sub-line of each quantum neural network layer, so that a loss function of the classification result is reduced to a preset loss function threshold value, thereby obtaining a target conversion parameter of the connection layer in the feature extraction network and a target rotation angle of the logic gate in the variable sub-network of each quantum neural network layer. And then, the electronic equipment can fix target transformation parameters and target rotation angles to obtain a feature extraction network and each quantum neural network layer which are well trained and optimized.
Based on the image classification method provided by the above embodiment, the embodiment of the present application further provides an image classification device for implementing the above method embodiment, please refer to fig. 7, fig. 7 is a schematic structural diagram of an image classification device provided by the embodiment of the present application. As shown in fig. 7, the picture classification apparatus 7 may include a first acquisition unit 71, a feature extraction and dimension reduction unit 72, a classification unit 73, and a first determination unit 74. Wherein:
the first acquisition unit 71 is configured to acquire a picture to be classified.
The feature extraction and dimension reduction unit 72 is configured to perform feature extraction and dimension reduction on the picture to be classified, so as to obtain a target feature vector corresponding to the picture to be classified.
The classification unit 73 is configured to input the target feature vector to a plurality of quantum neural network layers arranged in parallel to perform classification processing, and obtain a classification result of each quantum neural network layer.
The first determining unit 74 is configured to determine a final classification result of the picture to be classified according to the classification result of each quantum neural network layer.
Optionally, the feature extraction and dimension reduction unit 72 is specifically configured to:
inputting the pictures to be classified into a trained feature extraction network to perform feature extraction and dimension reduction processing to obtain target feature vectors;
The feature extraction network comprises a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a full connection layer; inputting the pictures to be classified into a trained feature extraction network to perform feature extraction and dimension reduction processing to obtain target feature vectors, wherein the method comprises the following steps:
inputting the pictures to be classified into a first convolution layer for feature extraction and dimension reduction processing to obtain a first feature vector;
inputting the first feature vector into a first pooling layer for dimension reduction treatment to obtain a second feature vector;
inputting the second feature vector into a second convolution layer for feature extraction processing to obtain a third feature vector;
inputting the third feature vector into a second pooling layer for dimension reduction treatment to obtain a fourth feature vector;
and converting the fourth feature vector into N M-dimensional feature vectors through the full connection layer, determining the N M-dimensional feature vectors as target feature vectors, wherein N represents the number of preset picture categories, and the value of M is determined according to the number of quantum bits of each quantum neural network.
Optionally, the classifying unit 73 is specifically configured to:
and respectively inputting every two one-dimensional feature vectors in the N one-dimensional feature vectors into a quantum neural network layer for classification processing.
Optionally, the picture classifying apparatus 7 may further include a second acquiring unit, a third acquiring unit, and a parameter adjusting unit. Wherein:
the second acquisition unit is used for acquiring a sample image marked with the category label.
The third acquisition unit is used for acquiring a sample classification result obtained after the sample image is subjected to classification processing through the feature extraction network and the quantum neural network layers.
The parameter adjusting unit is used for adjusting parameters of the feature extraction network and parameters of the plurality of quantum neural network layers in a back propagation mode according to the sample classification result and the category labels.
Optionally, the parameters of the feature extraction network include conversion parameters of the full connection layer, each quantum neural network layer includes a variable-distribution sub-network, and the parameter adjustment unit is specifically configured to:
according to the sample classification result and the class label, the conversion parameters of the full-connection layer are adjusted in a back propagation mode, and the rotation angle of a logic gate in the variable-distribution line sub-network of each quantum neural network layer is adjusted.
Optionally, each quantum neural network layer includes a quantum state preparation sub-network, a variable-distribution sub-network and a measurement sub-network, so that the target quantum neural network layer is an arbitrary quantum neural network layer, and the classification unit 73 is specifically configured to:
Encoding the target feature vector into the rotation angle of a designated logic gate according to a preset sequence through a quantum state preparation sub-network of the target quantum neural network layer to obtain a quantum state corresponding to the target feature vector, wherein the designated logic gate is a logic gate arranged in the quantum state preparation sub-network of the target quantum neural network layer;
carrying out variation treatment on the quantum state through a variation line sub-network of the target quantum neural network layer to obtain a varied quantum state;
and measuring the varied quantum states through a measuring sub-network of the target quantum neural network layer to obtain a measuring result, and determining a classification result of the target quantum neural network layer according to the measuring result.
Optionally, the first determining unit 74 is configured to determine an average value of the classification results of each quantum neural network layer as a final classification result of the picture to be classified.
It should be noted that, because the content of information interaction and execution process between the above units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to the method embodiment specifically, and will not be described herein again.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the application. As shown in fig. 8, the electronic device 8 provided in this embodiment may include: a processor 80, a memory 81 and a computer program 82 stored in the memory 81 and executable on the processor 80. For example, a program corresponding to a picture classification method. The steps applied in the embodiment of the picture classification method described above are implemented when the processor 80 executes the computer program 82, for example, S101 to S104 shown in fig. 1, S201 to S205 shown in fig. 3, S301 to S304 in fig. 5, and S401 to S403 in fig. 6. Alternatively, the processor 80 may execute the computer program 82 to implement the functions of the modules/units in the embodiment corresponding to the electronic device 8, for example, the functions of the units 71 to 74 shown in fig. 7.
By way of example, the computer program 82 may be partitioned into one or more modules/units that are stored in the memory 81 and executed by the processor 80 to complete the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing particular functions for describing the execution of the computer program 82 in the electronic device 8. For example, the computer program 82 may be divided into the first obtaining unit 71, the feature extraction and dimension reduction unit 72, the classifying unit 73 and the first determining unit 74, and the specific functions of the respective units are described with reference to the relevant descriptions in the corresponding embodiment of fig. 7, which is not repeated here.
It will be appreciated by those skilled in the art that fig. 8 is merely an example of the electronic device 8 and is not limiting of the electronic device 8 and may include more or fewer components than shown, or certain components may be combined, or different components.
The processor 80 may be a central processing unit (central processing unit, CPU), but may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate arrays (field-programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may be an internal storage unit of the electronic device 8, such as a hard disk or a memory of the electronic device 8. The memory 81 may also be an external storage device of the electronic device 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) card, a flash card (flash card) or the like, which are provided on the electronic device 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the electronic device 8. The memory 81 is used to store computer programs and other programs and data required by the electronic device. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be clear to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units is illustrated, and in practical application, the above-mentioned functional allocation may be performed by different functional units according to needs, i.e. the internal structure of the picture classification device is divided into different functional units, so as to perform all or part of the above-mentioned functions. The functional units in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present application. The specific working process of the units in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, performs the steps of the respective method embodiments described above.
The embodiments of the present application provide a computer program product for causing a terminal device to carry out the steps of the respective method embodiments described above when the computer program product is run on the terminal device.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A picture classification method, comprising:
obtaining pictures to be classified;
performing feature extraction and dimension reduction on the pictures to be classified to obtain target feature vectors corresponding to the pictures to be classified;
inputting the target feature vector into a plurality of quantum neural network layers which are arranged in parallel for classification processing, and respectively obtaining classification results of each quantum neural network layer;
and determining a final classification result of the pictures to be classified according to the classification result of each quantum neural network layer.
2. The method for classifying pictures according to claim 1, wherein the performing feature extraction and dimension reduction on the pictures to be classified to obtain target feature vectors corresponding to the pictures to be classified comprises:
Inputting the pictures to be classified into a trained feature extraction network to perform feature extraction and dimension reduction processing to obtain the target feature vector;
the feature extraction network comprises a first convolution layer, a first pooling layer, a second convolution layer, a second pooling layer and a full connection layer; inputting the pictures to be classified into a trained feature extraction network for feature extraction and dimension reduction processing to obtain the target feature vector, wherein the method comprises the following steps of:
inputting the pictures to be classified into the first convolution layer for feature extraction and dimension reduction processing to obtain a first feature vector;
inputting the first feature vector to the first pooling layer for dimension reduction processing to obtain a second feature vector;
inputting the second feature vector into the second convolution layer for feature extraction processing to obtain a third feature vector;
inputting the third feature vector to the second pooling layer for dimension reduction processing to obtain a fourth feature vector;
and converting the fourth feature vector into N M-dimensional feature vectors through the full connection layer, determining the N M-dimensional feature vectors as the target feature vector, wherein N represents the preset number of picture categories, and the value of M is determined according to the number of quantum bits of each quantum neural network.
3. The picture classification method according to claim 2, wherein the inputting the target feature vector into a plurality of quantum neural network layers arranged in parallel for classification processing includes:
and respectively inputting each two one-dimensional feature vectors in the N one-dimensional feature vectors into one quantum neural network layer for classification processing.
4. The picture classification method according to claim 2, wherein the feature extraction network and the plurality of quantum neural network layers are optimized by training:
acquiring a sample image marked with a category label;
obtaining a sample classification result obtained after the sample image is subjected to classification treatment through the feature extraction network and the quantum neural network layers;
and adjusting parameters of the feature extraction network and parameters of the quantum neural network layers in a back propagation mode according to the sample classification result and the class labels.
5. The picture classification method according to claim 4, wherein the parameters of the feature extraction network include conversion parameters of a full connection layer, each of the quantum neural network layers includes a variational line sub-network, and the adjusting the parameters of the feature extraction network and the parameters of the plurality of quantum neural network layers by back propagation according to the sample classification result and the class label includes:
And adjusting conversion parameters of the full-connection layer in a counter-propagation mode according to the sample classification result and the class labels, and adjusting the rotation angle of a logic gate in a variable-distribution line sub-network of each quantum neural network layer.
6. The method of classifying pictures according to claim 1, wherein each quantum neural network layer includes a quantum state preparation sub-network, a variable distribution sub-network and a measurement sub-network, the target quantum neural network layer is made to be any one of the quantum neural network layers, the target feature vector is input into a plurality of quantum neural network layers arranged in parallel to perform classification processing, and classification results of each quantum neural network layer are obtained respectively, including:
encoding the target feature vector into a rotation angle of a designated logic gate according to a preset sequence through a quantum state preparation sub-network of the target quantum neural network layer to obtain a quantum state corresponding to the target feature vector, wherein the designated logic gate is a logic gate arranged in the quantum state preparation sub-network of the target quantum neural network layer;
performing variation treatment on the quantum state through a variation line sub-network of the target quantum neural network layer to obtain the quantum state after variation;
And measuring the quantum state after the variation through a measurement sub-network of the target quantum neural network layer to obtain a measurement result, and determining a classification result of the target quantum neural network layer according to the measurement result.
7. The method according to any one of claims 1 to 6, wherein determining a final classification result of the picture to be classified according to the classification result of each quantum neural network layer comprises:
and determining an average value of the classification results of each quantum neural network layer as a final classification result of the picture to be classified.
8. A picture classifying apparatus, comprising:
the first acquisition unit is used for acquiring pictures to be classified;
the feature extraction and dimension reduction unit is used for carrying out feature extraction and dimension reduction on the pictures to be classified to obtain target feature vectors corresponding to the pictures to be classified;
the classifying unit is used for inputting the target feature vector into a plurality of quantum neural network layers which are arranged in parallel to carry out classifying treatment, and respectively obtaining a classifying result of each quantum neural network layer;
and the first determining unit is used for determining a final classification result of the picture to be classified according to the classification result of each quantum neural network layer.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the picture classification method according to any of claims 1-7 when the computer program is executed.
10. A computer-readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the steps of the picture classification method according to any one of claims 1 to 7.
CN202310683308.XA 2023-06-09 2023-06-09 Picture classification method and device, electronic equipment and storage medium Pending CN116912556A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310683308.XA CN116912556A (en) 2023-06-09 2023-06-09 Picture classification method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310683308.XA CN116912556A (en) 2023-06-09 2023-06-09 Picture classification method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116912556A true CN116912556A (en) 2023-10-20

Family

ID=88361832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310683308.XA Pending CN116912556A (en) 2023-06-09 2023-06-09 Picture classification method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116912556A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117710761A (en) * 2024-02-06 2024-03-15 中国科学院深圳先进技术研究院 Quantum convolution neural network-based magnetic resonance image classification method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117710761A (en) * 2024-02-06 2024-03-15 中国科学院深圳先进技术研究院 Quantum convolution neural network-based magnetic resonance image classification method and device

Similar Documents

Publication Publication Date Title
CN110413812B (en) Neural network model training method and device, electronic equipment and storage medium
CN110738235A (en) Pulmonary tuberculosis determination method, pulmonary tuberculosis determination device, computer device, and storage medium
CN112258512A (en) Point cloud segmentation method, device, equipment and storage medium
CN111882565B (en) Image binarization method, device, equipment and storage medium
CN116912556A (en) Picture classification method and device, electronic equipment and storage medium
CN110188782B (en) Image similarity determining method and device, electronic equipment and readable storage medium
CN113486939A (en) Method, device, terminal and storage medium for processing pictures
CN111222558B (en) Image processing method and storage medium
Yang et al. No-reference image quality assessment based on sparse representation
CN110717405B (en) Face feature point positioning method, device, medium and electronic equipment
WO2023138540A1 (en) Edge extraction method and apparatus, and electronic device and storage medium
CN111862351A (en) Positioning model optimization method, positioning method and positioning equipment
CN108629219B (en) Method and device for identifying one-dimensional code
CN110795993A (en) Method and device for constructing model, terminal equipment and medium
CN114429628A (en) Image processing method and device, readable storage medium and electronic equipment
CN114140427A (en) Object detection method and device
CN111784607A (en) Image tone mapping method, device, terminal equipment and storage medium
CN116912631B (en) Target identification method, device, electronic equipment and storage medium
Yang et al. A local structural information representation method for image quality assessment
CN117376977B (en) Mobile phone 5G wireless signal testing system, method, equipment and medium
CN113139617B (en) Power transmission line autonomous positioning method and device and terminal equipment
WO2023134068A1 (en) Digit recognition model training method and apparatus, device, and storage medium
KR102478814B1 (en) Method and apparatus for improving resolution based on image segmentation
CN114359889B (en) Text recognition method for long text data
CN109977734B (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination